Sample records for large computational burden

  1. A multiresolution approach to iterative reconstruction algorithms in X-ray computed tomography.

    PubMed

    De Witte, Yoni; Vlassenbroeck, Jelle; Van Hoorebeke, Luc

    2010-09-01

    In computed tomography, the application of iterative reconstruction methods in practical situations is impeded by their high computational demands. Especially in high resolution X-ray computed tomography, where reconstruction volumes contain a high number of volume elements (several giga voxels), this computational burden prevents their actual breakthrough. Besides the large amount of calculations, iterative algorithms require the entire volume to be kept in memory during reconstruction, which quickly becomes cumbersome for large data sets. To overcome this obstacle, we present a novel multiresolution reconstruction, which greatly reduces the required amount of memory without significantly affecting the reconstructed image quality. It is shown that, combined with an efficient implementation on a graphical processing unit, the multiresolution approach enables the application of iterative algorithms in the reconstruction of large volumes at an acceptable speed using only limited resources.

  2. Exploring Cloud Computing for Large-scale Scientific Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Guang; Han, Binh; Yin, Jian

    This paper explores cloud computing for large-scale data-intensive scientific applications. Cloud computing is attractive because it provides hardware and software resources on-demand, which relieves the burden of acquiring and maintaining a huge amount of resources that may be used only once by a scientific application. However, unlike typical commercial applications that often just requires a moderate amount of ordinary resources, large-scale scientific applications often need to process enormous amount of data in the terabyte or even petabyte range and require special high performance hardware with low latency connections to complete computation in a reasonable amount of time. To address thesemore » challenges, we build an infrastructure that can dynamically select high performance computing hardware across institutions and dynamically adapt the computation to the selected resources to achieve high performance. We have also demonstrated the effectiveness of our infrastructure by building a system biology application and an uncertainty quantification application for carbon sequestration, which can efficiently utilize data and computation resources across several institutions.« less

  3. Waveform inversion with source encoding for breast sound speed reconstruction in ultrasound computed tomography.

    PubMed

    Wang, Kun; Matthews, Thomas; Anis, Fatima; Li, Cuiping; Duric, Neb; Anastasio, Mark A

    2015-03-01

    Ultrasound computed tomography (USCT) holds great promise for improving the detection and management of breast cancer. Because they are based on the acoustic wave equation, waveform inversion-based reconstruction methods can produce images that possess improved spatial resolution properties over those produced by ray-based methods. However, waveform inversion methods are computationally demanding and have not been applied widely in USCT breast imaging. In this work, source encoding concepts are employed to develop an accelerated USCT reconstruction method that circumvents the large computational burden of conventional waveform inversion methods. This method, referred to as the waveform inversion with source encoding (WISE) method, encodes the measurement data using a random encoding vector and determines an estimate of the sound speed distribution by solving a stochastic optimization problem by use of a stochastic gradient descent algorithm. Both computer simulation and experimental phantom studies are conducted to demonstrate the use of the WISE method. The results suggest that the WISE method maintains the high spatial resolution of waveform inversion methods while significantly reducing the computational burden.

  4. A locally p-adaptive approach for Large Eddy Simulation of compressible flows in a DG framework

    NASA Astrophysics Data System (ADS)

    Tugnoli, Matteo; Abbà, Antonella; Bonaventura, Luca; Restelli, Marco

    2017-11-01

    We investigate the possibility of reducing the computational burden of LES models by employing local polynomial degree adaptivity in the framework of a high-order DG method. A novel degree adaptation technique especially featured to be effective for LES applications is proposed and its effectiveness is compared to that of other criteria already employed in the literature. The resulting locally adaptive approach allows to achieve significant reductions in computational cost of representative LES computations.

  5. Topics in programmable automation. [for materials handling, inspection, and assembly

    NASA Technical Reports Server (NTRS)

    Rosen, C. A.

    1975-01-01

    Topics explored in the development of integrated programmable automation systems include: numerically controlled and computer controlled machining; machine intelligence and the emulation of human-like capabilities; large scale semiconductor integration technology applications; and sensor technology for asynchronous local computation without burdening the executive minicomputer which controls the whole system. The role and development of training aids, and the potential application of these aids to augmented teleoperator systems are discussed.

  6. Decentralized Optimal Dispatch of Photovoltaic Inverters in Residential Distribution Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dall'Anese, Emiliano; Dhople, Sairaj V.; Johnson, Brian B.

    Summary form only given. Decentralized methods for computing optimal real and reactive power setpoints for residential photovoltaic (PV) inverters are developed in this paper. It is known that conventional PV inverter controllers, which are designed to extract maximum power at unity power factor, cannot address secondary performance objectives such as voltage regulation and network loss minimization. Optimal power flow techniques can be utilized to select which inverters will provide ancillary services, and to compute their optimal real and reactive power setpoints according to well-defined performance criteria and economic objectives. Leveraging advances in sparsity-promoting regularization techniques and semidefinite relaxation, this papermore » shows how such problems can be solved with reduced computational burden and optimality guarantees. To enable large-scale implementation, a novel algorithmic framework is introduced - based on the so-called alternating direction method of multipliers - by which optimal power flow-type problems in this setting can be systematically decomposed into sub-problems that can be solved in a decentralized fashion by the utility and customer-owned PV systems with limited exchanges of information. Since the computational burden is shared among multiple devices and the requirement of all-to-all communication can be circumvented, the proposed optimization approach scales favorably to large distribution networks.« less

  7. Tracking the NGS revolution: managing life science research on shared high-performance computing clusters.

    PubMed

    Dahlö, Martin; Scofield, Douglas G; Schaal, Wesley; Spjuth, Ola

    2018-05-01

    Next-generation sequencing (NGS) has transformed the life sciences, and many research groups are newly dependent upon computer clusters to store and analyze large datasets. This creates challenges for e-infrastructures accustomed to hosting computationally mature research in other sciences. Using data gathered from our own clusters at UPPMAX computing center at Uppsala University, Sweden, where core hour usage of ∼800 NGS and ∼200 non-NGS projects is now similar, we compare and contrast the growth, administrative burden, and cluster usage of NGS projects with projects from other sciences. The number of NGS projects has grown rapidly since 2010, with growth driven by entry of new research groups. Storage used by NGS projects has grown more rapidly since 2013 and is now limited by disk capacity. NGS users submit nearly twice as many support tickets per user, and 11 more tools are installed each month for NGS projects than for non-NGS projects. We developed usage and efficiency metrics and show that computing jobs for NGS projects use more RAM than non-NGS projects, are more variable in core usage, and rarely span multiple nodes. NGS jobs use booked resources less efficiently for a variety of reasons. Active monitoring can improve this somewhat. Hosting NGS projects imposes a large administrative burden at UPPMAX due to large numbers of inexperienced users and diverse and rapidly evolving research areas. We provide a set of recommendations for e-infrastructures that host NGS research projects. We provide anonymized versions of our storage, job, and efficiency databases.

  8. Tracking the NGS revolution: managing life science research on shared high-performance computing clusters

    PubMed Central

    2018-01-01

    Abstract Background Next-generation sequencing (NGS) has transformed the life sciences, and many research groups are newly dependent upon computer clusters to store and analyze large datasets. This creates challenges for e-infrastructures accustomed to hosting computationally mature research in other sciences. Using data gathered from our own clusters at UPPMAX computing center at Uppsala University, Sweden, where core hour usage of ∼800 NGS and ∼200 non-NGS projects is now similar, we compare and contrast the growth, administrative burden, and cluster usage of NGS projects with projects from other sciences. Results The number of NGS projects has grown rapidly since 2010, with growth driven by entry of new research groups. Storage used by NGS projects has grown more rapidly since 2013 and is now limited by disk capacity. NGS users submit nearly twice as many support tickets per user, and 11 more tools are installed each month for NGS projects than for non-NGS projects. We developed usage and efficiency metrics and show that computing jobs for NGS projects use more RAM than non-NGS projects, are more variable in core usage, and rarely span multiple nodes. NGS jobs use booked resources less efficiently for a variety of reasons. Active monitoring can improve this somewhat. Conclusions Hosting NGS projects imposes a large administrative burden at UPPMAX due to large numbers of inexperienced users and diverse and rapidly evolving research areas. We provide a set of recommendations for e-infrastructures that host NGS research projects. We provide anonymized versions of our storage, job, and efficiency databases. PMID:29659792

  9. 75 FR 39003 - SAFRA Act Payments to Loan Servicers for Job Retention

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-07

    ... obtain this document in an accessible format (e.g., braille, large print, audiotape, or computer diskette... Executive Order 12866 and its overall requirement of reducing regulatory burden that might result from these.../index.html . Waiver of Rulemaking and Delayed Effective Date Under the Administrative Procedure Act (APA...

  10. Breast ultrasound computed tomography using waveform inversion with source encoding

    NASA Astrophysics Data System (ADS)

    Wang, Kun; Matthews, Thomas; Anis, Fatima; Li, Cuiping; Duric, Neb; Anastasio, Mark A.

    2015-03-01

    Ultrasound computed tomography (USCT) holds great promise for improving the detection and management of breast cancer. Because they are based on the acoustic wave equation, waveform inversion-based reconstruction methods can produce images that possess improved spatial resolution properties over those produced by ray-based methods. However, waveform inversion methods are computationally demanding and have not been applied widely in USCT breast imaging. In this work, source encoding concepts are employed to develop an accelerated USCT reconstruction method that circumvents the large computational burden of conventional waveform inversion methods. This method, referred to as the waveform inversion with source encoding (WISE) method, encodes the measurement data using a random encoding vector and determines an estimate of the speed-of-sound distribution by solving a stochastic optimization problem by use of a stochastic gradient descent algorithm. Computer-simulation studies are conducted to demonstrate the use of the WISE method. Using a single graphics processing unit card, each iteration can be completed within 25 seconds for a 128 × 128 mm2 reconstruction region. The results suggest that the WISE method maintains the high spatial resolution of waveform inversion methods while significantly reducing the computational burden.

  11. On the impact of approximate computation in an analog DeSTIN architecture.

    PubMed

    Young, Steven; Lu, Junjie; Holleman, Jeremy; Arel, Itamar

    2014-05-01

    Deep machine learning (DML) holds the potential to revolutionize machine learning by automating rich feature extraction, which has become the primary bottleneck of human engineering in pattern recognition systems. However, the heavy computational burden renders DML systems implemented on conventional digital processors impractical for large-scale problems. The highly parallel computations required to implement large-scale deep learning systems are well suited to custom hardware. Analog computation has demonstrated power efficiency advantages of multiple orders of magnitude relative to digital systems while performing nonideal computations. In this paper, we investigate typical error sources introduced by analog computational elements and their impact on system-level performance in DeSTIN--a compositional deep learning architecture. These inaccuracies are evaluated on a pattern classification benchmark, clearly demonstrating the robustness of the underlying algorithm to the errors introduced by analog computational elements. A clear understanding of the impacts of nonideal computations is necessary to fully exploit the efficiency of analog circuits.

  12. How Computer Literacy and Socioeconomic Status Affect Attitudes Toward a Web-Based Cohort: Results From the NutriNet-Santé Study

    PubMed Central

    Méjean, Caroline; Andreeva, Valentina A; Kesse-Guyot, Emmanuelle; Fassier, Philippine; Galan, Pilar; Hercberg, Serge; Touvier, Mathilde

    2015-01-01

    Background In spite of the growing literature in the field of e-epidemiology, clear evidence about computer literacy or attitudes toward respondent burden among e-cohort participants is largely lacking. Objective We assessed the computer and Internet skills of participants in the NutriNet-Santé Web-based cohort. We then explored attitudes toward the study demands/respondent burden according to levels of computer literacy and sociodemographic status. Methods Self-reported data from 43,028 e-cohort participants were collected in 2013 via a Web-based questionnaire. We employed unconditional logistic and linear regression analyses. Results Approximately one-quarter of participants (23.79%, 10,235/43,028) reported being inexperienced in terms of computer use. Regarding attitudes toward participant burden, women tended to be more favorable (eg, “The overall website use is easy”) than were men (OR 0.65, 95% CI 0.59-0.71, P<.001), whereas better educated participants (>12 years of schooling) were less likely to accept the demands associated with participation (eg, “I receive questionnaires too often”) compared to their less educated counterparts (OR 1.62, 95% CI 1.48-1.76, P<.001). Conclusions A substantial proportion of participants had low computer/Internet skills, suggesting that this does not represent a barrier to participation in Web-based cohorts. Our study also suggests that several subgroups of participants with lower computer skills (eg, women or those with lower educational level) might more readily accept the demands associated with participation in the Web cohort. These findings can help guide future Web-based research strategies. PMID:25648178

  13. The burden of disease attributable to cannabis use in Canada in 2012.

    PubMed

    Imtiaz, Sameer; Shield, Kevin D; Roerecke, Michael; Cheng, Joyce; Popova, Svetlana; Kurdyak, Paul; Fischer, Benedikt; Rehm, Jürgen

    2016-04-01

    Cannabis use is associated with several adverse health effects. However, little is known about the cannabis-attributable burden of disease. This study quantified the age-, sex- and adverse health effect-specific cannabis-attributable (1) mortality, (2) years of life lost due to premature mortality (YLLs), (3) years of life lost due to disability (YLDs) and (4) disability-adjusted life years (DALYs) in Canada in 2012. Epidemiological modeling. Canada. Canadians aged ≥ 15 years in 2012. Using comparative risk assessment methodology, cannabis-attributable fractions were computed using Canadian exposure data and risk relations from large studies or meta-analyses. Outcome data were obtained from Canadian databases and the World Health Organization. The 95% confidence intervals (CIs) were computed using Monte Carlo methodology. Cannabis use was estimated to have caused 287 deaths (95% CI = 108, 609), 10,533 YLLs (95% CI = 4760, 20,833), 55,813 YLDs (95% CI = 38,175, 74,094) and 66,346 DALYs (95% CI = 47,785, 87,207), based on causal impacts on cannabis use disorders, schizophrenia, lung cancer and road traffic injuries. Cannabis-attributable burden of disease was highest among young people, and males accounted for twice the burden than females. Cannabis use disorders were the most important single cause of the cannabis-attributable burden of disease. The cannabis-attributable burden of disease in Canada in 2012 included 55,813 years of life lost due to disability, caused mainly by cannabis use disorders. Although the cannabis-attributable burden of disease was substantial, it was much lower compared with other commonly used legal and illegal substances. Moreover, the evidence base for cannabis-attributable harms was smaller. © 2015 Society for the Study of Addiction.

  14. Boundary formulations for sensitivity analysis without matrix derivatives

    NASA Technical Reports Server (NTRS)

    Kane, J. H.; Guru Prasad, K.

    1993-01-01

    A new hybrid approach to continuum structural shape sensitivity analysis employing boundary element analysis (BEA) is presented. The approach uses iterative reanalysis to obviate the need to factor perturbed matrices in the determination of surface displacement and traction sensitivities via a univariate perturbation/finite difference (UPFD) step. The UPFD approach makes it possible to immediately reuse existing subroutines for computation of BEA matrix coefficients in the design sensitivity analysis process. The reanalysis technique computes economical response of univariately perturbed models without factoring perturbed matrices. The approach provides substantial computational economy without the burden of a large-scale reprogramming effort.

  15. Novel patch modelling method for efficient simulation and prediction uncertainty analysis of multi-scale groundwater flow and transport processes

    NASA Astrophysics Data System (ADS)

    Sreekanth, J.; Moore, Catherine

    2018-04-01

    The application of global sensitivity and uncertainty analysis techniques to groundwater models of deep sedimentary basins are typically challenged by large computational burdens combined with associated numerical stability issues. The highly parameterized approaches required for exploring the predictive uncertainty associated with the heterogeneous hydraulic characteristics of multiple aquifers and aquitards in these sedimentary basins exacerbate these issues. A novel Patch Modelling Methodology is proposed for improving the computational feasibility of stochastic modelling analysis of large-scale and complex groundwater models. The method incorporates a nested groundwater modelling framework that enables efficient simulation of groundwater flow and transport across multiple spatial and temporal scales. The method also allows different processes to be simulated within different model scales. Existing nested model methodologies are extended by employing 'joining predictions' for extrapolating prediction-salient information from one model scale to the next. This establishes a feedback mechanism supporting the transfer of information from child models to parent models as well as parent models to child models in a computationally efficient manner. This feedback mechanism is simple and flexible and ensures that while the salient small scale features influencing larger scale prediction are transferred back to the larger scale, this does not require the live coupling of models. This method allows the modelling of multiple groundwater flow and transport processes using separate groundwater models that are built for the appropriate spatial and temporal scales, within a stochastic framework, while also removing the computational burden associated with live model coupling. The utility of the method is demonstrated by application to an actual large scale aquifer injection scheme in Australia.

  16. Fast Combinatorial Algorithm for the Solution of Linearly Constrained Least Squares Problems

    DOEpatents

    Van Benthem, Mark H.; Keenan, Michael R.

    2008-11-11

    A fast combinatorial algorithm can significantly reduce the computational burden when solving general equality and inequality constrained least squares problems with large numbers of observation vectors. The combinatorial algorithm provides a mathematically rigorous solution and operates at great speed by reorganizing the calculations to take advantage of the combinatorial nature of the problems to be solved. The combinatorial algorithm exploits the structure that exists in large-scale problems in order to minimize the number of arithmetic operations required to obtain a solution.

  17. Health literacy and task environment influence parents' burden for data entry on child-specific health information: randomized controlled trial.

    PubMed

    Porter, Stephen C; Guo, Chao-Yu; Bacic, Janine; Chan, Eugenia

    2011-01-26

    Health care systems increasingly rely on patients' data entry efforts to organize and assist in care delivery through health information exchange. We sought to determine (1) the variation in burden imposed on parents by data entry efforts across paper-based and computer-based environments, and (2) the impact, if any, of parents' health literacy on the task burden. We completed a randomized controlled trial of parent-completed data entry tasks. Parents of children with attention deficit hyperactivity disorder (ADHD) were randomized based on the Test of Functional Health Literacy in Adults (TOFHLA) to either a paper-based or computer-based environment for entry of health information on their children. The primary outcome was the National Aeronautics and Space Administration Task Load Index (TLX) total weighted score. We screened 271 parents: 194 (71.6%) were eligible, and 180 of these (92.8%) constituted the study cohort. We analyzed 90 participants from each arm. Parents who completed information tasks on paper reported a higher task burden than those who worked in the computer environment: mean (SD) TLX scores were 22.8 (20.6) for paper and 16.3 (16.1) for computer. Assignment to the paper environment conferred a significant risk of higher task burden (F(1,178) = 4.05, P = .046). Adequate literacy was associated with lower task burden (decrease in burden score of 1.15 SD, P = .003). After adjusting for relevant child and parent factors, parents' TOFHLA score (beta = -.02, P = .02) and task environment (beta = .31, P = .03) remained significantly associated with task burden. A tailored computer-based environment provided an improved task experience for data entry compared to the same tasks completed on paper. Health literacy was inversely related to task burden.

  18. Microbial burden prediction model for unmanned planetary spacecraft

    NASA Technical Reports Server (NTRS)

    Hoffman, A. R.; Winterburn, D. A.

    1972-01-01

    The technical development of a computer program for predicting microbial burden on unmanned planetary spacecraft is outlined. The discussion includes the derivation of the basic analytical equations, the selection of a method for handling several random variables, the macrologic of the computer programs and the validation and verification of the model. The prediction model was developed to (1) supplement the biological assays of a spacecraft by simulating the microbial accretion during periods when assays are not taken; (2) minimize the necessity for a large number of microbiological assays; and (3) predict the microbial loading on a lander immediately prior to sterilization and other non-lander equipment prior to launch. It is shown that these purposes not only were achieved but also that the prediction results compare favorably to the estimates derived from the direct assays. The computer program can be applied not only as a prediction instrument but also as a management and control tool. The basic logic of the model is shown to have possible applicability to other sequential flow processes, such as food processing.

  19. GPU-accelerated iterative reconstruction for limited-data tomography in CBCT systems.

    PubMed

    de Molina, Claudia; Serrano, Estefania; Garcia-Blas, Javier; Carretero, Jesus; Desco, Manuel; Abella, Monica

    2018-05-15

    Standard cone-beam computed tomography (CBCT) involves the acquisition of at least 360 projections rotating through 360 degrees. Nevertheless, there are cases in which only a few projections can be taken in a limited angular span, such as during surgery, where rotation of the source-detector pair is limited to less than 180 degrees. Reconstruction of limited data with the conventional method proposed by Feldkamp, Davis and Kress (FDK) results in severe artifacts. Iterative methods may compensate for the lack of data by including additional prior information, although they imply a high computational burden and memory consumption. We present an accelerated implementation of an iterative method for CBCT following the Split Bregman formulation, which reduces computational time through GPU-accelerated kernels. The implementation enables the reconstruction of large volumes (>1024 3 pixels) using partitioning strategies in forward- and back-projection operations. We evaluated the algorithm on small-animal data for different scenarios with different numbers of projections, angular span, and projection size. Reconstruction time varied linearly with the number of projections and quadratically with projection size but remained almost unchanged with angular span. Forward- and back-projection operations represent 60% of the total computational burden. Efficient implementation using parallel processing and large-memory management strategies together with GPU kernels enables the use of advanced reconstruction approaches which are needed in limited-data scenarios. Our GPU implementation showed a significant time reduction (up to 48 ×) compared to a CPU-only implementation, resulting in a total reconstruction time from several hours to few minutes.

  20. Motivating students' participation in a computer networks course by means of magic, drama and games.

    PubMed

    Hilas, Constantinos S; Politis, Anastasios

    2014-01-01

    The recent economic crisis has forced many universities to cut down expenses by packing students into large lecture groups. The problem with large auditoria is that they discourage dialogue between students and faculty and they burden participation. Adding to this, students in computer science courses usually find the field to be full of theoretical and technical concepts. Lack of understanding leads them to lose interest and / or motivation. Classroom experience shows that the lecturer could employ alternative teaching methods, especially for early-year undergraduate students, in order to grasp their interest and introduce basic concepts. This paper describes some of the approaches that may be used to keep students interested and make them feel comfortable as they comprehend basic concepts in computer networks. The lecturing procedure was enriched with games, magic tricks and dramatic representations. This approach was used experimentally for two semesters and the results were more than encouraging.

  1. Health Literacy and Task Environment Influence Parents' Burden for Data Entry on Child-Specific Health Information: Randomized Controlled Trial

    PubMed Central

    Guo, Chao-Yu; Bacic, Janine; Chan, Eugenia

    2011-01-01

    Background Health care systems increasingly rely on patients’ data entry efforts to organize and assist in care delivery through health information exchange. Objectives We sought to determine (1) the variation in burden imposed on parents by data entry efforts across paper-based and computer-based environments, and (2) the impact, if any, of parents’ health literacy on the task burden. Methods We completed a randomized controlled trial of parent-completed data entry tasks. Parents of children with attention deficit hyperactivity disorder (ADHD) were randomized based on the Test of Functional Health Literacy in Adults (TOFHLA) to either a paper-based or computer-based environment for entry of health information on their children. The primary outcome was the National Aeronautics and Space Administration Task Load Index (TLX) total weighted score. Results We screened 271 parents: 194 (71.6%) were eligible, and 180 of these (92.8%) constituted the study cohort. We analyzed 90 participants from each arm. Parents who completed information tasks on paper reported a higher task burden than those who worked in the computer environment: mean (SD) TLX scores were 22.8 (20.6) for paper and 16.3 (16.1) for computer. Assignment to the paper environment conferred a significant risk of higher task burden (F1,178 = 4.05, P = .046). Adequate literacy was associated with lower task burden (decrease in burden score of 1.15 SD, P = .003). After adjusting for relevant child and parent factors, parents’ TOFHLA score (beta = -.02, P = .02) and task environment (beta = .31, P = .03) remained significantly associated with task burden. Conclusions A tailored computer-based environment provided an improved task experience for data entry compared to the same tasks completed on paper. Health literacy was inversely related to task burden. Trial registration Clinicaltrials.gov NCT00543257; http://www.clinicaltrials.gov/ct2/show/NCT00543257 (Archived by WebCite at http://www.webcitation.org/5vUVH2DYR) PMID:21269990

  2. Methodology for computing the burden of disease of adverse events following immunization.

    PubMed

    McDonald, Scott A; Nijsten, Danielle; Bollaerts, Kaatje; Bauwens, Jorgen; Praet, Nicolas; van der Sande, Marianne; Bauchau, Vincent; de Smedt, Tom; Sturkenboom, Miriam; Hahné, Susan

    2018-03-24

    Composite disease burden measures such as disability-adjusted life-years (DALY) have been widely used to quantify the population-level health impact of disease or injury, but application has been limited for the estimation of the burden of adverse events following immunization. Our objective was to assess the feasibility of adapting the DALY approach for estimating adverse event burden. We developed a practical methodological framework, explicitly describing all steps involved: acquisition of relative or absolute risks and background event incidence rates, selection of disability weights and durations, and computation of the years lived with disability (YLD) measure, with appropriate estimation of uncertainty. We present a worked example, in which YLD is computed for 3 recognized adverse reactions following 3 childhood vaccination types, based on background incidence rates and relative/absolute risks retrieved from the literature. YLD provided extra insight into the health impact of an adverse event over presentation of incidence rates only, as severity and duration are additionally incorporated. As well as providing guidance for the deployment of DALY methodology in the context of adverse events associated with vaccination, we also identified where data limitations potentially occur. Burden of disease methodology can be applied to estimate the health burden of adverse events following vaccination in a systematic way. As with all burden of disease studies, interpretation of the estimates must consider the quality and accuracy of the data sources contributing to the DALY computation. © 2018 The Authors. Pharmacoepidemiology & Drug Safety Published by John Wiley & Sons Ltd.

  3. An Improved TA-SVM Method Without Matrix Inversion and Its Fast Implementation for Nonstationary Datasets.

    PubMed

    Shi, Yingzhong; Chung, Fu-Lai; Wang, Shitong

    2015-09-01

    Recently, a time-adaptive support vector machine (TA-SVM) is proposed for handling nonstationary datasets. While attractive performance has been reported and the new classifier is distinctive in simultaneously solving several SVM subclassifiers locally and globally by using an elegant SVM formulation in an alternative kernel space, the coupling of subclassifiers brings in the computation of matrix inversion, thus resulting to suffer from high computational burden in large nonstationary dataset applications. To overcome this shortcoming, an improved TA-SVM (ITA-SVM) is proposed using a common vector shared by all the SVM subclassifiers involved. ITA-SVM not only keeps an SVM formulation, but also avoids the computation of matrix inversion. Thus, we can realize its fast version, that is, improved time-adaptive core vector machine (ITA-CVM) for large nonstationary datasets by using the CVM technique. ITA-CVM has the merit of asymptotic linear time complexity for large nonstationary datasets as well as inherits the advantage of TA-SVM. The effectiveness of the proposed classifiers ITA-SVM and ITA-CVM is also experimentally confirmed.

  4. Computer-assisted qualitative data analysis software.

    PubMed

    Cope, Diane G

    2014-05-01

    Advances in technology have provided new approaches for data collection methods and analysis for researchers. Data collection is no longer limited to paper-and-pencil format, and numerous methods are now available through Internet and electronic resources. With these techniques, researchers are not burdened with entering data manually and data analysis is facilitated by software programs. Quantitative research is supported by the use of computer software and provides ease in the management of large data sets and rapid analysis of numeric statistical methods. New technologies are emerging to support qualitative research with the availability of computer-assisted qualitative data analysis software (CAQDAS).CAQDAS will be presented with a discussion of advantages, limitations, controversial issues, and recommendations for this type of software use.

  5. Acceleration of color computer-generated hologram from three-dimensional scenes with texture and depth information

    NASA Astrophysics Data System (ADS)

    Shimobaba, Tomoyoshi; Kakue, Takashi; Ito, Tomoyoshi

    2014-06-01

    We propose acceleration of color computer-generated holograms (CGHs) from three-dimensional (3D) scenes that are expressed as texture (RGB) and depth (D) images. These images are obtained by 3D graphics libraries and RGB-D cameras: for example, OpenGL and Kinect, respectively. We can regard them as two-dimensional (2D) cross-sectional images along the depth direction. The generation of CGHs from the 2D cross-sectional images requires multiple diffraction calculations. If we use convolution-based diffraction such as the angular spectrum method, the diffraction calculation takes a long time and requires large memory usage because the convolution diffraction calculation requires the expansion of the 2D cross-sectional images to avoid the wraparound noise. In this paper, we first describe the acceleration of the diffraction calculation using "Band-limited double-step Fresnel diffraction," which does not require the expansion. Next, we describe color CGH acceleration using color space conversion. In general, color CGHs are generated on RGB color space; however, we need to repeat the same calculation for each color component, so that the computational burden of the color CGH generation increases three-fold, compared with monochrome CGH generation. We can reduce the computational burden by using YCbCr color space because the 2D cross-sectional images on YCbCr color space can be down-sampled without the impairing of the image quality.

  6. The Cramér-Rao Bounds and Sensor Selection for Nonlinear Systems with Uncertain Observations.

    PubMed

    Wang, Zhiguo; Shen, Xiaojing; Wang, Ping; Zhu, Yunmin

    2018-04-05

    This paper considers the problems of the posterior Cramér-Rao bound and sensor selection for multi-sensor nonlinear systems with uncertain observations. In order to effectively overcome the difficulties caused by uncertainty, we investigate two methods to derive the posterior Cramér-Rao bound. The first method is based on the recursive formula of the Cramér-Rao bound and the Gaussian mixture model. Nevertheless, it needs to compute a complex integral based on the joint probability density function of the sensor measurements and the target state. The computation burden of this method is relatively high, especially in large sensor networks. Inspired by the idea of the expectation maximization algorithm, the second method is to introduce some 0-1 latent variables to deal with the Gaussian mixture model. Since the regular condition of the posterior Cramér-Rao bound is unsatisfied for the discrete uncertain system, we use some continuous variables to approximate the discrete latent variables. Then, a new Cramér-Rao bound can be achieved by a limiting process of the Cramér-Rao bound of the continuous system. It avoids the complex integral, which can reduce the computation burden. Based on the new posterior Cramér-Rao bound, the optimal solution of the sensor selection problem can be derived analytically. Thus, it can be used to deal with the sensor selection of a large-scale sensor networks. Two typical numerical examples verify the effectiveness of the proposed methods.

  7. Rationale, Design, and Methodological Aspects of the BUDAPEST-GLOBAL Study (Burden of Atherosclerotic Plaques Study in Twins-Genetic Loci and the Burden of Atherosclerotic Lesions).

    PubMed

    Maurovich-Horvat, Pál; Tárnoki, Dávid L; Tárnoki, Ádám D; Horváth, Tamás; Jermendy, Ádám L; Kolossváry, Márton; Szilveszter, Bálint; Voros, Viktor; Kovács, Attila; Molnár, Andrea Á; Littvay, Levente; Lamb, Hildo J; Voros, Szilard; Jermendy, György; Merkely, Béla

    2015-12-01

    The heritability of coronary atherosclerotic plaque burden, coronary geometry, and phenotypes associated with increased cardiometabolic risk are largely unknown. The primary aim of the Burden of Atherosclerotic Plaques Study in Twins-Genetic Loci and the Burden of Atherosclerotic Lesions (BUDAPEST-GLOBAL) study is to evaluate the influence of genetic and environmental factors on the burden of coronary artery disease. By design this is a prospective, single-center, classical twin study. In total, 202 twins (61 monozygotic pairs, 40 dizygotic same-sex pairs) were enrolled from the Hungarian Twin Registry database. All twins underwent non-contrast-enhanced computed tomography (CT) for the detection and quantification of coronary artery calcium and for the measurement of epicardial fat volumes. In addition, a single non-contrast-enhanced image slice was acquired at the level of L3-L4 to assess abdominal fat distribution. Coronary CT angiography was used for the detection and quantification of plaque, stenosis, and overall coronary artery disease burden. For the primary analysis, we will assess the presence and volume of atherosclerotic plaques. Furthermore, the 3-dimensional coronary geometry will be assessed based on the coronary CT angiography datasets. Additional phenotypic analyses will include per-patient epicardial and abdominal fat quantity measurements. Measurements obtained from monozygotic and dizygotic twin pairs will be compared to evaluate the genetic or environmental effects of the given phenotype. The BUDAPEST-GLOBAL study provides a unique framework to shed some light on the genetic and environmental influences of cardiometabolic disorders. © 2015 Wiley Periodicals, Inc.

  8. Fuel consumption optimization for smart hybrid electric vehicle during a car-following process

    NASA Astrophysics Data System (ADS)

    Li, Liang; Wang, Xiangyu; Song, Jian

    2017-03-01

    Hybrid electric vehicles (HEVs) provide large potential to save energy and reduce emission, and smart vehicles bring out great convenience and safety for drivers. By combining these two technologies, vehicles may achieve excellent performances in terms of dynamic, economy, environmental friendliness, safety, and comfort. Hence, a smart hybrid electric vehicle (s-HEV) is selected as a platform in this paper to study a car-following process with optimizing the fuel consumption. The whole process is a multi-objective optimal problem, whose optimal solution is not just adding an energy management strategy (EMS) to an adaptive cruise control (ACC), but a deep fusion of these two methods. The problem has more restricted conditions, optimal objectives, and system states, which may result in larger computing burden. Therefore, a novel fuel consumption optimization algorithm based on model predictive control (MPC) is proposed and some search skills are adopted in receding horizon optimization to reduce computing burden. Simulations are carried out and the results indicate that the fuel consumption of proposed method is lower than that of the ACC+EMS method on the condition of ensuring car-following performances.

  9. Confidential and Authenticated Communications in a Large Fixed-Wing UAV Swarm

    DTIC Science & Technology

    2016-12-01

    either a UAV or a ground station. Asymmetric cryptography is not an option for swarm communications. It is a potential option for initially keying or...each UAV grows ten bytes for each UAV in the swarm, and a 30% overhead is added on for worst case cryptography . The resulting throughput is...analysis in Section IV, we can predict the burden that cryptography places on the ODroid computer. Given that the average unencrypted message size was

  10. Associations Between Collateral Status and Thrombus Characteristics and Their Impact in Anterior Circulation Stroke.

    PubMed

    Alves, Heitor C; Treurniet, Kilian M; Dutra, Bruna G; Jansen, Ivo G H; Boers, Anna M M; Santos, Emilie M M; Berkhemer, Olvert A; Dippel, Diederik W J; van der Lugt, Aad; van Zwam, Wim H; van Oostenbrugge, Robert J; Lingsma, Hester F; Roos, Yvo B W E M; Yoo, Albert J; Marquering, Henk A; Majoie, Charles B L M

    2018-02-01

    Thrombus characteristics and collateral score are associated with functional outcome in patients with acute ischemic stroke. It has been suggested that they affect each other. The aim of this study is to evaluate the association between clot burden score, thrombus perviousness, and collateral score and to determine whether collateral score influences the association of thrombus characteristics with functional outcome. Patients with baseline thin-slice noncontrast computed tomography and computed tomographic angiography images from the MR CLEAN trial (Multicenter Randomized Clinical Trial of Endovascular Treatment of Acute Ischemic Stroke in the Netherlands) were included (n=195). Collateral score and clot burden scores were determined on baseline computed tomographic angiography. Thrombus attenuation increase was determined by comparing thrombus density on noncontrast computed tomography and computed tomographic angiography using a semiautomated method. The association of collateral score with clot burden score and thrombus attenuation increase was evaluated with linear regression. Mediation and effect modification analyses were used to assess the influence of collateral score on the association of clot burden score and thrombus attenuation increase with functional outcome. A higher clot burden score (B=0.063; 95% confidence interval, 0.008-0.118) and a higher thrombus attenuation increase (B=0.014; 95% confidence interval, 0.003-0.026) were associated with higher collateral score. Collateral score mediated the association of clot burden score with functional outcome. The association between thrombus attenuation increase and functional outcome was modified by the collateral score, and this association was stronger in patients with moderate and good collaterals. Patients with lower thrombus burden and higher thrombus perviousness scores had higher collateral score. The positive effect of thrombus perviousness on clinical outcome was only present in patients with moderate and high collateral scores. URL: http://www.trialregister.nl. Unique identifier: NTR1804 and URL: http://www.controlled-trials.com Unique identifier: ISRCTN10888758. © 2018 The Authors.

  11. An adaptive importance sampling algorithm for Bayesian inversion with multimodal distributions

    DOE PAGES

    Li, Weixuan; Lin, Guang

    2015-03-21

    Parametric uncertainties are encountered in the simulations of many physical systems, and may be reduced by an inverse modeling procedure that calibrates the simulation results to observations on the real system being simulated. Following Bayes’ rule, a general approach for inverse modeling problems is to sample from the posterior distribution of the uncertain model parameters given the observations. However, the large number of repetitive forward simulations required in the sampling process could pose a prohibitive computational burden. This difficulty is particularly challenging when the posterior is multimodal. We present in this paper an adaptive importance sampling algorithm to tackle thesemore » challenges. Two essential ingredients of the algorithm are: 1) a Gaussian mixture (GM) model adaptively constructed as the proposal distribution to approximate the possibly multimodal target posterior, and 2) a mixture of polynomial chaos (PC) expansions, built according to the GM proposal, as a surrogate model to alleviate the computational burden caused by computational-demanding forward model evaluations. In three illustrative examples, the proposed adaptive importance sampling algorithm demonstrates its capabilities of automatically finding a GM proposal with an appropriate number of modes for the specific problem under study, and obtaining a sample accurately and efficiently representing the posterior with limited number of forward simulations.« less

  12. An adaptive importance sampling algorithm for Bayesian inversion with multimodal distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Weixuan; Lin, Guang, E-mail: guanglin@purdue.edu

    2015-08-01

    Parametric uncertainties are encountered in the simulations of many physical systems, and may be reduced by an inverse modeling procedure that calibrates the simulation results to observations on the real system being simulated. Following Bayes' rule, a general approach for inverse modeling problems is to sample from the posterior distribution of the uncertain model parameters given the observations. However, the large number of repetitive forward simulations required in the sampling process could pose a prohibitive computational burden. This difficulty is particularly challenging when the posterior is multimodal. We present in this paper an adaptive importance sampling algorithm to tackle thesemore » challenges. Two essential ingredients of the algorithm are: 1) a Gaussian mixture (GM) model adaptively constructed as the proposal distribution to approximate the possibly multimodal target posterior, and 2) a mixture of polynomial chaos (PC) expansions, built according to the GM proposal, as a surrogate model to alleviate the computational burden caused by computational-demanding forward model evaluations. In three illustrative examples, the proposed adaptive importance sampling algorithm demonstrates its capabilities of automatically finding a GM proposal with an appropriate number of modes for the specific problem under study, and obtaining a sample accurately and efficiently representing the posterior with limited number of forward simulations.« less

  13. Singular value decomposition for collaborative filtering on a GPU

    NASA Astrophysics Data System (ADS)

    Kato, Kimikazu; Hosino, Tikara

    2010-06-01

    A collaborative filtering predicts customers' unknown preferences from known preferences. In a computation of the collaborative filtering, a singular value decomposition (SVD) is needed to reduce the size of a large scale matrix so that the burden for the next phase computation will be decreased. In this application, SVD means a roughly approximated factorization of a given matrix into smaller sized matrices. Webb (a.k.a. Simon Funk) showed an effective algorithm to compute SVD toward a solution of an open competition called "Netflix Prize". The algorithm utilizes an iterative method so that the error of approximation improves in each step of the iteration. We give a GPU version of Webb's algorithm. Our algorithm is implemented in the CUDA and it is shown to be efficient by an experiment.

  14. Image segmentation evaluation for very-large datasets

    NASA Astrophysics Data System (ADS)

    Reeves, Anthony P.; Liu, Shuang; Xie, Yiting

    2016-03-01

    With the advent of modern machine learning methods and fully automated image analysis there is a need for very large image datasets having documented segmentations for both computer algorithm training and evaluation. Current approaches of visual inspection and manual markings do not scale well to big data. We present a new approach that depends on fully automated algorithm outcomes for segmentation documentation, requires no manual marking, and provides quantitative evaluation for computer algorithms. The documentation of new image segmentations and new algorithm outcomes are achieved by visual inspection. The burden of visual inspection on large datasets is minimized by (a) customized visualizations for rapid review and (b) reducing the number of cases to be reviewed through analysis of quantitative segmentation evaluation. This method has been applied to a dataset of 7,440 whole-lung CT images for 6 different segmentation algorithms designed to fully automatically facilitate the measurement of a number of very important quantitative image biomarkers. The results indicate that we could achieve 93% to 99% successful segmentation for these algorithms on this relatively large image database. The presented evaluation method may be scaled to much larger image databases.

  15. Burden of suicide in Poland in 2012: how could it be measured and how big is it?

    PubMed

    Orlewska, Katarzyna; Orlewska, Ewa

    2018-04-01

    The aim of our study was to estimate the health-related and economic burden of suicide in Poland in 2012 and to demonstrate the effects of using different assumptions on the disease burden estimation. Years of life lost (YLL) were calculated by multiplying the number of deaths by the remaining life expectancy. Local expected YLL (LEYLL) and standard expected YLL (SEYLL) were computed using Polish life expectancy tables and WHO standards, respectively. In the base case analysis LEYLL and SEYLL were computed with 3.5 and 0% discount rates, respectively, and no age-weighting. Premature mortality costs were calculated using a human capital approach, with discounting at 5%, and are reported in Polish zloty (PLN) (1 euro = 4.3 PLN). The impact of applying different assumptions on base-case estimates was tested in sensitivity analyses. The total LEYLLs and SEYLLs due to suicide were 109,338 and 279,425, respectively, with 88% attributable to male deaths. The cost of male premature mortality (2,808,854,532 PLN) was substantially higher than for females (177,852,804 PLN). Discounting and age-weighting have a large effect on the base case estimates of LEYLLs. The greatest impact on the estimates of suicide-related premature mortality costs was due to the value of the discount rate. Our findings provide quantitative evidence on the burden of suicide. In our opinion each of the demonstrated methods brings something valuable to the evaluation of the impact of suicide on a given population, but LEYLLs and premature mortality costs estimated according to national guidelines have the potential to be useful for local public health policymakers.

  16. Adaptive surrogate model based multiobjective optimization for coastal aquifer management

    NASA Astrophysics Data System (ADS)

    Song, Jian; Yang, Yun; Wu, Jianfeng; Wu, Jichun; Sun, Xiaomin; Lin, Jin

    2018-06-01

    In this study, a novel surrogate model assisted multiobjective memetic algorithm (SMOMA) is developed for optimal pumping strategies of large-scale coastal groundwater problems. The proposed SMOMA integrates an efficient data-driven surrogate model with an improved non-dominated sorted genetic algorithm-II (NSGAII) that employs a local search operator to accelerate its convergence in optimization. The surrogate model based on Kernel Extreme Learning Machine (KELM) is developed and evaluated as an approximate simulator to generate the patterns of regional groundwater flow and salinity levels in coastal aquifers for reducing huge computational burden. The KELM model is adaptively trained during evolutionary search to satisfy desired fidelity level of surrogate so that it inhibits error accumulation of forecasting and results in correctly converging to true Pareto-optimal front. The proposed methodology is then applied to a large-scale coastal aquifer management in Baldwin County, Alabama. Objectives of minimizing the saltwater mass increase and maximizing the total pumping rate in the coastal aquifers are considered. The optimal solutions achieved by the proposed adaptive surrogate model are compared against those solutions obtained from one-shot surrogate model and original simulation model. The adaptive surrogate model does not only improve the prediction accuracy of Pareto-optimal solutions compared with those by the one-shot surrogate model, but also maintains the equivalent quality of Pareto-optimal solutions compared with those by NSGAII coupled with original simulation model, while retaining the advantage of surrogate models in reducing computational burden up to 94% of time-saving. This study shows that the proposed methodology is a computationally efficient and promising tool for multiobjective optimizations of coastal aquifer managements.

  17. Association between diabetes and different components of coronary atherosclerotic plaque burden as measured by coronary multidetector computed tomography.

    PubMed

    Yun, Chun-Ho; Schlett, Christopher L; Rogers, Ian S; Truong, Quynh A; Toepker, Michael; Donnelly, Patrick; Brady, Thomas J; Hoffmann, Udo; Bamberg, Fabian

    2009-08-01

    The aim of the study was to assess differences in the presence, extent, and composition of coronary atherosclerotic plaque burden as detected by coronary multidetector computed tomography (MDCT) between patients with and without diabetes mellitus. We compared coronary atherosclerotic plaques (any plaque, calcified [CAP], non-calcified [NCAP, and mixed plaque [MCAP

  18. Control Synthesis of Discrete-Time T-S Fuzzy Systems: Reducing the Conservatism Whilst Alleviating the Computational Burden.

    PubMed

    Xie, Xiangpeng; Yue, Dong; Zhang, Huaguang; Peng, Chen

    2017-09-01

    The augmented multi-indexed matrix approach acts as a powerful tool in reducing the conservatism of control synthesis of discrete-time Takagi-Sugeno fuzzy systems. However, its computational burden is sometimes too heavy as a tradeoff. Nowadays, reducing the conservatism whilst alleviating the computational burden becomes an ideal but very challenging problem. This paper is toward finding an efficient way to achieve one of satisfactory answers. Different from the augmented multi-indexed matrix approach in the literature, we aim to design a more efficient slack variable approach under a general framework of homogenous matrix polynomials. Thanks to the introduction of a new extended representation for homogeneous matrix polynomials, related matrices with the same coefficient are collected together into one sole set and thus those redundant terms of the augmented multi-indexed matrix approach can be removed, i.e., the computational burden can be alleviated in this paper. More importantly, due to the fact that more useful information is involved into control design, the conservatism of the proposed approach as well is less than the counterpart of the augmented multi-indexed matrix approach. Finally, numerical experiments are given to show the effectiveness of the proposed approach.

  19. Fast and accurate genotype imputation in genome-wide association studies through pre-phasing

    PubMed Central

    Howie, Bryan; Fuchsberger, Christian; Stephens, Matthew; Marchini, Jonathan; Abecasis, Gonçalo R.

    2013-01-01

    Sequencing efforts, including the 1000 Genomes Project and disease-specific efforts, are producing large collections of haplotypes that can be used for genotype imputation in genome-wide association studies (GWAS). Imputing from these reference panels can help identify new risk alleles, but the use of large panels with existing methods imposes a high computational burden. To keep imputation broadly accessible, we introduce a strategy called “pre-phasing” that maintains the accuracy of leading methods while cutting computational costs by orders of magnitude. In brief, we first statistically estimate the haplotypes for each GWAS individual (“pre-phasing”) and then impute missing genotypes into these estimated haplotypes. This reduces the computational cost because: (i) the GWAS samples must be phased only once, whereas standard methods would implicitly re-phase with each reference panel update; (ii) it is much faster to match a phased GWAS haplotype to one reference haplotype than to match unphased GWAS genotypes to a pair of reference haplotypes. This strategy will be particularly valuable for repeated imputation as reference panels evolve. PMID:22820512

  20. Computer aided manual validation of mass spectrometry-based proteomic data.

    PubMed

    Curran, Timothy G; Bryson, Bryan D; Reigelhaupt, Michael; Johnson, Hannah; White, Forest M

    2013-06-15

    Advances in mass spectrometry-based proteomic technologies have increased the speed of analysis and the depth provided by a single analysis. Computational tools to evaluate the accuracy of peptide identifications from these high-throughput analyses have not kept pace with technological advances; currently the most common quality evaluation methods are based on statistical analysis of the likelihood of false positive identifications in large-scale data sets. While helpful, these calculations do not consider the accuracy of each identification, thus creating a precarious situation for biologists relying on the data to inform experimental design. Manual validation is the gold standard approach to confirm accuracy of database identifications, but is extremely time-intensive. To palliate the increasing time required to manually validate large proteomic datasets, we provide computer aided manual validation software (CAMV) to expedite the process. Relevant spectra are collected, catalogued, and pre-labeled, allowing users to efficiently judge the quality of each identification and summarize applicable quantitative information. CAMV significantly reduces the burden associated with manual validation and will hopefully encourage broader adoption of manual validation in mass spectrometry-based proteomics. Copyright © 2013 Elsevier Inc. All rights reserved.

  1. The Global Economic and Health Burden of Human Hookworm Infection.

    PubMed

    Bartsch, Sarah M; Hotez, Peter J; Asti, Lindsey; Zapf, Kristina M; Bottazzi, Maria Elena; Diemert, David J; Lee, Bruce Y

    2016-09-01

    Even though human hookworm infection is highly endemic in many countries throughout the world, its global economic and health impact is not well known. Without a better understanding of hookworm's economic burden worldwide, it is difficult for decision makers such as funders, policy makers, disease control officials, and intervention manufacturers to determine how much time, energy, and resources to invest in hookworm control. We developed a computational simulation model to estimate the economic and health burden of hookworm infection in every country, WHO region, and globally, in 2016 from the societal perspective. Globally, hookworm infection resulted in a total 2,126,280 DALYs using 2004 disability weight estimates and 4,087,803 DALYs using 2010 disability weight estimates (excluding cognitive impairment outcomes). Including cognitive impairment did not significantly increase DALYs worldwide. Total productivity losses varied with the probability of anemia and calculation method used, ranging from $7.5 billion to $138.9 billion annually using gross national income per capita as a proxy for annual wages and ranging from $2.5 billion to $43.9 billion using minimum wage as a proxy for annual wages. Even though hookworm is classified as a neglected tropical disease, its economic and health burden exceeded published estimates for a number of diseases that have received comparatively more attention than hookworm such as rotavirus. Additionally, certain large countries that are transitioning to higher income countries such as Brazil and China, still face considerable hookworm burden.

  2. The Global Economic and Health Burden of Human Hookworm Infection

    PubMed Central

    Bartsch, Sarah M.; Hotez, Peter J.; Asti, Lindsey; Zapf, Kristina M.; Bottazzi, Maria Elena; Diemert, David J.; Lee, Bruce Y.

    2016-01-01

    Background Even though human hookworm infection is highly endemic in many countries throughout the world, its global economic and health impact is not well known. Without a better understanding of hookworm’s economic burden worldwide, it is difficult for decision makers such as funders, policy makers, disease control officials, and intervention manufacturers to determine how much time, energy, and resources to invest in hookworm control. Methodology/Principle Findings We developed a computational simulation model to estimate the economic and health burden of hookworm infection in every country, WHO region, and globally, in 2016 from the societal perspective. Globally, hookworm infection resulted in a total 2,126,280 DALYs using 2004 disability weight estimates and 4,087,803 DALYs using 2010 disability weight estimates (excluding cognitive impairment outcomes). Including cognitive impairment did not significantly increase DALYs worldwide. Total productivity losses varied with the probability of anemia and calculation method used, ranging from $7.5 billion to $138.9 billion annually using gross national income per capita as a proxy for annual wages and ranging from $2.5 billion to $43.9 billion using minimum wage as a proxy for annual wages. Conclusion Even though hookworm is classified as a neglected tropical disease, its economic and health burden exceeded published estimates for a number of diseases that have received comparatively more attention than hookworm such as rotavirus. Additionally, certain large countries that are transitioning to higher income countries such as Brazil and China, still face considerable hookworm burden. PMID:27607360

  3. Cryptogenic Stroke and Nonstenosing Intracranial Calcified Atherosclerosis.

    PubMed

    Kamel, Hooman; Gialdini, Gino; Baradaran, Hediyeh; Giambrone, Ashley E; Navi, Babak B; Lerario, Michael P; Min, James K; Iadecola, Costantino; Gupta, Ajay

    2017-04-01

    Because some cryptogenic strokes may result from large-artery atherosclerosis that goes unrecognized as it causes <50% luminal stenosis, we compared the prevalence of nonstenosing intracranial atherosclerotic plaques ipsilateral to cryptogenic cerebral infarcts versus the unaffected side using imaging biomarkers of calcium burden. In a prospective stroke registry, we identified patients with cerebral infarction limited to the territory of one internal carotid artery (ICA). We included patients with stroke of undetermined etiology and, as controls, patients with cardioembolic stroke. We used noncontrast computed tomography to measure calcification in both intracranial ICAs, including qualitative calcium scoring and quantitative scoring utilizing the Agatston-Janowitz (AJ) calcium scoring. Within subjects, the Wilcoxon signed-rank sum test for nonparametric paired data was used to compare the calcium burden in the ICA upstream of the infarction versus the ICA on the unaffected side. We obtained 440 calcium measures from 110 ICAs in 55 patients. Among 34 patients with stroke of undetermined etiology, we found greater calcium in the ICA ipsilateral to the infarction (mean Modified Woodcock Visual Scale score, 6.7 ± 4.6) compared with the contralateral side (5.4 ± 4.1) (P = .005). Among 21 patients with cardioembolic stroke, we found no difference in calcium burden ipsilateral to the infarction (6.7 ± 5.9) versus the contralateral side (7.3 ± 6.3) (P = .13). The results were similar using quantitative calcium measurements, including the AJ calcium scores. In patients with strokes of undetermined etiology, the burden of calcified intracranial large-artery plaque was associated with downstream cerebral infarction. Copyright © 2017 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  4. The Nature of Episodic Memory Deficits in MCI with and without Vascular Burden

    ERIC Educational Resources Information Center

    Villeneuve, Sylvia; Massoud, Fadi; Bocti, Christian; Gauthier, Serge; Belleville, Sylvie

    2011-01-01

    This study measured episodic memory deficits in individuals with mild cognitive impairment (MCI) as a function of their vascular burden. Vascular burden was determined clinically by computing the number of vascular risk factors and diseases and neuroradiologically by assessing the presence and severity of white matter lesions (WML). Strategic…

  5. An increase in aerosol burden due to the land-sea warming contrast

    NASA Astrophysics Data System (ADS)

    Hassan, T.; Allen, R.; Randles, C. A.

    2017-12-01

    Climate models simulate an increase in most aerosol species in response to warming, particularly over the tropics and Northern Hemisphere midlatitudes. This increase in aerosol burden is related to a decrease in wet removal, primarily due to reduced large-scale precipitation. Here, we show that the increase in aerosol burden, and the decrease in large-scale precipitation, is related to a robust climate change phenomenon—the land/sea warming contrast. Idealized simulations with two state of the art climate models, the National Center for Atmospheric Research Community Atmosphere Model version 5 (NCAR CAM5) and the Geophysical Fluid Dynamics Laboratory Atmospheric Model 3 (GFDL AM3), show that muting the land-sea warming contrast negates the increase in aerosol burden under warming. This is related to smaller decreases in near-surface relative humidity over land, and in turn, smaller decreases in large-scale precipitation over land—especially in the NH midlatitudes. Furthermore, additional idealized simulations with an enhanced land/sea warming contrast lead to the opposite result—larger decreases in relative humidity over land, larger decreases in large-scale precipitation, and larger increases in aerosol burden. Our results, which relate the increase in aerosol burden to the robust climate projection of enhanced land warming, adds confidence that a warmer world will be associated with a larger aerosol burden.

  6. A method for interactive specification of multiple-block topologies

    NASA Technical Reports Server (NTRS)

    Sorenson, Reese L.; Mccann, Karen M.

    1991-01-01

    A method is presented for dealing with the vast amount of topological and other data which must be specified to generate a multiple-block computational grid. Specific uses of the graphical capabilities of a powerful scientific workstation are described which reduce the burden on the user of collecting and formatting such large amounts of data. A program to implement this method, 3DPREP, is described. A plotting transformation algorithm, some useful software tools, notes on programming, and a database organization are also presented. Example grids developed using the method are shown.

  7. Use of high-order spectral moments in Doppler weather radar

    NASA Astrophysics Data System (ADS)

    di Vito, A.; Galati, G.; Veredice, A.

    Three techniques to estimate the skewness and curtosis of measured precipitation spectra are evaluated. These are: (1) an extension of the pulse-pair technique, (2) fitting the autocorrelation function with a least square polynomial and differentiating it, and (3) the autoregressive spectral estimation. The third technique provides the best results but has an exceedingly large computation burden. The first technique does not supply any useful results due to the crude approximation of the derivatives of the ACF. The second technique requires further study to reduce its variance.

  8. Methodological Framework for World Health Organization Estimates of the Global Burden of Foodborne Disease

    PubMed Central

    Devleesschauwer, Brecht; Haagsma, Juanita A.; Angulo, Frederick J.; Bellinger, David C.; Cole, Dana; Döpfer, Dörte; Fazil, Aamir; Fèvre, Eric M.; Gibb, Herman J.; Hald, Tine; Kirk, Martyn D.; Lake, Robin J.; Maertens de Noordhout, Charline; Mathers, Colin D.; McDonald, Scott A.; Pires, Sara M.; Speybroeck, Niko; Thomas, M. Kate; Torgerson, Paul R.; Wu, Felicia; Havelaar, Arie H.; Praet, Nicolas

    2015-01-01

    Background The Foodborne Disease Burden Epidemiology Reference Group (FERG) was established in 2007 by the World Health Organization to estimate the global burden of foodborne diseases (FBDs). This paper describes the methodological framework developed by FERG's Computational Task Force to transform epidemiological information into FBD burden estimates. Methods and Findings The global and regional burden of 31 FBDs was quantified, along with limited estimates for 5 other FBDs, using Disability-Adjusted Life Years in a hazard- and incidence-based approach. To accomplish this task, the following workflow was defined: outline of disease models and collection of epidemiological data; design and completion of a database template; development of an imputation model; identification of disability weights; probabilistic burden assessment; and estimating the proportion of the disease burden by each hazard that is attributable to exposure by food (i.e., source attribution). All computations were performed in R and the different functions were compiled in the R package 'FERG'. Traceability and transparency were ensured by sharing results and methods in an interactive way with all FERG members throughout the process. Conclusions We developed a comprehensive framework for estimating the global burden of FBDs, in which methodological simplicity and transparency were key elements. All the tools developed have been made available and can be translated into a user-friendly national toolkit for studying and monitoring food safety at the local level. PMID:26633883

  9. Physical-geometric optics method for large size faceted particles.

    PubMed

    Sun, Bingqiang; Yang, Ping; Kattawar, George W; Zhang, Xiaodong

    2017-10-02

    A new physical-geometric optics method is developed to compute the single-scattering properties of faceted particles. It incorporates a general absorption vector to accurately account for inhomogeneous wave effects, and subsequently yields the relevant analytical formulas effective and computationally efficient for absorptive scattering particles. A bundle of rays incident on a certain facet can be traced as a single beam. For a beam incident on multiple facets, a systematic beam-splitting technique based on computer graphics is used to split the original beam into several sub-beams so that each sub-beam is incident only on an individual facet. The new beam-splitting technique significantly reduces the computational burden. The present physical-geometric optics method can be generalized to arbitrary faceted particles with either convex or concave shapes and with a homogeneous or an inhomogeneous (e.g., a particle with a core) composition. The single-scattering properties of irregular convex homogeneous and inhomogeneous hexahedra are simulated and compared to their counterparts from two other methods including a numerically rigorous method.

  10. Modeling Human Behavior with Fuzzy and Soft Computing Methods

    DTIC Science & Technology

    2017-12-13

    REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 The public reporting burden for this collection of information is estimated to average 1...completing and reviewing the collection of information . Send comments regarding this burden estimate or any other aspect of this collection of... information , including suggestions for reducing the burden, to Department of Defense, Washington Headquarters Services, Directorate for Information

  11. Estimated Global, Regional, and National Disease Burdens Related to Sugar-Sweetened Beverage Consumption in 2010

    PubMed Central

    Singh, Gitanjali M.; Micha, Renata; Khatibzadeh, Shahab; Lim, Stephen; Ezzati, Majid; Mozaffarian, Dariush

    2015-01-01

    Background Sugar-sweetened beverages (SSBs) are consumed globally and contribute to adiposity. However, the worldwide impact of SSBs on burdens of adiposity-related cardiovascular diseases (CVD), cancers, and diabetes has not been assessed by nation, age, and sex. Methods and Results We modeled global, regional, and national burdens of disease associated with SSB consumption by age/sex in 2010. Data on SSB consumption levels were pooled from national dietary surveys worldwide. The effects of SSB intake on BMI and diabetes, and of elevated BMI on CVD, diabetes, and cancers were derived from large prospective cohort pooling studies. Disease-specific mortality/morbidity data were obtained from Global Burden of Diseases, Injuries, and Risk Factors 2010 Study. We computed cause-specific population-attributable fractions for SSB consumption, which were multiplied by cause-specific mortality/morbidity to compute estimates of SSB-attributable death/disability. Analyses were done by country/age/sex; uncertainties of all input data were propagated into final estimates. Worldwide, the model estimated 184,000(95%UI=161,000–208,000) deaths/year attributable to SSB consumption: 133,000(126,000–139,000) from diabetes, 45,000(26,000–61,000) from CVD, and 6,450(4,300–8,600) from cancers. 5.0% of SSB-related deaths occurred in low-income, 70.9% in middle-income, and 24.1% in high-income countries. Proportional mortality due to SSBs ranged from <1% in Japanese >65y to 30% in Mexicans <45y. Among the 20 most populous countries, Mexico had largest absolute (405 deaths/million adults) and proportional (12.1%) deaths from SSBs. A total of 8.5(2.8, 19.2) million disability-adjusted life years (DALYs) were related to SSB intake (4.5% of diabetes-related DALYs). Conclusions SSBs, are a single, modifiable component of diet, that can impact preventable death/disability in adults in high, middle, and low-income countries, indicating an urgent need for strong global prevention programs. PMID:26124185

  12. The Burden of Research on Trauma for Respondents: A Prospective and Comparative Study on Respondents Evaluations and Predictors

    PubMed Central

    van der Velden, Peter G.; Bosmans, Mark W. G.; Scherpenzeel, Annette C.

    2013-01-01

    The possible burden of participating in trauma research is an important topic for Ethical Committees (EC's), Review Boards (RB's) and researchers. However, to what extent research on trauma is more burdensome than non-trauma research is unknown. Little is known about which factors explain respondents evaluations on the burden: to what extent are they trauma-related or dependent on other factors such as personality and how respondents evaluate research in general? Data of a large probability based multi-wave internet panel, with surveys on politics and values, personality and health in 2009 and 2011, and a survey on trauma in 2012 provided the unique opportunity to address these questions. Results among respondents confronted with these events in the past 2 years (N = 950) showed that questions on trauma were significantly and systematically evaluated as less pleasant (enjoyed less), more difficult, but also stimulated respondents to think about things more than almost all previous non-trauma surveys. Yet, the computed effect sizes indicated that the differences were (very) small and often meaningless. No differences were found between users and non-users of mental services, in contrast to posttraumatic stress symptoms. Evaluations of the burden of previous surveys in 2011 on politics and values, personality and health most strongly, systematically and independently predicted the burden of questions on trauma, and not posttraumatic stress symptoms, event-related coping self-efficacy and personality factors. For instance, multiple linear regression analyses showed that 30% of the variance of how (un)pleasant questions on trauma and life-events were evaluated, was explained by how (un)pleasant the 3 surveys in 2011 were evaluated, in contrast to posttraumatic stress symptoms (not significant) and coping self-efficacy (5%). Findings question why EC's, RB's and researchers should be more critical of the possible burden of trauma research than of the possible burden of other non-trauma research. PMID:24204785

  13. Towards Dynamic Remote Data Auditing in Computational Clouds

    PubMed Central

    Khurram Khan, Muhammad; Anuar, Nor Badrul

    2014-01-01

    Cloud computing is a significant shift of computational paradigm where computing as a utility and storing data remotely have a great potential. Enterprise and businesses are now more interested in outsourcing their data to the cloud to lessen the burden of local data storage and maintenance. However, the outsourced data and the computation outcomes are not continuously trustworthy due to the lack of control and physical possession of the data owners. To better streamline this issue, researchers have now focused on designing remote data auditing (RDA) techniques. The majority of these techniques, however, are only applicable for static archive data and are not subject to audit the dynamically updated outsourced data. We propose an effectual RDA technique based on algebraic signature properties for cloud storage system and also present a new data structure capable of efficiently supporting dynamic data operations like append, insert, modify, and delete. Moreover, this data structure empowers our method to be applicable for large-scale data with minimum computation cost. The comparative analysis with the state-of-the-art RDA schemes shows that the proposed scheme is secure and highly efficient in terms of the computation and communication overhead on the auditor and server. PMID:25121114

  14. Towards dynamic remote data auditing in computational clouds.

    PubMed

    Sookhak, Mehdi; Akhunzada, Adnan; Gani, Abdullah; Khurram Khan, Muhammad; Anuar, Nor Badrul

    2014-01-01

    Cloud computing is a significant shift of computational paradigm where computing as a utility and storing data remotely have a great potential. Enterprise and businesses are now more interested in outsourcing their data to the cloud to lessen the burden of local data storage and maintenance. However, the outsourced data and the computation outcomes are not continuously trustworthy due to the lack of control and physical possession of the data owners. To better streamline this issue, researchers have now focused on designing remote data auditing (RDA) techniques. The majority of these techniques, however, are only applicable for static archive data and are not subject to audit the dynamically updated outsourced data. We propose an effectual RDA technique based on algebraic signature properties for cloud storage system and also present a new data structure capable of efficiently supporting dynamic data operations like append, insert, modify, and delete. Moreover, this data structure empowers our method to be applicable for large-scale data with minimum computation cost. The comparative analysis with the state-of-the-art RDA schemes shows that the proposed scheme is secure and highly efficient in terms of the computation and communication overhead on the auditor and server.

  15. Implementation of an audio computer-assisted self-interview (ACASI) system in a general medicine clinic: patient response burden.

    PubMed

    Trick, W E; Deamant, C; Smith, J; Garcia, D; Angulo, F

    2015-01-01

    Routine implementation of instruments to capture patient-reported outcomes could guide clinical practice and facilitate health services research. Audio interviews facilitate self-interviews across literacy levels. To evaluate time burden for patients, and factors associated with response times for an audio computer-assisted self interview (ACASI) system integrated into the clinical workflow. We developed an ACASI system, integrated with a research data warehouse. Instruments for symptom burden, self-reported health, depression screening, tobacco use, and patient satisfaction were administered through touch-screen monitors in the general medicine clinic at the Cook County Health & Hospitals System during April 8, 2011-July 27, 2012. We performed a cross-sectional study to evaluate the mean time burden per item and for each module of instruments; we evaluated factors associated with longer response latency. Among 1,670 interviews, the mean per-question response time was 18.4 [SD, 6.1] seconds. By multivariable analysis, age was most strongly associated with prolonged response time and increased per decade compared to < 50 years as follows (additional seconds per question; 95% CI): 50-59 years (1.4; 0.7 to 2.1 seconds); 60-69 (3.4; 2.6 to 4.1); 70-79 (5.1; 4.0 to 6.1); and 80-89 (5.5; 4.1 to 7.0). Response times also were longer for Spanish language (3.9; 2.9 to 4.9); no home computer use (3.3; 2.8 to 3.9); and, low mental self-reported health (0.6; 0.0 to 1.1). However, most interviews were completed within 10 minutes. An ACASI software system can be included in a patient visit and adds minimal time burden. The burden was greatest for older patients, interviews in Spanish, and for those with less computer exposure. A patient's self-reported health had minimal impact on response times.

  16. Computational Biomathematics: Toward Optimal Control of Complex Biological Systems

    DTIC Science & Technology

    2016-09-26

    The public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing...comments regarding this burden estimate or any other aspect of this collection of information, including suggesstions for reducing this burden, to...equations seems daunting. However, we are currently working on parameter estimation methods that show some promise. In this approach, we generate data from

  17. Multiplexed Predictive Control of a Large Commercial Turbofan Engine

    NASA Technical Reports Server (NTRS)

    Richter, hanz; Singaraju, Anil; Litt, Jonathan S.

    2008-01-01

    Model predictive control is a strategy well-suited to handle the highly complex, nonlinear, uncertain, and constrained dynamics involved in aircraft engine control problems. However, it has thus far been infeasible to implement model predictive control in engine control applications, because of the combination of model complexity and the time allotted for the control update calculation. In this paper, a multiplexed implementation is proposed that dramatically reduces the computational burden of the quadratic programming optimization that must be solved online as part of the model-predictive-control algorithm. Actuator updates are calculated sequentially and cyclically in a multiplexed implementation, as opposed to the simultaneous optimization taking place in conventional model predictive control. Theoretical aspects are discussed based on a nominal model, and actual computational savings are demonstrated using a realistic commercial engine model.

  18. Two-Level Verification of Data Integrity for Data Storage in Cloud Computing

    NASA Astrophysics Data System (ADS)

    Xu, Guangwei; Chen, Chunlin; Wang, Hongya; Zang, Zhuping; Pang, Mugen; Jiang, Ping

    Data storage in cloud computing can save capital expenditure and relive burden of storage management for users. As the lose or corruption of files stored may happen, many researchers focus on the verification of data integrity. However, massive users often bring large numbers of verifying tasks for the auditor. Moreover, users also need to pay extra fee for these verifying tasks beyond storage fee. Therefore, we propose a two-level verification of data integrity to alleviate these problems. The key idea is to routinely verify the data integrity by users and arbitrate the challenge between the user and cloud provider by the auditor according to the MACs and ϕ values. The extensive performance simulations show that the proposed scheme obviously decreases auditor's verifying tasks and the ratio of wrong arbitration.

  19. Addressing Curse of Dimensionality in Sensitivity Analysis: How Can We Handle High-Dimensional Problems?

    NASA Astrophysics Data System (ADS)

    Safaei, S.; Haghnegahdar, A.; Razavi, S.

    2016-12-01

    Complex environmental models are now the primary tool to inform decision makers for the current or future management of environmental resources under the climate and environmental changes. These complex models often contain a large number of parameters that need to be determined by a computationally intensive calibration procedure. Sensitivity analysis (SA) is a very useful tool that not only allows for understanding the model behavior, but also helps in reducing the number of calibration parameters by identifying unimportant ones. The issue is that most global sensitivity techniques are highly computationally demanding themselves for generating robust and stable sensitivity metrics over the entire model response surface. Recently, a novel global sensitivity analysis method, Variogram Analysis of Response Surfaces (VARS), is introduced that can efficiently provide a comprehensive assessment of global sensitivity using the Variogram concept. In this work, we aim to evaluate the effectiveness of this highly efficient GSA method in saving computational burden, when applied to systems with extra-large number of input factors ( 100). We use a test function and a hydrological modelling case study to demonstrate the capability of VARS method in reducing problem dimensionality by identifying important vs unimportant input factors.

  20. INS/GNSS Tightly-Coupled Integration Using Quaternion-Based AUPF for USV.

    PubMed

    Xia, Guoqing; Wang, Guoqing

    2016-08-02

    This paper addresses the problem of integration of Inertial Navigation System (INS) and Global Navigation Satellite System (GNSS) for the purpose of developing a low-cost, robust and highly accurate navigation system for unmanned surface vehicles (USVs). A tightly-coupled integration approach is one of the most promising architectures to fuse the GNSS data with INS measurements. However, the resulting system and measurement models turn out to be nonlinear, and the sensor stochastic measurement errors are non-Gaussian and distributed in a practical system. Particle filter (PF), one of the most theoretical attractive non-linear/non-Gaussian estimation methods, is becoming more and more attractive in navigation applications. However, the large computation burden limits its practical usage. For the purpose of reducing the computational burden without degrading the system estimation accuracy, a quaternion-based adaptive unscented particle filter (AUPF), which combines the adaptive unscented Kalman filter (AUKF) with PF, has been proposed in this paper. The unscented Kalman filter (UKF) is used in the algorithm to improve the proposal distribution and generate a posterior estimates, which specify the PF importance density function for generating particles more intelligently. In addition, the computational complexity of the filter is reduced with the avoidance of the re-sampling step. Furthermore, a residual-based covariance matching technique is used to adapt the measurement error covariance. A trajectory simulator based on a dynamic model of USV is used to test the proposed algorithm. Results show that quaternion-based AUPF can significantly improve the overall navigation accuracy and reliability.

  1. Computers: Good for Education?

    ERIC Educational Resources Information Center

    Skinner, David

    1997-01-01

    Explores the use of computers in the classroom, and concludes that the burden should be on the computer industry to prove that it really has something to offer to the educational system. Instead, article notes, the computer industry is pushing computer use that has not been demonstrated to be an educational necessity. (SLD)

  2. A Carrier Estimation Method Based on MLE and KF for Weak GNSS Signals.

    PubMed

    Zhang, Hongyang; Xu, Luping; Yan, Bo; Zhang, Hua; Luo, Liyan

    2017-06-22

    Maximum likelihood estimation (MLE) has been researched for some acquisition and tracking applications of global navigation satellite system (GNSS) receivers and shows high performance. However, all current methods are derived and operated based on the sampling data, which results in a large computation burden. This paper proposes a low-complexity MLE carrier tracking loop for weak GNSS signals which processes the coherent integration results instead of the sampling data. First, the cost function of the MLE of signal parameters such as signal amplitude, carrier phase, and Doppler frequency are used to derive a MLE discriminator function. The optimal value of the cost function is searched by an efficient Levenberg-Marquardt (LM) method iteratively. Its performance including Cramér-Rao bound (CRB), dynamic characteristics and computation burden are analyzed by numerical techniques. Second, an adaptive Kalman filter is designed for the MLE discriminator to obtain smooth estimates of carrier phase and frequency. The performance of the proposed loop, in terms of sensitivity, accuracy and bit error rate, is compared with conventional methods by Monte Carlo (MC) simulations both in pedestrian-level and vehicle-level dynamic circumstances. Finally, an optimal loop which combines the proposed method and conventional method is designed to achieve the optimal performance both in weak and strong signal circumstances.

  3. Three-Dimensional Terahertz Coded-Aperture Imaging Based on Matched Filtering and Convolutional Neural Network.

    PubMed

    Chen, Shuo; Luo, Chenggao; Wang, Hongqiang; Deng, Bin; Cheng, Yongqiang; Zhuang, Zhaowen

    2018-04-26

    As a promising radar imaging technique, terahertz coded-aperture imaging (TCAI) can achieve high-resolution, forward-looking, and staring imaging by producing spatiotemporal independent signals with coded apertures. However, there are still two problems in three-dimensional (3D) TCAI. Firstly, the large-scale reference-signal matrix based on meshing the 3D imaging area creates a heavy computational burden, thus leading to unsatisfactory efficiency. Secondly, it is difficult to resolve the target under low signal-to-noise ratio (SNR). In this paper, we propose a 3D imaging method based on matched filtering (MF) and convolutional neural network (CNN), which can reduce the computational burden and achieve high-resolution imaging for low SNR targets. In terms of the frequency-hopping (FH) signal, the original echo is processed with MF. By extracting the processed echo in different spike pulses separately, targets in different imaging planes are reconstructed simultaneously to decompose the global computational complexity, and then are synthesized together to reconstruct the 3D target. Based on the conventional TCAI model, we deduce and build a new TCAI model based on MF. Furthermore, the convolutional neural network (CNN) is designed to teach the MF-TCAI how to reconstruct the low SNR target better. The experimental results demonstrate that the MF-TCAI achieves impressive performance on imaging ability and efficiency under low SNR. Moreover, the MF-TCAI has learned to better resolve the low-SNR 3D target with the help of CNN. In summary, the proposed 3D TCAI can achieve: (1) low-SNR high-resolution imaging by using MF; (2) efficient 3D imaging by downsizing the large-scale reference-signal matrix; and (3) intelligent imaging with CNN. Therefore, the TCAI based on MF and CNN has great potential in applications such as security screening, nondestructive detection, medical diagnosis, etc.

  4. Input-output oriented computation algorithms for the control of large flexible structures

    NASA Technical Reports Server (NTRS)

    Minto, K. D.

    1989-01-01

    An overview is given of work in progress aimed at developing computational algorithms addressing two important aspects in the control of large flexible space structures; namely, the selection and placement of sensors and actuators, and the resulting multivariable control law design problem. The issue of sensor/actuator set selection is particularly crucial to obtaining a satisfactory control design, as clearly a poor choice will inherently limit the degree to which good control can be achieved. With regard to control law design, the researchers are driven by concerns stemming from the practical issues associated with eventual implementation of multivariable control laws, such as reliability, limit protection, multimode operation, sampling rate selection, processor throughput, etc. Naturally, the burden imposed by dealing with these aspects of the problem can be reduced by ensuring that the complexity of the compensator is minimized. Our approach to these problems is based on extensions to input/output oriented techniques that have proven useful in the design of multivariable control systems for aircraft engines. In particular, researchers are exploring the use of relative gain analysis and the condition number as a means of quantifying the process of sensor/actuator selection and placement for shape control of a large space platform.

  5. Gender differences in coronary plaque composition by coronary computed tomography angiography.

    PubMed

    Blaha, Michael J; Nasir, Khurram; Rivera, Juan J; Choi, Eue-Keun; Chang, Sung-A; Yoon, Yeonyee E; Chun, Eun Ju; Choi, Sang-il; Agatston, Arthur; Blumenthal, Roger S; Chang, Hyuk-Jae

    2009-12-01

    Coronary computed tomography angiography allows the differentiation of non-calcified (NCAP), calcified (CAP), and mixed coronary artery plaques (MCAP). Although males are thought to have a higher prevalence of atherosclerosis for a given age, there are currently few data regarding age-adjusted sex differences in plaque morphology and composition. We studied 1015 consecutive asymptomatic South Korean patients (49+/-10 years, 64% men) who underwent 64-slice coronary computed tomography angiography during a routine health evaluation. Coronary plaque characteristics were analyzed on a per-segment basis according to the modified AHA classification. Plaques with more than 50% calcified tissue were classified as CAP, plaques with less than 50% calcified tissue were classified as MCAP, and plaques without calcium were classified as NCAP. Multiple regression analysis was used to describe the cross-sectional association between sex and plaque-type burden (>or=2 affected segments) after adjustment for age and other cardiovascular risk factors. There was a greater prevalence of coronary plaque among men (13 vs. 4%, P<0.001). Males were more likely to have an increased burden of CAP (4 vs. 1%, P = 0.01) and MCAP (5 vs. 1%, P<0.001), whereas the burden of NCAP was similar across sex (2 vs. 1%, P = 0.28). After multivariable adjustment, men have six to seven times greater odds of having an increased burden of CAP and MCAP, whereas no sex difference was observed in the burden of NCAP. In this population of asymptomatic middle-aged Korean individuals, males had a significantly greater burden of MCAP and CAP. Future studies will determine whether these differences contribute to the accelerated cardiovascular risk observed in men.

  6. The Burden of Cardiovascular Diseases Among US States, 1990-2016.

    PubMed

    Roth, Gregory A; Johnson, Catherine O; Abate, Kalkidan Hassen; Abd-Allah, Foad; Ahmed, Muktar; Alam, Khurshid; Alam, Tahiya; Alvis-Guzman, Nelson; Ansari, Hossein; Ärnlöv, Johan; Atey, Tesfay Mehari; Awasthi, Ashish; Awoke, Tadesse; Barac, Aleksandra; Bärnighausen, Till; Bedi, Neeraj; Bennett, Derrick; Bensenor, Isabela; Biadgilign, Sibhatu; Castañeda-Orjuela, Carlos; Catalá-López, Ferrán; Davletov, Kairat; Dharmaratne, Samath; Ding, Eric L; Dubey, Manisha; Faraon, Emerito Jose Aquino; Farid, Talha; Farvid, Maryam S; Feigin, Valery; Fernandes, João; Frostad, Joseph; Gebru, Alemseged; Geleijnse, Johanna M; Gona, Philimon Nyakauru; Griswold, Max; Hailu, Gessessew Bugssa; Hankey, Graeme J; Hassen, Hamid Yimam; Havmoeller, Rasmus; Hay, Simon; Heckbert, Susan R; Irvine, Caleb Mackay Salpeter; James, Spencer Lewis; Jara, Dube; Kasaeian, Amir; Khan, Abdur Rahman; Khera, Sahil; Khoja, Abdullah T; Khubchandani, Jagdish; Kim, Daniel; Kolte, Dhaval; Lal, Dharmesh; Larsson, Anders; Linn, Shai; Lotufo, Paulo A; Magdy Abd El Razek, Hassan; Mazidi, Mohsen; Meier, Toni; Mendoza, Walter; Mensah, George A; Meretoja, Atte; Mezgebe, Haftay Berhane; Mirrakhimov, Erkin; Mohammed, Shafiu; Moran, Andrew Edward; Nguyen, Grant; Nguyen, Minh; Ong, Kanyin Liane; Owolabi, Mayowa; Pletcher, Martin; Pourmalek, Farshad; Purcell, Caroline A; Qorbani, Mostafa; Rahman, Mahfuzar; Rai, Rajesh Kumar; Ram, Usha; Reitsma, Marissa Bettay; Renzaho, Andre M N; Rios-Blancas, Maria Jesus; Safiri, Saeid; Salomon, Joshua A; Sartorius, Benn; Sepanlou, Sadaf Ghajarieh; Shaikh, Masood Ali; Silva, Diego; Stranges, Saverio; Tabarés-Seisdedos, Rafael; Tadele Atnafu, Niguse; Thakur, J S; Topor-Madry, Roman; Truelsen, Thomas; Tuzcu, E Murat; Tyrovolas, Stefanos; Ukwaja, Kingsley Nnanna; Vasankari, Tommi; Vlassov, Vasiliy; Vollset, Stein Emil; Wakayo, Tolassa; Weintraub, Robert; Wolfe, Charles; Workicho, Abdulhalik; Xu, Gelin; Yadgir, Simon; Yano, Yuichiro; Yip, Paul; Yonemoto, Naohiro; Younis, Mustafa; Yu, Chuanhua; Zaidi, Zoubida; Zaki, Maysaa El Sayed; Zipkin, Ben; Afshin, Ashkan; Gakidou, Emmanuela; Lim, Stephen S; Mokdad, Ali H; Naghavi, Mohsen; Vos, Theo; Murray, Christopher J L

    2018-04-11

    Cardiovascular disease (CVD) is the leading cause of death in the United States, but regional variation within the United States is large. Comparable and consistent state-level measures of total CVD burden and risk factors have not been produced previously. To quantify and describe levels and trends of lost health due to CVD within the United States from 1990 to 2016 as well as risk factors driving these changes. Using the Global Burden of Disease methodology, cardiovascular disease mortality, nonfatal health outcomes, and associated risk factors were analyzed by age group, sex, and year from 1990 to 2016 for all residents in the United States using standardized approaches for data processing and statistical modeling. Burden of disease was estimated for 10 groupings of CVD, and comparative risk analysis was performed. Data were analyzed from August 2016 to July 2017. Residing in the United States. Cardiovascular disease disability-adjusted life-years (DALYs). Between 1990 and 2016, age-standardized CVD DALYs for all states decreased. Several states had large rises in their relative rank ordering for total CVD DALYs among states, including Arkansas, Oklahoma, Alabama, Kentucky, Missouri, Indiana, Kansas, Alaska, and Iowa. The rate of decline varied widely across states, and CVD burden increased for a small number of states in the most recent years. Cardiovascular disease DALYs remained twice as large among men compared with women. Ischemic heart disease was the leading cause of CVD DALYs in all states, but the second most common varied by state. Trends were driven by 12 groups of risk factors, with the largest attributable CVD burden due to dietary risk exposures followed by high systolic blood pressure, high body mass index, high total cholesterol level, high fasting plasma glucose level, tobacco smoking, and low levels of physical activity. Increases in risk-deleted CVD DALY rates between 2006 and 2016 in 16 states suggest additional unmeasured risks beyond these traditional factors. Large disparities in total burden of CVD persist between US states despite marked improvements in CVD burden. Differences in CVD burden are largely attributable to modifiable risk exposures.

  7. Education: AIChE Probes Impact of Computer on Future Engineering Education.

    ERIC Educational Resources Information Center

    Krieger, James

    1983-01-01

    Evaluates influence of computer assisted instruction on engineering education, considering use of computers to remove burden of doing calculations and to provide interactive self-study programs of a tutorial/remedial nature. Cites universities requiring personal computer purchase, pointing out possibility for individualized design assignments.…

  8. 76 FR 1410 - Privacy Act of 1974; Computer Matching Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-10

    ...; Computer Matching Program AGENCY: Defense Manpower Data Center (DMDC), DoD. ACTION: Notice of a Computer... administrative burden, constitute a greater intrusion of the individual's privacy, and would result in additional... Liaison Officer, Department of Defense. Notice of a Computer Matching Program Among the Defense Manpower...

  9. Worldwide burden of gastric cancer in 2010 attributable to high sodium intake in 1990 and predicted attributable burden for 2030 based on exposures in 2010.

    PubMed

    Peleteiro, Bárbara; Barros, Susana; Castro, Clara; Ferro, Ana; Morais, Samantha; Lunet, Nuno

    2016-08-01

    Assessing the impact that patterns of Na intake may have on gastric cancer will provide a more comprehensive estimation of Na reduction as a primary prevention approach. We aimed to estimate the proportion of gastric cancer cases that are attributable to Na intake above the recommendation by the WHO (≤2 g/d) throughout the world in 2010, as well as expected values for 2030. Population attributable fractions (PAF) were computed for 187 countries, using Na intakes in 1990 and 2010 and estimates of the association between Na intake and gastric cancer, assuming a time lag of 20 years. Median PAF ranged from 10·1% in low to 22·5 % in very high Human Development Index (HDI) countries in men (P<0·001) and from 7·2 to 16·6 %, respectively, among women (P<0·001). An increase in median PAF until 2030 is expected in most settings, except for countries classified as low HDI, in both sexes. High Na intakes account for a large proportion of gastric cancer cases, and proportions are expected to increase in almost all of the countries. Intensified efforts to diminish Na intake in virtually all populations are needed to further reduce gastric cancer burden.

  10. 78 FR 24466 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-25

    .... Affected Public: Private Sector: Businesses or other for-profits. Estimated Annual Burden Hours: 24,206,448... correctly computed. Affected Public: Private Sector: Businesses or other for-profits. Estimated Annual...: Private Sector: Businesses or other for-profits. Estimated Annual Burden Hours: 51,024. OMB Number: 1545...

  11. Energy intensity of computer manufacturing: hybrid assessment combining process and economic input-output methods.

    PubMed

    Williams, Eric

    2004-11-15

    The total energy and fossil fuels used in producing a desktop computer with 17-in. CRT monitor are estimated at 6400 megajoules (MJ) and 260 kg, respectively. This indicates that computer manufacturing is energy intensive: the ratio of fossil fuel use to product weight is 11, an order of magnitude larger than the factor of 1-2 for many other manufactured goods. This high energy intensity of manufacturing, combined with rapid turnover in computers, results in an annual life cycle energy burden that is surprisingly high: about 2600 MJ per year, 1.3 times that of a refrigerator. In contrast with many home appliances, life cycle energy use of a computer is dominated by production (81%) as opposed to operation (19%). Extension of usable lifespan (e.g. by reselling or upgrading) is thus a promising approach to mitigating energy impacts as well as other environmental burdens associated with manufacturing and disposal.

  12. Combined multi-spectrum and orthogonal Laplacianfaces for fast CB-XLCT imaging with single-view data

    NASA Astrophysics Data System (ADS)

    Zhang, Haibo; Geng, Guohua; Chen, Yanrong; Qu, Xuan; Zhao, Fengjun; Hou, Yuqing; Yi, Huangjian; He, Xiaowei

    2017-12-01

    Cone-beam X-ray luminescence computed tomography (CB-XLCT) is an attractive hybrid imaging modality, which has the potential of monitoring the metabolic processes of nanophosphors-based drugs in vivo. Single-view data reconstruction as a key issue of CB-XLCT imaging promotes the effective study of dynamic XLCT imaging. However, it suffers from serious ill-posedness in the inverse problem. In this paper, a multi-spectrum strategy is adopted to relieve the ill-posedness of reconstruction. The strategy is based on the third-order simplified spherical harmonic approximation model. Then, an orthogonal Laplacianfaces-based method is proposed to reduce the large computational burden without degrading the imaging quality. Both simulated data and in vivo experimental data were used to evaluate the efficiency and robustness of the proposed method. The results are satisfactory in terms of both location and quantitative recovering with computational efficiency, indicating that the proposed method is practical and promising for single-view CB-XLCT imaging.

  13. Morphometric synaptology of a whole neuron profile using a semiautomatic interactive computer system.

    PubMed

    Saito, K; Niki, K

    1983-07-01

    We propose a new method of dealing with morphometric synaptology that processes all synapses and boutons around the HRP marked neuron on a large composite electron micrograph, rather than a qualitative or a piecemeal quantitative study of a particular synapse and/or bouton that is not positioned on the surface of the neuron. This approach requires the development of both neuroanatomical procedures, by which a specific whole neuronal profile is identified, and valuable specialized tools, which support the collection and analysis of a great volume of morphometric data from composite electron micrographs, in order to reduce the burden of the morphologist. The present report is also concerned with the total and reliable semi-automatic interactive computer system for gathering and analyzing morphometric data that has been under development in our laboratory. A morphologist performs the pattern recognition portion by using a large-sized tablet digitizer and a menu-sheet command, and the system registers the various morphometric values of many different neurons and performs statistical analysis. Some examples of morphometric measurements and analysis show the usefulness and efficiency of the proposed system and method.

  14. Some Experience with Interactive Computing in Teaching Introductory Statistics.

    ERIC Educational Resources Information Center

    Diegert, Carl

    Students in two biostatistics courses at the Cornell Medical College and in a course in applications of computer science given in Cornell's School of Industrial Engineering were given access to an interactive package of computer programs enabling them to perform statistical analysis without the burden of hand computation. After a general…

  15. Fast MPEG-CDVS Encoder With GPU-CPU Hybrid Computing

    NASA Astrophysics Data System (ADS)

    Duan, Ling-Yu; Sun, Wei; Zhang, Xinfeng; Wang, Shiqi; Chen, Jie; Yin, Jianxiong; See, Simon; Huang, Tiejun; Kot, Alex C.; Gao, Wen

    2018-05-01

    The compact descriptors for visual search (CDVS) standard from ISO/IEC moving pictures experts group (MPEG) has succeeded in enabling the interoperability for efficient and effective image retrieval by standardizing the bitstream syntax of compact feature descriptors. However, the intensive computation of CDVS encoder unfortunately hinders its widely deployment in industry for large-scale visual search. In this paper, we revisit the merits of low complexity design of CDVS core techniques and present a very fast CDVS encoder by leveraging the massive parallel execution resources of GPU. We elegantly shift the computation-intensive and parallel-friendly modules to the state-of-the-arts GPU platforms, in which the thread block allocation and the memory access are jointly optimized to eliminate performance loss. In addition, those operations with heavy data dependence are allocated to CPU to resolve the extra but non-necessary computation burden for GPU. Furthermore, we have demonstrated the proposed fast CDVS encoder can work well with those convolution neural network approaches which has harmoniously leveraged the advantages of GPU platforms, and yielded significant performance improvements. Comprehensive experimental results over benchmarks are evaluated, which has shown that the fast CDVS encoder using GPU-CPU hybrid computing is promising for scalable visual search.

  16. Fast inverse scattering solutions using the distorted Born iterative method and the multilevel fast multipole algorithm

    PubMed Central

    Hesford, Andrew J.; Chew, Weng C.

    2010-01-01

    The distorted Born iterative method (DBIM) computes iterative solutions to nonlinear inverse scattering problems through successive linear approximations. By decomposing the scattered field into a superposition of scattering by an inhomogeneous background and by a material perturbation, large or high-contrast variations in medium properties can be imaged through iterations that are each subject to the distorted Born approximation. However, the need to repeatedly compute forward solutions still imposes a very heavy computational burden. To ameliorate this problem, the multilevel fast multipole algorithm (MLFMA) has been applied as a forward solver within the DBIM. The MLFMA computes forward solutions in linear time for volumetric scatterers. The typically regular distribution and shape of scattering elements in the inverse scattering problem allow the method to take advantage of data redundancy and reduce the computational demands of the normally expensive MLFMA setup. Additional benefits are gained by employing Kaczmarz-like iterations, where partial measurements are used to accelerate convergence. Numerical results demonstrate both the efficiency of the forward solver and the successful application of the inverse method to imaging problems with dimensions in the neighborhood of ten wavelengths. PMID:20707438

  17. Numerical algorithms for scatter-to-attenuation reconstruction in PET: empirical comparison of convergence, acceleration, and the effect of subsets.

    PubMed

    Berker, Yannick; Karp, Joel S; Schulz, Volkmar

    2017-09-01

    The use of scattered coincidences for attenuation correction of positron emission tomography (PET) data has recently been proposed. For practical applications, convergence speeds require further improvement, yet there exists a trade-off between convergence speed and the risk of non-convergence. In this respect, a maximum-likelihood gradient-ascent (MLGA) algorithm and a two-branch back-projection (2BP), which was previously proposed, were evaluated. MLGA was combined with the Armijo step size rule; and accelerated using conjugate gradients, Nesterov's momentum method, and data subsets of different sizes. In 2BP, we varied the subset size, an important determinant of convergence speed and computational burden. We used three sets of simulation data to evaluate the impact of a spatial scale factor. The Armijo step size allowed 10-fold increased step sizes compared to native MLGA. Conjugate gradients and Nesterov momentum lead to slightly faster, yet non-uniform convergence; improvements were mostly confined to later iterations, possibly due to the non-linearity of the problem. MLGA with data subsets achieved faster, uniform, and predictable convergence, with a speed-up factor equivalent to the number of subsets and no increase in computational burden. By contrast, 2BP computational burden increased linearly with the number of subsets due to repeated evaluation of the objective function, and convergence was limited to the case of many (and therefore small) subsets, which resulted in high computational burden. Possibilities of improving 2BP appear limited. While general-purpose acceleration methods appear insufficient for MLGA, results suggest that data subsets are a promising way of improving MLGA performance.

  18. Estimation of Model's Marginal likelihood Using Adaptive Sparse Grid Surrogates in Bayesian Model Averaging

    NASA Astrophysics Data System (ADS)

    Zeng, X.

    2015-12-01

    A large number of model executions are required to obtain alternative conceptual models' predictions and their posterior probabilities in Bayesian model averaging (BMA). The posterior model probability is estimated through models' marginal likelihood and prior probability. The heavy computation burden hinders the implementation of BMA prediction, especially for the elaborated marginal likelihood estimator. For overcoming the computation burden of BMA, an adaptive sparse grid (SG) stochastic collocation method is used to build surrogates for alternative conceptual models through the numerical experiment of a synthetical groundwater model. BMA predictions depend on model posterior weights (or marginal likelihoods), and this study also evaluated four marginal likelihood estimators, including arithmetic mean estimator (AME), harmonic mean estimator (HME), stabilized harmonic mean estimator (SHME), and thermodynamic integration estimator (TIE). The results demonstrate that TIE is accurate in estimating conceptual models' marginal likelihoods. The BMA-TIE has better predictive performance than other BMA predictions. TIE has high stability for estimating conceptual model's marginal likelihood. The repeated estimated conceptual model's marginal likelihoods by TIE have significant less variability than that estimated by other estimators. In addition, the SG surrogates are efficient to facilitate BMA predictions, especially for BMA-TIE. The number of model executions needed for building surrogates is 4.13%, 6.89%, 3.44%, and 0.43% of the required model executions of BMA-AME, BMA-HME, BMA-SHME, and BMA-TIE, respectively.

  19. Accurate and efficient seismic data interpolation in the principal frequency wavenumber domain

    NASA Astrophysics Data System (ADS)

    Wang, Benfeng; Lu, Wenkai

    2017-12-01

    Seismic data irregularity caused by economic limitations, acquisition environmental constraints or bad trace elimination, can decrease the performance of the below multi-channel algorithms, such as surface-related multiple elimination (SRME), though some can overcome the irregularity defects. Therefore, accurate interpolation to provide the necessary complete data is a pre-requisite, but its wide applications are constrained because of its large computational burden for huge data volume, especially in 3D explorations. For accurate and efficient interpolation, the curvelet transform- (CT) based projection onto convex sets (POCS) method in the principal frequency wavenumber (PFK) domain is introduced. The complex-valued PF components can characterize their original signal with a high accuracy, but are at least half the size, which can help provide a reasonable efficiency improvement. The irregularity of the observed data is transformed into incoherent noise in the PFK domain, and curvelet coefficients may be sparser when CT is performed on the PFK domain data, enhancing the interpolation accuracy. The performance of the POCS-based algorithms using complex-valued CT in the time space (TX), principal frequency space, and PFK domains are compared. Numerical examples on synthetic and field data demonstrate the validity and effectiveness of the proposed method. With less computational burden, the proposed method can achieve a better interpolation result, and it can be easily extended into higher dimensions.

  20. Implementation of an Audio Computer-Assisted Self-Interview (ACASI) System in a General Medicine Clinic

    PubMed Central

    Deamant, C.; Smith, J.; Garcia, D.; Angulo, F.

    2015-01-01

    Summary Background Routine implementation of instruments to capture patient-reported outcomes could guide clinical practice and facilitate health services research. Audio interviews facilitate self-interviews across literacy levels. Objectives To evaluate time burden for patients, and factors associated with response times for an audio computer-assisted self interview (ACASI) system integrated into the clinical workflow. Methods We developed an ACASI system, integrated with a research data warehouse. Instruments for symptom burden, self-reported health, depression screening, tobacco use, and patient satisfaction were administered through touch-screen monitors in the general medicine clinic at the Cook County Health & Hospitals System during April 8, 2011-July 27, 2012. We performed a cross-sectional study to evaluate the mean time burden per item and for each module of instruments; we evaluated factors associated with longer response latency. Results Among 1,670 interviews, the mean per-question response time was 18.4 [SD, 6.1] seconds. By multivariable analysis, age was most strongly associated with prolonged response time and increased per decade compared to < 50 years as follows (additional seconds per question; 95% CI): 50–59 years (1.4; 0.7 to 2.1 seconds); 60–69 (3.4; 2.6 to 4.1); 70–79 (5.1; 4.0 to 6.1); and 80–89 (5.5; 4.1 to 7.0). Response times also were longer for Spanish language (3.9; 2.9 to 4.9); no home computer use (3.3; 2.8 to 3.9); and, low mental self-reported health (0.6; 0.0 to 1.1). However, most interviews were completed within 10 minutes. Conclusions An ACASI software system can be included in a patient visit and adds minimal time burden. The burden was greatest for older patients, interviews in Spanish, and for those with less computer exposure. A patient’s self-reported health had minimal impact on response times. PMID:25848420

  1. 77 FR 57194 - Proposed Collection; Comment Request for Form 13285-A

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-17

    ... Form 13285-A, Reducing Tax Burden on America's Taxpayers. DATES: Written comments should be received [email protected] . SUPPLEMENTARY INFORMATION: Title: Reducing Tax Burden on America's Taxpayers. OMB Number... taxpaying public's help to identify meaningful taxpayer burden reduction opportunities that impact a large...

  2. The economic burden of occupational non-melanoma skin cancer due to solar radiation.

    PubMed

    Mofidi, Amirabbas; Tompa, Emile; Spencer, James; Kalcevich, Christina; Peters, Cheryl E; Kim, Joanne; Song, Chaojie; Mortazavi, Seyed Bagher; Demers, Paul A

    2018-06-01

    Solar ultraviolet (UV) radiation is the second most prevalent carcinogenic exposure in Canada and is similarly important in other countries with large Caucasian populations. The objective of this article was to estimate the economic burden associated with newly diagnosed non-melanoma skin cancers (NMSCs) attributable to occupational solar radiation exposure. Key cost categories considered were direct costs (healthcare costs, out-of-pocket costs (OOPCs), and informal caregiver costs); indirect costs (productivity/output costs and home production costs); and intangible costs (monetary value of the loss of health-related quality of life (HRQoL)). To generate the burden estimates, we used secondary data from multiple sources applied to computational methods developed from an extensive review of the literature. An estimated 2,846 (5.3%) of the 53,696 newly diagnosed cases of basal cell carcinoma (BCC) and 1,710 (9.2%) of the 18,549 newly diagnosed cases of squamous cell carcinoma (SCC) in 2011 in Canada were attributable to occupational solar radiation exposure. The combined total for direct and indirect costs of occupational NMSC cases is $28.9 million ($15.9 million for BCC and $13.0 million for SCC), and for intangible costs is $5.7 million ($0.6 million for BCC and $5.1 million for SCC). On a per-case basis, the total costs are $5,670 for BCC and $10,555 for SCC. The higher per-case cost for SCC is largely a result of a lower survival rate, and hence higher indirect and intangible costs. Our estimates can be used to raise awareness of occupational solar UV exposure as an important causal factor in NMSCs and can highlight the importance of occupational BCC and SCC among other occupational cancers.

  3. Dynamic resource allocation scheme for distributed heterogeneous computer systems

    NASA Technical Reports Server (NTRS)

    Liu, Howard T. (Inventor); Silvester, John A. (Inventor)

    1991-01-01

    This invention relates to a resource allocation in computer systems, and more particularly, to a method and associated apparatus for shortening response time and improving efficiency of a heterogeneous distributed networked computer system by reallocating the jobs queued up for busy nodes to idle, or less-busy nodes. In accordance with the algorithm (SIDA for short), the load-sharing is initiated by the server device in a manner such that extra overhead in not imposed on the system during heavily-loaded conditions. The algorithm employed in the present invention uses a dual-mode, server-initiated approach. Jobs are transferred from heavily burdened nodes (i.e., over a high threshold limit) to low burdened nodes at the initiation of the receiving node when: (1) a job finishes at a node which is burdened below a pre-established threshold level, or (2) a node is idle for a period of time as established by a wakeup timer at the node. The invention uses a combination of the local queue length and the local service rate ratio at each node as the workload indicator.

  4. Direct discriminant locality preserving projection with Hammerstein polynomial expansion.

    PubMed

    Chen, Xi; Zhang, Jiashu; Li, Defang

    2012-12-01

    Discriminant locality preserving projection (DLPP) is a linear approach that encodes discriminant information into the objective of locality preserving projection and improves its classification ability. To enhance the nonlinear description ability of DLPP, we can optimize the objective function of DLPP in reproducing kernel Hilbert space to form a kernel-based discriminant locality preserving projection (KDLPP). However, KDLPP suffers the following problems: 1) larger computational burden; 2) no explicit mapping functions in KDLPP, which results in more computational burden when projecting a new sample into the low-dimensional subspace; and 3) KDLPP cannot obtain optimal discriminant vectors, which exceedingly optimize the objective of DLPP. To overcome the weaknesses of KDLPP, in this paper, a direct discriminant locality preserving projection with Hammerstein polynomial expansion (HPDDLPP) is proposed. The proposed HPDDLPP directly implements the objective of DLPP in high-dimensional second-order Hammerstein polynomial space without matrix inverse, which extracts the optimal discriminant vectors for DLPP without larger computational burden. Compared with some other related classical methods, experimental results for face and palmprint recognition problems indicate the effectiveness of the proposed HPDDLPP.

  5. Estimated Global, Regional, and National Disease Burdens Related to Sugar-Sweetened Beverage Consumption in 2010.

    PubMed

    Singh, Gitanjali M; Micha, Renata; Khatibzadeh, Shahab; Lim, Stephen; Ezzati, Majid; Mozaffarian, Dariush

    2015-08-25

    Sugar-sweetened beverages (SSBs) are consumed globally and contribute to adiposity. However, the worldwide impact of SSBs on burdens of adiposity-related cardiovascular diseases (CVDs), cancers, and diabetes mellitus has not been assessed by nation, age, and sex. We modeled global, regional, and national burdens of disease associated with SSB consumption by age/sex in 2010. Data on SSB consumption levels were pooled from national dietary surveys worldwide. The effects of SSB intake on body mass index and diabetes mellitus, and of elevated body mass index on CVD, diabetes mellitus, and cancers were derived from large prospective cohort pooling studies. Disease-specific mortality/morbidity data were obtained from Global Burden of Diseases, Injuries, and Risk Factors 2010 Study. We computed cause-specific population-attributable fractions for SSB consumption, which were multiplied by cause-specific mortality/morbidity to compute estimates of SSB-attributable death/disability. Analyses were done by country/age/sex; uncertainties of all input data were propagated into final estimates. Worldwide, the model estimated 184 000 (95% uncertainty interval, 161 000-208 000) deaths/y attributable to SSB consumption: 133 000 (126 000-139 000) from diabetes mellitus, 45 000 (26 000-61 000) from CVD, and 6450 (4300-8600) from cancers. Five percent of SSB-related deaths occurred in low-income, 70.9% in middle-income, and 24.1% in high-income countries. Proportional mortality attributable to SSBs ranged from <1% in Japanese >65 years if age to 30% in Mexicans <45 years of age. Among the 20 most populous countries, Mexico had largest absolute (405 deaths/million adults) and proportional (12.1%) deaths from SSBs. A total of 8.5 (2.8, 19.2) million disability-adjusted life years were related to SSB intake (4.5% of diabetes mellitus-related disability-adjusted life years). SSBs are a single, modifiable component of diet that can impact preventable death/disability in adults in high-, middle-, and low-income countries, indicating an urgent need for strong global prevention programs. © 2015 American Heart Association, Inc.

  6. Incidence and lifetime costs of injuries in the United States

    PubMed Central

    Corso, P; Finkelstein, E; Miller, T; Fiebelkorn, I; Zaloshnja, E

    2006-01-01

    Background Standardized methodologies for assessing economic burden of injury at the national or international level do not exist. Objective To measure national incidence, medical costs, and productivity losses of medically treated injuries using the most recent data available in the United States, as a case study for similarly developed countries undertaking economic burden analyses. Method The authors combined several data sets to estimate the incidence of fatal and non‐fatal injuries in 2000. They computed unit medical and productivity costs and multiplied these costs by corresponding incidence estimates to yield total lifetime costs of injuries occurring in 2000. Main outcome measures Incidence, medical costs, productivity losses, and total costs for injuries stratified by age group, sex, and mechanism. Results More than 50 million Americans experienced a medically treated injury in 2000, resulting in lifetime costs of $406 billion; $80 billion for medical treatment and $326 billion for lost productivity. Males had a 20% higher rate of injury than females. Injuries resulting from falls or being struck by/against an object accounted for more than 44% of injuries. The rate of medically treated injuries declined by 15% from 1985 to 2000 in the US. For those aged 0–44, the incidence rate of injuries declined by more than 20%; while persons aged 75 and older experienced a 20% increase. Conclusions These national burden estimates provide unequivocal evidence of the large health and financial burden of injuries. This study can serve as a template for other countries or be used in intercountry comparisons. PMID:16887941

  7. Association Between Traumatic Brain Injury-Related Brain Lesions and Long-term Caregiver Burden.

    PubMed

    Guevara, Andrea Brioschi; Demonet, Jean-Francois; Polejaeva, Elena; Knutson, Kristine M; Wassermann, Eric M; Grafman, Jordan; Krueger, Frank

    2016-01-01

    To investigate the association between traumatic brain injury (TBI)-related brain lesions and long-term caregiver burden in relation to dysexecutive syndrome. National Institute of Neurological Disorders and Stroke, National Institutes of Health, Bethesda, Maryland. A total of 256 participants: 105 combat veterans with TBI, 23 healthy control combat veterans (HCv), and 128 caregivers. Caregiver burden assessed by the Zarit Burden Interview at 40 years postinjury. Participants with penetrating TBI were compared with HCv on perceived caregiver burden and neuropsychological assessment measures. Data of computed tomographic scans (overlay lesion maps of participants with a penetrating TBI whose caregivers have a significantly high burden) and behavioral statistical analyses were combined to identify brain lesions associated with caregiver burden. Burden was greater in caregivers of veterans with TBI than in caregivers of HCv. Caregivers of participants with lesions affecting cognitive and behavioral indicators of dysexecutive syndrome (ie, left dorsolateral prefrontal cortex and dorsal anterior cingulate cortex) showed greater long-term burden than caregivers of participants with lesions elsewhere in the brain. The TBI-related brain lesions have a lasting effect on long-term caregiver burden due to cognitive and behavioral factors associated with dysexecutive syndrome.

  8. Nephele: a cloud platform for simplified, standardized and reproducible microbiome data analysis.

    PubMed

    Weber, Nick; Liou, David; Dommer, Jennifer; MacMenamin, Philip; Quiñones, Mariam; Misner, Ian; Oler, Andrew J; Wan, Joe; Kim, Lewis; Coakley McCarthy, Meghan; Ezeji, Samuel; Noble, Karlynn; Hurt, Darrell E

    2018-04-15

    Widespread interest in the study of the microbiome has resulted in data proliferation and the development of powerful computational tools. However, many scientific researchers lack the time, training, or infrastructure to work with large datasets or to install and use command line tools. The National Institute of Allergy and Infectious Diseases (NIAID) has created Nephele, a cloud-based microbiome data analysis platform with standardized pipelines and a simple web interface for transforming raw data into biological insights. Nephele integrates common microbiome analysis tools as well as valuable reference datasets like the healthy human subjects cohort of the Human Microbiome Project (HMP). Nephele is built on the Amazon Web Services cloud, which provides centralized and automated storage and compute capacity, thereby reducing the burden on researchers and their institutions. https://nephele.niaid.nih.gov and https://github.com/niaid/Nephele. darrell.hurt@nih.gov.

  9. A microphysical parameterization of aqSOA and sulfate formation in clouds

    NASA Astrophysics Data System (ADS)

    McVay, Renee; Ervens, Barbara

    2017-07-01

    Sulfate and secondary organic aerosol (cloud aqSOA) can be chemically formed in cloud water. Model implementation of these processes represents a computational burden due to the large number of microphysical and chemical parameters. Chemical mechanisms have been condensed by reducing the number of chemical parameters. Here an alternative is presented to reduce the number of microphysical parameters (number of cloud droplet size classes). In-cloud mass formation is surface and volume dependent due to surface-limited oxidant uptake and/or size-dependent pH. Box and parcel model simulations show that using the effective cloud droplet diameter (proportional to total volume-to-surface ratio) reproduces sulfate and aqSOA formation rates within ≤30% as compared to full droplet distributions; other single diameters lead to much greater deviations. This single-class approach reduces computing time significantly and can be included in models when total liquid water content and effective diameter are available.

  10. Nephele: a cloud platform for simplified, standardized and reproducible microbiome data analysis

    PubMed Central

    Weber, Nick; Liou, David; Dommer, Jennifer; MacMenamin, Philip; Quiñones, Mariam; Misner, Ian; Oler, Andrew J; Wan, Joe; Kim, Lewis; Coakley McCarthy, Meghan; Ezeji, Samuel; Noble, Karlynn; Hurt, Darrell E

    2018-01-01

    Abstract Motivation Widespread interest in the study of the microbiome has resulted in data proliferation and the development of powerful computational tools. However, many scientific researchers lack the time, training, or infrastructure to work with large datasets or to install and use command line tools. Results The National Institute of Allergy and Infectious Diseases (NIAID) has created Nephele, a cloud-based microbiome data analysis platform with standardized pipelines and a simple web interface for transforming raw data into biological insights. Nephele integrates common microbiome analysis tools as well as valuable reference datasets like the healthy human subjects cohort of the Human Microbiome Project (HMP). Nephele is built on the Amazon Web Services cloud, which provides centralized and automated storage and compute capacity, thereby reducing the burden on researchers and their institutions. Availability and implementation https://nephele.niaid.nih.gov and https://github.com/niaid/Nephele Contact darrell.hurt@nih.gov PMID:29028892

  11. On the placement of active members in adaptive truss structures for vibration control

    NASA Technical Reports Server (NTRS)

    Lu, L.-Y.; Utku, S.; Wada, B. K.

    1992-01-01

    The problem of optimal placement of active members which are used for vibration control in adaptive truss structures is investigated. The control scheme is based on the method of eigenvalue assignment as a means of shaping the transient response of the controlled adaptive structures, and the minimization of required control action is considered as the optimization criterion. To this end, a performance index which measures the control strokes of active members is formulated in an efficient way. In order to reduce the computation burden, particularly for the case where the locations of active members have to be selected from a large set of available sites, several heuristic searching schemes are proposed for obtaining the near-optimal locations. The proposed schemes significantly reduce the computational complexity of placing multiple active members to the order of that when a single active member is placed.

  12. Decomposition and model selection for large contingency tables.

    PubMed

    Dahinden, Corinne; Kalisch, Markus; Bühlmann, Peter

    2010-04-01

    Large contingency tables summarizing categorical variables arise in many areas. One example is in biology, where large numbers of biomarkers are cross-tabulated according to their discrete expression level. Interactions of the variables are of great interest and are generally studied with log-linear models. The structure of a log-linear model can be visually represented by a graph from which the conditional independence structure can then be easily read off. However, since the number of parameters in a saturated model grows exponentially in the number of variables, this generally comes with a heavy computational burden. Even if we restrict ourselves to models of lower-order interactions or other sparse structures, we are faced with the problem of a large number of cells which play the role of sample size. This is in sharp contrast to high-dimensional regression or classification procedures because, in addition to a high-dimensional parameter, we also have to deal with the analogue of a huge sample size. Furthermore, high-dimensional tables naturally feature a large number of sampling zeros which often leads to the nonexistence of the maximum likelihood estimate. We therefore present a decomposition approach, where we first divide the problem into several lower-dimensional problems and then combine these to form a global solution. Our methodology is computationally feasible for log-linear interaction models with many categorical variables each or some of them having many levels. We demonstrate the proposed method on simulated data and apply it to a bio-medical problem in cancer research.

  13. Influence of the Quantity of Aortic Valve Calcium on the Agreement Between Automated 3-Dimensional Transesophageal Echocardiography and Multidetector Row Computed Tomography for Aortic Annulus Sizing.

    PubMed

    Podlesnikar, Tomaz; Prihadi, Edgard A; van Rosendael, Philippe J; Vollema, E Mara; van der Kley, Frank; de Weger, Arend; Ajmone Marsan, Nina; Naji, Franjo; Fras, Zlatko; Bax, Jeroen J; Delgado, Victoria

    2018-01-01

    Accurate aortic annulus sizing is key for selection of appropriate transcatheter aortic valve implantation (TAVI) prosthesis size. The present study compared novel automated 3-dimensional (3D) transesophageal echocardiography (TEE) software and multidetector row computed tomography (MDCT) for aortic annulus sizing and investigated the influence of the quantity of aortic valve calcium (AVC) on the selection of TAVI prosthesis size. A total of 83 patients with severe aortic stenosis undergoing TAVI were evaluated. Maximal and minimal aortic annulus diameter, perimeter, and area were measured. AVC was assessed with computed tomography. The low and high AVC burden groups were defined according to the median AVC score. Overall, 3D TEE measurements slightly underestimated the aortic annulus dimensions as compared with MDCT (mean differences between maximum, minimum diameter, perimeter, and area: -1.7 mm, 0.5 mm, -2.7 mm, and -13 mm 2 , respectively). The agreement between 3D TEE and MDCT on aortic annulus dimensions was superior among patients with low AVC burden (<3,025 arbitrary units) compared with patients with high AVC burden (≥3,025 arbitrary units). The interobserver variability was excellent for both methods. 3D TEE and MDCT led to the same prosthesis size selection in 88%, 95%, and 81% of patients in the total population, the low, and the high AVC burden group, respectively. In conclusion, the novel automated 3D TEE imaging software allows accurate and highly reproducible measurements of the aortic annulus dimensions and shows excellent agreement with MDCT to determine the TAVI prosthesis size, particularly in patients with low AVC burden. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.

  14. Strategies for Large Scale Implementation of a Multiscale, Multiprocess Integrated Hydrologic Model

    NASA Astrophysics Data System (ADS)

    Kumar, M.; Duffy, C.

    2006-05-01

    Distributed models simulate hydrologic state variables in space and time while taking into account the heterogeneities in terrain, surface, subsurface properties and meteorological forcings. Computational cost and complexity associated with these model increases with its tendency to accurately simulate the large number of interacting physical processes at fine spatio-temporal resolution in a large basin. A hydrologic model run on a coarse spatial discretization of the watershed with limited number of physical processes needs lesser computational load. But this negatively affects the accuracy of model results and restricts physical realization of the problem. So it is imperative to have an integrated modeling strategy (a) which can be universally applied at various scales in order to study the tradeoffs between computational complexity (determined by spatio- temporal resolution), accuracy and predictive uncertainty in relation to various approximations of physical processes (b) which can be applied at adaptively different spatial scales in the same domain by taking into account the local heterogeneity of topography and hydrogeologic variables c) which is flexible enough to incorporate different number and approximation of process equations depending on model purpose and computational constraint. An efficient implementation of this strategy becomes all the more important for Great Salt Lake river basin which is relatively large (~89000 sq. km) and complex in terms of hydrologic and geomorphic conditions. Also the types and the time scales of hydrologic processes which are dominant in different parts of basin are different. Part of snow melt runoff generated in the Uinta Mountains infiltrates and contributes as base flow to the Great Salt Lake over a time scale of decades to centuries. The adaptive strategy helps capture the steep topographic and climatic gradient along the Wasatch front. Here we present the aforesaid modeling strategy along with an associated hydrologic modeling framework which facilitates a seamless, computationally efficient and accurate integration of the process model with the data model. The flexibility of this framework leads to implementation of multiscale, multiresolution, adaptive refinement/de-refinement and nested modeling simulations with least computational burden. However, performing these simulations and related calibration of these models over a large basin at higher spatio- temporal resolutions is computationally intensive and requires use of increasing computing power. With the advent of parallel processing architectures, high computing performance can be achieved by parallelization of existing serial integrated-hydrologic-model code. This translates to running the same model simulation on a network of large number of processors thereby reducing the time needed to obtain solution. The paper also discusses the implementation of the integrated model on parallel processors. Also will be discussed the mapping of the problem on multi-processor environment, method to incorporate coupling between hydrologic processes using interprocessor communication models, model data structure and parallel numerical algorithms to obtain high performance.

  15. Impacts of different characterizations of large-scale background on simulated regional-scale ozone over the continental United States

    EPA Science Inventory

    This study analyzes simulated regional-scale ozone burdens both near the surface and aloft, estimates process contributions to these burdens, and calculates the sensitivity of the simulated regional-scale ozone burden to several key model inputs with a particular emphasis on boun...

  16. Lack of communication and control: experiences of distance caregivers of parents with advanced cancer.

    PubMed

    Mazanec, Polly; Daly, Barbara J; Ferrell, Betty Rolling; Prince-Paul, Maryjo

    2011-05-01

    To explore the new and complex phenomenon of distance caregiving in the advanced cancer population. Qualitative. A large comprehensive cancer center in the midwestern region of the United States. 14 distance caregivers of parents with advanced cancer. Patients with advanced lung, gastrointestinal, and gynecologic malignancies consented to have their distance caregiving adult children contacted to participate in the study. Responses to three open-ended questions guided the tape-recorded telephone interviews with the distance caregivers. Following transcription, content analysis with inductive coding was performed. Two major themes, communication and control, and five subthemes, benefits and burdens of distance caregiving, dealing with uncertainty, direct action through information seeking, protecting, and staying connected, emerged from the data. Distance caregivers experience some of the same stressors that local caregivers of patients with cancer experience. In addition, they have unique psychosocial needs related to the burden of geographic distance. Distance caregivers could benefit from nursing interventions targeted at their unique needs. Innovative interventions using Web-based computer technology for improved communication, as well as supportive care interventions, may be helpful.

  17. WIRELESS Computing in Schools: Reach Out and Touch the World.

    ERIC Educational Resources Information Center

    Null, Linda; Teschner, Randy

    Many elementary and secondary schools tie with local colleges and universities and use modems to access the computing power available at these higher education facilities. To help alleviate the financial burden of long-distance phone charges, work had begun to use the airways instead of phone lines for computer communication. An interest in…

  18. Advancing the efficiency and efficacy of patient reported outcomes with multivariate computer adaptive testing.

    PubMed

    Morris, Scott; Bass, Mike; Lee, Mirinae; Neapolitan, Richard E

    2017-09-01

    The Patient Reported Outcomes Measurement Information System (PROMIS) initiative developed an array of patient reported outcome (PRO) measures. To reduce the number of questions administered, PROMIS utilizes unidimensional item response theory and unidimensional computer adaptive testing (UCAT), which means a separate set of questions is administered for each measured trait. Multidimensional item response theory (MIRT) and multidimensional computer adaptive testing (MCAT) simultaneously assess correlated traits. The objective was to investigate the extent to which MCAT reduces patient burden relative to UCAT in the case of PROs. One MIRT and 3 unidimensional item response theory models were developed using the related traits anxiety, depression, and anger. Using these models, MCAT and UCAT performance was compared with simulated individuals. Surprisingly, the root mean squared error for both methods increased with the number of items. These results were driven by large errors for individuals with low trait levels. A second analysis focused on individuals aligned with item content. For these individuals, both MCAT and UCAT accuracies improved with additional items. Furthermore, MCAT reduced the test length by 50%. For the PROMIS Emotional Distress banks, neither UCAT nor MCAT provided accurate estimates for individuals at low trait levels. Because the items in these banks were designed to detect clinical levels of distress, there is little information for individuals with low trait values. However, trait estimates for individuals targeted by the banks were accurate and MCAT asked substantially fewer questions. By reducing the number of items administered, MCAT can allow clinicians and researchers to assess a wider range of PROs with less patient burden. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  19. Genomics and privacy: implications of the new reality of closed data for the field.

    PubMed

    Greenbaum, Dov; Sboner, Andrea; Mu, Xinmeng Jasmine; Gerstein, Mark

    2011-12-01

    Open source and open data have been driving forces in bioinformatics in the past. However, privacy concerns may soon change the landscape, limiting future access to important data sets, including personal genomics data. Here we survey this situation in some detail, describing, in particular, how the large scale of the data from personal genomic sequencing makes it especially hard to share data, exacerbating the privacy problem. We also go over various aspects of genomic privacy: first, there is basic identifiability of subjects having their genome sequenced. However, even for individuals who have consented to be identified, there is the prospect of very detailed future characterization of their genotype, which, unanticipated at the time of their consent, may be more personal and invasive than the release of their medical records. We go over various computational strategies for dealing with the issue of genomic privacy. One can "slice" and reformat datasets to allow them to be partially shared while securing the most private variants. This is particularly applicable to functional genomics information, which can be largely processed without variant information. For handling the most private data there are a number of legal and technological approaches-for example, modifying the informed consent procedure to acknowledge that privacy cannot be guaranteed, and/or employing a secure cloud computing environment. Cloud computing in particular may allow access to the data in a more controlled fashion than the current practice of downloading and computing on large datasets. Furthermore, it may be particularly advantageous for small labs, given that the burden of many privacy issues falls disproportionately on them in comparison to large corporations and genome centers. Finally, we discuss how education of future genetics researchers will be important, with curriculums emphasizing privacy and data security. However, teaching personal genomics with identifiable subjects in the university setting will, in turn, create additional privacy issues and social conundrums. © 2011 Greenbaum et al.

  20. Researching and Reducing the Health Burden of Stroke

    MedlinePlus

    ... the result of continuing research to map the brain and interface it with a computer to enable stroke patients to regain function. How important is the new effort to map the human brain? The brain is more complex than any computer ...

  1. SU-F-BRD-09: A Random Walk Model Algorithm for Proton Dose Calculation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yao, W; Farr, J

    2015-06-15

    Purpose: To develop a random walk model algorithm for calculating proton dose with balanced computation burden and accuracy. Methods: Random walk (RW) model is sometimes referred to as a density Monte Carlo (MC) simulation. In MC proton dose calculation, the use of Gaussian angular distribution of protons due to multiple Coulomb scatter (MCS) is convenient, but in RW the use of Gaussian angular distribution requires an extremely large computation and memory. Thus, our RW model adopts spatial distribution from the angular one to accelerate the computation and to decrease the memory usage. From the physics and comparison with the MCmore » simulations, we have determined and analytically expressed those critical variables affecting the dose accuracy in our RW model. Results: Besides those variables such as MCS, stopping power, energy spectrum after energy absorption etc., which have been extensively discussed in literature, the following variables were found to be critical in our RW model: (1) inverse squared law that can significantly reduce the computation burden and memory, (2) non-Gaussian spatial distribution after MCS, and (3) the mean direction of scatters at each voxel. In comparison to MC results, taken as reference, for a water phantom irradiated by mono-energetic proton beams from 75 MeV to 221.28 MeV, the gamma test pass rate was 100% for the 2%/2mm/10% criterion. For a highly heterogeneous phantom consisting of water embedded by a 10 cm cortical bone and a 10 cm lung in the Bragg peak region of the proton beam, the gamma test pass rate was greater than 98% for the 3%/3mm/10% criterion. Conclusion: We have determined key variables in our RW model for proton dose calculation. Compared with commercial pencil beam algorithms, our RW model much improves the dose accuracy in heterogeneous regions, and is about 10 times faster than MC simulations.« less

  2. Program Aids Specification Of Multiple-Block Grids

    NASA Technical Reports Server (NTRS)

    Sorenson, R. L.; Mccann, K. M.

    1993-01-01

    3DPREP computer program aids specification of multiple-block computational grids. Highly interactive graphical preprocessing program designed for use on powerful graphical scientific computer workstation. Divided into three main parts, each corresponding to principal graphical-and-alphanumerical display. Relieves user of some burden of collecting and formatting many data needed to specify blocks and grids, and prepares input data for NASA's 3DGRAPE grid-generating computer program.

  3. Diverse power iteration embeddings: Theory and practice

    DOE PAGES

    Huang, Hao; Yoo, Shinjae; Yu, Dantong; ...

    2015-11-09

    Manifold learning, especially spectral embedding, is known as one of the most effective learning approaches on high dimensional data, but for real-world applications it raises a serious computational burden in constructing spectral embeddings for large datasets. To overcome this computational complexity, we propose a novel efficient embedding construction, Diverse Power Iteration Embedding (DPIE). DPIE shows almost the same effectiveness of spectral embeddings and yet is three order of magnitude faster than spectral embeddings computed from eigen-decomposition. Our DPIE is unique in that (1) it finds linearly independent embeddings and thus shows diverse aspects of dataset; (2) the proposed regularized DPIEmore » is effective if we need many embeddings; (3) we show how to efficiently orthogonalize DPIE if one needs; and (4) Diverse Power Iteration Value (DPIV) provides the importance of each DPIE like an eigen value. As a result, such various aspects of DPIE and DPIV ensure that our algorithm is easy to apply to various applications, and we also show the effectiveness and efficiency of DPIE on clustering, anomaly detection, and feature selection as our case studies.« less

  4. Fast MPEG-CDVS Encoder With GPU-CPU Hybrid Computing.

    PubMed

    Duan, Ling-Yu; Sun, Wei; Zhang, Xinfeng; Wang, Shiqi; Chen, Jie; Yin, Jianxiong; See, Simon; Huang, Tiejun; Kot, Alex C; Gao, Wen

    2018-05-01

    The compact descriptors for visual search (CDVS) standard from ISO/IEC moving pictures experts group has succeeded in enabling the interoperability for efficient and effective image retrieval by standardizing the bitstream syntax of compact feature descriptors. However, the intensive computation of a CDVS encoder unfortunately hinders its widely deployment in industry for large-scale visual search. In this paper, we revisit the merits of low complexity design of CDVS core techniques and present a very fast CDVS encoder by leveraging the massive parallel execution resources of graphics processing unit (GPU). We elegantly shift the computation-intensive and parallel-friendly modules to the state-of-the-arts GPU platforms, in which the thread block allocation as well as the memory access mechanism are jointly optimized to eliminate performance loss. In addition, those operations with heavy data dependence are allocated to CPU for resolving the extra but non-necessary computation burden for GPU. Furthermore, we have demonstrated the proposed fast CDVS encoder can work well with those convolution neural network approaches which enables to leverage the advantages of GPU platforms harmoniously, and yield significant performance improvements. Comprehensive experimental results over benchmarks are evaluated, which has shown that the fast CDVS encoder using GPU-CPU hybrid computing is promising for scalable visual search.

  5. Software Engineering Support of the Third Round of Scientific Grand Challenge Investigations: Earth System Modeling Software Framework Survey

    NASA Technical Reports Server (NTRS)

    Talbot, Bryan; Zhou, Shu-Jia; Higgins, Glenn; Zukor, Dorothy (Technical Monitor)

    2002-01-01

    One of the most significant challenges in large-scale climate modeling, as well as in high-performance computing in other scientific fields, is that of effectively integrating many software models from multiple contributors. A software framework facilitates the integration task, both in the development and runtime stages of the simulation. Effective software frameworks reduce the programming burden for the investigators, freeing them to focus more on the science and less on the parallel communication implementation. while maintaining high performance across numerous supercomputer and workstation architectures. This document surveys numerous software frameworks for potential use in Earth science modeling. Several frameworks are evaluated in depth, including Parallel Object-Oriented Methods and Applications (POOMA), Cactus (from (he relativistic physics community), Overture, Goddard Earth Modeling System (GEMS), the National Center for Atmospheric Research Flux Coupler, and UCLA/UCB Distributed Data Broker (DDB). Frameworks evaluated in less detail include ROOT, Parallel Application Workspace (PAWS), and Advanced Large-Scale Integrated Computational Environment (ALICE). A host of other frameworks and related tools are referenced in this context. The frameworks are evaluated individually and also compared with each other.

  6. Machine learning from computer simulations with applications in rail vehicle dynamics

    NASA Astrophysics Data System (ADS)

    Taheri, Mehdi; Ahmadian, Mehdi

    2016-05-01

    The application of stochastic modelling for learning the behaviour of a multibody dynamics (MBD) models is investigated. Post-processing data from a simulation run are used to train the stochastic model that estimates the relationship between model inputs (suspension relative displacement and velocity) and the output (sum of suspension forces). The stochastic model can be used to reduce the computational burden of the MBD model by replacing a computationally expensive subsystem in the model (suspension subsystem). With minor changes, the stochastic modelling technique is able to learn the behaviour of a physical system and integrate its behaviour within MBD models. The technique is highly advantageous for MBD models where real-time simulations are necessary, or with models that have a large number of repeated substructures, e.g. modelling a train with a large number of railcars. The fact that the training data are acquired prior to the development of the stochastic model discards the conventional sampling plan strategies like Latin Hypercube sampling plans where simulations are performed using the inputs dictated by the sampling plan. Since the sampling plan greatly influences the overall accuracy and efficiency of the stochastic predictions, a sampling plan suitable for the process is developed where the most space-filling subset of the acquired data with ? number of sample points that best describes the dynamic behaviour of the system under study is selected as the training data.

  7. Estimating Skin Cancer Risk: Evaluating Mobile Computer-Adaptive Testing.

    PubMed

    Djaja, Ngadiman; Janda, Monika; Olsen, Catherine M; Whiteman, David C; Chien, Tsair-Wei

    2016-01-22

    Response burden is a major detriment to questionnaire completion rates. Computer adaptive testing may offer advantages over non-adaptive testing, including reduction of numbers of items required for precise measurement. Our aim was to compare the efficiency of non-adaptive (NAT) and computer adaptive testing (CAT) facilitated by Partial Credit Model (PCM)-derived calibration to estimate skin cancer risk. We used a random sample from a population-based Australian cohort study of skin cancer risk (N=43,794). All 30 items of the skin cancer risk scale were calibrated with the Rasch PCM. A total of 1000 cases generated following a normal distribution (mean [SD] 0 [1]) were simulated using three Rasch models with three fixed-item (dichotomous, rating scale, and partial credit) scenarios, respectively. We calculated the comparative efficiency and precision of CAT and NAT (shortening of questionnaire length and the count difference number ratio less than 5% using independent t tests). We found that use of CAT led to smaller person standard error of the estimated measure than NAT, with substantially higher efficiency but no loss of precision, reducing response burden by 48%, 66%, and 66% for dichotomous, Rating Scale Model, and PCM models, respectively. CAT-based administrations of the skin cancer risk scale could substantially reduce participant burden without compromising measurement precision. A mobile computer adaptive test was developed to help people efficiently assess their skin cancer risk.

  8. Optical computing using optical flip-flops in Fourier processors: use in matrix multiplication and discrete linear transforms.

    PubMed

    Ando, S; Sekine, S; Mita, M; Katsuo, S

    1989-12-15

    An architecture and the algorithms for matrix multiplication using optical flip-flops (OFFs) in optical processors are proposed based on residue arithmetic. The proposed system is capable of processing all elements of matrices in parallel utilizing the information retrieving ability of optical Fourier processors. The employment of OFFs enables bidirectional data flow leading to a simpler architecture and the burden of residue-to-decimal (or residue-to-binary) conversion to operation time can be largely reduced by processing all elements in parallel. The calculated characteristics of operation time suggest a promising use of the system in a real time 2-D linear transform.

  9. Indicators for the automated analysis of drug prescribing quality.

    PubMed

    Coste, J; Séné, B; Milstein, C; Bouée, S; Venot, A

    1998-01-01

    Irrational and inconsistent drug prescription has considerable impact on morbidity, mortality, health service utilization, and community burden. However, few studies have addressed the methodology of processing the information contained in these drug orders used to study the quality of drug prescriptions and prescriber behavior. We present a comprehensive set of quantitative indicators for the quality of drug prescriptions which can be derived from a drug order. These indicators were constructed using explicit a priori criteria which were previously validated on the basis of scientific data. Automatic computation is straightforward, using a relational database system, such that large sets of prescriptions can be processed with minimal human effort. We illustrate the feasibility and value of this approach by using a large set of 23,000 prescriptions for several diseases, selected from a nationally representative prescriptions database. Our study may result in direct and wide applications in the epidemiology of medical practice and in quality control procedures.

  10. Rationale and design of dal-PLAQUE: A study assessing efficacy and safety of dalcetrapib on progression or regression of atherosclerosis using magnetic resonance imaging and 18F-fluorodeoxyglucose positron emission tomography/computed tomography

    PubMed Central

    Fayad, Zahi A.; Mani, Venkatesh; Woodward, Mark; Kallend, David; Bansilal, Sameer; Pozza, Joseph; Burgess, Tracy; Fuster, Valentin; Rudd, James H. F.; Tawakol, Ahmed; Farkouh, Michael E.

    2014-01-01

    dal-PLAQUE is a placebo-controlled multicenter study designed to assess the effect of dalcetrapib on imaging measures of plaque inflammation and plaque burden. dal-PLAQUE is a multimodality imaging study in the context of the large dal-HEART Program. Decreased high-density lipoprotein cholesterol is linked to increased risk of coronary heart disease (CHD). Dalcetrapib, a compound that increases high-density lipoprotein cholesterol by modulating cholesteryl ester transfer protein, is being studied to assess if it can reduce the progression of atherosclerotic disease and thereby decrease cardiovascular morbidity and mortality. Patients with CHD or CHD-risk equivalents were randomized to receive 600 mg dalcetrapib or placebo daily for 24 months, in addition to conventional lipid-lowering medication and other medications for cardiovascular risk factors. The primary outcomes are the effect of dalcetrapib on 18F-fluorodeoxyglucose positron emission tomography target-to-background ratio after 6 months and magnetic resonance imaging (MRI) plaque burden (wall area, wall thickness, total vessel area, and wall area/total vessel area ratio) after 12 months. Secondary objectives include positron emission tomography target-to-background ratio at 3 months and MRI plaque burden at 6 and 24 months; plaque composition at 6, 12, and 24 months; and aortic compliance at 6 months. A tertiary objective is to examine the dynamic contrast-enhanced MRI parameters of plaque neovascularization. In total, 189 subjects entered screening, and 130 were randomized. dal-PLAQUE will provide important information on the effects of dalcetrapib on markers of inflammation and atherosclerotic plaque burden and, thereby, on the safety of cholesteryl ester transfer protein modulation with dalcetrapib. Results are expected in 2011. PMID:21835280

  11. Economic Burden of Heart Failure: Investigating Outpatient and Inpatient Costs in Abeokuta, Southwest Nigeria

    PubMed Central

    Ogah, Okechukwu S.; Stewart, Simon; Onwujekwe, Obinna E.; Falase, Ayodele O.; Adebayo, Saheed O.; Olunuga, Taiwo; Sliwa, Karen

    2014-01-01

    Background: Heart failure (HF) is a deadly, disabling and often costly syndrome world-wide. Unfortunately, there is a paucity of data describing its economic impact in sub Saharan Africa; a region in which the number of relatively younger cases will inevitably rise. Methods: Heath economic data were extracted from a prospective HF registry in a tertiary hospital situated in Abeokuta, southwest Nigeria. Outpatient and inpatient costs were computed from a representative cohort of 239 HF cases including personnel, diagnostic and treatment resources used for their management over a 12-month period. Indirect costs were also calculated. The annual cost per person was then calculated. Results: Mean age of the cohort was 58.0±15.1 years and 53.1% were men. The total computed cost of care of HF in Abeokuta was 76, 288,845 Nigerian Naira (US$508, 595) translating to 319,200 Naira (US$2,128 US Dollars) per patient per year. The total cost of in-patient care (46% of total health care expenditure) was estimated as 34,996,477 Naira (about 301,230 US dollars). This comprised of 17,899,977 Naira- 50.9% ($US114,600) and 17,806,500 naira −49.1%($US118,710) for direct and in-direct costs respectively. Out-patient cost was estimated as 41,292,368 Naira ($US 275,282). The relatively high cost of outpatient care was largely due to cost of transportation for monthly follow up visits. Payments were mostly made through out-of-pocket spending. Conclusion: The economic burden of HF in Nigeria is particularly high considering, the relatively young age of affected cases, a minimum wage of 18,000 Naira ($US120) per month and considerable component of out-of-pocket spending for those affected. Health reforms designed to mitigate the individual to societal burden imposed by the syndrome are required. PMID:25415310

  12. Economic burden of heart failure: investigating outpatient and inpatient costs in Abeokuta, Southwest Nigeria.

    PubMed

    Ogah, Okechukwu S; Stewart, Simon; Onwujekwe, Obinna E; Falase, Ayodele O; Adebayo, Saheed O; Olunuga, Taiwo; Sliwa, Karen

    2014-01-01

    Heart failure (HF) is a deadly, disabling and often costly syndrome world-wide. Unfortunately, there is a paucity of data describing its economic impact in sub Saharan Africa; a region in which the number of relatively younger cases will inevitably rise. Heath economic data were extracted from a prospective HF registry in a tertiary hospital situated in Abeokuta, southwest Nigeria. Outpatient and inpatient costs were computed from a representative cohort of 239 HF cases including personnel, diagnostic and treatment resources used for their management over a 12-month period. Indirect costs were also calculated. The annual cost per person was then calculated. Mean age of the cohort was 58.0 ± 15.1 years and 53.1% were men. The total computed cost of care of HF in Abeokuta was 76, 288,845 Nigerian Naira (US$508, 595) translating to 319,200 Naira (US$2,128 US Dollars) per patient per year. The total cost of in-patient care (46% of total health care expenditure) was estimated as 34,996,477 Naira (about 301,230 US dollars). This comprised of 17,899,977 Naira- 50.9% ($US114,600) and 17,806,500 naira -49.1%($US118,710) for direct and in-direct costs respectively. Out-patient cost was estimated as 41,292,368 Naira ($US 275,282). The relatively high cost of outpatient care was largely due to cost of transportation for monthly follow up visits. Payments were mostly made through out-of-pocket spending. The economic burden of HF in Nigeria is particularly high considering, the relatively young age of affected cases, a minimum wage of 18,000 Naira ($US120) per month and considerable component of out-of-pocket spending for those affected. Health reforms designed to mitigate the individual to societal burden imposed by the syndrome are required.

  13. Saving Energy and Money: A Lesson in Computer Power Management

    ERIC Educational Resources Information Center

    Lazaros, Edward J.; Hua, David

    2012-01-01

    In this activity, students will develop an understanding of the economic impact of technology by estimating the cost savings of power management strategies in the classroom. Students will learn how to adjust computer display settings to influence the impact that the computer has on the financial burden to the school. They will use mathematics to…

  14. 75 FR 70899 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-19

    ... submit to the Office of Management and Budget (OMB) for clearance the following proposal for collection... Annual Burden Hours: 2,952. Public Computer Center Reports (Quarterly and Annually) Number of Respondents... specific to Infrastructure and Comprehensive Community Infrastructure, Public Computer Center, and...

  15. Lessons learnt on the analysis of large sequence data in animal genomics.

    PubMed

    Biscarini, F; Cozzi, P; Orozco-Ter Wengel, P

    2018-04-06

    The 'omics revolution has made a large amount of sequence data available to researchers and the industry. This has had a profound impact in the field of bioinformatics, stimulating unprecedented advancements in this discipline. Mostly, this is usually looked at from the perspective of human 'omics, in particular human genomics. Plant and animal genomics, however, have also been deeply influenced by next-generation sequencing technologies, with several genomics applications now popular among researchers and the breeding industry. Genomics tends to generate huge amounts of data, and genomic sequence data account for an increasing proportion of big data in biological sciences, due largely to decreasing sequencing and genotyping costs and to large-scale sequencing and resequencing projects. The analysis of big data poses a challenge to scientists, as data gathering currently takes place at a faster pace than does data processing and analysis, and the associated computational burden is increasingly taxing, making even simple manipulation, visualization and transferring of data a cumbersome operation. The time consumed by the processing and analysing of huge data sets may be at the expense of data quality assessment and critical interpretation. Additionally, when analysing lots of data, something is likely to go awry-the software may crash or stop-and it can be very frustrating to track the error. We herein review the most relevant issues related to tackling these challenges and problems, from the perspective of animal genomics, and provide researchers that lack extensive computing experience with guidelines that will help when processing large genomic data sets. © 2018 Stichting International Foundation for Animal Genetics.

  16. SAChES: Scalable Adaptive Chain-Ensemble Sampling.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swiler, Laura Painton; Ray, Jaideep; Ebeida, Mohamed Salah

    We present the development of a parallel Markov Chain Monte Carlo (MCMC) method called SAChES, Scalable Adaptive Chain-Ensemble Sampling. This capability is targed to Bayesian calibration of com- putationally expensive simulation models. SAChES involves a hybrid of two methods: Differential Evo- lution Monte Carlo followed by Adaptive Metropolis. Both methods involve parallel chains. Differential evolution allows one to explore high-dimensional parameter spaces using loosely coupled (i.e., largely asynchronous) chains. Loose coupling allows the use of large chain ensembles, with far more chains than the number of parameters to explore. This reduces per-chain sampling burden, enables high-dimensional inversions and the usemore » of computationally expensive forward models. The large number of chains can also ameliorate the impact of silent-errors, which may affect only a few chains. The chain ensemble can also be sampled to provide an initial condition when an aberrant chain is re-spawned. Adaptive Metropolis takes the best points from the differential evolution and efficiently hones in on the poste- rior density. The multitude of chains in SAChES is leveraged to (1) enable efficient exploration of the parameter space; and (2) ensure robustness to silent errors which may be unavoidable in extreme-scale computational platforms of the future. This report outlines SAChES, describes four papers that are the result of the project, and discusses some additional results.« less

  17. State of inequality in malaria intervention coverage in sub-Saharan African countries.

    PubMed

    Galactionova, Katya; Smith, Thomas A; de Savigny, Don; Penny, Melissa A

    2017-10-18

    Scale-up of malaria interventions over the last decade have yielded a significant reduction in malaria transmission and disease burden in sub-Saharan Africa. We estimated economic gradients in the distribution of these efforts and of their impacts within and across endemic countries. Using Demographic and Health Surveys we computed equity metrics to characterize the distribution of malaria interventions in 30 endemic countries proxying economic position with an asset-wealth index. Gradients were summarized in a concentration index, tabulated against level of coverage, and compared among interventions, across countries, and against respective trends over the period 2005-2015. There remain broad differences in coverage of malaria interventions and their distribution by wealth within and across countries. In most, economic gradients are lacking or favor the poorest for vector control; malaria services delivered through the formal healthcare sector are much less equitable. Scale-up of interventions in many countries improved access across the wealth continuum; in some, these efforts consistently prioritized the poorest. Expansions in control programs generally narrowed coverage gaps between economic strata; gradients persist in countries where growth was slower in the poorest quintile or where baseline inequality was large. Despite progress, malaria is consistently concentrated in the poorest, with the degree of inequality in burden far surpassing that expected given gradients in the distribution of interventions. Economic gradients in the distribution of interventions persist over time, limiting progress toward equity in malaria control. We found that, in countries with large baseline inequality in the distribution of interventions, even a small bias in expansion favoring the least poor yielded large gradients in intervention coverage while pro-poor growth failed to close the gap between the poorest and least poor. We demonstrated that dimensions of disadvantage compound for the poor; a lack of economic gradients in the distribution of malaria services does not translate to equity in coverage nor can it be interpreted to imply equity in distribution of risk or disease burden. Our analysis testifies to the progress made by countries in narrowing economic gradients in malaria interventions and highlights the scope for continued monitoring of programs with respect to equity.

  18. The burden on informal caregivers of people with bipolar disorder.

    PubMed

    Ogilvie, Alan D; Morant, Nicola; Goodwin, Guy M

    2005-01-01

    Caregivers of people with bipolar disorder may experience a different quality of burden than is seen with other illnesses. A better understanding of their concerns is necessary to improve the training of professionals working with this population. Conceptualizing caregiver burden in a conventional medical framework may not focus enough on issues important to caregivers, or on cultural and social issues. Perceptions of caregivers about bipolar disorder have important effects on levels of burden experienced. It is important to distinguish between caregivers' experience of this subjective burden and objective burden as externally appraised. Caregivers' previous experiences of health services may influence their beliefs about the illness. Caregiver burden is associated with depression, which affects patient recovery by adding stress to the living environment. The objective burden on caregivers of patients with bipolar disorder is significantly higher than for those with unipolar depression. Caregivers of bipolar patients have high levels of expressed emotion, including critical, hostile, or over-involved attitudes. Several measures have been developed to assess the care burden of patients with depressive disorders, but may be inappropriate for patients with bipolar disorder because of its cyclical nature and the stresses arising from manic and hypomanic episodes. Inter-episode symptoms pose another potential of burden in patients with bipolar disorder. Subsyndromal depressive symptoms are common in this phase of the illness, resulting in severe and widespread impairment of function. Despite the importance of assessing caregiver burden in bipolar disorder, relevant literature is scarce. The specific effects of mania and inter-episode symptoms have not been adequately addressed, and there is a lack of existing measures to assess burden adequately, causing uncertainty regarding how best to structure family interventions to optimally alleviate burden. The relatively few studies into caregiver burden in bipolar disorder may largely reflect experiences in the US Veterans Affairs health service, but the findings may be limited in their generalizability. Nevertheless, available data suggest that caregiver burden is high and largely neglected in bipolar disorder. Clinically effective, well-targeted and practically viable interventions are needed. However, services cannot be enhanced on a rational basis without an improved understanding and capacity to measure and target caregiver burden the impact of any change in services be evaluated.

  19. Trends in gastric cancer mortality and in the prevalence of Helicobacter pylori infection in Portugal.

    PubMed

    Morais, Samantha; Ferro, Ana; Bastos, Ana; Castro, Clara; Lunet, Nuno; Peleteiro, Bárbara

    2016-07-01

    Portugal has the highest gastric cancer mortality rates in Western Europe, along with high prevalences of Helicobacter pylori infection. Monitoring their trends is essential to predict the burden of this cancer. We aimed to quantify time trends in gastric cancer mortality in Portugal and in each administrative region, and to compute short-term predictions, as well as to describe the prevalence of H. pylori infection, through a systematic review. Joinpoint analyses were used to identify significant changes in sex-specific trends in gastric cancer age-standardized mortality rates (ASMR) and to estimate annual percent changes (APC). The most recent trends were considered to compute estimates up to 2020 by adjusting Poisson regression models. We searched PubMed and IndexRMP to identify studies carried out in Portugal reporting the prevalence of H. pylori. Gastric cancer mortality has been decreasing in Portugal since 1971 in men (from ASMR=55.3/100 000; APC=-2.4, 95% confidence interval: -2.5 to -2.3) and since 1970 in women (from ASMR=28.0/100 000; APC=-2.8, 95% confidence interval: -2.9 to -2.7), although large regional differences were observed. Predicted ASMR for 2015 and 2020 were 18.8/100 000 and 16.7/100 000 for men and 8.5/100 000 and 7.4/100 000 for women, respectively. The prevalence of H. pylori varied from almost 5% at 0.5-2 years to just over 90% at 70 years or more. No consistent variation was observed since the 1990s. The downward trends in mortality rates are expected to remain in the next decades. The high prevalence of H. pylori infection across age groups and studies from different periods shows a large potential for decrease in the burden of gastric cancer in Portugal.

  20. Years of life lost due to influenza-attributable mortality in older adults in the Netherlands: a competing risks approach.

    PubMed

    McDonald, Scott A; van Wijhe, Maarten; van Asten, Liselotte; van der Hoek, Wim; Wallinga, Jacco

    2018-02-06

    We estimated the influenza mortality burden in adults 60 years of age and older in the Netherlands in terms of years of life lost, taking into account competing mortality risks. Weekly laboratory surveillance data for influenza and other respiratory pathogens and weekly extreme temperature served as covariates in Poisson regression models fitted to weekly age-group specific mortality data for the period 1999/2000 through 2012/13. Burden for age-groups 60-64 through 85-89 years was computed as years of life lost before age 90 (YLL90) using restricted mean lifetimes survival analysis and accounting for competing risks. Influenza-attributable mortality burden was greatest for persons aged 80-84 years, at 914 YLL90 per 100,000 persons (95% uncertainty interval:867, 963), followed by 85-89 years (787 YLL90/100,000; 95% uncertainty interval:741, 834). Ignoring competing mortality risks in the computation of influenza-attributable YLL90 would lead to substantial over-estimation of burden, from 3.5% for 60-64 years to 82% for persons aged 80-89 years at death. Failure to account for competing mortality risks has implications for accuracy of disease burden estimates, especially among persons aged 80 years and older. As the mortality burden borne by the elderly is notably high, prevention initiatives may benefit from being redesigned to more effectively prevent infection in the oldest age-groups. © The Author(s) 2018. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  1. Biological and statistical approaches to predicting human lung cancer risk from silica.

    PubMed

    Kuempel, E D; Tran, C L; Bailer, A J; Porter, D W; Hubbs, A F; Castranova, V

    2001-01-01

    Chronic inflammation is a key step in the pathogenesis of particle-elicited fibrosis and lung cancer in rats, and possibly in humans. In this study, we compute the excess risk estimates for lung cancer in humans with occupational exposure to crystalline silica, using both rat and human data, and using both a threshold approach and linear models. From a toxicokinetic/dynamic model fit to lung burden and pulmonary response data from a subchronic inhalation study in rats, we estimated the minimum critical quartz lung burden (Mcrit) associated with reduced pulmonary clearance and increased neutrophilic inflammation. A chronic study in rats was also used to predict the human excess risk of lung cancer at various quartz burdens, including mean Mcrit (0.39 mg/g lung). We used a human kinetic lung model to link the equivalent lung burdens to external exposures in humans. We then computed the excess risk of lung cancer at these external exposures, using data of workers exposed to respirable crystalline silica and using Poisson regression and lifetable analyses. Finally, we compared the lung cancer excess risks estimated from male rat and human data. We found that the rat-based linear model estimates were approximately three times higher than those based on human data (e.g., 2.8% in rats vs. 0.9-1% in humans, at mean Mcrit lung burden or associated mean working lifetime exposure of 0.036 mg/m3). Accounting for variability and uncertainty resulted in 100-1000 times lower estimates of human critical lung burden and airborne exposure. This study illustrates that assumptions about the relevant biological mechanism, animal model, and statistical approach can all influence the magnitude of lung cancer risk estimates in humans exposed to crystalline silica.

  2. Planetary protection implementation on Mars Reconnaissance Orbiter mission

    NASA Astrophysics Data System (ADS)

    Barengoltz, J.; Witte, J.

    2008-09-01

    In August 2005 NASA launched a large orbiting science observatory, the Mars Reconnaissance Orbiter (MRO), for what is scheduled to be a 5.4-year mission. High resolution imaging of the surface is a principal goal of the mission. One consequence of this goal however is the need for a low science orbit. Unfortunately this orbit fails the required 20-year orbit life set in NASA Planetary Protection (PP) requirements [NASA. Planetary protection provisions for robotic extraterrestrial missions, NASA procedural requirements NPR 8020.12C, NASA HQ, Washington, DC, April 2005.]. So rather than sacrifice the science goals of the mission by raising the science orbit, the MRO Project chose to be the first orbiter to pursue the bio-burden reduction approach. Cleaning alone for a large orbiter like MRO is insufficient to achieve the bio-burden threshold requirement in NASA PP requirements. The burden requirement for an orbiter includes spores encapsulated in non-metallic materials and trapped in joints, as well as located on all internal and external surfaces (the total spore burden). Total burden estimates are dominated by the mated and encapsulated burden. The encapsulated burden cannot be cleaned. The total burden of a smaller orbiter (e.g., Mars Odyssey) likely could not have met the requirement by cleaning; for the large MRO it is clearly impossible. Of course, a system-level partial sterilization, with its attendant costs and system design issues, could have been employed. In the approach taken by the MRO Project, hardware which will burn up (completely vaporize or ablate) before reaching the surface or will at least attain high temperature (500 °C for 0.5 s or more) due to entry heating was exempt from burden accounting. Thus the bio-burden estimate was reduced. Lockheed Martin engineers developed a process to perform what is called breakup and burn-up (B&B) analysis.Lockheed Martin Corporation.2 The use of the B&B analysis to comply with the spore burden requirement is the main subject of this article. However, several components aboard the orbiter were predicted to fail the minimum time at temperature requirements (or could not conservatively be shown to meet the conditions). An implementation plan was generated to address the highest contributors to the bio-burden assessment that fail to meet the requirements. The spore burden for these components was estimated by direct and proxy burden assays, NASA PP specifications, and dry heat microbial reduction, as appropriate. Items on the orbiter that required rework during assembly were also individually assessed. MRO met the spore burden requirement based on the B&B analysis, the MRO Planetary Protection Implementation Plan, and verification by the NASA Planetary Protection Officer’s (PPO) independent assays. The compliance was documented in the MRO PP Pre-Launch Report. MRO was approved for flight by the NASA PPO.

  3. Variable-Complexity Multidisciplinary Optimization on Parallel Computers

    NASA Technical Reports Server (NTRS)

    Grossman, Bernard; Mason, William H.; Watson, Layne T.; Haftka, Raphael T.

    1998-01-01

    This report covers work conducted under grant NAG1-1562 for the NASA High Performance Computing and Communications Program (HPCCP) from December 7, 1993, to December 31, 1997. The objective of the research was to develop new multidisciplinary design optimization (MDO) techniques which exploit parallel computing to reduce the computational burden of aircraft MDO. The design of the High-Speed Civil Transport (HSCT) air-craft was selected as a test case to demonstrate the utility of our MDO methods. The three major tasks of this research grant included: development of parallel multipoint approximation methods for the aerodynamic design of the HSCT, use of parallel multipoint approximation methods for structural optimization of the HSCT, mathematical and algorithmic development including support in the integration of parallel computation for items (1) and (2). These tasks have been accomplished with the development of a response surface methodology that incorporates multi-fidelity models. For the aerodynamic design we were able to optimize with up to 20 design variables using hundreds of expensive Euler analyses together with thousands of inexpensive linear theory simulations. We have thereby demonstrated the application of CFD to a large aerodynamic design problem. For the predicting structural weight we were able to combine hundreds of structural optimizations of refined finite element models with thousands of optimizations based on coarse models. Computations have been carried out on the Intel Paragon with up to 128 nodes. The parallel computation allowed us to perform combined aerodynamic-structural optimization using state of the art models of a complex aircraft configurations.

  4. Brain tumor segmentation in 3D MRIs using an improved Markov random field model

    NASA Astrophysics Data System (ADS)

    Yousefi, Sahar; Azmi, Reza; Zahedi, Morteza

    2011-10-01

    Markov Random Field (MRF) models have been recently suggested for MRI brain segmentation by a large number of researchers. By employing Markovianity, which represents the local property, MRF models are able to solve a global optimization problem locally. But they still have a heavy computation burden, especially when they use stochastic relaxation schemes such as Simulated Annealing (SA). In this paper, a new 3D-MRF model is put forward to raise the speed of the convergence. Although, search procedure of SA is fairly localized and prevents from exploring the same diversity of solutions, it suffers from several limitations. In comparison, Genetic Algorithm (GA) has a good capability of global researching but it is weak in hill climbing. Our proposed algorithm combines SA and an improved GA (IGA) to optimize the solution which speeds up the computation time. What is more, this proposed algorithm outperforms the traditional 2D-MRF in quality of the solution.

  5. Development of the Assessment of Burden of COPD tool: an integrated tool to measure the burden of COPD.

    PubMed

    Slok, Annerika H M; in 't Veen, Johannes C C M; Chavannes, Niels H; van der Molen, Thys; Rutten-van Mölken, Maureen P M H; Kerstjens, Huib A M; Salomé, Philippe L; Holverda, Sebastiaan; Dekhuijzen, P N Richard; Schuiten, Denise; Asijee, Guus M; van Schayck, Onno C P

    2014-07-10

    In deciding on the treatment plan for patients with chronic obstructive pulmonary disease (COPD), the burden of COPD as experienced by patients should be the core focus. It is therefore important for daily practice to develop a tool that can both assess the burden of COPD and facilitate communication with patients in clinical practice. This paper describes the development of an integrated tool to assess the burden of COPD in daily practice. A definition of the burden of COPD was formulated by a Dutch expert team. Interviews showed that patients and health-care providers agreed on this definition. We found no existing instruments that fully measured burden of disease according to this definition. However, the Clinical COPD Questionnaire meets most requirements, and was therefore used and adapted. The adapted questionnaire is called the Assessment of Burden of COPD (ABC) scale. In addition, the ABC tool was developed, of which the ABC scale is the core part. The ABC tool is a computer program with an algorithm that visualises outcomes and provides treatment advice. The next step in the development of the tool is to test the validity and effectiveness of both the ABC scale and tool in daily practice.

  6. Development of the Assessment of Burden of COPD tool: an integrated tool to measure the burden of COPD

    PubMed Central

    Slok, Annerika H M; in ’t Veen, Johannes C C M; Chavannes, Niels H; van der Molen, Thys; Rutten-van Mölken, Maureen P M H; Kerstjens, Huib A M; Salomé, Philippe L; Holverda, Sebastiaan; Dekhuijzen, PN Richard; Schuiten, Denise; Asijee, Guus M; van Schayck, Onno C P

    2014-01-01

    In deciding on the treatment plan for patients with chronic obstructive pulmonary disease (COPD), the burden of COPD as experienced by patients should be the core focus. It is therefore important for daily practice to develop a tool that can both assess the burden of COPD and facilitate communication with patients in clinical practice. This paper describes the development of an integrated tool to assess the burden of COPD in daily practice. A definition of the burden of COPD was formulated by a Dutch expert team. Interviews showed that patients and health-care providers agreed on this definition. We found no existing instruments that fully measured burden of disease according to this definition. However, the Clinical COPD Questionnaire meets most requirements, and was therefore used and adapted. The adapted questionnaire is called the Assessment of Burden of COPD (ABC) scale. In addition, the ABC tool was developed, of which the ABC scale is the core part. The ABC tool is a computer program with an algorithm that visualises outcomes and provides treatment advice. The next step in the development of the tool is to test the validity and effectiveness of both the ABC scale and tool in daily practice. PMID:25010353

  7. Predictors of caregiving burden: impact of subjective health, negative affect, and loneliness of octogenarians and centenarians.

    PubMed

    Lee, Kyuho; Martin, Peter; Poon, Leonard W

    2017-11-01

    This study aimed (1) to determine whether octogenarian and centenarian care recipients' self-report on physical, social, and emotional status are different from caregivers' reports, (2) to assess associations between octogenarian and centenarian care recipients' poor physical, social, and emotional status and caregiver burden, and (3) to determine which report, the care recipients' self-report or caregivers' report, about the participants' physical and emotional status predicted more accurately levels of caregiver burden. Self-ratings and caregiver informant ratings were obtained from 309 participants of the Georgia Centenarian Study. Care recipients' health, negative affect, and loneliness were reported by both the caregivers and care recipients for the analyses. Differences between care recipients' and caregivers' reports were assessed by t-test. Blockwise multiple regression analysis was computed to assess predictors of caregiver burden. Caregivers' reports on the three measures were significantly higher than self-reports. Caregivers' negative affect and loneliness, not physical health, reported by caregivers predicted higher caregiver burden. Care recipients' reports did not predict caregiver burden. Caregivers perceived care recipients' social and emotional status more negatively, and caregivers' negative perceptions on care recipients' well-being status were an important predictor of caregiver burden.

  8. Print Station Operation. Microcomputing Working Paper Series.

    ERIC Educational Resources Information Center

    Wozny, Lucy Anne

    During the academic year 1983-84, Drexel University instituted a new policy requiring all incoming students to have access to a microcomputer. The computer chosen to fulfill this requirement was the Macintosh from Apple Computer, Inc. Although this requirement put an additional financial burden on the Drexel student, the university administration…

  9. A Model for Integrating Technology and Learning in Public Health Education

    ERIC Educational Resources Information Center

    Bardzell, Shaowen; Bardzell, Jeffrey; So, Hyo-Jeong; Lee, Junghun

    2004-01-01

    As computer interfaces emerge as an instructional medium, instructors transitioning from the classroom continue to bear the burden of designing effective instruction. The medium of the computer interface, and the kinds of learning and interactive possibilities it affords, presumably changes the delivery of learner-centered instruction.…

  10. Developing and Applying Smartphone Apps in Online Courses

    ERIC Educational Resources Information Center

    Yan, Gongjun; Rawat, Danda B.; Shi, Hui; Alnusair, Awny

    2014-01-01

    Online courses provide students flexible access to class at anytime and anywhere. Most online courses currently rely on computer-based delivery. However, computers still burden instructors and students with limited mobility and flexibility. To provide more convenient access to online courses, smartphones have been increasingly adopted as a mobile…

  11. Computer-Based Testing: Test Site Security.

    ERIC Educational Resources Information Center

    Rosen, Gerald A.

    Computer-based testing places great burdens on all involved parties to ensure test security. A task analysis of test site security might identify the areas of protecting the test, protecting the data, and protecting the environment as essential issues in test security. Protecting the test involves transmission of the examinations, identifying the…

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smeets, Albert J., E-mail: radiol@eztilburg.nl; Nijenhuis, Robbert J.; Rooij, Willem Jan van

    Uterine artery embolization (UAE) in patients with a large fibroid burden is controversial. Anecdotal reports describe serious complications and limited clinical results. We report the long-term clinical and magnetic resonance (MR) results in a large series of women with a dominant fibroid of >10 cm and/or an uterine volume of >700 cm{sup 3}. Seventy-one consecutive patients (mean age, 42.5 years; median, 40 years; range, 25-52 years) with a large fibroid burden were treated by UAE between August 2000 and April 2005. Volume reduction and infarction rate of dominant fibroid and uterus were assessed by comparing the baseline and latest follow-upmore » MRIs. Patients were clinically followed at various time intervals after UAE with standardized questionnaires. There were no serious complications of UAE. During a mean follow-up of 48 months (median, 59 months; range, 6-106 months), 10 of 71 patients (14%) had a hysterectomy. Mean volume reduction of the fibroid and uterus was 44 and 43%. Mean infarction rate of the fibroid and overall fibroid infarction rate was 86 and 87%. In the vast majority of patients there was a substantial improvement of symptoms. Clinical results were similar in patients with a dominant fibroid >10 cm and in patients with large uterine volumes by diffuse fibroid disease. In conclusion, our results indicate that the risk of serious complications after UAE in patients with a large fibroid burden is not increased. Moreover, clinical long-term results are as good as in other patients who are treated with UAE. Therefore, a large fibroid burden should not be considered a contraindication for UAE.« less

  13. A Modeling Framework for Optimal Computational Resource Allocation Estimation: Considering the Trade-offs between Physical Resolutions, Uncertainty and Computational Costs

    NASA Astrophysics Data System (ADS)

    Moslehi, M.; de Barros, F.; Rajagopal, R.

    2014-12-01

    Hydrogeological models that represent flow and transport in subsurface domains are usually large-scale with excessive computational complexity and uncertain characteristics. Uncertainty quantification for predicting flow and transport in heterogeneous formations often entails utilizing a numerical Monte Carlo framework, which repeatedly simulates the model according to a random field representing hydrogeological characteristics of the field. The physical resolution (e.g. grid resolution associated with the physical space) for the simulation is customarily chosen based on recommendations in the literature, independent of the number of Monte Carlo realizations. This practice may lead to either excessive computational burden or inaccurate solutions. We propose an optimization-based methodology that considers the trade-off between the following conflicting objectives: time associated with computational costs, statistical convergence of the model predictions and physical errors corresponding to numerical grid resolution. In this research, we optimally allocate computational resources by developing a modeling framework for the overall error based on a joint statistical and numerical analysis and optimizing the error model subject to a given computational constraint. The derived expression for the overall error explicitly takes into account the joint dependence between the discretization error of the physical space and the statistical error associated with Monte Carlo realizations. The accuracy of the proposed framework is verified in this study by applying it to several computationally extensive examples. Having this framework at hand aims hydrogeologists to achieve the optimum physical and statistical resolutions to minimize the error with a given computational budget. Moreover, the influence of the available computational resources and the geometric properties of the contaminant source zone on the optimum resolutions are investigated. We conclude that the computational cost associated with optimal allocation can be substantially reduced compared with prevalent recommendations in the literature.

  14. Generation of binary holograms for deep scenes captured with a camera and a depth sensor

    NASA Astrophysics Data System (ADS)

    Leportier, Thibault; Park, Min-Chul

    2017-01-01

    This work presents binary hologram generation from images of a real object acquired from a Kinect sensor. Since hologram calculation from a point-cloud or polygon model presents a heavy computational burden, we adopted a depth-layer approach to generate the holograms. This method enables us to obtain holographic data of large scenes quickly. Our investigations focus on the performance of different methods, iterative and noniterative, to convert complex holograms into binary format. Comparisons were performed to examine the reconstruction of the binary holograms at different depths. We also propose to modify the direct binary search algorithm to take into account several reference image planes. Then, deep scenes featuring multiple planes of interest can be reconstructed with better efficiency.

  15. Carbon dioxide: Global warning for nephrologists

    PubMed Central

    Marano, Marco; D’Amato, Anna; Cantone, Alessandra

    2016-01-01

    The large prevalence of respiratory acid-base disorders overlapping metabolic acidosis in hemodialysis population should prompt nephrologists to deal with the partial pressure of carbon dioxide (pCO2) complying with the reduced bicarbonate concentration. What the most suitable formula to compute pCO2 is reviewed. Then, the neglected issue of CO2 content in the dialysis fluid is under the spotlight. In fact, a considerable amount of CO2 comes to patients’ bloodstream every hemodialysis treatment and “acidosis by dialysate” may occur if lungs do not properly clear away this burden of CO2. Moreover, vascular access recirculation may be easy diagnosed by detecting CO2 in the arterial line of extracorporeal circuit if CO2-enriched blood from the filter reenters arterial needle. PMID:27648406

  16. Carbon dioxide: Global warning for nephrologists.

    PubMed

    Marano, Marco; D'Amato, Anna; Cantone, Alessandra

    2016-09-06

    The large prevalence of respiratory acid-base disorders overlapping metabolic acidosis in hemodialysis population should prompt nephrologists to deal with the partial pressure of carbon dioxide (pCO2) complying with the reduced bicarbonate concentration. What the most suitable formula to compute pCO2 is reviewed. Then, the neglected issue of CO2 content in the dialysis fluid is under the spotlight. In fact, a considerable amount of CO2 comes to patients' bloodstream every hemodialysis treatment and "acidosis by dialysate" may occur if lungs do not properly clear away this burden of CO2. Moreover, vascular access recirculation may be easy diagnosed by detecting CO2 in the arterial line of extracorporeal circuit if CO2-enriched blood from the filter reenters arterial needle.

  17. Automated Help System For A Supercomputer

    NASA Technical Reports Server (NTRS)

    Callas, George P.; Schulbach, Catherine H.; Younkin, Michael

    1994-01-01

    Expert-system software developed to provide automated system of user-helping displays in supercomputer system at Ames Research Center Advanced Computer Facility. Users located at remote computer terminals connected to supercomputer and each other via gateway computers, local-area networks, telephone lines, and satellite links. Automated help system answers routine user inquiries about how to use services of computer system. Available 24 hours per day and reduces burden on human experts, freeing them to concentrate on helping users with complicated problems.

  18. Fast Image Subtraction Using Multi-cores and GPUs

    NASA Astrophysics Data System (ADS)

    Hartung, Steven; Shukla, H.

    2013-01-01

    Many important image processing techniques in astronomy require a massive number of computations per pixel. Among them is an image differencing technique known as Optimal Image Subtraction (OIS), which is very useful for detecting and characterizing transient phenomena. Like many image processing routines, OIS computations increase proportionally with the number of pixels being processed, and the number of pixels in need of processing is increasing rapidly. Utilizing many-core graphical processing unit (GPU) technology in a hybrid conjunction with multi-core CPU and computer clustering technologies, this work presents a new astronomy image processing pipeline architecture. The chosen OIS implementation focuses on the 2nd order spatially-varying kernel with the Dirac delta function basis, a powerful image differencing method that has seen limited deployment in part because of the heavy computational burden. This tool can process standard image calibration and OIS differencing in a fashion that is scalable with the increasing data volume. It employs several parallel processing technologies in a hierarchical fashion in order to best utilize each of their strengths. The Linux/Unix based application can operate on a single computer, or on an MPI configured cluster, with or without GPU hardware. With GPU hardware available, even low-cost commercial video cards, the OIS convolution and subtraction times for large images can be accelerated by up to three orders of magnitude.

  19. AA9int: SNP Interaction Pattern Search Using Non-Hierarchical Additive Model Set.

    PubMed

    Lin, Hui-Yi; Huang, Po-Yu; Chen, Dung-Tsa; Tung, Heng-Yuan; Sellers, Thomas A; Pow-Sang, Julio; Eeles, Rosalind; Easton, Doug; Kote-Jarai, Zsofia; Amin Al Olama, Ali; Benlloch, Sara; Muir, Kenneth; Giles, Graham G; Wiklund, Fredrik; Gronberg, Henrik; Haiman, Christopher A; Schleutker, Johanna; Nordestgaard, Børge G; Travis, Ruth C; Hamdy, Freddie; Neal, David E; Pashayan, Nora; Khaw, Kay-Tee; Stanford, Janet L; Blot, William J; Thibodeau, Stephen N; Maier, Christiane; Kibel, Adam S; Cybulski, Cezary; Cannon-Albright, Lisa; Brenner, Hermann; Kaneva, Radka; Batra, Jyotsna; Teixeira, Manuel R; Pandha, Hardev; Lu, Yong-Jie; Park, Jong Y

    2018-06-07

    The use of single nucleotide polymorphism (SNP) interactions to predict complex diseases is getting more attention during the past decade, but related statistical methods are still immature. We previously proposed the SNP Interaction Pattern Identifier (SIPI) approach to evaluate 45 SNP interaction patterns/patterns. SIPI is statistically powerful but suffers from a large computation burden. For large-scale studies, it is necessary to use a powerful and computation-efficient method. The objective of this study is to develop an evidence-based mini-version of SIPI as the screening tool or solitary use and to evaluate the impact of inheritance mode and model structure on detecting SNP-SNP interactions. We tested two candidate approaches: the 'Five-Full' and 'AA9int' method. The Five-Full approach is composed of the five full interaction models considering three inheritance modes (additive, dominant and recessive). The AA9int approach is composed of nine interaction models by considering non-hierarchical model structure and the additive mode. Our simulation results show that AA9int has similar statistical power compared to SIPI and is superior to the Five-Full approach, and the impact of the non-hierarchical model structure is greater than that of the inheritance mode in detecting SNP-SNP interactions. In summary, it is recommended that AA9int is a powerful tool to be used either alone or as the screening stage of a two-stage approach (AA9int+SIPI) for detecting SNP-SNP interactions in large-scale studies. The 'AA9int' and 'parAA9int' functions (standard and parallel computing version) are added in the SIPI R package, which is freely available at https://linhuiyi.github.io/LinHY_Software/. hlin1@lsuhsc.edu. Supplementary data are available at Bioinformatics online.

  20. A parallel algorithm for viewshed analysis in three-dimensional Digital Earth

    NASA Astrophysics Data System (ADS)

    Feng, Wang; Gang, Wang; Deji, Pan; Yuan, Liu; Liuzhong, Yang; Hongbo, Wang

    2015-02-01

    Viewshed analysis, often supported by geographic information systems, is widely used in the three-dimensional (3D) Digital Earth system. Many of the analyzes involve the siting of features and real-timedecision-making. Viewshed analysis is usually performed at a large scale, which poses substantial computational challenges, as geographic datasets continue to become increasingly large. Previous research on viewshed analysis has been generally limited to a single data structure (i.e., DEM), which cannot be used to analyze viewsheds in complicated scenes. In this paper, a real-time algorithm for viewshed analysis in Digital Earth is presented using the parallel computing of graphics processing units (GPUs). An occlusion for each geometric entity in the neighbor space of the viewshed point is generated according to line-of-sight. The region within the occlusion is marked by a stencil buffer within the programmable 3D visualization pipeline. The marked region is drawn with red color concurrently. In contrast to traditional algorithms based on line-of-sight, the new algorithm, in which the viewshed calculation is integrated with the rendering module, is more efficient and stable. This proposed method of viewshed generation is closer to the reality of the virtual geographic environment. No DEM interpolation, which is seen as a computational burden, is needed. The algorithm was implemented in a 3D Digital Earth system (GeoBeans3D) with the DirectX application programming interface (API) and has been widely used in a range of applications.

  1. Bayesian analysis of zero inflated spatiotemporal HIV/TB child mortality data through the INLA and SPDE approaches: Applied to data observed between 1992 and 2010 in rural North East South Africa

    NASA Astrophysics Data System (ADS)

    Musenge, Eustasius; Chirwa, Tobias Freeman; Kahn, Kathleen; Vounatsou, Penelope

    2013-06-01

    Longitudinal mortality data with few deaths usually have problems of zero-inflation. This paper presents and applies two Bayesian models which cater for zero-inflation, spatial and temporal random effects. To reduce the computational burden experienced when a large number of geo-locations are treated as a Gaussian field (GF) we transformed the field to a Gaussian Markov Random Fields (GMRF) by triangulation. We then modelled the spatial random effects using the Stochastic Partial Differential Equations (SPDEs). Inference was done using a computationally efficient alternative to Markov chain Monte Carlo (MCMC) called Integrated Nested Laplace Approximation (INLA) suited for GMRF. The models were applied to data from 71,057 children aged 0 to under 10 years from rural north-east South Africa living in 15,703 households over the years 1992-2010. We found protective effects on HIV/TB mortality due to greater birth weight, older age and more antenatal clinic visits during pregnancy (adjusted RR (95% CI)): 0.73(0.53;0.99), 0.18(0.14;0.22) and 0.96(0.94;0.97) respectively. Therefore childhood HIV/TB mortality could be reduced if mothers are better catered for during pregnancy as this can reduce mother-to-child transmissions and contribute to improved birth weights. The INLA and SPDE approaches are computationally good alternatives in modelling large multilevel spatiotemporal GMRF data structures.

  2. Modeling Trends in Tropospheric Aerosol Burden & Its Radiative Effects

    EPA Science Inventory

    Large changes in emissions of aerosol precursors have occurred across the southeast U.S., North America, as well as the northern hemisphere. The spatial heterogeneity and contrasting trends in the aerosol burden is resulting in differing effects on regional radiative balance. Mul...

  3. The financial and health burden of diabetic ambulatory care sensitive hospitalisations in Mexico.

    PubMed

    Lugo-Palacios, David G; Cairns, John

    2016-01-01

    To estimate the financial and health burden of diabetic ambulatory care sensitive hospitalisations (ACSH) in Mexico during 2001-2011. We identified ACSH due to diabetic complications in general hospitals run by local health ministries and estimated their financial cost using diagnostic related groups. The health burden estimation assumes that patients would not have experienced complications if they had received appropriate primary care and computes the associated Disability-Adjusted Life Years (DALYs). The financial cost of diabetic ACSH increased by 125% in real terms and their health burden in 2010 accounted for 4.2% of total DALYs associated with diabetes in Mexico. Avoiding preventable hospitalisations could free resources within the health system for other health purposes. In addition, patients with ACSH suffer preventable losses of health that should be considered when assessing the performance of any primary care intervention.

  4. An optimal control strategy for hybrid actuator systems: Application to an artificial muscle with electric motor assist.

    PubMed

    Ishihara, Koji; Morimoto, Jun

    2018-03-01

    Humans use multiple muscles to generate such joint movements as an elbow motion. With multiple lightweight and compliant actuators, joint movements can also be efficiently generated. Similarly, robots can use multiple actuators to efficiently generate a one degree of freedom movement. For this movement, the desired joint torque must be properly distributed to each actuator. One approach to cope with this torque distribution problem is an optimal control method. However, solving the optimal control problem at each control time step has not been deemed a practical approach due to its large computational burden. In this paper, we propose a computationally efficient method to derive an optimal control strategy for a hybrid actuation system composed of multiple actuators, where each actuator has different dynamical properties. We investigated a singularly perturbed system of the hybrid actuator model that subdivided the original large-scale control problem into smaller subproblems so that the optimal control outputs for each actuator can be derived at each control time step and applied our proposed method to our pneumatic-electric hybrid actuator system. Our method derived a torque distribution strategy for the hybrid actuator by dealing with the difficulty of solving real-time optimal control problems. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  5. An ISVD-based Euclidian structure from motion for smartphones

    NASA Astrophysics Data System (ADS)

    Masiero, A.; Guarnieri, A.; Vettore, A.; Pirotti, F.

    2014-06-01

    The development of Mobile Mapping systems over the last decades allowed to quickly collect georeferenced spatial measurements by means of sensors mounted on mobile vehicles. Despite the large number of applications that can potentially take advantage of such systems, because of their cost their use is currently typically limited to certain specialized organizations, companies, and Universities. However, the recent worldwide diffusion of powerful mobile devices typically embedded with GPS, Inertial Navigation System (INS), and imaging sensors is enabling the development of small and compact mobile mapping systems. More specifically, this paper considers the development of a 3D reconstruction system based on photogrammetry methods for smartphones (or other similar mobile devices). The limited computational resources available in such systems and the users' request for real time reconstructions impose very stringent requirements on the computational burden of the 3D reconstruction procedure. This work takes advantage of certain recently developed mathematical tools (incremental singular value decomposition) and of photogrammetry techniques (structure from motion, Tomasi-Kanade factorization) to access very computationally efficient Euclidian 3D reconstruction of the scene. Furthermore, thanks to the presence of instrumentation for localization embedded in the device, the obtained 3D reconstruction can be properly georeferenced.

  6. Sensitivity of a computer adaptive assessment for measuring functional mobility changes in children enrolled in a community fitness programme.

    PubMed

    Haley, Stephen M; Fragala-Pinkham, Maria; Ni, Pengsheng

    2006-07-01

    To examine the relative sensitivity to detect functional mobility changes with a full-length parent questionnaire compared with a computerized adaptive testing version of the questionnaire after a 16-week group fitness programme. Prospective, pre- and posttest study with a 16-week group fitness intervention. Three community-based fitness centres. Convenience sample of children (n = 28) with physical or developmental disabilities. A 16-week group exercise programme held twice a week in a community setting. A full-length (161 items) paper version of a mobility parent questionnaire based on the Pediatric Evaluation of Disability Inventory, but expanded to include expected skills of children up to 15 years old was compared with a 15-item computer adaptive testing version. Both measures were administered at pre- and posttest intervals. Both the full-length Pediatric Evaluation of Disability Inventory and the 15-item computer adaptive testing version detected significant changes between pre- and posttest scores, had large effect sizes, and standardized response means, with a modest decrease in the computer adaptive test as compared with the 161-item paper version. Correlations between the computer adaptive and paper formats across pre- and posttest scores ranged from r = 0.76 to 0.86. Both functional mobility test versions were able to detect positive functional changes at the end of the intervention period. Greater variability in score estimates was generated by the computerized adaptive testing version, which led to a relative reduction in sensitivity as defined by the standardized response mean. Extreme scores were generally more difficult for the computer adaptive format to estimate with as much accuracy as scores in the mid-range of the scale. However, the reduction in accuracy and sensitivity, which did not influence the group effect results in this study, is counterbalanced by the large reduction in testing burden.

  7. Two-stage atlas subset selection in multi-atlas based image segmentation.

    PubMed

    Zhao, Tingting; Ruan, Dan

    2015-06-01

    Fast growing access to large databases and cloud stored data presents a unique opportunity for multi-atlas based image segmentation and also presents challenges in heterogeneous atlas quality and computation burden. This work aims to develop a novel two-stage method tailored to the special needs in the face of large atlas collection with varied quality, so that high-accuracy segmentation can be achieved with low computational cost. An atlas subset selection scheme is proposed to substitute a significant portion of the computationally expensive full-fledged registration in the conventional scheme with a low-cost alternative. More specifically, the authors introduce a two-stage atlas subset selection method. In the first stage, an augmented subset is obtained based on a low-cost registration configuration and a preliminary relevance metric; in the second stage, the subset is further narrowed down to a fusion set of desired size, based on full-fledged registration and a refined relevance metric. An inference model is developed to characterize the relationship between the preliminary and refined relevance metrics, and a proper augmented subset size is derived to ensure that the desired atlases survive the preliminary selection with high probability. The performance of the proposed scheme has been assessed with cross validation based on two clinical datasets consisting of manually segmented prostate and brain magnetic resonance images, respectively. The proposed scheme demonstrates comparable end-to-end segmentation performance as the conventional single-stage selection method, but with significant computation reduction. Compared with the alternative computation reduction method, their scheme improves the mean and medium Dice similarity coefficient value from (0.74, 0.78) to (0.83, 0.85) and from (0.82, 0.84) to (0.95, 0.95) for prostate and corpus callosum segmentation, respectively, with statistical significance. The authors have developed a novel two-stage atlas subset selection scheme for multi-atlas based segmentation. It achieves good segmentation accuracy with significantly reduced computation cost, making it a suitable configuration in the presence of extensive heterogeneous atlases.

  8. Tradeoffs Between Synchronization, Communication, and Work in Parallel Linear Algebra Computations

    DTIC Science & Technology

    2014-01-25

    Demmel Electrical Engineering and Computer Sciences University of California at Berkeley Technical Report No. UCB/EECS-2014- 8 http...www.eecs.berkeley.edu/Pubs/TechRpts/2014/EECS-2014- 8 .html January 25, 2014 Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the...University of California at Berkeley,Electrical Engineering and Computer Sciences,Berkeley,CA,94720 8 . PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING

  9. Law School Experience in Pervasive Electronic Communications.

    ERIC Educational Resources Information Center

    Shiels, Rosemary

    1994-01-01

    Installation of a schoolwide local area computer network at Chicago-Kent College of Law (Illinois) is described. Uses of electronic mail within a course on computer law are described. Additional social, administrative, and research uses of electronic mail are noted as are positive effects and emerging problems (e.g., burdens on recipients and…

  10. Evaluating privacy-preserving record linkage using cryptographic long-term keys and multibit trees on large medical datasets.

    PubMed

    Brown, Adrian P; Borgs, Christian; Randall, Sean M; Schnell, Rainer

    2017-06-08

    Integrating medical data using databases from different sources by record linkage is a powerful technique increasingly used in medical research. Under many jurisdictions, unique personal identifiers needed for linking the records are unavailable. Since sensitive attributes, such as names, have to be used instead, privacy regulations usually demand encrypting these identifiers. The corresponding set of techniques for privacy-preserving record linkage (PPRL) has received widespread attention. One recent method is based on Bloom filters. Due to superior resilience against cryptographic attacks, composite Bloom filters (cryptographic long-term keys, CLKs) are considered best practice for privacy in PPRL. Real-world performance of these techniques using large-scale data is unknown up to now. Using a large subset of Australian hospital admission data, we tested the performance of an innovative PPRL technique (CLKs using multibit trees) against a gold-standard derived from clear-text probabilistic record linkage. Linkage time and linkage quality (recall, precision and F-measure) were evaluated. Clear text probabilistic linkage resulted in marginally higher precision and recall than CLKs. PPRL required more computing time but 5 million records could still be de-duplicated within one day. However, the PPRL approach required fine tuning of parameters. We argue that increased privacy of PPRL comes with the price of small losses in precision and recall and a large increase in computational burden and setup time. These costs seem to be acceptable in most applied settings, but they have to be considered in the decision to apply PPRL. Further research on the optimal automatic choice of parameters is needed.

  11. Computational Burden Resulting from Image Recognition of High Resolution Radar Sensors

    PubMed Central

    López-Rodríguez, Patricia; Fernández-Recio, Raúl; Bravo, Ignacio; Gardel, Alfredo; Lázaro, José L.; Rufo, Elena

    2013-01-01

    This paper presents a methodology for high resolution radar image generation and automatic target recognition emphasizing the computational cost involved in the process. In order to obtain focused inverse synthetic aperture radar (ISAR) images certain signal processing algorithms must be applied to the information sensed by the radar. From actual data collected by radar the stages and algorithms needed to obtain ISAR images are revised, including high resolution range profile generation, motion compensation and ISAR formation. Target recognition is achieved by comparing the generated set of actual ISAR images with a database of ISAR images generated by electromagnetic software. High resolution radar image generation and target recognition processes are burdensome and time consuming, so to determine the most suitable implementation platform the analysis of the computational complexity is of great interest. To this end and since target identification must be completed in real time, computational burden of both processes the generation and comparison with a database is explained separately. Conclusions are drawn about implementation platforms and calculation efficiency in order to reduce time consumption in a possible future implementation. PMID:23609804

  12. Computational burden resulting from image recognition of high resolution radar sensors.

    PubMed

    López-Rodríguez, Patricia; Fernández-Recio, Raúl; Bravo, Ignacio; Gardel, Alfredo; Lázaro, José L; Rufo, Elena

    2013-04-22

    This paper presents a methodology for high resolution radar image generation and automatic target recognition emphasizing the computational cost involved in the process. In order to obtain focused inverse synthetic aperture radar (ISAR) images certain signal processing algorithms must be applied to the information sensed by the radar. From actual data collected by radar the stages and algorithms needed to obtain ISAR images are revised, including high resolution range profile generation, motion compensation and ISAR formation. Target recognition is achieved by comparing the generated set of actual ISAR images with a database of ISAR images generated by electromagnetic software. High resolution radar image generation and target recognition processes are burdensome and time consuming, so to determine the most suitable implementation platform the analysis of the computational complexity is of great interest. To this end and since target identification must be completed in real time, computational burden of both processes the generation and comparison with a database is explained separately. Conclusions are drawn about implementation platforms and calculation efficiency in order to reduce time consumption in a possible future implementation.

  13. DEMONSTRATION OF LOW COST, LOW BURDEN EXPOSURE MONITORING STRATEGIES

    EPA Science Inventory

    This study is designed to develop and demonstrate relevant, low-cost, low-burden monitoring strategies that could be used in large longitudinal exposure/epidemiological studies, such as the National Children's Study. The focus of this study is on (1) recruiting and retaining p...

  14. RESPIRATORY EPIDEMIOLOGY OF HOUSEHOLD AIR POLLUTION EXPOSURES IN DEVELOPING COUNTRIES

    EPA Science Inventory

    Acute and chronic respiratory diseases impose a huge public health burden in the developing world. A large and growing body of scientific evidence indicates that household air pollution exposures contribute substantially to this burden. The most important source of indoor air p...

  15. Patterns of Objective and Subjective Burden of Informal Caregivers in Multiple Sclerosis.

    PubMed

    Bayen, E; Papeix, C; Pradat-Diehl, P; Lubetzki, C; Joël, M E

    2015-01-01

    Home care for patients with Multiple Sclerosis (MS) relies largely on informal caregivers (ICs). Methods. We assessed ICs objective burden (Resource Utilization in Dementia measuring informal care time (ICT)) and ICs subjective burden (Zarit Burden Inventory (ZBI)). ICs (N = 99) were spouses (70%), mean age 52 years, assisting disabled patients with a mean EDSS (Expanded Disability Status Scale) of 5.5, with executive dysfunction (mean DEX (Dysexecutive questionnaire) of 25) and a duration of MS ranging from 1 to 44 years. burden was high (mean ICT = 6.5 hours/day), mostly consisting of supervision time. Subjective burden was moderate (mean ZBI = 27.3). Multivariate analyses showed that both burdens were positively correlated with higher levels of EDSS and DEX, whereas coresidency and IC's female gender correlated with objective burden only and IC's poor mental health status with subjective burden only. When considering MS aggressiveness, it appeared that both burdens were not correlated with a higher duration of MS but rather increased for patients with severe and early dysexecutive function and for patients classified as fast progressors according to the Multiple Sclerosis Severity Score. Evaluation of MS disability course and IC's personal situation is crucial to understand the burden process and to implement adequate interventions in MS.

  16. The disease burden of human cystic echinococcosis based on HDRs from 2001 to 2014 in Italy

    PubMed Central

    Brundu, Diego; Stegel, Giovanni; Loi, Federica; Rolesu, Sandro; Masu, Gabriella; Ledda, Salvatore; Masala, Giovanna

    2017-01-01

    Background Cystic echinococcosis (CE) is an important neglected zoonotic parasitic infection belonging to the subgroup of seven Neglected Zoonotic Disease (NZDs) included in the World Health Organization’s official list of 18 Neglected Tropical Diseases (NTDs). CE causes serious global human health concerns and leads to significant economic losses arising from the costs of medical treatment, morbidity, life impairments and fatality rates in human cases. Moreover, CE is endemic in several Italian Regions. The aim of this study is to perform a detailed analysis of the economic burden of hospitalization and treatment costs and to estimate the Disability Adjusted Life Years (DALYs) of CE in Italy. Methods and findings In the period from 2001 to 2014, the direct costs of 21,050 Hospital Discharge Records (HDRs) belonging to 12,619 patients with at least one CE-related diagnosis codes were analyzed in order to quantify the economic burden of CE. CE cases average per annum are 901 (min—max = 480–1,583). Direct costs include expenses for hospitalizations, medical and surgical treatment incurred by public and private hospitals and were computed on an individual basis according to Italian Health Ministry legislation. Moreover, we estimated the DALYs for each patient. The Italian financial burden of CE is around € 53 million; the national average economic burden per annum is around € 4 million; the DALYs of the population from 2001 to 2014 are 223.35 annually and 5.26 DALYs per 105 inhabitants. Conclusion In Italy, human CE is responsible for significant economic losses in the public health sector. In humans, costs associated with CE have been shown to have a great impact on affected individuals, their families and the community as a whole. This study could be used as a tool to prioritize and make decisions with regard to a surveillance system for this largely preventable yet neglected disease. It demonstrates the need of implementing a CE control program aimed at preventing the considerable economic and social losses it causes in high incidence areas. PMID:28746395

  17. Global Economic Burden of Norovirus Gastroenteritis

    PubMed Central

    Bartsch, Sarah M.; Lopman, Benjamin A.; Ozawa, Sachiko; Hall, Aron J.; Lee, Bruce Y.

    2016-01-01

    Background Despite accounting for approximately one fifth of all acute gastroenteritis illnesses, norovirus has received comparatively less attention than other infectious pathogens. With several candidate vaccines under development, characterizing the global economic burden of norovirus could help funders, policy makers, public health officials, and product developers determine how much attention and resources to allocate to advancing these technologies to prevent and control norovirus. Methods We developed a computational simulation model to estimate the economic burden of norovirus in every country/area (233 total) stratified by WHO region and globally, from the health system and societal perspectives. We considered direct costs of illness (e.g., clinic visits and hospitalization) and productivity losses. Results Globally, norovirus resulted in a total of $4.2 billion (95% UI: $3.2–5.7 billion) in direct health system costs and $60.3 billion (95% UI: $44.4–83.4 billion) in societal costs per year. Disease amongst children <5 years cost society $39.8 billion, compared to $20.4 billion for all other age groups combined. Costs per norovirus illness varied by both region and age and was highest among adults ≥55 years. Productivity losses represented 84–99% of total costs varying by region. While low and middle income countries and high income countries had similar disease incidence (10,148 vs. 9,935 illness per 100,000 persons), high income countries generated 62% of global health system costs. In sensitivity analysis, the probability of hospitalization had the largest impact on health system cost estimates ($2.8 billion globally, assuming no hospitalization costs), while the probability of missing productive days had the largest impact on societal cost estimates ($35.9 billion globally, with a 25% probability of missing productive days). Conclusions The total economic burden is greatest in young children but the highest cost per illness is among older age groups in some regions. These large costs overwhelmingly are from productivity losses resulting from acute illness. Low, middle, and high income countries all have a considerable economic burden, suggesting that norovirus gastroenteritis is a truly global economic problem. Our findings can help identify which age group(s) and/or geographic regions may benefit the most from interventions. PMID:27115736

  18. Global Economic Burden of Norovirus Gastroenteritis.

    PubMed

    Bartsch, Sarah M; Lopman, Benjamin A; Ozawa, Sachiko; Hall, Aron J; Lee, Bruce Y

    2016-01-01

    Despite accounting for approximately one fifth of all acute gastroenteritis illnesses, norovirus has received comparatively less attention than other infectious pathogens. With several candidate vaccines under development, characterizing the global economic burden of norovirus could help funders, policy makers, public health officials, and product developers determine how much attention and resources to allocate to advancing these technologies to prevent and control norovirus. We developed a computational simulation model to estimate the economic burden of norovirus in every country/area (233 total) stratified by WHO region and globally, from the health system and societal perspectives. We considered direct costs of illness (e.g., clinic visits and hospitalization) and productivity losses. Globally, norovirus resulted in a total of $4.2 billion (95% UI: $3.2-5.7 billion) in direct health system costs and $60.3 billion (95% UI: $44.4-83.4 billion) in societal costs per year. Disease amongst children <5 years cost society $39.8 billion, compared to $20.4 billion for all other age groups combined. Costs per norovirus illness varied by both region and age and was highest among adults ≥55 years. Productivity losses represented 84-99% of total costs varying by region. While low and middle income countries and high income countries had similar disease incidence (10,148 vs. 9,935 illness per 100,000 persons), high income countries generated 62% of global health system costs. In sensitivity analysis, the probability of hospitalization had the largest impact on health system cost estimates ($2.8 billion globally, assuming no hospitalization costs), while the probability of missing productive days had the largest impact on societal cost estimates ($35.9 billion globally, with a 25% probability of missing productive days). The total economic burden is greatest in young children but the highest cost per illness is among older age groups in some regions. These large costs overwhelmingly are from productivity losses resulting from acute illness. Low, middle, and high income countries all have a considerable economic burden, suggesting that norovirus gastroenteritis is a truly global economic problem. Our findings can help identify which age group(s) and/or geographic regions may benefit the most from interventions.

  19. A fast radiative transfer model for visible through shortwave infrared spectral reflectances in clear and cloudy atmospheres

    NASA Astrophysics Data System (ADS)

    Wang, Chenxi; Yang, Ping; Nasiri, Shaima L.; Platnick, Steven; Baum, Bryan A.; Heidinger, Andrew K.; Liu, Xu

    2013-02-01

    A computationally efficient radiative transfer model (RTM) for calculating visible (VIS) through shortwave infrared (SWIR) reflectances is developed for use in satellite and airborne cloud property retrievals. The full radiative transfer equation (RTE) for combinations of cloud, aerosol, and molecular layers is solved approximately by using six independent RTEs that assume the plane-parallel approximation along with a single-scattering approximation for Rayleigh scattering. Each of the six RTEs can be solved analytically if the bidirectional reflectance/transmittance distribution functions (BRDF/BTDF) of the cloud/aerosol layers are known. The adding/doubling (AD) algorithm is employed to account for overlapped cloud/aerosol layers and non-Lambertian surfaces. Two approaches are used to mitigate the significant computational burden of the AD algorithm. First, the BRDF and BTDF of single cloud/aerosol layers are pre-computed using the discrete ordinates radiative transfer program (DISORT) implemented with 128 streams, and second, the required integral in the AD algorithm is numerically implemented on a twisted icosahedral mesh. A concise surface BRDF simulator associated with the MODIS land surface product (MCD43) is merged into a fast RTM to accurately account for non-isotropic surface reflectance. The resulting fast RTM is evaluated with respect to its computational accuracy and efficiency. The simulation bias between DISORT and the fast RTM is large (e.g., relative error >5%) only when both the solar zenith angle (SZA) and the viewing zenith angle (VZA) are large (i.e., SZA>45° and VZA>70°). For general situations, i.e., cloud/aerosol layers above a non-Lambertian surface, the fast RTM calculation rate is faster than that of the 128-stream DISORT by approximately two orders of magnitude.

  20. National Fusion Collaboratory: Grid Computing for Simulations and Experiments

    NASA Astrophysics Data System (ADS)

    Greenwald, Martin

    2004-05-01

    The National Fusion Collaboratory Project is creating a computational grid designed to advance scientific understanding and innovation in magnetic fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling and allowing more efficient use of experimental facilities. The philosophy of FusionGrid is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as network available services, easily used by the fusion scientist. In such an environment, access to services is stressed rather than portability. By building on a foundation of established computer science toolkits, deployment time can be minimized. These services all share the same basic infrastructure that allows for secure authentication and resource authorization which allows stakeholders to control their own resources such as computers, data and experiments. Code developers can control intellectual property, and fair use of shared resources can be demonstrated and controlled. A key goal is to shield scientific users from the implementation details such that transparency and ease-of-use are maximized. The first FusionGrid service deployed was the TRANSP code, a widely used tool for transport analysis. Tools for run preparation, submission, monitoring and management have been developed and shared among a wide user base. This approach saves user sites from the laborious effort of maintaining such a large and complex code while at the same time reducing the burden on the development team by avoiding the need to support a large number of heterogeneous installations. Shared visualization and A/V tools are being developed and deployed to enhance long-distance collaborations. These include desktop versions of the Access Grid, a highly capable multi-point remote conferencing tool and capabilities for sharing displays and analysis tools over local and wide-area networks.

  1. Discrete element simulation of charging and mixed layer formation in the ironmaking blast furnace

    NASA Astrophysics Data System (ADS)

    Mitra, Tamoghna; Saxén, Henrik

    2016-11-01

    The burden distribution in the ironmaking blast furnace plays an important role for the operation as it affects the gas flow distribution, heat and mass transfer, and chemical reactions in the shaft. This work studies certain aspects of burden distribution by small-scale experiments and numerical simulation by the discrete element method (DEM). Particular attention is focused on the complex layer-formation process and the problems associated with estimating the burden layer distribution by burden profile measurements. The formation of mixed layers is studied, and a computational method for estimating the extent of the mixed layer, as well as its voidage, is proposed and applied on the results of the DEM simulations. In studying a charging program and its resulting burden distribution, the mixed layers of coke and pellets were found to show lower voidage than the individual burden layers. The dynamic evolution of the mixed layer during the charging process is also analyzed. The results of the study can be used to gain deeper insight into the complex charging process of the blast furnace, which is useful in the design of new charging programs and for mathematical models that do not consider the full behavior of the particles in the burden layers.

  2. Reverse Engineering Cellular Networks with Information Theoretic Methods

    PubMed Central

    Villaverde, Alejandro F.; Ross, John; Banga, Julio R.

    2013-01-01

    Building mathematical models of cellular networks lies at the core of systems biology. It involves, among other tasks, the reconstruction of the structure of interactions between molecular components, which is known as network inference or reverse engineering. Information theory can help in the goal of extracting as much information as possible from the available data. A large number of methods founded on these concepts have been proposed in the literature, not only in biology journals, but in a wide range of areas. Their critical comparison is difficult due to the different focuses and the adoption of different terminologies. Here we attempt to review some of the existing information theoretic methodologies for network inference, and clarify their differences. While some of these methods have achieved notable success, many challenges remain, among which we can mention dealing with incomplete measurements, noisy data, counterintuitive behaviour emerging from nonlinear relations or feedback loops, and computational burden of dealing with large data sets. PMID:24709703

  3. Impacts of different characterizations of large-scale background on simulated regional-scale ozone over the continental United States

    NASA Astrophysics Data System (ADS)

    Hogrefe, Christian; Liu, Peng; Pouliot, George; Mathur, Rohit; Roselle, Shawn; Flemming, Johannes; Lin, Meiyun; Park, Rokjin J.

    2018-03-01

    This study analyzes simulated regional-scale ozone burdens both near the surface and aloft, estimates process contributions to these burdens, and calculates the sensitivity of the simulated regional-scale ozone burden to several key model inputs with a particular emphasis on boundary conditions derived from hemispheric or global-scale models. The Community Multiscale Air Quality (CMAQ) model simulations supporting this analysis were performed over the continental US for the year 2010 within the context of the Air Quality Model Evaluation International Initiative (AQMEII) and Task Force on Hemispheric Transport of Air Pollution (TF-HTAP) activities. CMAQ process analysis (PA) results highlight the dominant role of horizontal and vertical advection on the ozone burden in the mid-to-upper troposphere and lower stratosphere. Vertical mixing, including mixing by convective clouds, couples fluctuations in free-tropospheric ozone to ozone in lower layers. Hypothetical bounding scenarios were performed to quantify the effects of emissions, boundary conditions, and ozone dry deposition on the simulated ozone burden. Analysis of these simulations confirms that the characterization of ozone outside the regional-scale modeling domain can have a profound impact on simulated regional-scale ozone. This was further investigated by using data from four hemispheric or global modeling systems (Chemistry - Integrated Forecasting Model (C-IFS), CMAQ extended for hemispheric applications (H-CMAQ), the Goddard Earth Observing System model coupled to chemistry (GEOS-Chem), and AM3) to derive alternate boundary conditions for the regional-scale CMAQ simulations. The regional-scale CMAQ simulations using these four different boundary conditions showed that the largest ozone abundance in the upper layers was simulated when using boundary conditions from GEOS-Chem, followed by the simulations using C-IFS, AM3, and H-CMAQ boundary conditions, consistent with the analysis of the ozone fields from the global models along the CMAQ boundaries. Using boundary conditions from AM3 yielded higher springtime ozone columns burdens in the middle and lower troposphere compared to boundary conditions from the other models. For surface ozone, the differences between the AM3-driven CMAQ simulations and the CMAQ simulations driven by other large-scale models are especially pronounced during spring and winter where they can reach more than 10 ppb for seasonal mean ozone mixing ratios and as much as 15 ppb for domain-averaged daily maximum 8 h average ozone on individual days. In contrast, the differences between the C-IFS-, GEOS-Chem-, and H-CMAQ-driven regional-scale CMAQ simulations are typically smaller. Comparing simulated surface ozone mixing ratios to observations and computing seasonal and regional model performance statistics revealed that boundary conditions can have a substantial impact on model performance. Further analysis showed that boundary conditions can affect model performance across the entire range of the observed distribution, although the impacts tend to be lower during summer and for the very highest observed percentiles. The results are discussed in the context of future model development and analysis opportunities.

  4. [Psychosocial burdens in teaching proffession--initial results of Questionnaire of Occupational Burdens in Teaching (QOBT)].

    PubMed

    Pyzalski, Jacek

    2008-01-01

    This article presents the results obtained using a new tool for measuring psychosocial burdens in teaching profession--the Questionnaire of Occupational Burdens in Teaching (QOBT). In its first theoretical part, some typologies of stressors in teaching, developed in other countries, are presented and the need to construct a new tool in Poland is discussed. In this part, the construction process of the new tool and its three scales comprising: Conflict Situations, Organizational Burdens and Lack of Work Sense are described. The psychometric features of the new questionnaire (e.g., Cronbach a = 0.63-0.84) are also given. The results are based on a large random sample of teachers from the Łódz voivodeship. The results did not show significant differences in the level of occupational burdens between men and women. Generally, neither are seniority and age related to the level of burdens. One exception are Organizational burdens that slightly more affect older teachers. The study also showed the need to incorporate the activities at the organizational level into programs on occupational stress.

  5. Between-Country Inequalities in the Neglected Tropical Disease Burden in 1990 and 2010, with Projections for 2020.

    PubMed

    Stolk, Wilma A; Kulik, Margarete C; le Rutte, Epke A; Jacobson, Julie; Richardus, Jan Hendrik; de Vlas, Sake J; Houweling, Tanja A J

    2016-05-01

    The World Health Organization (WHO) has set ambitious time-bound targets for the control and elimination of neglected tropical diseases (NTDs). Investing in NTDs is not only seen as good value for money, but is also advocated as a pro-poor policy since it would improve population health in the poorest populations. We studied the extent to which the disease burden from nine NTDs (lymphatic filariasis, onchocerciasis, schistosomiasis, soil-transmitted helminths, trachoma, Chagas disease, human African trypanosomiasis, leprosy, visceral leishmaniasis) was concentrated in the poorest countries in 1990 and 2010, and how this would change by 2020 in case the WHO targets are met. Our analysis was based on 1990 and 2010 data from the Global Burden of Disease (GBD) 2010 study and on projections of the 2020 burden. Low and lower-middle income countries together accounted for 69% and 81% of the global burden in 1990 and 2010 respectively. Only the soil-transmitted helminths and Chagas disease caused a considerable burden in upper-middle income countries. The global burden from these NTDs declined by 27% between 1990 and 2010, but reduction largely came to the benefit of upper-middle income countries. Achieving the WHO targets would lead to a further 55% reduction in the global burden between 2010 and 2020 in each country income group, and 81% of the global reduction would occur in low and lower-middle income countries. The GBD 2010 data show the burden of the nine selected NTDs in DALYs is strongly concentrated in low and lower-middle income countries, which implies that the beneficial impact of NTD control eventually also largely comes to the benefit of these same countries. While the nine NTDs became increasingly concentrated in developing countries in the 1990-2010 period, this trend would be rectified if the WHO targets were met, supporting the pro-poor designation.

  6. Computer Program Re-layers Engineering Drawings

    NASA Technical Reports Server (NTRS)

    Crosby, Dewey C., III

    1990-01-01

    RULCHK computer program aids in structuring layers of information pertaining to part or assembly designed with software described in article "Software for Drawing Design Details Concurrently" (MFS-28444). Checks and optionally updates structure of layers for part. Enables designer to construct model and annotate its documentation without burden of manually layering part to conform to standards at design time.

  7. Burden of Hemoglobinopathies (Thalassemia, Sickle Cell Disorders and G6PD Deficiency) in Iran, 1990-2010: findings from the Global Burden of Disease Study 2010.

    PubMed

    Rezaei, Nazila; Naderimagham, Shohreh; Ghasemian, Anoosheh; Saeedi Moghaddam, Sahar; Gohari, Kimia; Zareiy, Saeid; Sobhani, Sahar; Modirian, Mitra; Kompani, Farzad

    2015-08-01

    Hemoglobinopathies are known as the most common genetic disorders in Iran. The paper aims to provide global estimates of deaths and disability adjusted life years (DALYs) due to hemoglobinopathies in Iran by sex and age during 1990 to 2010 and describe the challenges due to limitations of the Global Burden of Disease Study 2010 (GBD 2010). GBD 2010 estimates of the numbers of deaths and years of life lost (YLLs) due to premature mortality were calculated using the Cause of Death Ensemble model (CODEm). Years of life lost due to disability (YLDs) were computed by multiplication of prevalence, the disability weight for occurrence of sequelae, and the duration of symptoms. Prevalence was estimated through a systematic search of published and available unpublished data sources, with a Bayesian meta-regression model developed for GBD 2010. Disability weights were produced using collected data from population-based surveys. Uncertainty from all inputs was incorporated into the computations of DALYs using simulation methods. We aim to prepare and criticize the results of GBD 2010 and provide some recommendations for reaching better conclusions about the burden of hemoglobinopathies in Iran. Between 1990 and 2010, the overall deaths attributed to hemoglobinopathies decreased from 0.51% to 0.36% of total deaths, with the corresponding burden declining from 1% to 0.82% of total DALYs. There was a reduction in deaths and DALYs rates for all ages and the rates attributed to all ages followed the same pattern in Iranian men and women. The highest DALYs for hemoglobinopathies, thalassemia, sickle cell disorder, and glucose-6-phosphate dehydrogenase deficiency (G6PD-D) were found in those aged less than 5 years. The collective burden of all of these hemoglobin disorder was lower in 2010 than in 1990. Although the screening programs in Iran have been very successful in reducing the number of thalassemia patients between 1990 to 2010, in order to provide a better estimation of the burden of hemoglobin disorders, it is necessary to perform a national and sub-national study of hemoglobinopathies using multiple national and sub-national surveys.

  8. Computers in medicine. Virtual rehabilitation: dream or reality?

    PubMed

    Berra, Kathy

    2006-08-01

    Coronary heart disease is the number one cause of death for men and women in the United States and internationally. Identification of persons at risk for cardiovascular disease and reduction of cardiovascular risk factors are key factors in managing this tremendous societal burden. The internet holds great promise in helping to identify and manage persons at high risk of a cardiac or vascular event. The Internet has the capability to assess risk and provide support for cardiovascular risk reduction for large numbers of persons in a cost effective and time efficient manner. The purpose of this report is to describe important advances in the use of the Internet in identifying persons at risk for a cardiovascular event and in the Internet's ability to provide interventions designed to reduce this risk.

  9. Association of aortic valve calcification to the presence, extent, and composition of coronary artery plaque burden: from the Rule Out Myocardial Infarction using Computer Assisted Tomography (ROMICAT) trial.

    PubMed

    Mahabadi, Amir A; Bamberg, Fabian; Toepker, Michael; Schlett, Christopher L; Rogers, Ian S; Nagurney, John T; Brady, Thomas J; Hoffmann, Udo; Truong, Quynh A

    2009-10-01

    Aortic valve calcification (AVC) is associated with cardiovascular risk factors and coronary artery calcification. We sought to determine whether AVC is associated with the presence and extent of overall plaque burden, as well as to plaque composition (calcified, mixed, and noncalcified). We examined 357 subjects (mean age 53 +/- 12 years, 61% male) who underwent contrast-enhanced electrocardiogram-gated 64-slice multidetector computed tomography from the ROMICAT trial for the assessment of presence and extent of coronary plaque burden according to the 17-coronary segment model and presence of AVC. Patients with AVC (n = 37, 10%) were more likely than those without AVC (n = 320, 90%) to have coexisting presence of any coronary plaque (89% vs 46%, P < .001) and had a greater extent of coronary plaque burden (6.4 vs 1.8 segments, P < .001). Those with AVC had >3-fold increase odds of having any plaque (adjusted odds ratio [OR] 3.6, P = .047) and an increase of 2.5 segments of plaque (P < .001) as compared to those without AVC. When stratified by plaque composition, AVC was associated most with calcified plaque (OR 5.2, P = .004), then mixed plaque (OR 3.2, P = .02), but not with noncalcified plaque (P = .96). Aortic valve calcification is associated with the presence and greater extent of coronary artery plaque burden and may be part of the later stages of the atherosclerosis process, as its relation is strongest with calcified plaque, less with mixed plaque, and nonsignificant with noncalcified plaque. If AVC is present, consideration for aggressive medical therapy may be warranted.

  10. Incidental findings in imaging research: evaluating incidence, benefit, and burden.

    PubMed

    Orme, Nicholas M; Fletcher, Joel G; Siddiki, Hassan A; Harmsen, W Scott; O'Byrne, Megan M; Port, John D; Tremaine, William J; Pitot, Henry C; McFarland, Elizabeth G; Robinson, Marguerite E; Koenig, Barbara A; King, Bernard F; Wolf, Susan M

    2010-09-27

    Little information exists concerning the frequency and medical significance of incidental findings (IFs) in imaging research. Medical records of research participants undergoing a research imaging examination interpreted by a radiologist during January through March 2004 were reviewed, with 3-year clinical follow-up. An expert panel reviewed all IFs generating clinical action to determine medical benefit/burden on the basis of predefined criteria. The frequency of IFs that generated further clinical action was estimated by modality, body part, age, and sex, along with net medical benefit or burden. Of 1426 research imaging examinations, 567 (39.8%) had at least 1 IF (1055 total). Risk of an IF increased significantly by age (odds ratio [OR], 1.5; 95% confidence interval, 1.4-1.7 per decade increase). Abdominopelvic computed tomography generated more IFs than other examinations (OR, 18.9 vs ultrasonography; 9.2% with subsequent clinical action), with computed tomography of the thorax and magnetic resonance imaging of the head next (OR, 11.9 and 5.9; 2.8% and 2.2% with action, respectively). Of the 567 examinations with an IF, 35 (6.2%) generated clinical action, resulting in clear medical benefit in 1.1% (6 of 567) and clear medical burden in 0.5% (3 of 567). Medical benefit/burden was usually unclear (26 of 567 [4.6%]). Frequency of IFs in imaging research examinations varies significantly by imaging modality, body region, and age. Research imaging studies at high risk for generating IFs can be identified. Routine evaluation of research images by radiologists may result in identification of IFs in a high number of cases and subsequent clinical action to address them in a small but significant minority. Such clinical action can result in medical benefit to a small number of patients.

  11. A multinational review of recent trends and reports in dementia caregiver burden.

    PubMed

    Torti, Frank M; Gwyther, Lisa P; Reed, Shelby D; Friedman, Joëlle Y; Schulman, Kevin A

    2004-01-01

    This systematic review of the literature focuses on the influence of ethnic, cultural, and geographic factors on the caregivers of patients with dementia. In particular, we explore the impact of cultural expectations on five important questions: 1) Do the characteristics of dementia affect caregiver burden? 2) Do characteristics of the caregiver independently predict burden? 3) Does the caregiver affect patient outcomes? 4) Does support or intervention for caregiver result in reduced caregiver burden or improved patient outcomes? 5) Finally, do patient interventions result in reduced caregiver burden or improved patient outcomes? Our findings suggest that noncognitive, behavioral disturbances of patients with dementia result in increased caregiver burden and that female caregivers bear a particularly heavy burden across cultures, particularly in Asian societies. Caregiver burden influences time to medical presentation of patients with dementia, patient condition at presentation, and patient institutionalization. Moreover, interventions designed to reduce caregiver burden have been largely, although not universally, unsuccessful. Pharmacological treatments for symptoms of dementia were found to be beneficial in reducing caregiver burden. The consistency of findings across studies, geographic regions, cultural differences, and heathcare delivery systems is striking. Yet, there are critical differences in cultural expectations and social resources. Future interventions to reduce caregiver burden must consider these differences, identify patients and caregivers at greatest risk, and develop targeted programs that combine aspects of a number of interventional strategies.

  12. Real-time, autonomous precise satellite orbit determination using the global positioning system

    NASA Astrophysics Data System (ADS)

    Goldstein, David Ben

    2000-10-01

    The desire for autonomously generated, rapidly available, and highly accurate satellite ephemeris is growing with the proliferation of constellations of satellites and the cost and overhead of ground tracking resources. Autonomous Orbit Determination (OD) may be done on the ground in a post-processing mode or in real-time on board a satellite and may be accomplished days, hours or immediately after observations are processed. The Global Positioning System (GPS) is now widely used as an alternative to ground tracking resources to supply observation data for satellite positioning and navigation. GPS is accurate, inexpensive, provides continuous coverage, and is an excellent choice for autonomous systems. In an effort to estimate precise satellite ephemeris in real-time on board a satellite, the Goddard Space Flight Center (GSFC) created the GPS Enhanced OD Experiment (GEODE) flight navigation software. This dissertation offers alternative methods and improvements to GEODE to increase on board autonomy and real-time total position accuracy and precision without increasing computational burden. First, GEODE is modified to include a Gravity Acceleration Approximation Function (GAAF) to replace the traditional spherical harmonic representation of the gravity field. Next, an ionospheric correction method called Differenced Range Versus Integrated Doppler (DRVID) is applied to correct for ionospheric errors in the GPS measurements used in GEODE. Then, Dynamic Model Compensation (DMC) is added to estimate unmodeled and/or mismodeled forces in the dynamic model and to provide an alternative process noise variance-covariance formulation. Finally, a Genetic Algorithm (GA) is implemented in the form of Genetic Model Compensation (GMC) to optimize DMC forcing noise parameters. Application of GAAF, DRVID and DMC improved GEODE's position estimates by 28.3% when applied to GPS/MET data collected in the presence of Selective Availability (SA), 17.5% when SA is removed from the GPS/MET data and 10.8% on SA free TOPEX data. Position estimates with RSS errors below I meter are now achieved using SA free TOPEX data. DRVID causes an increase in computational burden while GAAF and DMC reduce computational burden. The net effect of applying GAAF, DRVID and DMC is an improvement in GEODE's accuracy/precision without an increase in computational burden.

  13. Virtual Cloud Computing: Effects and Application of Hastily Formed Networks (HFN) for Humanitarian Assistance/Disaster Relief (HA/DR) Missions

    DTIC Science & Technology

    2011-09-01

    COMPUTING: EFFECTS AND APPLICATION OF HASTILY FORMED NETWORKS (HFN) FOR HUMANITARIAN ASSISTANCE/DISASTER RELIEF (HA/DR) MISSIONS by Mark K. Morris...i REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour...SUBTITLE Virtual Cloud Computing: Effects and Application of Hastily Formed Networks (HFN) for Humanitarian Assistance/Disaster Relief (HA/DR) Missions

  14. Side effect burden of antipsychotic drugs in real life - Impact of gender and polypharmacy.

    PubMed

    Iversen, Trude Seselie Jahr; Steen, Nils Eiel; Dieset, Ingrid; Hope, Sigrun; Mørch, Ragni; Gardsjord, Erlend Strand; Jørgensen, Kjetil Nordbø; Melle, Ingrid; Andreassen, Ole A; Molden, Espen; Jönsson, Erik G

    2018-03-02

    Antipsychotic-associated side effects are well known and represent a significant treatment challenge. Still, few large studies have investigated the overall side effect burden of antipsychotics in real-life settings. To describe the occurrence of side effects and perceived burden of antipsychotics in a large naturalistic sample, taking polypharmacy and patient characteristics into account. Patients (n=1087) with psychotic disorders were assessed for side effects using the Udvalg for Kliniske Undersøgelser (UKU) side effect rating scale in addition to assessment of clinical and pharmacological data. Statistical analyses were performed controlling for possible confounding factors. Use of antipsychotics showed significant associations to neurologic and sexual symptoms, sedation and weight gain, and >75% of antipsychotics-users reported side effects. More side effects were observed in patients using several antipsychotics (p=0.002), with increasing total dose (p=0.021) and with antipsychotics in combinations with other psychotropic drugs. Patients and investigators evaluated the side effect burden differently, particularly related to severity, gender and antipsychotics dose. Twice as many females described side effect burden as severe (p=0.004). Patients with psychotic disorders have a high occurrence of symptoms associated with use of antipsychotics, and polypharmacy and female gender are seemingly risk factors for reporting a severe side effect burden. Due to the cross-sectional design evaluation of causality is tentative, and these findings should be further investigated in prospective studies. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. GPU-based Space Situational Awareness Simulation utilising Parallelism for Enhanced Multi-sensor Management

    NASA Astrophysics Data System (ADS)

    Hobson, T.; Clarkson, V.

    2012-09-01

    As a result of continual space activity since the 1950s, there are now a large number of man-made Resident Space Objects (RSOs) orbiting the Earth. Because of the large number of items and their relative speeds, the possibility of destructive collisions involving important space assets is now of significant concern to users and operators of space-borne technologies. As a result, a growing number of international agencies are researching methods for improving techniques to maintain Space Situational Awareness (SSA). Computer simulation is a method commonly used by many countries to validate competing methodologies prior to full scale adoption. The use of supercomputing and/or reduced scale testing is often necessary to effectively simulate such a complex problem on todays computers. Recently the authors presented a simulation aimed at reducing the computational burden by selecting the minimum level of fidelity necessary for contrasting methodologies and by utilising multi-core CPU parallelism for increased computational efficiency. The resulting simulation runs on a single PC while maintaining the ability to effectively evaluate competing methodologies. Nonetheless, the ability to control the scale and expand upon the computational demands of the sensor management system is limited. In this paper, we examine the advantages of increasing the parallelism of the simulation by means of General Purpose computing on Graphics Processing Units (GPGPU). As many sub-processes pertaining to SSA management are independent, we demonstrate how parallelisation via GPGPU has the potential to significantly enhance not only research into techniques for maintaining SSA, but also to enhance the level of sophistication of existing space surveillance sensors and sensor management systems. Nonetheless, the use of GPGPU imposes certain limitations and adds to the implementation complexity, both of which require consideration to achieve an effective system. We discuss these challenges and how they can be overcome. We further describe an application of the parallelised system where visibility prediction is used to enhance sensor management. This facilitates significant improvement in maximum catalogue error when RSOs become temporarily unobservable. The objective is to demonstrate the enhanced scalability and increased computational capability of the system.

  16. Burden of Fasciola hepatica Infection among children from Paucartambo in Cusco, Peru.

    PubMed

    Lopez, Martha; White, A Clinton; Cabada, Miguel M

    2012-03-01

    There is a high prevalence of fascioliasis in the Peruvian highlands, but most cases remain undiagnosed. The burden of disease caused by chronic subclinical infection is largely unknown. We studied school-age children from a district in Paucartambo Province in Cusco, Peru to evaluate the burden of disease caused by subclinical fascioliasis. Parasite eggs and/or larvae were identified in 46.2% of subjects, including Fasciola hepatica in 10.3% of subjects. Fascioliasis was independently associated with anemia (adjusted odds ratio = 3.01 [1.10-8.23]). Subclinical fascioliasis was common among children and strongly associated with anemia. Anemia should be recognized as an important component of the burden of disease from fascioliasis.

  17. OSA Imaging and Applied Optics Congress Support

    DTIC Science & Technology

    2017-02-16

    ranged from theoretical to experimental demonstration and verification of the latest advances in computational imaging research . This meeting covered...Approved OMB No. 0704-0188 The public reporting burden for this collection of information is estimated to average 1 hour per response, including the time ...Applied Optics Congress was a four-day meeting that encompassed the latest advances in computational imaging research . emphasizing integration of

  18. Cost of speech-language interventions for children and youth with foetal alcohol spectrum disorder in Canada.

    PubMed

    Popova, Svetlana; Lange, Shannon; Burd, Larry; Shield, Kevin; Rehm, Jürgen

    2014-12-01

    This study, which is part of a large economic project on the overall burden and cost associated with Foetal Alcohol Spectrum Disorder (FASD) in Canada, estimated the cost of 1:1 speech-language interventions among children and youth with FASD for Canada in 2011. The number of children and youth with FASD and speech-language disorder(s) (SLD), the distribution of the level of severity, and the number of hours needed to treat were estimated using data from the available literature. 1:1 speech-language interventions were computed using the average cost per hour for speech-language pathologists. It was estimated that ˜ 37,928 children and youth with FASD had SLD in Canada in 2011. Using the most conservative approach, the annual cost of 1:1 speech-language interventions among children and youth with FASD is substantial, ranging from $72.5 million to $144.1 million Canadian dollars. Speech-language pathologists should be aware of the disproportionate number of children and youth with FASD who have SLD and the need for early identification to improve access to early intervention. Early identification and access to high quality services may have a role in decreasing the risk of developing the secondary disabilities and in reducing the economic burden of FASD on society.

  19. Shifting the burden: the private sector's response to the AIDS epidemic in Africa.

    PubMed

    Rosen, Sydney; Simon, Jonathon L

    2003-01-01

    As the economic burden of human immunodeficiency virus/acquired immunodeficiency syndrome (HIV/AIDS) increases in sub-Saharan Africa, allocation of the burden among levels and sectors of society is changing. The private sector has more scope to avoid the economic burden of AIDS than governments, households, or nongovernmental organizations, and the burden is being systematically shifted away from the private sector. Common practices that transfer the burden to households and government include pre-employment screening, reductions in employee benefits, restructured employment contracts, outsourcing of low skilled jobs, selective retrenchments, and changes in production technologies. Between 1997 and 1999 more than two-thirds of large South African employers reduced the level of health care benefits or increased employee contributions. Most firms also have replaced defined-benefit retirement funds, which expose the firm to large annual costs but provide long-term support for families, with defined-contribution funds, which eliminate risks to the firm but provide little for families of younger workers who die of AIDS. Contracting out previously permanent jobs is also shielding firms from benefit and turnover costs, effectively shifting the responsibility to care for affected workers and their families to households, nongovernmental organizations, and the government. Many of these changes are responses to globalization that would have occurred in the absence of AIDS, but they are devastating for the households of employees with HIV/AIDS. We argue that the shift in the economic burden of AIDS is a predictable response by business to which a deliberate public policy response is needed. Countries should make explicit decisions about each sector's responsibilities if a socially desirable allocation is to be achieved.

  20. Shifting the burden: the private sector's response to the AIDS epidemic in Africa.

    PubMed Central

    Rosen, Sydney; Simon, Jonathon L.

    2003-01-01

    As the economic burden of human immunodeficiency virus/acquired immunodeficiency syndrome (HIV/AIDS) increases in sub-Saharan Africa, allocation of the burden among levels and sectors of society is changing. The private sector has more scope to avoid the economic burden of AIDS than governments, households, or nongovernmental organizations, and the burden is being systematically shifted away from the private sector. Common practices that transfer the burden to households and government include pre-employment screening, reductions in employee benefits, restructured employment contracts, outsourcing of low skilled jobs, selective retrenchments, and changes in production technologies. Between 1997 and 1999 more than two-thirds of large South African employers reduced the level of health care benefits or increased employee contributions. Most firms also have replaced defined-benefit retirement funds, which expose the firm to large annual costs but provide long-term support for families, with defined-contribution funds, which eliminate risks to the firm but provide little for families of younger workers who die of AIDS. Contracting out previously permanent jobs is also shielding firms from benefit and turnover costs, effectively shifting the responsibility to care for affected workers and their families to households, nongovernmental organizations, and the government. Many of these changes are responses to globalization that would have occurred in the absence of AIDS, but they are devastating for the households of employees with HIV/AIDS. We argue that the shift in the economic burden of AIDS is a predictable response by business to which a deliberate public policy response is needed. Countries should make explicit decisions about each sector's responsibilities if a socially desirable allocation is to be achieved. PMID:12751421

  1. Chemotherapy appointment scheduling under uncertainty using mean-risk stochastic integer programming.

    PubMed

    Alvarado, Michelle; Ntaimo, Lewis

    2018-03-01

    Oncology clinics are often burdened with scheduling large volumes of cancer patients for chemotherapy treatments under limited resources such as the number of nurses and chairs. These cancer patients require a series of appointments over several weeks or months and the timing of these appointments is critical to the treatment's effectiveness. Additionally, the appointment duration, the acuity levels of each appointment, and the availability of clinic nurses are uncertain. The timing constraints, stochastic parameters, rising treatment costs, and increased demand of outpatient oncology clinic services motivate the need for efficient appointment schedules and clinic operations. In this paper, we develop three mean-risk stochastic integer programming (SIP) models, referred to as SIP-CHEMO, for the problem of scheduling individual chemotherapy patient appointments and resources. These mean-risk models are presented and an algorithm is devised to improve computational speed. Computational results were conducted using a simulation model and results indicate that the risk-averse SIP-CHEMO model with the expected excess mean-risk measure can decrease patient waiting times and nurse overtime when compared to deterministic scheduling algorithms by 42 % and 27 %, respectively.

  2. Cellular burdens and biological effects on tissue level caused by inhaled radon progenies.

    PubMed

    Madas, B G; Balásházy, I; Farkas, Á; Szoke, I

    2011-02-01

    In the case of radon exposure, the spatial distribution of deposited radioactive particles is highly inhomogeneous in the central airways. The object of this research is to investigate the consequences of this heterogeneity regarding cellular burdens in the bronchial epithelium and to study the possible biological effects at tissue level. Applying computational fluid and particle dynamics techniques, the deposition distribution of inhaled radon daughters has been determined in a bronchial airway model for 23 min of work in the New Mexico uranium mine corresponding to 0.0129 WLM exposure. A numerical epithelium model based on experimental data has been utilised in order to quantify cellular hits and doses. Finally, a carcinogenesis model considering cell death-induced cell-cycle shortening has been applied to assess the biological responses. Present computations reveal that cellular dose may reach 1.5 Gy, which is several orders of magnitude higher than tissue dose. The results are in agreement with the histological finding that the uneven deposition distribution of radon progenies may lead to inhomogeneous spatial distribution of tumours in the bronchial airways. In addition, at the macroscopic level, the relationship between cancer risk and radiation burden seems to be non-linear.

  3. Quantification of source region influences on the ozone burden

    NASA Astrophysics Data System (ADS)

    Treffeisen, Renate; Grunow, Katja; Möller, Detlev; Hainsch, Andreas

    A project was performed to quantify different influences on the ozone burden. It could be shown that large-scale meteorological influences determine a very large percentage of the ozone concentration. Local measures intended to reduce peak ozone concentrations in summer turn out to be not very effective as a result. The aim of this paper is to quantify regional emission influences on the ozone burden. The investigation of these influences is possible by comparison of the ozone (O 3) and oxidant (O x=O 3+NO 2) concentrations at high-elevation sites downwind and upwind of a source region by using back trajectories. It has been shown that a separation between large-scale influenced meteorological and regional ozone burdens at these sites is possible. This method is applied for an important emission area in Germany—the Ruhrgebiet. On average, no significant ozone contribution of this area to the regional ozone concentration could be found. A large part of the ozone concentration is highly correlated with synoptic weather systems, which exhibit a dominant influence on the local ozone concentrations. Significant contributions of related photochemical ozone formation of the source area of 13-15% have been found only during favourable meteorological situations, identified by the hourly maximum day temperature being above 25°C. This is important with respect to the EU daughter directive to EU 96/62/EC (Official Journal L296 (1996) 55) because Member States should explore the possibilities of local measures to avoid the exceedance of threshold values and, if effective local measures exist, to implement them.

  4. An Efficient Multiscale Finite-Element Method for Frequency-Domain Seismic Wave Propagation

    DOE PAGES

    Gao, Kai; Fu, Shubin; Chung, Eric T.

    2018-02-13

    The frequency-domain seismic-wave equation, that is, the Helmholtz equation, has many important applications in seismological studies, yet is very challenging to solve, particularly for large geological models. Iterative solvers, domain decomposition, or parallel strategies can partially alleviate the computational burden, but these approaches may still encounter nontrivial difficulties in complex geological models where a sufficiently fine mesh is required to represent the fine-scale heterogeneities. We develop a novel numerical method to solve the frequency-domain acoustic wave equation on the basis of the multiscale finite-element theory. We discretize a heterogeneous model with a coarse mesh and employ carefully constructed high-order multiscalemore » basis functions to form the basis space for the coarse mesh. Solved from medium- and frequency-dependent local problems, these multiscale basis functions can effectively capture themedium’s fine-scale heterogeneity and the source’s frequency information, leading to a discrete system matrix with a much smaller dimension compared with those from conventional methods.We then obtain an accurate solution to the acoustic Helmholtz equation by solving only a small linear system instead of a large linear system constructed on the fine mesh in conventional methods.We verify our new method using several models of complicated heterogeneities, and the results show that our new multiscale method can solve the Helmholtz equation in complex models with high accuracy and extremely low computational costs.« less

  5. An Efficient Multiscale Finite-Element Method for Frequency-Domain Seismic Wave Propagation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, Kai; Fu, Shubin; Chung, Eric T.

    The frequency-domain seismic-wave equation, that is, the Helmholtz equation, has many important applications in seismological studies, yet is very challenging to solve, particularly for large geological models. Iterative solvers, domain decomposition, or parallel strategies can partially alleviate the computational burden, but these approaches may still encounter nontrivial difficulties in complex geological models where a sufficiently fine mesh is required to represent the fine-scale heterogeneities. We develop a novel numerical method to solve the frequency-domain acoustic wave equation on the basis of the multiscale finite-element theory. We discretize a heterogeneous model with a coarse mesh and employ carefully constructed high-order multiscalemore » basis functions to form the basis space for the coarse mesh. Solved from medium- and frequency-dependent local problems, these multiscale basis functions can effectively capture themedium’s fine-scale heterogeneity and the source’s frequency information, leading to a discrete system matrix with a much smaller dimension compared with those from conventional methods.We then obtain an accurate solution to the acoustic Helmholtz equation by solving only a small linear system instead of a large linear system constructed on the fine mesh in conventional methods.We verify our new method using several models of complicated heterogeneities, and the results show that our new multiscale method can solve the Helmholtz equation in complex models with high accuracy and extremely low computational costs.« less

  6. Making Classical Ground State Spin Computing Fault-Tolerant

    DTIC Science & Technology

    2010-06-24

    approaches to perebor (brute-force searches) algorithms,” IEEE Annals of the History of Computing, 6, 384–400 (1984). [24] D. Bacon and S . T. Flammia ...Adiabatic gate teleportation,” Phys. Rev. Lett., 103, 120504 (2009). [25] D. Bacon and S . T. Flammia , “Adiabatic cluster state quantum computing...v1 [ co nd -m at . s ta t- m ec h] 2 2 Ju n 20 10 Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the

  7. A case study of tuning MapReduce for efficient Bioinformatics in the cloud

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shi, Lizhen; Wang, Zhong; Yu, Weikuan

    The combination of the Hadoop MapReduce programming model and cloud computing allows biological scientists to analyze next-generation sequencing (NGS) data in a timely and cost-effective manner. Cloud computing platforms remove the burden of IT facility procurement and management from end users and provide ease of access to Hadoop clusters. However, biological scientists are still expected to choose appropriate Hadoop parameters for running their jobs. More importantly, the available Hadoop tuning guidelines are either obsolete or too general to capture the particular characteristics of bioinformatics applications. In this paper, we aim to minimize the cloud computing cost spent on bioinformatics datamore » analysis by optimizing the extracted significant Hadoop parameters. When using MapReduce-based bioinformatics tools in the cloud, the default settings often lead to resource underutilization and wasteful expenses. We choose k-mer counting, a representative application used in a large number of NGS data analysis tools, as our study case. Experimental results show that, with the fine-tuned parameters, we achieve a total of 4× speedup compared with the original performance (using the default settings). Finally, this paper presents an exemplary case for tuning MapReduce-based bioinformatics applications in the cloud, and documents the key parameters that could lead to significant performance benefits.« less

  8. GenomeVIP: a cloud platform for genomic variant discovery and interpretation

    PubMed Central

    Mashl, R. Jay; Scott, Adam D.; Huang, Kuan-lin; Wyczalkowski, Matthew A.; Yoon, Christopher J.; Niu, Beifang; DeNardo, Erin; Yellapantula, Venkata D.; Handsaker, Robert E.; Chen, Ken; Koboldt, Daniel C.; Ye, Kai; Fenyö, David; Raphael, Benjamin J.; Wendl, Michael C.; Ding, Li

    2017-01-01

    Identifying genomic variants is a fundamental first step toward the understanding of the role of inherited and acquired variation in disease. The accelerating growth in the corpus of sequencing data that underpins such analysis is making the data-download bottleneck more evident, placing substantial burdens on the research community to keep pace. As a result, the search for alternative approaches to the traditional “download and analyze” paradigm on local computing resources has led to a rapidly growing demand for cloud-computing solutions for genomics analysis. Here, we introduce the Genome Variant Investigation Platform (GenomeVIP), an open-source framework for performing genomics variant discovery and annotation using cloud- or local high-performance computing infrastructure. GenomeVIP orchestrates the analysis of whole-genome and exome sequence data using a set of robust and popular task-specific tools, including VarScan, GATK, Pindel, BreakDancer, Strelka, and Genome STRiP, through a web interface. GenomeVIP has been used for genomic analysis in large-data projects such as the TCGA PanCanAtlas and in other projects, such as the ICGC Pilots, CPTAC, ICGC-TCGA DREAM Challenges, and the 1000 Genomes SV Project. Here, we demonstrate GenomeVIP's ability to provide high-confidence annotated somatic, germline, and de novo variants of potential biological significance using publicly available data sets. PMID:28522612

  9. Burden of potentially pathologic copy number variants is higher in children with isolated congenital heart disease and significantly impairs covariate-adjusted transplant-free survival.

    PubMed

    Kim, Daniel Seung; Kim, Jerry H; Burt, Amber A; Crosslin, David R; Burnham, Nancy; Kim, Cecilia E; McDonald-McGinn, Donna M; Zackai, Elaine H; Nicolson, Susan C; Spray, Thomas L; Stanaway, Ian B; Nickerson, Deborah A; Heagerty, Patrick J; Hakonarson, Hakon; Gaynor, J William; Jarvik, Gail P

    2016-04-01

    Copy number variants (CNVs) are duplications or deletions of genomic regions. Large CNVs are potentially pathogenic and are overrepresented in children with congenital heart disease (CHD). We sought to determine the frequency of large CNVs in children with isolated CHD, and to evaluate the relationship of these potentially pathogenic CNVs with transplant-free survival. These cases are derived from a prospective cohort of patients with nonsyndromic CHD (n = 422) identified before first surgery. Healthy pediatric controls (n = 500) were obtained from the electronic Medical Records and Genetic Epidemiology Network, and CNV frequency was contrasted for CHD cases and controls. CNVs were determined algorithmically; subsequently screened for >95% overlap between 2 methods, size (>300 kb), quality score, overlap with a gene, and novelty (absent from databases of known, benign CNVs); and separately validated by quantitative polymerase chain reaction. Survival likelihoods for cases were calculated using Cox proportional hazards modeling to evaluate the joint effect of CNV burden and known confounders on transplant-free survival. Children with nonsyndromic CHD had a higher burden of potentially pathogenic CNVs compared with pediatric controls (12.1% vs 5.0%; P = .00016). Presence of a CNV was associated with significantly decreased transplant-free survival after surgery (hazard ratio, 3.42; 95% confidence interval, 1.66-7.09; P = .00090) with confounder adjustment. We confirm that children with isolated CHD have a greater burden of rare/large CNVs. We report a novel finding that these CNVs are associated with an adjusted 2.55-fold increased risk of death or transplant. These data suggest that CNV burden is an important modifier of survival after surgery for CHD. Copyright © 2016 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.

  10. Patients' self-perceived burden, caregivers' burden and quality of life for amyotrophic lateral sclerosis patients: a cross-sectional study.

    PubMed

    Geng, Dan; Ou, RuWei; Miao, XiaoHui; Zhao, LiHong; Wei, QianQian; Chen, XuePing; Liang, Yan; Shang, HuiFang; Yang, Rong

    2017-10-01

    This study surveys the quality of life of amyotrophic lateral sclerosis patients and the factors associated with amyotrophic lateral sclerosis patients' self-perceived burden and their caregivers' burden. Burdens of patients with amyotrophic lateral sclerosis and their caregivers in Chinese population are largely unknown. A cross-sectional study was conducted among 81 pairs of amyotrophic lateral sclerosis patients and their caregivers. Amyotrophic lateral sclerosis patients' self-perceived burden and caregivers' burden were assessed by the Self-Perceived Burden Scale and Zarit-Burden Interview, respectively. Quality of life of amyotrophic lateral sclerosis patients was measured using the World Health Organization Quality of Life-Bref. The amyotrophic lateral sclerosis Functional Rating Scale-Revised questionnaire was used to estimate patients' physical function. Both patients and caregivers reported a mild to moderate burden. The World Health Organization quality of life-Bref scores were decreased in respondents with lower amyotrophic lateral sclerosis Functional Rating Scale-Revised, higher Self-Perceived Burden Scale and higher Zarit-Burden Interview scores. Self-Perceived Burden Scale scores were associated with patients' knowledge of amyotrophic lateral sclerosis, respiratory function and female sex. Zarit-Burden Interview scores were associated with caregivers' age, patients' motor function and out-of-pocket payment. With increase in amyotrophic lateral sclerosis patients' self-perceived burden and caregivers' burden, quality of life of amyotrophic lateral sclerosis patients decreased. Female patients, who had known more about the disease, and those with severe respiratory dysfunction were subject to higher self-perceived burden. Older caregivers and caregivers of patients with severe motor dysfunction and more out-of-pocket payment experienced more care burdens. Our study suggests that paying more attention to female amyotrophic lateral sclerosis patients might benefit patients in China or other South-East Asian countries under the Confucian concept of ethics. There is an urgent demand to expand medical insurance coverage to cover amyotrophic lateral sclerosis in China and other developing countries. Long and adequate supports are needed for relieving caregiver's burden. To improve the quality of life of patients, relieving the patients' SBP and caregivers' burden is likely to be not only required, but also essential. © 2016 John Wiley & Sons Ltd.

  11. Socioeconomic differences in the burden of disease in Sweden.

    PubMed Central

    Ljung, Rickard; Peterson, Stefan; Hallqvist, Johan; Heimerson, Inger; Diderichsen, Finn

    2005-01-01

    OBJECTIVE: We sought to analyse how much of the total burden of disease in Sweden, measured in disability-adjusted life years (DALYs), is a result of inequalities in health between socioeconomic groups. We also sought to determine how this unequal burden is distributed across different disease groups and socioeconomic groups. METHODS: Our analysis used data from the Swedish Burden of Disease Study. We studied all Swedish men and women in three age groups (15-44, 45-64, 65-84) and five major socioeconomic groups. The 18 disease and injury groups that contributed to 65% of the total burden of disease were analysed using attributable fractions and the slope index of inequality and the relative index of inequality. FINDINGS: About 30% of the burden of disease among women and 37% of the burden among men is a differential burden resulting from socioeconomic inequalities in health. A large part of this unequally distributed burden falls on unskilled manual workers. The largest contributors to inequalities in health for women are ischaemic heart disease, depression and neurosis, and stroke. For men, the largest contributors are ischaemic heart disease, alcohol addiction and self-inflicted injuries. CONCLUSION: This is the first study to use socioeconomic differences, measured by socioeconomic position, to assess the burden of disease using DALYs. We found that in Sweden one-third of the burden of the diseases we studied is unequally distributed. Studies of socioeconomic inequalities in the burden of disease that take both mortality and morbidity into account can help policy-makers understand the magnitude of inequalities in health for different disease groups. PMID:15744401

  12. Adaptive MCMC in Bayesian phylogenetics: an application to analyzing partitioned data in BEAST.

    PubMed

    Baele, Guy; Lemey, Philippe; Rambaut, Andrew; Suchard, Marc A

    2017-06-15

    Advances in sequencing technology continue to deliver increasingly large molecular sequence datasets that are often heavily partitioned in order to accurately model the underlying evolutionary processes. In phylogenetic analyses, partitioning strategies involve estimating conditionally independent models of molecular evolution for different genes and different positions within those genes, requiring a large number of evolutionary parameters that have to be estimated, leading to an increased computational burden for such analyses. The past two decades have also seen the rise of multi-core processors, both in the central processing unit (CPU) and Graphics processing unit processor markets, enabling massively parallel computations that are not yet fully exploited by many software packages for multipartite analyses. We here propose a Markov chain Monte Carlo (MCMC) approach using an adaptive multivariate transition kernel to estimate in parallel a large number of parameters, split across partitioned data, by exploiting multi-core processing. Across several real-world examples, we demonstrate that our approach enables the estimation of these multipartite parameters more efficiently than standard approaches that typically use a mixture of univariate transition kernels. In one case, when estimating the relative rate parameter of the non-coding partition in a heterochronous dataset, MCMC integration efficiency improves by > 14-fold. Our implementation is part of the BEAST code base, a widely used open source software package to perform Bayesian phylogenetic inference. guy.baele@kuleuven.be. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  13. A Biophysico-computational Perspective of Breast Cancer Pathogenesis and Treatment Response

    DTIC Science & Technology

    2006-03-01

    of Breast Cancer Pathogenesis and Treatment Response PRINCIPAL INVESTIGATOR: Valerie M. Weaver Ph.D. CONTRACTING...burden for this collection of information is estimated to average 1 hour per response , including the time for reviewing instructions, searching existing...Biophysico-computational Perspective of Breast Cancer Pathogenesis and 5a. CONTRACT NUMBER Treatment Response 5b. GRANT NUMBER W81XWH-05-1-0330 5c

  14. Evaluation of the Validity and Response Burden of Patient Self-Report Measures of the Pain Assessment Screening Tool and Outcomes Registry (PASTOR).

    PubMed

    Cook, Karon F; Kallen, Michael A; Buckenmaier, Chester; Flynn, Diane M; Hanling, Steven R; Collins, Teresa S; Joltes, Kristin; Kwon, Kyung; Medina-Torne, Sheila; Nahavandi, Parisa; Suen, Joshua; Gershon, Richard

    2017-07-01

    In 2009, the Army Pain Management Task Force was chartered. On the basis of their findings, the Department of Defense recommended a comprehensive pain management strategy that included development of a standardized pain assessment system that would collect patient-reported outcomes data to inform the patient-provider clinical encounter. The result was the Pain Assessment Screening Tool and Outcomes Registry (PASTOR). The purpose of this study was to assess the validity and response burden of the patient-reported outcome measures in PASTOR. Data for analyses were collected from 681 individuals who completed PASTOR at baseline and follow-up as part of their routine clinical care. The survey tool included self-report measures of pain severity and pain interference (measured using the National Institutes of Health Patient-Reported Outcome Measurement Information System [PROMIS] and the Defense and Veterans Pain Rating scale). PROMIS measures of pain correlates also were administered. Validation analyses included estimation of score associations among measures, comparison of scores of known groups, responsiveness, ceiling and floor effects, and response burden. Results of psychometric testing provided substantial evidence for the validity of PASTOR self-report measures in this population. Expected associations among scores largely supported the concurrent validity of the measures. Scores effectively distinguished among respondents on the basis of their self-reported impressions of general health. PROMIS measures were administered using computer adaptive testing and each, on average, required less than 1 minute to administer. Statistical and graphical analyses demonstrated the responsiveness of PASTOR measures over time. Reprint & Copyright © 2017 Association of Military Surgeons of the U.S.

  15. The prevalence and burden of mental and substance use disorders in Australia: Findings from the Global Burden of Disease Study 2015.

    PubMed

    Ciobanu, Liliana G; Ferrari, Alize J; Erskine, Holly E; Santomauro, Damian F; Charlson, Fiona J; Leung, Janni; Amare, Azmeraw T; Olagunju, Andrew T; Whiteford, Harvey A; Baune, Bernhard T

    2018-05-01

    Timely and accurate assessments of disease burden are essential for developing effective national health policies. We used the Global Burden of Disease Study 2015 to examine burden due to mental and substance use disorders in Australia. For each of the 20 mental and substance use disorders included in Global Burden of Disease Study 2015, systematic reviews of epidemiological data were conducted, and data modelled using a Bayesian meta-regression tool to produce prevalence estimates by age, sex, geography and year. Prevalence for each disorder was then combined with a disorder-specific disability weight to give years lived with disability, as a measure of non-fatal burden. Fatal burden was measured as years of life lost due to premature mortality which were calculated by combining the number of deaths due to a disorder with the life expectancy remaining at the time of death. Disability-adjusted life years were calculated by summing years lived with disability and years of life lost to give a measure of total burden. Uncertainty was calculated around all burden estimates. Mental and substance use disorders were the leading cause of non-fatal burden in Australia in 2015, explaining 24.3% of total years lived with disability, and were the second leading cause of total burden, accounting for 14.6% of total disability-adjusted life years. There was no significant change in the age-standardised disability-adjusted life year rates for mental and substance use disorders from 1990 to 2015. Global Burden of Disease Study 2015 found that mental and substance use disorders were leading contributors to disease burden in Australia. Despite several decades of national reform, the burden of mental and substance use disorders remained largely unchanged between 1990 and 2015. To reduce this burden, effective population-level preventions strategies are required in addition to effective interventions of sufficient duration and coverage.

  16. The WorkPlace distributed processing environment

    NASA Technical Reports Server (NTRS)

    Ames, Troy; Henderson, Scott

    1993-01-01

    Real time control problems require robust, high performance solutions. Distributed computing can offer high performance through parallelism and robustness through redundancy. Unfortunately, implementing distributed systems with these characteristics places a significant burden on the applications programmers. Goddard Code 522 has developed WorkPlace to alleviate this burden. WorkPlace is a small, portable, embeddable network interface which automates message routing, failure detection, and re-configuration in response to failures in distributed systems. This paper describes the design and use of WorkPlace, and its application in the construction of a distributed blackboard system.

  17. Burden of Fasciola hepatica Infection among Children from Paucartambo in Cusco, Peru

    PubMed Central

    Lopez, Martha; White, A. Clinton; Cabada, Miguel M.

    2012-01-01

    There is a high prevalence of fascioliasis in the Peruvian highlands, but most cases remain undiagnosed. The burden of disease caused by chronic subclinical infection is largely unknown. We studied school-age children from a district in Paucartambo Province in Cusco, Peru to evaluate the burden of disease caused by subclinical fascioliasis. Parasite eggs and/or larvae were identified in 46.2% of subjects, including Fasciola hepatica in 10.3% of subjects. Fascioliasis was independently associated with anemia (adjusted odds ratio = 3.01 [1.10–8.23]). Subclinical fascioliasis was common among children and strongly associated with anemia. Anemia should be recognized as an important component of the burden of disease from fascioliasis. PMID:22403322

  18. Stratospheric areal distribution of water vapor burden and the jet stream

    NASA Technical Reports Server (NTRS)

    Kuhn, P. M.; Magaziner, E.; Stearns, L. P.

    1976-01-01

    Radiometrically inferred areal observations of the atmospheric water vapor burden have been made in the 270 to 520 per cm spectral band over western U.S. and the extreme eastern Pacific from the NASA C-141 Kuiper Airborne Observatory. Before this, very few observations from the upper troposphere and lower stratosphere over such a broad area have been made. A total of 30,600 individual observations from eight separate synoptic situations involving eight jet maxima were computer-averaged over 2-deg latitude x 2-deg longitude boxes and related to the polar continental jet. Mean water vapor burdens ranged from 0.00046 to 0.00143 g per sq cm at 13.4 km with a striking peak just north of the jet wind maximum over a region of strong upward vertical motion.

  19. Systematic Review of the Economic Burden of Pulmonary Arterial Hypertension.

    PubMed

    Gu, Shuyan; Hu, Huimei; Dong, Hengjin

    2016-06-01

    Pulmonary arterial hypertension (PAH), as a life-threatening disease with no efficient cure, may impose a tremendous economic burden on patients and healthcare systems. However, most existing studies have mainly emphasised epidemiology and medications, while large observational studies reporting on the economic burden are currently lacking. To review and evaluate evidence on the costs of PAH and the cost effectiveness of PAH treatments, and to summarise the corresponding cost drivers. Systematic literature searches were conducted in English-language databases (PubMed, Web of Science, ScienceDirect) and Chinese-language databases (China National Knowledge Infrastructure, Wanfang Data, Chongqing VIP) to identify studies (published from 2000 to 2014) assessing the costs of PAH or the cost effectiveness of PAH treatments. The search results were independently reviewed and extracted by two reviewers. Costs were converted into 2014 US dollars. Of 1959 citations identified in the initial search, 19 papers were finally included in this analysis: eight on the economic burden of PAH and 11 on economic evaluation of PAH treatments. The economic burden on patients with PAH was rather large, with direct healthcare costs per patient per month varying from $2476 to $11,875, but none of the studies reported indirect costs. Sildenafil was universally reported to be a cost-effective treatment, with lower costs and better efficacy than other medications. Medical costs were reported to be the key cost drivers. The economic burden of patients with PAH is substantial, while the paucity of comprehensive country-specific evidence in this area and the lack of reports on indirect costs of PAH warrant researchers' concern, especially in China.

  20. Steady shape analysis of tomographic pumping tests for characterization of aquifer heterogeneities

    USGS Publications Warehouse

    Bohling, Geoffrey C.; Zhan, Xiaoyong; Butler, James J.; Zheng, Li

    2002-01-01

    Hydraulic tomography, a procedure involving the performance of a suite of pumping tests in a tomographic format, provides information about variations in hydraulic conductivity at a level of detail not obtainable with traditional well tests. However, analysis of transient data from such a suite of pumping tests represents a substantial computational burden. Although steady state responses can be analyzed to reduce this computational burden significantly, the time required to reach steady state will often be too long for practical applications of the tomography concept. In addition, uncertainty regarding the mechanisms driving the system to steady state can propagate to adversely impact the resulting hydraulic conductivity estimates. These disadvantages of a steady state analysis can be overcome by exploiting the simplifications possible under the steady shape flow regime. At steady shape conditions, drawdown varies with time but the hydraulic gradient does not. Thus transient data can be analyzed with the computational efficiency of a steady state model. In this study, we demonstrate the value of the steady shape concept for inversion of hydraulic tomography data and investigate its robustness with respect to improperly specified boundary conditions.

  1. Model predictive control design for polytopic uncertain systems by synthesising multi-step prediction scenarios

    NASA Astrophysics Data System (ADS)

    Lu, Jianbo; Xi, Yugeng; Li, Dewei; Xu, Yuli; Gan, Zhongxue

    2018-01-01

    A common objective of model predictive control (MPC) design is the large initial feasible region, low online computational burden as well as satisfactory control performance of the resulting algorithm. It is well known that interpolation-based MPC can achieve a favourable trade-off among these different aspects. However, the existing results are usually based on fixed prediction scenarios, which inevitably limits the performance of the obtained algorithms. So by replacing the fixed prediction scenarios with the time-varying multi-step prediction scenarios, this paper provides a new insight into improvement of the existing MPC designs. The adopted control law is a combination of predetermined multi-step feedback control laws, based on which two MPC algorithms with guaranteed recursive feasibility and asymptotic stability are presented. The efficacy of the proposed algorithms is illustrated by a numerical example.

  2. An Assessment of the Intestinal Lumen as a Site for Intervention in Reducing Body Burdens of Organochlorine Compounds

    PubMed Central

    Jandacek, Ronald J.; Genuis, Stephen J.

    2013-01-01

    Many individuals maintain a persistent body burden of organochlorine compounds (OCs) as well as other lipophilic compounds, largely as a result of airborne and dietary exposures. Ingested OCs are typically absorbed from the small intestine along with dietary lipids. Once in the body, stored OCs can mobilize from adipose tissue storage sites and, along with circulating OCs, are delivered into the small intestine via hepatic processing and biliary transport. Retained OCs are also transported into both the large and small intestinal lumen via non-biliary mechanisms involving both secretion and desquamation from enterocytes. OCs and some other toxicants can be reabsorbed from the intestine, however, they take part in enterohepatic circulation(EHC). While dietary fat facilitates the absorption of OCs from the small intestine, it has little effect on OCs within the large intestine. Non-absorbable dietary fats and fat absorption inhibitors, however, can reduce the re-absorption of OCs and other lipophiles involved in EHC and may enhance the secretion of these compounds into the large intestine—thereby hastening their elimination. Clinical studies are currently underway to determine the efficacy of using non-absorbable fats and inhibitors of fat absorption in facilitating the elimination of persistent body burdens of OCs and other lipophilic human contaminants. PMID:23476122

  3. An assessment of the intestinal lumen as a site for intervention in reducing body burdens of organochlorine compounds.

    PubMed

    Jandacek, Ronald J; Genuis, Stephen J

    2013-01-01

    Many individuals maintain a persistent body burden of organochlorine compounds (OCs) as well as other lipophilic compounds, largely as a result of airborne and dietary exposures. Ingested OCs are typically absorbed from the small intestine along with dietary lipids. Once in the body, stored OCs can mobilize from adipose tissue storage sites and, along with circulating OCs, are delivered into the small intestine via hepatic processing and biliary transport. Retained OCs are also transported into both the large and small intestinal lumen via non-biliary mechanisms involving both secretion and desquamation from enterocytes. OCs and some other toxicants can be reabsorbed from the intestine, however, they take part in enterohepatic circulation(EHC). While dietary fat facilitates the absorption of OCs from the small intestine, it has little effect on OCs within the large intestine. Non-absorbable dietary fats and fat absorption inhibitors, however, can reduce the re-absorption of OCs and other lipophiles involved in EHC and may enhance the secretion of these compounds into the large intestine--thereby hastening their elimination. Clinical studies are currently underway to determine the efficacy of using non-absorbable fats and inhibitors of fat absorption in facilitating the elimination of persistent body burdens of OCs and other lipophilic human contaminants.

  4. Two-stage atlas subset selection in multi-atlas based image segmentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Tingting, E-mail: tingtingzhao@mednet.ucla.edu; Ruan, Dan, E-mail: druan@mednet.ucla.edu

    2015-06-15

    Purpose: Fast growing access to large databases and cloud stored data presents a unique opportunity for multi-atlas based image segmentation and also presents challenges in heterogeneous atlas quality and computation burden. This work aims to develop a novel two-stage method tailored to the special needs in the face of large atlas collection with varied quality, so that high-accuracy segmentation can be achieved with low computational cost. Methods: An atlas subset selection scheme is proposed to substitute a significant portion of the computationally expensive full-fledged registration in the conventional scheme with a low-cost alternative. More specifically, the authors introduce a two-stagemore » atlas subset selection method. In the first stage, an augmented subset is obtained based on a low-cost registration configuration and a preliminary relevance metric; in the second stage, the subset is further narrowed down to a fusion set of desired size, based on full-fledged registration and a refined relevance metric. An inference model is developed to characterize the relationship between the preliminary and refined relevance metrics, and a proper augmented subset size is derived to ensure that the desired atlases survive the preliminary selection with high probability. Results: The performance of the proposed scheme has been assessed with cross validation based on two clinical datasets consisting of manually segmented prostate and brain magnetic resonance images, respectively. The proposed scheme demonstrates comparable end-to-end segmentation performance as the conventional single-stage selection method, but with significant computation reduction. Compared with the alternative computation reduction method, their scheme improves the mean and medium Dice similarity coefficient value from (0.74, 0.78) to (0.83, 0.85) and from (0.82, 0.84) to (0.95, 0.95) for prostate and corpus callosum segmentation, respectively, with statistical significance. Conclusions: The authors have developed a novel two-stage atlas subset selection scheme for multi-atlas based segmentation. It achieves good segmentation accuracy with significantly reduced computation cost, making it a suitable configuration in the presence of extensive heterogeneous atlases.« less

  5. Solar Power Tower Integrated Layout and Optimization Tool | Concentrating

    Science.gov Websites

    methods to reduce the overall computational burden while generating accurate and precise results. These methods have been developed as part of the U.S. Department of Energy (DOE) SunShot Initiative research

  6. Calibration of Complex Subsurface Reaction Models Using a Surrogate-Model Approach

    EPA Science Inventory

    Application of model assessment techniques to complex subsurface reaction models involves numerous difficulties, including non-trivial model selection, parameter non-uniqueness, and excessive computational burden. To overcome these difficulties, this study introduces SAMM (Simult...

  7. Electric Power Distribution System Model Simplification Using Segment Substitution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reiman, Andrew P.; McDermott, Thomas E.; Akcakaya, Murat

    Quasi-static time-series (QSTS) simulation is used to simulate the behavior of distribution systems over long periods of time (typically hours to years). The technique involves repeatedly solving the load-flow problem for a distribution system model and is useful for distributed energy resource (DER) planning. When a QSTS simulation has a small time step and a long duration, the computational burden of the simulation can be a barrier to integration into utility workflows. One way to relieve the computational burden is to simplify the system model. The segment substitution method of simplifying distribution system models introduced in this paper offers modelmore » bus reduction of up to 98% with a simplification error as low as 0.2% (0.002 pu voltage). In contrast to existing methods of distribution system model simplification, which rely on topological inspection and linearization, the segment substitution method uses black-box segment data and an assumed simplified topology.« less

  8. Acoustic Biometric System Based on Preprocessing Techniques and Linear Support Vector Machines

    PubMed Central

    del Val, Lara; Izquierdo-Fuente, Alberto; Villacorta, Juan J.; Raboso, Mariano

    2015-01-01

    Drawing on the results of an acoustic biometric system based on a MSE classifier, a new biometric system has been implemented. This new system preprocesses acoustic images, extracts several parameters and finally classifies them, based on Support Vector Machine (SVM). The preprocessing techniques used are spatial filtering, segmentation—based on a Gaussian Mixture Model (GMM) to separate the person from the background, masking—to reduce the dimensions of images—and binarization—to reduce the size of each image. An analysis of classification error and a study of the sensitivity of the error versus the computational burden of each implemented algorithm are presented. This allows the selection of the most relevant algorithms, according to the benefits required by the system. A significant improvement of the biometric system has been achieved by reducing the classification error, the computational burden and the storage requirements. PMID:26091392

  9. Acoustic Biometric System Based on Preprocessing Techniques and Linear Support Vector Machines.

    PubMed

    del Val, Lara; Izquierdo-Fuente, Alberto; Villacorta, Juan J; Raboso, Mariano

    2015-06-17

    Drawing on the results of an acoustic biometric system based on a MSE classifier, a new biometric system has been implemented. This new system preprocesses acoustic images, extracts several parameters and finally classifies them, based on Support Vector Machine (SVM). The preprocessing techniques used are spatial filtering, segmentation-based on a Gaussian Mixture Model (GMM) to separate the person from the background, masking-to reduce the dimensions of images-and binarization-to reduce the size of each image. An analysis of classification error and a study of the sensitivity of the error versus the computational burden of each implemented algorithm are presented. This allows the selection of the most relevant algorithms, according to the benefits required by the system. A significant improvement of the biometric system has been achieved by reducing the classification error, the computational burden and the storage requirements.

  10. The Disease Burden of Childhood Adversities in Adults: A Population-Based Study

    ERIC Educational Resources Information Center

    Cuijpers, Pim; Smit, Filip; Unger, Froukje; Stikkelbroek, Yvonne; ten Have, Margreet; de Graaf, Ron

    2011-01-01

    Objectives: There is much evidence showing that childhood adversities have considerable effects on the mental and physical health of adults. It could be assumed therefore, that the disease burden of childhood adversities is high. It has not yet been examined, however, whether this is true. Method: We used data of a large representative sample (N =…

  11. Computational issues in complex water-energy optimization problems: Time scales, parameterizations, objectives and algorithms

    NASA Astrophysics Data System (ADS)

    Efstratiadis, Andreas; Tsoukalas, Ioannis; Kossieris, Panayiotis; Karavokiros, George; Christofides, Antonis; Siskos, Alexandros; Mamassis, Nikos; Koutsoyiannis, Demetris

    2015-04-01

    Modelling of large-scale hybrid renewable energy systems (HRES) is a challenging task, for which several open computational issues exist. HRES comprise typical components of hydrosystems (reservoirs, boreholes, conveyance networks, hydropower stations, pumps, water demand nodes, etc.), which are dynamically linked with renewables (e.g., wind turbines, solar parks) and energy demand nodes. In such systems, apart from the well-known shortcomings of water resources modelling (nonlinear dynamics, unknown future inflows, large number of variables and constraints, conflicting criteria, etc.), additional complexities and uncertainties arise due to the introduction of energy components and associated fluxes. A major difficulty is the need for coupling two different temporal scales, given that in hydrosystem modeling, monthly simulation steps are typically adopted, yet for a faithful representation of the energy balance (i.e. energy production vs. demand) a much finer resolution (e.g. hourly) is required. Another drawback is the increase of control variables, constraints and objectives, due to the simultaneous modelling of the two parallel fluxes (i.e. water and energy) and their interactions. Finally, since the driving hydrometeorological processes of the integrated system are inherently uncertain, it is often essential to use synthetically generated input time series of large length, in order to assess the system performance in terms of reliability and risk, with satisfactory accuracy. To address these issues, we propose an effective and efficient modeling framework, key objectives of which are: (a) the substantial reduction of control variables, through parsimonious yet consistent parameterizations; (b) the substantial decrease of computational burden of simulation, by linearizing the combined water and energy allocation problem of each individual time step, and solve each local sub-problem through very fast linear network programming algorithms, and (c) the substantial decrease of the required number of function evaluations for detecting the optimal management policy, using an innovative, surrogate-assisted global optimization approach.

  12. Lithographic image simulation for the 21st century with 19th-century tools

    NASA Astrophysics Data System (ADS)

    Gordon, Ronald L.; Rosenbluth, Alan E.

    2004-01-01

    Simulation of lithographic processes in semiconductor manufacturing has gone from a crude learning tool 20 years ago to a critical part of yield enhancement strategy today. Although many disparate models, championed by equally disparate communities, exist to describe various photoresist development phenomena, these communities would all agree that the one piece of the simulation picture that can, and must, be computed accurately is the image intensity in the photoresist. The imaging of a photomask onto a thin-film stack is one of the only phenomena in the lithographic process that is described fully by well-known, definitive physical laws. Although many approximations are made in the derivation of the Fourier transform relations between the mask object, the pupil, and the image, these and their impacts are well-understood and need little further investigation. The imaging process in optical lithography is modeled as a partially-coherent, Kohler illumination system. As Hopkins has shown, we can separate the computation into 2 pieces: one that takes information about the illumination source, the projection lens pupil, the resist stack, and the mask size or pitch, and the other that only needs the details of the mask structure. As the latter piece of the calculation can be expressed as a fast Fourier transform, it is the first piece that dominates. This piece involves computation of a potentially large number of numbers called Transmission Cross-Coefficients (TCCs), which are correlations of the pupil function weighted with the illumination intensity distribution. The advantage of performing the image calculations this way is that the computation of these TCCs represents an up-front cost, not to be repeated if one is only interested in changing the mask features, which is the case in Model-Based Optical Proximity Correction (MBOPC). The down side, however, is that the number of these expensive double integrals that must be performed increases as the square of the mask unit cell area; this number can cause even the fastest computers to balk if one needs to study medium- or long-range effects. One can reduce this computational burden by approximating with a smaller area, but accuracy is usually a concern, especially when building a model that will purportedly represent a manufacturing process. This work will review the current methodologies used to simulate the intensity distribution in air above the resist and address the above problems. More to the point, a methodology has been developed to eliminate the expensive numerical integrations in the TCC calculations, as the resulting integrals in many cases of interest can be either evaluated analytically, or replaced by analytical functions accurate to within machine precision. With the burden of computing these numbers lightened, more accurate representations of the image field can be realized, and better overall models are then possible.

  13. Economic Burden of Colorectal Cancer in Korea

    PubMed Central

    Byun, Ju-Young; Oh, In-Hwan; Kim, Young Ae; Seo, Hye-Young; Lee, Yo-Han

    2014-01-01

    Objectives The incidence and survival rate of colorectal cancer in Korea are increasing because of improved screening, treatment technologies, and lifestyle changes. In this aging population, increases in economic cost result. This study was conducted to estimate the economic burden of colorectal cancer utilizing claims data from the Health Insurance Review and Assessment Service. Methods Economic burdens of colorectal cancer were estimated using prevalence data and patients were defined as those who received ambulatory treatment from medical institutions or who had been hospitalized due to colorectal cancer under the International Classification of Disease 10th revision codes from C18-C21. The economic burdens of colorectal cancer were calculated as direct costs and indirect costs. Results The prevalence rate (per 100 000 people) of those who were treated for colorectal cancer during 2010 was 165.48. The economic burdens of colorectal cancer in 2010 were 3 trillion and 100 billion Korean won (KRW), respectively. Direct costs included 1 trillion and 960 billion KRW (62.85%), respectively and indirect costs were 1 trillion and 160 billion (37.15%), respectively. Conclusions Colorectal cancer has a large economic burden. Efforts should be made to reduce the economic burden of the disease through primary and secondary prevention. PMID:24744825

  14. The quality-of-life burden of knee osteoarthritis in New Zealand adults: A model-based evaluation.

    PubMed

    Abbott, J Haxby; Usiskin, Ilana M; Wilson, Ross; Hansen, Paul; Losina, Elena

    2017-01-01

    Knee osteoarthritis is a leading global cause of health-related quality of life loss. The aim of this project was to quantify health losses arising from knee osteoarthritis in New Zealand (NZ) in terms of quality-adjusted life years (QALYs) lost. The Osteoarthritis Policy Model (OAPol), a validated Monte Carlo computer simulation model, was used to estimate QALYs lost due to knee osteoarthritis in the NZ adult population aged 40-84 over their lifetimes from the base year of 2006 until death. Data were from the NZ Health Survey, NZ Burden of Diseases, NZ Census, and relevant literature. QALYs were derived from NZ EQ-5D value set 2. Sensitivity to health state valuation, disease and pain prevalence were assessed in secondary analyses. Based on NZ EQ-5D health state valuations, mean health losses due to knee osteoarthritis over people's lifetimes in NZ are 3.44 QALYs per person, corresponding to 467,240 QALYs across the adult population. Average estimated per person QALY losses are higher for non-Māori females (3.55) than Māori females (3.38), and higher for non-Māori males (3.34) than Māori males (2.60). The proportion of QALYs lost out of the total quality-adjusted life expectancy for those without knee osteoarthritis is similar across all subgroups, ranging from 20 to 23 percent. At both the individual and population levels, knee osteoarthritis is responsible for large lifetime QALY losses. QALY losses are higher for females than males due to greater prevalence of knee osteoarthritis and higher life expectancy, and lower for Māori than non-Māori due to lower life expectancy. Large health gains are potentially realisable from public health and policy measures aimed at decreasing incidence, progression, pain, and disability of osteoarthritis.

  15. First experiences with model based iterative reconstructions influence on quantitative plaque volume and intensity measurements in coronary computed tomography angiography.

    PubMed

    Precht, H; Kitslaar, P H; Broersen, A; Gerke, O; Dijkstra, J; Thygesen, J; Egstrup, K; Lambrechtsen, J

    2017-02-01

    Investigate the influence of adaptive statistical iterative reconstruction (ASIR) and the model-based IR (Veo) reconstruction algorithm in coronary computed tomography angiography (CCTA) images on quantitative measurements in coronary arteries for plaque volumes and intensities. Three patients had three independent dose reduced CCTA performed and reconstructed with 30% ASIR (CTDI vol at 6.7 mGy), 60% ASIR (CTDI vol 4.3 mGy) and Veo (CTDI vol at 1.9 mGy). Coronary plaque analysis was performed for each measured CCTA volumes, plaque burden and intensities. Plaque volume and plaque burden show a decreasing tendency from ASIR to Veo as median volume for ASIR is 314 mm 3 and 337 mm 3 -252 mm 3 for Veo and plaque burden is 42% and 44% for ASIR to 39% for Veo. The lumen and vessel volume decrease slightly from 30% ASIR to 60% ASIR with 498 mm 3 -391 mm 3 for lumen volume and vessel volume from 939 mm 3 to 830 mm 3 . The intensities did not change overall between the different reconstructions for either lumen or plaque. We found a tendency of decreasing plaque volumes and plaque burden but no change in intensities with the use of low dose Veo CCTA (1.9 mGy) compared to dose reduced ASIR CCTA (6.7 mGy & 4.3 mGy), although more studies are warranted. Copyright © 2016 The College of Radiographers. Published by Elsevier Ltd. All rights reserved.

  16. Development and Evaluation of Sterographic Display for Lung Cancer Screening

    DTIC Science & Technology

    2008-12-01

    burden. Application of GPUs – With the evolution of commodity graphics processing units (GPUs) for accelerating games on personal computers, over the...units, which are designed for rendering computer games , are readily available and can be programmed to perform the kinds of real-time calculations...575-581, 1994. 12. Anderson CM, Saloner D, Tsuruda JS, Shapeero LG, Lee RE. "Artifacts in maximun-intensity-projection display of MR angiograms

  17. Stochastic Simulation of Biomolecular Networks in Dynamic Environments

    PubMed Central

    Voliotis, Margaritis; Thomas, Philipp; Grima, Ramon; Bowsher, Clive G.

    2016-01-01

    Simulation of biomolecular networks is now indispensable for studying biological systems, from small reaction networks to large ensembles of cells. Here we present a novel approach for stochastic simulation of networks embedded in the dynamic environment of the cell and its surroundings. We thus sample trajectories of the stochastic process described by the chemical master equation with time-varying propensities. A comparative analysis shows that existing approaches can either fail dramatically, or else can impose impractical computational burdens due to numerical integration of reaction propensities, especially when cell ensembles are studied. Here we introduce the Extrande method which, given a simulated time course of dynamic network inputs, provides a conditionally exact and several orders-of-magnitude faster simulation solution. The new approach makes it feasible to demonstrate—using decision-making by a large population of quorum sensing bacteria—that robustness to fluctuations from upstream signaling places strong constraints on the design of networks determining cell fate. Our approach has the potential to significantly advance both understanding of molecular systems biology and design of synthetic circuits. PMID:27248512

  18. Virtual gastrointestinal colonoscopy in combination with large bowel endoscopy: Clinical application

    PubMed Central

    He, Qing; Rao, Ting; Guan, Yong-Song

    2014-01-01

    Although colorectal cancer (CRC) has no longer been the leading cancer killer worldwide for years with the exponential development in computed tomography (CT) or magnetic resonance imaging, and positron emission tomography/CT as well as virtual colonoscopy for early detection, the CRC related mortality is still high. The objective of CRC screening is to reduce the burden of CRC and thereby the morbidity and mortality rates of the disease. It is believed that this goal can be achieved by regularly screening the average-risk population, enabling the detection of cancer at early, curable stages, and polyps before they become cancerous. Large-scale screening with multimodality imaging approaches plays an important role in reaching that goal to detect polyps, Crohn’s disease, ulcerative colitis and CRC in early stage. This article reviews kinds of presentative imaging procedures for various screening options and updates detecting, staging and re-staging of CRC patients for determining the optimal therapeutic method and forecasting the risk of CRC recurrence and the overall prognosis. The combination use of virtual colonoscopy and conventional endoscopy, advantages and limitations of these modalities are also discussed. PMID:25320519

  19. Probabilistic assessment of the impact of coal seam gas development on groundwater: Surat Basin, Australia

    NASA Astrophysics Data System (ADS)

    Cui, Tao; Moore, Catherine; Raiber, Matthias

    2018-05-01

    Modelling cumulative impacts of basin-scale coal seam gas (CSG) extraction is challenging due to the long time frames and spatial extent over which impacts occur combined with the need to consider local-scale processes. The computational burden of such models limits the ability to undertake calibration and sensitivity and uncertainty analyses. A framework is presented that integrates recently developed methods and tools to address the computational burdens of an assessment of drawdown impacts associated with rapid CSG development in the Surat Basin, Australia. The null space Monte Carlo method combined with singular value decomposition (SVD)-assisted regularisation was used to analyse the uncertainty of simulated drawdown impacts. The study also describes how the computational burden of assessing local-scale impacts was mitigated by adopting a novel combination of a nested modelling framework which incorporated a model emulator of drawdown in dual-phase flow conditions, and a methodology for representing local faulting. This combination provides a mechanism to support more reliable estimates of regional CSG-related drawdown predictions. The study indicates that uncertainties associated with boundary conditions are reduced significantly when expressing differences between scenarios. The results are analysed and distilled to enable the easy identification of areas where the simulated maximum drawdown impacts could exceed trigger points associated with legislative `make good' requirements; trigger points require that either an adjustment in the development scheme or other measures are implemented to remediate the impact. This report contributes to the currently small body of work that describes modelling and uncertainty analyses of CSG extraction impacts on groundwater.

  20. Grid Computing and Collaboration Technology in Support of Fusion Energy Sciences

    NASA Astrophysics Data System (ADS)

    Schissel, D. P.

    2004-11-01

    The SciDAC Initiative is creating a computational grid designed to advance scientific understanding in fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling, and allowing more efficient use of experimental facilities. The philosophy is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as easy to use network available services. Access to services is stressed rather than portability. Services share the same basic security infrastructure so that stakeholders can control their own resources and helps ensure fair use of resources. The collaborative control room is being developed using the open-source Access Grid software that enables secure group-to-group collaboration with capabilities beyond teleconferencing including application sharing and control. The ability to effectively integrate off-site scientists into a dynamic control room will be critical to the success of future international projects like ITER. Grid computing, the secure integration of computer systems over high-speed networks to provide on-demand access to data analysis capabilities and related functions, is being deployed as an alternative to traditional resource sharing among institutions. The first grid computational service deployed was the transport code TRANSP and included tools for run preparation, submission, monitoring and management. This approach saves user sites from the laborious effort of maintaining a complex code while at the same time reducing the burden on developers by avoiding the support of a large number of heterogeneous installations. This tutorial will present the philosophy behind an advanced collaborative environment, give specific examples, and discuss its usage beyond FES.

  1. Ordinal optimization and its application to complex deterministic problems

    NASA Astrophysics Data System (ADS)

    Yang, Mike Shang-Yu

    1998-10-01

    We present in this thesis a new perspective to approach a general class of optimization problems characterized by large deterministic complexities. Many problems of real-world concerns today lack analyzable structures and almost always involve high level of difficulties and complexities in the evaluation process. Advances in computer technology allow us to build computer models to simulate the evaluation process through numerical means, but the burden of high complexities remains to tax the simulation with an exorbitant computing cost for each evaluation. Such a resource requirement makes local fine-tuning of a known design difficult under most circumstances, let alone global optimization. Kolmogorov equivalence of complexity and randomness in computation theory is introduced to resolve this difficulty by converting the complex deterministic model to a stochastic pseudo-model composed of a simple deterministic component and a white-noise like stochastic term. The resulting randomness is then dealt with by a noise-robust approach called Ordinal Optimization. Ordinal Optimization utilizes Goal Softening and Ordinal Comparison to achieve an efficient and quantifiable selection of designs in the initial search process. The approach is substantiated by a case study in the turbine blade manufacturing process. The problem involves the optimization of the manufacturing process of the integrally bladed rotor in the turbine engines of U.S. Air Force fighter jets. The intertwining interactions among the material, thermomechanical, and geometrical changes makes the current FEM approach prohibitively uneconomical in the optimization process. The generalized OO approach to complex deterministic problems is applied here with great success. Empirical results indicate a saving of nearly 95% in the computing cost.

  2. The Impact of Medication Anticholinergic Burden on Cognitive Performance in People With Schizophrenia.

    PubMed

    Ang, Mei San; Abdul Rashid, Nur Amirah; Lam, Max; Rapisarda, Attilio; Kraus, Michael; Keefe, Richard S E; Lee, Jimmy

    2017-12-01

    Cognitive deficits are prevalent in people with schizophrenia and associated with functional impairments. In addition to antipsychotics, pharmacotherapy in schizophrenia often includes other psychotropics, and some of these agents possess anticholinergic properties, which may impair cognition. The objective of this study was to explore the association between medication anticholinergic burden and cognition in schizophrenia. Seven hundred five individuals with schizophrenia completed a neuropsychological battery comprising Judgment of Line Orientation Test, Wechsler Abbreviated Scale of Intelligence Matrix Reasoning, Continuous Performance Test-Identical Pairs Version, and the Brief Assessment of Cognition in Schizophrenia. Cognitive g and 3 cognitive factor scores that include executive function, memory/fluency, and speed of processing/vigilance, which were derived from a previously published analysis, were entered as cognitive variables. Anticholinergic burden was computed using 2 anticholinergic scales: Anticholinergic Burden Scale and Anticholinergic Drug Scale. Duration and severity of illness, antipsychotic dose, smoking status, age, and sex were included as covariates. Anticholinergic burden was associated with poorer cognitive performance in cognitive g, all 3 cognitive domains and most cognitive tasks in multivariate analyses. The associations were statistically significant, but the effect sizes were small (for Anticholinergic Burden Scale, Cohen f = 0.008; for Anticholinergic Drug Scale, Cohen f = 0.017). Although our results showed a statistically significant association between medications with anticholinergic properties and cognition in people with schizophrenia, the impact is of doubtful or minimal clinical significance.

  3. Tumor Burden Analysis on Computed Tomography by Automated Liver and Tumor Segmentation

    PubMed Central

    Linguraru, Marius George; Richbourg, William J.; Liu, Jianfei; Watt, Jeremy M.; Pamulapati, Vivek; Wang, Shijun; Summers, Ronald M.

    2013-01-01

    The paper presents the automated computation of hepatic tumor burden from abdominal CT images of diseased populations with images with inconsistent enhancement. The automated segmentation of livers is addressed first. A novel three-dimensional (3D) affine invariant shape parameterization is employed to compare local shape across organs. By generating a regular sampling of the organ's surface, this parameterization can be effectively used to compare features of a set of closed 3D surfaces point-to-point, while avoiding common problems with the parameterization of concave surfaces. From an initial segmentation of the livers, the areas of atypical local shape are determined using training sets. A geodesic active contour corrects locally the segmentations of the livers in abnormal images. Graph cuts segment the hepatic tumors using shape and enhancement constraints. Liver segmentation errors are reduced significantly and all tumors are detected. Finally, support vector machines and feature selection are employed to reduce the number of false tumor detections. The tumor detection true position fraction of 100% is achieved at 2.3 false positives/case and the tumor burden is estimated with 0.9% error. Results from the test data demonstrate the method's robustness to analyze livers from difficult clinical cases to allow the temporal monitoring of patients with hepatic cancer. PMID:22893379

  4. Discriminative Random Field Models for Subsurface Contamination Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Arshadi, M.; Abriola, L. M.; Miller, E. L.; De Paolis Kaluza, C.

    2017-12-01

    Application of flow and transport simulators for prediction of the release, entrapment, and persistence of dense non-aqueous phase liquids (DNAPLs) and associated contaminant plumes is a computationally intensive process that requires specification of a large number of material properties and hydrologic/chemical parameters. Given its computational burden, this direct simulation approach is particularly ill-suited for quantifying both the expected performance and uncertainty associated with candidate remediation strategies under real field conditions. Prediction uncertainties primarily arise from limited information about contaminant mass distributions, as well as the spatial distribution of subsurface hydrologic properties. Application of direct simulation to quantify uncertainty would, thus, typically require simulating multiphase flow and transport for a large number of permeability and release scenarios to collect statistics associated with remedial effectiveness, a computationally prohibitive process. The primary objective of this work is to develop and demonstrate a methodology that employs measured field data to produce equi-probable stochastic representations of a subsurface source zone that capture the spatial distribution and uncertainty associated with key features that control remediation performance (i.e., permeability and contamination mass). Here we employ probabilistic models known as discriminative random fields (DRFs) to synthesize stochastic realizations of initial mass distributions consistent with known, and typically limited, site characterization data. Using a limited number of full scale simulations as training data, a statistical model is developed for predicting the distribution of contaminant mass (e.g., DNAPL saturation and aqueous concentration) across a heterogeneous domain. Monte-Carlo sampling methods are then employed, in conjunction with the trained statistical model, to generate realizations conditioned on measured borehole data. Performance of the statistical model is illustrated through comparisons of generated realizations with the `true' numerical simulations. Finally, we demonstrate how these realizations can be used to determine statistically optimal locations for further interrogation of the subsurface.

  5. BROCCOLI: Software for fast fMRI analysis on many-core CPUs and GPUs

    PubMed Central

    Eklund, Anders; Dufort, Paul; Villani, Mattias; LaConte, Stephen

    2014-01-01

    Analysis of functional magnetic resonance imaging (fMRI) data is becoming ever more computationally demanding as temporal and spatial resolutions improve, and large, publicly available data sets proliferate. Moreover, methodological improvements in the neuroimaging pipeline, such as non-linear spatial normalization, non-parametric permutation tests and Bayesian Markov Chain Monte Carlo approaches, can dramatically increase the computational burden. Despite these challenges, there do not yet exist any fMRI software packages which leverage inexpensive and powerful graphics processing units (GPUs) to perform these analyses. Here, we therefore present BROCCOLI, a free software package written in OpenCL (Open Computing Language) that can be used for parallel analysis of fMRI data on a large variety of hardware configurations. BROCCOLI has, for example, been tested with an Intel CPU, an Nvidia GPU, and an AMD GPU. These tests show that parallel processing of fMRI data can lead to significantly faster analysis pipelines. This speedup can be achieved on relatively standard hardware, but further, dramatic speed improvements require only a modest investment in GPU hardware. BROCCOLI (running on a GPU) can perform non-linear spatial normalization to a 1 mm3 brain template in 4–6 s, and run a second level permutation test with 10,000 permutations in about a minute. These non-parametric tests are generally more robust than their parametric counterparts, and can also enable more sophisticated analyses by estimating complicated null distributions. Additionally, BROCCOLI includes support for Bayesian first-level fMRI analysis using a Gibbs sampler. The new software is freely available under GNU GPL3 and can be downloaded from github (https://github.com/wanderine/BROCCOLI/). PMID:24672471

  6. A Practical, Robust Methodology for Acquiring New Observation Data Using Computationally Expensive Groundwater Models

    NASA Astrophysics Data System (ADS)

    Siade, Adam J.; Hall, Joel; Karelse, Robert N.

    2017-11-01

    Regional groundwater flow models play an important role in decision making regarding water resources; however, the uncertainty embedded in model parameters and model assumptions can significantly hinder the reliability of model predictions. One way to reduce this uncertainty is to collect new observation data from the field. However, determining where and when to obtain such data is not straightforward. There exist a number of data-worth and experimental design strategies developed for this purpose. However, these studies often ignore issues related to real-world groundwater models such as computational expense, existing observation data, high-parameter dimension, etc. In this study, we propose a methodology, based on existing methods and software, to efficiently conduct such analyses for large-scale, complex regional groundwater flow systems for which there is a wealth of available observation data. The method utilizes the well-established d-optimality criterion, and the minimax criterion for robust sampling strategies. The so-called Null-Space Monte Carlo method is used to reduce the computational burden associated with uncertainty quantification. And, a heuristic methodology, based on the concept of the greedy algorithm, is proposed for developing robust designs with subsets of the posterior parameter samples. The proposed methodology is tested on a synthetic regional groundwater model, and subsequently applied to an existing, complex, regional groundwater system in the Perth region of Western Australia. The results indicate that robust designs can be obtained efficiently, within reasonable computational resources, for making regional decisions regarding groundwater level sampling.

  7. Arranging computer architectures to create higher-performance controllers

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen A.

    1988-01-01

    Techniques for integrating microprocessors, array processors, and other intelligent devices in control systems are reviewed, with an emphasis on the (re)arrangement of components to form distributed or parallel processing systems. Consideration is given to the selection of the host microprocessor, increasing the power and/or memory capacity of the host, multitasking software for the host, array processors to reduce computation time, the allocation of real-time and non-real-time events to different computer subsystems, intelligent devices to share the computational burden for real-time events, and intelligent interfaces to increase communication speeds. The case of a helicopter vibration-suppression and stabilization controller is analyzed as an example, and significant improvements in computation and throughput rates are demonstrated.

  8. The UP-TECH project, an intervention to support caregivers of Alzheimer's disease patients in Italy: preliminary findings on recruitment and caregiving burden in the baseline population.

    PubMed

    Chiatti, Carlos; Rimland, Joseph M; Bonfranceschi, Franco; Masera, Filippo; Bustacchini, Silvia; Cassetta, Laura

    2015-01-01

    The paper describes recruitment results and characteristics of the UP-TECH clinical trial sample, including level of care services use, informal caregiver burden and its determinants. UP-TECH is designed to test innovative care solutions for community-dwelling patients with moderate stage Alzheimer's disease and their caregivers in Italy. Four hundred and fifty patient-caregiver dyads were randomized into three arms receiving different combinations of services, composed of case management interventions, nurse visits, assistive technology and educational brochures. The research nurses administered a questionnaire comprising an in-depth socio-demographic assessment and several clinical scales, such as Novak's Caregiver Burden Inventory. Analyses of baseline data were conducted using uni- and bi-variate statistics. Linear regressions were computed to identify de-confounded correlates of caregiver burden. Four hundred and thirty-eight patient-caregiver dyads were recruited and randomized. In our sample, patients are predominantly women (71.5%), with an average age of 81.5 years and a mean Mini-Mental State Examination score of 16.2. Caregivers are mostly women (66.2%) and offspring (55.7%), with a mean caregiver burden score of 27.6. They provide more than 50 hours of care per week, while receiving an almost negligible support from public services. Factors associated with caregiver burden are female gender, kinship and the patient's behavioral disturbances. The most important factor associated with lower burden is the employment of a live-in care worker. The paper provides a comprehensive description of moderate stage Alzheimer's disease patients and their caregivers, suggesting useful markers of caregiver burden. The well-balanced randomization assures the reliability of the study data-set for prospective evaluation of care strategies.

  9. Computationally efficient algorithms for real-time attitude estimation

    NASA Technical Reports Server (NTRS)

    Pringle, Steven R.

    1993-01-01

    For many practical spacecraft applications, algorithms for determining spacecraft attitude must combine inputs from diverse sensors and provide redundancy in the event of sensor failure. A Kalman filter is suitable for this task, however, it may impose a computational burden which may be avoided by sub optimal methods. A suboptimal estimator is presented which was implemented successfully on the Delta Star spacecraft which performed a 9 month SDI flight experiment in 1989. This design sought to minimize algorithm complexity to accommodate the limitations of an 8K guidance computer. The algorithm used is interpreted in the framework of Kalman filtering and a derivation is given for the computation.

  10. A radiobiological model of metastatic burden reduction for molecular radiotherapy: application to patients with bone metastases

    NASA Astrophysics Data System (ADS)

    Denis-Bacelar, Ana M.; Chittenden, Sarah J.; Murray, Iain; Divoli, Antigoni; McCready, V. Ralph; Dearnaley, David P.; O'Sullivan, Joe M.; Johnson, Bernadette; Flux, Glenn D.

    2017-04-01

    Skeletal tumour burden is a biomarker of prognosis and survival in cancer patients. This study proposes a novel method based on the linear quadratic model to predict the reduction in metastatic tumour burden as a function of the absorbed doses delivered from molecular radiotherapy treatments. The range of absorbed doses necessary to eradicate all the bone lesions and to reduce the metastatic burden was investigated in a cohort of 22 patients with bone metastases from castration-resistant prostate cancer. A metastatic burden reduction curve was generated for each patient, which predicts the reduction in metastatic burden as a function of the patient mean absorbed dose, defined as the mean of all the lesion absorbed doses in any given patient. In the patient cohort studied, the median of the patient mean absorbed dose predicted to reduce the metastatic burden by 50% was 89 Gy (interquartile range: 83-105 Gy), whilst a median of 183 Gy (interquartile range: 107-247 Gy) was found necessary to eradicate all metastases in a given patient. The absorbed dose required to eradicate all the lesions was strongly correlated with the variability of the absorbed doses delivered to multiple lesions in a given patient (r  =  0.98, P  <  0.0001). The metastatic burden reduction curves showed a potential large reduction in metastatic burden for a small increase in absorbed dose in 91% of patients. The results indicate the range of absorbed doses required to potentially obtain a significant survival benefit. The metastatic burden reduction method provides a simple tool that could be used in routine clinical practice for patient selection and to indicate the required administered activity to achieve a predicted patient mean absorbed dose and reduction in metastatic tumour burden.

  11. Disease burden attributed to alcohol: How methodological advances in the Global Burden of Disease 2013 study have changed the estimates in Sweden.

    PubMed

    Kellerborg, Klas; Danielsson, Anna-Karin; Allebeck, Peter; Coates, Matthew M; Agardh, Emilie

    2016-08-01

    The Global Burden of Disease (GBD) study continuously refines its estimates as new data and methods become available. In the latest iteration of the study, GBD 2013, changes were made related to the disease burden attributed to alcohol. The aim of this study was to briefly present these changes and to compare the disease burden attributed to alcohol in Swedish men and women in 2010 using previous and updated methods. In the GBD study, the contribution of alcohol to the burden of disease is estimated by theoretically assessing how much of the disease burden can be avoided by reducing the consumption of alcohol to zero. The updated methods mainly consider improved measurements of alcohol consumption, including less severe alcohol dependence, assigning the most severe injuries and removing the protective effect of drinking on cardiovascular diseases if combined with binge drinking. The overall disease burden attributed to alcohol in 2010 increased by 14% when using the updated methods. Women accounted for this overall increase, mainly because the updated methods led to an overall higher alcohol consumption in women. By contrast, the overall burden decreased in men, one reason being the lower overall alcohol consumption with the new methods. In men, the inclusion of less severe alcohol dependence resulted in a large decrease in the alcohol attributed disease burden. This was, however, evened out to a great extent by the increase in cardiovascular disease and injuries. CONCLUSIONS WHEN USING THE UPDATED GBD METHODS, THE OVERALL DISEASE BURDEN ATTRIBUTED TO ALCOHOL INCREASED IN WOMEN, BUT NOT IN MEN. © 2016 the Nordic Societies of Public Health.

  12. Pharmacogenomic Research in South Africa: Lessons Learned and Future Opportunities in the Rainbow Nation.

    PubMed

    Warnich, Louise; Drögemöller, Britt I; Pepper, Michael S; Dandara, Collet; Wright, Galen E B

    2011-09-01

    South Africa, like many other developing countries, stands to benefit from novel diagnostics and drugs developed by pharmacogenomics guidance due to high prevalence of disease burden in the region. This includes both communicable (e.g., HIV/AIDS and tuberculosis) and non-communicable (e.g., diabetes and cardiovascular) diseases. For example, although only 0.7% of the world's population lives in South Africa, the country carries 17% of the global HIV/AIDS burden and 5% of the global tuberculosis burden. Nobel Peace Prize Laureate Archbishop Emeritus Desmond Tutu has coined the term Rainbow Nation, referring to a land of wealth in its many diverse peoples and cultures. It is now timely and necessary to reflect on how best to approach new genomics biotechnologies in a manner that carefully considers the public health needs and extant disease burden in the region. The aim of this paper is to document and review the advances in pharmacogenomics in South Africa and importantly, to evaluate the direction that future research should take. Previous research has shown that the populations in South Africa exhibit unique allele frequencies and novel genetic variation in pharmacogenetically relevant genes, often differing from other African and global populations. The high level of genetic diversity, low linkage disequilibrium and the presence of rare variants in these populations question the feasibility of the use of current commercially available genotyping platforms, and may partially account for genotype-phenotype discordance observed in past studies. However, the employment of high throughput technologies for genomic research, within the context of large clinical trials, combined with interdisciplinary studies and appropriate regulatory guidelines, should aid in acceleration of pharmacogenomic discoveries in high priority therapeutic areas in South Africa. Finally, we suggest that projects such as the H3Africa Initiative, the SAHGP and PGENI should play an integral role in the coordination of genomic research in South Africa, but also other African countries, by providing infrastructure and capital to local researchers, as well as providing aid in addressing the computational and statistical bottlenecks encountered at present.

  13. Pharmacogenomic Research in South Africa: Lessons Learned and Future Opportunities in the Rainbow Nation

    PubMed Central

    Warnich, Louise; Drögemöller, Britt I; Pepper, Michael S; Dandara, Collet; Wright, Galen E.B

    2011-01-01

    South Africa, like many other developing countries, stands to benefit from novel diagnostics and drugs developed by pharmacogenomics guidance due to high prevalence of disease burden in the region. This includes both communicable (e.g., HIV/AIDS and tuberculosis) and non-communicable (e.g., diabetes and cardiovascular) diseases. For example, although only 0.7% of the world’s population lives in South Africa, the country carries 17% of the global HIV/AIDS burden and 5% of the global tuberculosis burden. Nobel Peace Prize Laureate Archbishop Emeritus Desmond Tutu has coined the term Rainbow Nation, referring to a land of wealth in its many diverse peoples and cultures. It is now timely and necessary to reflect on how best to approach new genomics biotechnologies in a manner that carefully considers the public health needs and extant disease burden in the region. The aim of this paper is to document and review the advances in pharmacogenomics in South Africa and importantly, to evaluate the direction that future research should take. Previous research has shown that the populations in South Africa exhibit unique allele frequencies and novel genetic variation in pharmacogenetically relevant genes, often differing from other African and global populations. The high level of genetic diversity, low linkage disequilibrium and the presence of rare variants in these populations question the feasibility of the use of current commercially available genotyping platforms, and may partially account for genotype-phenotype discordance observed in past studies. However, the employment of high throughput technologies for genomic research, within the context of large clinical trials, combined with interdisciplinary studies and appropriate regulatory guidelines, should aid in acceleration of pharmacogenomic discoveries in high priority therapeutic areas in South Africa. Finally, we suggest that projects such as the H3Africa Initiative, the SAHGP and PGENI should play an integral role in the coordination of genomic research in South Africa, but also other African countries, by providing infrastructure and capital to local researchers, as well as providing aid in addressing the computational and statistical bottlenecks encountered at present. PMID:22563365

  14. The vulnerable plaque: the real villain in acute coronary syndromes.

    PubMed

    Liang, Michael; Puri, Aniket; Devlin, Gerard

    2011-01-01

    The term "vulnerable plaque" refers to a vascular lesion that is prone to rupture and may result in life-threatening events which include myocardial infarction. It consists of thin-cap fibroatheroma and a large lipid core which is highly thrombogenic. Acute coronary syndromes often result from rupture of vulnerable plaques which frequently are only moderately stenosed and not visible by conventional angiography. Several invasive and non-invasive strategies have been developed to assess the burden of vulnerable plaques. Intravascular ultrasound provides a two-dimensional cross-sectional image of the arterial wall and can help assess the plaque burden and composition. Optical coherent tomography offers superior resolution over intravascular ultrasound. High-resolution magnetic resonance imaging provides non-invasive imaging for visualizing fibrous cap thickness and rupture in plaques. In addition, it may be of value in assessing the effects of treatments, such as lipid-lowering therapy. Technical issues however limit its clinical applicability. The role of multi-slice computed tomography, a well established screening tool for coronary artery disease, remains to be determined. Fractional flow reserve (FFR) may provide physiological functional assessment of plaque vulnerability; however, its role in the management of vulnerable plaque requires further studies. Treatment of the vulnerable patient may involve systemic therapy which currently include statins, ACE inhibitors, beta-blockers, aspirin, and calcium-channel blockers and in the future local therapeutic options such as drug-eluting stents or photodynamic therapy.

  15. SELECTION AND CALIBRATION OF SUBSURFACE REACTIVE TRANSPORT MODELS USING A SURROGATE-MODEL APPROACH

    EPA Science Inventory

    While standard techniques for uncertainty analysis have been successfully applied to groundwater flow models, extension to reactive transport is frustrated by numerous difficulties, including excessive computational burden and parameter non-uniqueness. This research introduces a...

  16. Trusted Computing Exemplar: Life Cycle Management Plan

    DTIC Science & Technology

    2014-12-12

    Provost The report ...OPNAV N2/N6. Further distribution of all or part of this report is authorized. This report was prepared by...THIS PAGE INTENTIONALLY LEFT BLANK 35 REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting burden

  17. Investigating burden of informal caregivers in England, Finland and Greece: an analysis with the short form of the Burden Scale for Family Caregivers (BSFC-s).

    PubMed

    Konerding, Uwe; Bowen, Tom; Forte, Paul; Karampli, Eleftheria; Malmström, Tomi; Pavi, Elpida; Torkki, Paulus; Graessel, Elmar

    2018-02-01

    The burden of informal caregivers might show itself in different ways in different cultures. Understanding these differences is important for developing culture-specific measures aimed at alleviating caregiver burden. Hitherto, no findings regarding such cultural differences between different European countries were available. In this paper, differences between English, Finnish and Greek informal caregivers of people with dementia are investigated. A secondary analysis was performed with data from 36 English, 42 Finnish and 46 Greek caregivers obtained with the short form of the Burden Scale for Family Caregivers (BSFC-s). The probabilities of endorsing the BSFC-s items were investigated by computing a logit model with items and countries as categorical factors. Statistically significant deviation of data from this model was taken as evidence for country-specific response patterns. The two-factorial logit model explains the responses to the items quite well (McFadden's pseudo-R-square: 0.77). There are, however, also statistically significant deviations (p < 0.05). English caregivers have a stronger tendency to endorse items addressing impairments in individual well-being; Finnish caregivers have a stronger tendency to endorse items addressing the conflict between the demands resulting from care and demands resulting from the remaining social life and Greek caregivers have a stronger tendency to endorse items addressing impairments in physical health. Caregiver burden shows itself differently in English, Finnish and Greek caregivers. Accordingly, measures for alleviating caregiver burden in these three countries should address different aspects of the caregivers' lives.

  18. The Humanistic Burden of Type 1 Diabetes Mellitus in Europe: Examining Health Outcomes and the Role of Complications.

    PubMed

    Rydén, Anna; Sörstadius, Elisabeth; Bergenheim, Klas; Romanovschi, Alexandru; Thorén, Fredrik; Witt, Edward A; Sternhufvud, Catarina

    2016-01-01

    Diagnoses of Type 1 Diabetes Mellitus (T1DM) in Europe appear to be on the rise. Therefore it is imperative that researchers understand the potential impact that increases in prevalence could have on the affected individuals as well as on society as a whole. Accordingly this study examined the humanistic and economic burden of T1DM in patients relative to those without the condition across a number of health outcomes including health status, work productivity loss, activity impairment, and healthcare resource use. Survey data from a large, representative sample of EU adults (The EU National Health and Wellness Survey) were examined. Results suggest that overall burden is higher for those diagnosed with T1DM than respondents without diabetes and that burden increases as complications associated with T1DM increase. Taken together, these results suggest that treatment strategies for T1DM should balance clinical, humanistic, and economic burden and patients should be educated on the role of complications in disease outcomes.

  19. Role of Social Support in Predicting Caregiver Burden

    PubMed Central

    Rodakowski, Juleen; Skidmore, Elizabeth R.; Rogers, Joan C.; Schulz, Richard

    2012-01-01

    Objective To examine the unique contribution of social support to burden in caregivers of adults aging with spinal cord injuries (SCI). Design Secondary analyses of cross-sectional data from a large cohort of adults aging with SCI and their primary caregivers. Setting Multiple community locations in Pittsburgh, PA, and Miami, FL. Participants Caregivers of community-dwelling adults aging with SCI (n=173) were interviewed as part of a multisite randomized clinical trial. The mean age of caregivers was 53 years (SD=15) and of care-recipients 55 years (SD=13). Interventions Not applicable. Main Outcome Measures The primary outcome was caregiver burden measured with the Abridged Version of the Zarit Burden Interview. A hierarchical multiple regression analysis examined social supports (social integration, received social support, and negative social interactions) effect on burden in caregivers of adults aging while controlling for demographic characteristics and caregiving characteristics. Results After controlling for demographic characteristics and caregiving characteristics, social integration (β̂ =−.16, P<.05), received social support (β̂ =−.15, P<.05), and negative social interactions (β̂ =.21, P<.01) were significant independent predictors of caregiver burden. Conclusions Findings demonstrate that social support is an important factor associated with burden in caregivers of adults aging with SCI. Social support should be considered for assessments and interventions designed to identify and reduce caregiver burden. PMID:22824248

  20. Effects of Turbulence Model on Prediction of Hot-Gas Lateral Jet Interaction in a Supersonic Crossflow

    DTIC Science & Technology

    2015-07-01

    performance computing time from the US Department of Defense (DOD) High Performance Computing Modernization program at the US Army Research Laboratory...Approved OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time ...dimensional, compressible, Reynolds-averaged Navier-Stokes (RANS) equations are solved using a finite volume method. A point-implicit time - integration

  1. A geostatistical approach to the change-of-support problem and variable-support data fusion in spatial analysis

    NASA Astrophysics Data System (ADS)

    Wang, Jun; Wang, Yang; Zeng, Hui

    2016-01-01

    A key issue to address in synthesizing spatial data with variable-support in spatial analysis and modeling is the change-of-support problem. We present an approach for solving the change-of-support and variable-support data fusion problems. This approach is based on geostatistical inverse modeling that explicitly accounts for differences in spatial support. The inverse model is applied here to produce both the best predictions of a target support and prediction uncertainties, based on one or more measurements, while honoring measurements. Spatial data covering large geographic areas often exhibit spatial nonstationarity and can lead to computational challenge due to the large data size. We developed a local-window geostatistical inverse modeling approach to accommodate these issues of spatial nonstationarity and alleviate computational burden. We conducted experiments using synthetic and real-world raster data. Synthetic data were generated and aggregated to multiple supports and downscaled back to the original support to analyze the accuracy of spatial predictions and the correctness of prediction uncertainties. Similar experiments were conducted for real-world raster data. Real-world data with variable-support were statistically fused to produce single-support predictions and associated uncertainties. The modeling results demonstrate that geostatistical inverse modeling can produce accurate predictions and associated prediction uncertainties. It is shown that the local-window geostatistical inverse modeling approach suggested offers a practical way to solve the well-known change-of-support problem and variable-support data fusion problem in spatial analysis and modeling.

  2. Management of eWork health issues: a new perspective on an old problem.

    PubMed

    Kirk, Elizabeth; Strong, Jenny

    2010-01-01

    Contact centres are vehicles for a rapidly growing group of knowledge workers, or eWorkers. Using computers and high-speed telecommunications connections as work tools, these employees spend long hours performing mentally demanding work while maintaining static, physically stressful, seated positions. The complex interplay between job demands, work environment, and individual differences combine to produce high levels of physical discomfort among eWorkers. This paper discusses a new view that has emerged, one that focuses on the management rather than the elimination of work related upper limb disorders (WRULD) and computer vision syndrome (CVS) issues that are prevalent among eWorkers. It also reviews a cultural shift among practitioners and business that moves towards a consultative process and the sharing of knowledge among all stakeholders. The controlled work conditions and large single location workforce found within contact centres provide the opportunity to understand the personal and industry cost of eWork injuries and the ability to develop and review new multifaceted interventions. Advances in training and workplace design aimed at decreasing discomfort and injury and reducing the associated economic burden may then be adapted for all eWorkforce groups.

  3. Design and implementation of a high performance network security processor

    NASA Astrophysics Data System (ADS)

    Wang, Haixin; Bai, Guoqiang; Chen, Hongyi

    2010-03-01

    The last few years have seen many significant progresses in the field of application-specific processors. One example is network security processors (NSPs) that perform various cryptographic operations specified by network security protocols and help to offload the computation intensive burdens from network processors (NPs). This article presents a high performance NSP system architecture implementation intended for both internet protocol security (IPSec) and secure socket layer (SSL) protocol acceleration, which are widely employed in virtual private network (VPN) and e-commerce applications. The efficient dual one-way pipelined data transfer skeleton and optimised integration scheme of the heterogenous parallel crypto engine arrays lead to a Gbps rate NSP, which is programmable with domain specific descriptor-based instructions. The descriptor-based control flow fragments large data packets and distributes them to the crypto engine arrays, which fully utilises the parallel computation resources and improves the overall system data throughput. A prototyping platform for this NSP design is implemented with a Xilinx XC3S5000 based FPGA chip set. Results show that the design gives a peak throughput for the IPSec ESP tunnel mode of 2.85 Gbps with over 2100 full SSL handshakes per second at a clock rate of 95 MHz.

  4. An approximation method for improving dynamic network model fitting.

    PubMed

    Carnegie, Nicole Bohme; Krivitsky, Pavel N; Hunter, David R; Goodreau, Steven M

    There has been a great deal of interest recently in the modeling and simulation of dynamic networks, i.e., networks that change over time. One promising model is the separable temporal exponential-family random graph model (ERGM) of Krivitsky and Handcock, which treats the formation and dissolution of ties in parallel at each time step as independent ERGMs. However, the computational cost of fitting these models can be substantial, particularly for large, sparse networks. Fitting cross-sectional models for observations of a network at a single point in time, while still a non-negligible computational burden, is much easier. This paper examines model fitting when the available data consist of independent measures of cross-sectional network structure and the duration of relationships under the assumption of stationarity. We introduce a simple approximation to the dynamic parameters for sparse networks with relationships of moderate or long duration and show that the approximation method works best in precisely those cases where parameter estimation is most likely to fail-networks with very little change at each time step. We consider a variety of cases: Bernoulli formation and dissolution of ties, independent-tie formation and Bernoulli dissolution, independent-tie formation and dissolution, and dependent-tie formation models.

  5. An unsupervised technique for optimal feature selection in attribute profiles for spectral-spatial classification of hyperspectral images

    NASA Astrophysics Data System (ADS)

    Bhardwaj, Kaushal; Patra, Swarnajyoti

    2018-04-01

    Inclusion of spatial information along with spectral features play a significant role in classification of remote sensing images. Attribute profiles have already proved their ability to represent spatial information. In order to incorporate proper spatial information, multiple attributes are required and for each attribute large profiles need to be constructed by varying the filter parameter values within a wide range. Thus, the constructed profiles that represent spectral-spatial information of an hyperspectral image have huge dimension which leads to Hughes phenomenon and increases computational burden. To mitigate these problems, this work presents an unsupervised feature selection technique that selects a subset of filtered image from the constructed high dimensional multi-attribute profile which are sufficiently informative to discriminate well among classes. In this regard the proposed technique exploits genetic algorithms (GAs). The fitness function of GAs are defined in an unsupervised way with the help of mutual information. The effectiveness of the proposed technique is assessed using one-against-all support vector machine classifier. The experiments conducted on three hyperspectral data sets show the robustness of the proposed method in terms of computation time and classification accuracy.

  6. Efficient combination of a 3D Quasi-Newton inversion algorithm and a vector dual-primal finite element tearing and interconnecting method

    NASA Astrophysics Data System (ADS)

    Voznyuk, I.; Litman, A.; Tortel, H.

    2015-08-01

    A Quasi-Newton method for reconstructing the constitutive parameters of three-dimensional (3D) penetrable scatterers from scattered field measurements is presented. This method is adapted for handling large-scale electromagnetic problems while keeping the memory requirement and the time flexibility as low as possible. The forward scattering problem is solved by applying the finite-element tearing and interconnecting full-dual-primal (FETI-FDP2) method which shares the same spirit as the domain decomposition methods for finite element methods. The idea is to split the computational domain into smaller non-overlapping sub-domains in order to simultaneously solve local sub-problems. Various strategies are proposed in order to efficiently couple the inversion algorithm with the FETI-FDP2 method: a separation into permanent and non-permanent subdomains is performed, iterative solvers are favorized for resolving the interface problem and a marching-on-in-anything initial guess selection further accelerates the process. The computational burden is also reduced by applying the adjoint state vector methodology. Finally, the inversion algorithm is confronted to measurements extracted from the 3D Fresnel database.

  7. The impact of individual-level heterogeneity on estimated infectious disease burden: a simulation study.

    PubMed

    McDonald, Scott A; Devleesschauwer, Brecht; Wallinga, Jacco

    2016-12-08

    Disease burden is not evenly distributed within a population; this uneven distribution can be due to individual heterogeneity in progression rates between disease stages. Composite measures of disease burden that are based on disease progression models, such as the disability-adjusted life year (DALY), are widely used to quantify the current and future burden of infectious diseases. Our goal was to investigate to what extent ignoring the presence of heterogeneity could bias DALY computation. Simulations using individual-based models for hypothetical infectious diseases with short and long natural histories were run assuming either "population-averaged" progression probabilities between disease stages, or progression probabilities that were influenced by an a priori defined individual-level frailty (i.e., heterogeneity in disease risk) distribution, and DALYs were calculated. Under the assumption of heterogeneity in transition rates and increasing frailty with age, the short natural history disease model predicted 14% fewer DALYs compared with the homogenous population assumption. Simulations of a long natural history disease indicated that assuming homogeneity in transition rates when heterogeneity was present could overestimate total DALYs, in the present case by 4% (95% quantile interval: 1-8%). The consequences of ignoring population heterogeneity should be considered when defining transition parameters for natural history models and when interpreting the resulting disease burden estimates.

  8. Measuring the burden of preventable diabetic hospitalisations in the Mexican Institute of Social Security (IMSS).

    PubMed

    Lugo-Palacios, David G; Cairns, John; Masetto, Cynthia

    2016-08-02

    The prevalence of diabetes among adults in Mexico has increased markedly from 6.7 % in 1994 to 14.7 % in 2015. Although the main diabetic complications can be prevented or delayed with timely and effective primary care, a high percentage of diabetic patients have developed them imposing an important preventable burden on Mexican society and on the health system. This paper estimates the financial and health burden caused by potentially preventable hospitalisations due to diabetic complications in hospitals operated by the largest social security institution in Latin America, the Mexican Institute of Social Security (IMSS), in the period 2007-2014. Hospitalisations in IMSS hospitals whose main cause was a diabetic complication were identified. The financial burden was estimated using IMSS diagnostic-related groups. To estimate the health burden, DALYs were computed under the assumption that patients would not have experienced complications if they had received timely and effective primary care. A total of 322,977 hospitalisations due to five diabetic complications were identified during the period studied, of which hospitalisations due to kidney failure and diabetic foot represent 78 %. The financial burden increased by 8.4 % in real terms between 2007 and 2014. However, when measured as cost per IMSS affiliate, it decreased by 11.3 %. The health burden had an overall decrease of 13.6 % and the associated DALYs in 2014 reached 103,688. Resources used for the hospital treatment of diabetic complications are then not available for other health care interventions. In order to prevent these hospitalisations more resources might need to be invested in primary care; the first step could be to consider the financial burden of these hospitalisations as a potential target for switching resources from hospital care to primary care services. However, more evidence of the effectiveness of different primary care interventions is needed to know how much of the burden could be prevented by better primary care.

  9. Physical and Mental Health Status of Gulf War and Gulf Era Veterans: Results From a Large Population-Based Epidemiological Study.

    PubMed

    Dursa, Erin K; Barth, Shannon K; Schneiderman, Aaron I; Bossarte, Robert M

    2016-01-01

    The aim of the study was to report the mental and physical health of a population-based cohort of Gulf War and Gulf Era veterans 20 years after the war. A multimode (mail, Web, or computer-assisted telephone interviewing) heath survey of 14,252 Gulf War and Gulf Era veterans. The survey consisted of questions about general, physical, mental, reproductive, and functional health. Gulf War veterans report a higher prevalence of almost all queried physical and mental health conditions. The population as a whole, however, has a significant burden of disease including high body mass index and multiple comorbid conditions. Gulf War veterans continue to report poorer heath than Gulf Era veterans, 20 years after the war. Chronic disease management and interventions to improve health and wellness among both Gulf War and Gulf Era veterans are necessary.

  10. Coordinated scheduling for dynamic real-time systems

    NASA Technical Reports Server (NTRS)

    Natarajan, Swaminathan; Zhao, Wei

    1994-01-01

    In this project, we addressed issues in coordinated scheduling for dynamic real-time systems. In particular, we concentrated on design and implementation of a new distributed real-time system called R-Shell. The design objective of R-Shell is to provide computing support for space programs that have large, complex, fault-tolerant distributed real-time applications. In R-shell, the approach is based on the concept of scheduling agents, which reside in the application run-time environment, and are customized to provide just those resource management functions which are needed by the specific application. With this approach, we avoid the need for a sophisticated OS which provides a variety of generalized functionality, while still not burdening application programmers with heavy responsibility for resource management. In this report, we discuss the R-Shell approach, summarize the achievement of the project, and describe a preliminary prototype of R-Shell system.

  11. Comparison of Manual and Automated Measurements of Tracheobronchial Airway Geometry in Three Balb/c Mice.

    PubMed

    Islam, Asef; Oldham, Michael J; Wexler, Anthony S

    2017-11-01

    Mammalian lungs are comprised of large numbers of tracheobronchial airways that transition from the trachea to alveoli. Studies as wide ranging as pollutant deposition and lung development rely on accurate characterization of these airways. Advancements in CT imaging and the value of computational approaches in eliminating the burden of manual measurement are providing increased efficiency in obtaining this geometric data. In this study, we compare an automated method to a manual one for the first six generations of three Balb/c mouse lungs. We find good agreement between manual and automated methods and that much of the disagreement can be attributed to method precision. Using the automated method, we then provide anatomical data for the entire tracheobronchial airway tree from three Balb/C mice. Anat Rec, 2017. © 2017 Wiley Periodicals, Inc. Anat Rec, 300:2046-2057, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  12. Blending Qualitative and Computational Linguistics Methods for Fidelity Assessment: Experience with the Familias Unidas Preventive Intervention.

    PubMed

    Gallo, Carlos; Pantin, Hilda; Villamar, Juan; Prado, Guillermo; Tapia, Maria; Ogihara, Mitsunori; Cruden, Gracelyn; Brown, C Hendricks

    2015-09-01

    Careful fidelity monitoring and feedback are critical to implementing effective interventions. A wide range of procedures exist to assess fidelity; most are derived from observational assessments (Schoenwald and Garland, Psycholog Assess 25:146-156, 2013). However, these fidelity measures are resource intensive for research teams in efficacy/effectiveness trials, and are often unattainable or unmanageable for the host organization to rate when the program is implemented on a large scale. We present a first step towards automated processing of linguistic patterns in fidelity monitoring of a behavioral intervention using an innovative mixed methods approach to fidelity assessment that uses rule-based, computational linguistics to overcome major resource burdens. Data come from an effectiveness trial of the Familias Unidas intervention, an evidence-based, family-centered preventive intervention found to be efficacious in reducing conduct problems, substance use and HIV sexual risk behaviors among Hispanic youth. This computational approach focuses on "joining," which measures the quality of the working alliance of the facilitator with the family. Quantitative assessments of reliability are provided. Kappa scores between a human rater and a machine rater for the new method for measuring joining reached 0.83. Early findings suggest that this approach can reduce the high cost of fidelity measurement and the time delay between fidelity assessment and feedback to facilitators; it also has the potential for improving the quality of intervention fidelity ratings.

  13. Text processing for technical reports (direct computer-assisted origination, editing, and output of text)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Volpi, A.; Fenrick, M. R.; Stanford, G. S.

    1980-10-01

    Documentation often is a primary residual of research and development. Because of this important role and because of the large amount of time consumed in generating technical reports, particularly those containing formulas and graphics, an existing data-processing computer system has been adapted so as to provide text-processing of technical documents. Emphasis has been on accuracy, turnaround time, and time savings for staff and secretaries, for the types of reports normally produced in the reactor development program. The computer-assisted text-processing system, called TXT, has been implemented to benefit primarily the originator of technical reports. The system is of particular value tomore » professional staff, such as scientists and engineers, who have responsibility for generating much correspondence or lengthy, complex reports or manuscripts - especially if prompt turnaround and high accuracy are required. It can produce text that contains special Greek or mathematical symbols. Written in FORTRAN and MACRO, the program TXT operates on a PDP-11 minicomputer under the RSX-11M multitask multiuser monitor. Peripheral hardware includes videoterminals, electrostatic printers, and magnetic disks. Either data- or word-processing tasks may be performed at the terminals. The repertoire of operations has been restricted so as to minimize user training and memory burden. Spectarial staff may be readily trained to make corrections from annotated copy. Some examples of camera-ready copy are provided.« less

  14. Dynamic response analysis of structure under time-variant interval process model

    NASA Astrophysics Data System (ADS)

    Xia, Baizhan; Qin, Yuan; Yu, Dejie; Jiang, Chao

    2016-10-01

    Due to the aggressiveness of the environmental factor, the variation of the dynamic load, the degeneration of the material property and the wear of the machine surface, parameters related with the structure are distinctly time-variant. Typical model for time-variant uncertainties is the random process model which is constructed on the basis of a large number of samples. In this work, we propose a time-variant interval process model which can be effectively used to deal with time-variant uncertainties with limit information. And then two methods are presented for the dynamic response analysis of the structure under the time-variant interval process model. The first one is the direct Monte Carlo method (DMCM) whose computational burden is relative high. The second one is the Monte Carlo method based on the Chebyshev polynomial expansion (MCM-CPE) whose computational efficiency is high. In MCM-CPE, the dynamic response of the structure is approximated by the Chebyshev polynomials which can be efficiently calculated, and then the variational range of the dynamic response is estimated according to the samples yielded by the Monte Carlo method. To solve the dependency phenomenon of the interval operation, the affine arithmetic is integrated into the Chebyshev polynomial expansion. The computational effectiveness and efficiency of MCM-CPE is verified by two numerical examples, including a spring-mass-damper system and a shell structure.

  15. Blending Qualitative and Computational Linguistics Methods for Fidelity Assessment: Experience with the Familias Unidas Preventive Intervention

    PubMed Central

    Gallo, Carlos; Pantin, Hilda; Villamar, Juan; Prado, Guillermo; Tapia, Maria; Ogihara, Mitsunori; Cruden, Gracelyn; Brown, C Hendricks

    2014-01-01

    Careful fidelity monitoring and feedback are critical to implementing effective interventions. A wide range of procedures exist to assess fidelity; most are derived from observational assessments (Schoenwald et al, 2013). However, these fidelity measures are resource intensive for research teams in efficacy/effectiveness trials, and are often unattainable or unmanageable for the host organization to rate when the program is implemented on a large scale. We present a first step towards automated processing of linguistic patterns in fidelity monitoring of a behavioral intervention using an innovative mixed methods approach to fidelity assessment that uses rule-based, computational linguistics to overcome major resource burdens. Data come from an effectiveness trial of the Familias Unidas intervention, an evidence-based, family-centered preventive intervention found to be efficacious in reducing conduct problems, substance use and HIV sexual risk behaviors among Hispanic youth. This computational approach focuses on “joining,” which measures the quality of the working alliance of the facilitator with the family. Quantitative assessments of reliability are provided. Kappa scores between a human rater and a machine rater for the new method for measuring joining reached .83. Early findings suggest that this approach can reduce the high cost of fidelity measurement and the time delay between fidelity assessment and feedback to facilitators; it also has the potential for improving the quality of intervention fidelity ratings. PMID:24500022

  16. Interspinous Process Decompression: Expanding Treatment Options for Lumbar Spinal Stenosis

    PubMed Central

    Nunley, Pierce D.; Shamie, A. Nick; Blumenthal, Scott L.; Orndorff, Douglas; Geisler, Fred H.

    2016-01-01

    Interspinous process decompression is a minimally invasive implantation procedure employing a stand-alone interspinous spacer that functions as an extension blocker to prevent compression of neural elements without direct surgical removal of tissue adjacent to the nerves. The Superion® spacer is the only FDA approved stand-alone device available in the US. It is also the only spacer approved by the CMS to be implanted in an ambulatory surgery center. We computed the within-group effect sizes from the Superion IDE trial and compared them to results extrapolated from two randomized trials of decompressive laminectomy. For the ODI, effect sizes were all very large (>1.0) for Superion and laminectomy at 2, 3, and 4 years. For ZCQ, the 2-year Superion symptom severity (1.26) and physical function (1.29) domains were very large; laminectomy effect sizes were very large (1.07) for symptom severity and large for physical function (0.80). Current projections indicate a marked increase in the number of patients with spinal stenosis. Consequently, there remains a keen interest in minimally invasive treatment options that delay or obviate the need for invasive surgical procedures, such as decompressive laminectomy or fusion. Stand-alone interspinous spacers may fill a currently unmet treatment gap in the continuum of care and help to reduce the burden of this chronic degenerative condition on the health care system. PMID:27819001

  17. Interspinous Process Decompression: Expanding Treatment Options for Lumbar Spinal Stenosis.

    PubMed

    Nunley, Pierce D; Shamie, A Nick; Blumenthal, Scott L; Orndorff, Douglas; Block, Jon E; Geisler, Fred H

    2016-01-01

    Interspinous process decompression is a minimally invasive implantation procedure employing a stand-alone interspinous spacer that functions as an extension blocker to prevent compression of neural elements without direct surgical removal of tissue adjacent to the nerves. The Superion® spacer is the only FDA approved stand-alone device available in the US. It is also the only spacer approved by the CMS to be implanted in an ambulatory surgery center. We computed the within-group effect sizes from the Superion IDE trial and compared them to results extrapolated from two randomized trials of decompressive laminectomy. For the ODI, effect sizes were all very large (>1.0) for Superion and laminectomy at 2, 3, and 4 years. For ZCQ, the 2-year Superion symptom severity (1.26) and physical function (1.29) domains were very large ; laminectomy effect sizes were very large (1.07) for symptom severity and large for physical function (0.80). Current projections indicate a marked increase in the number of patients with spinal stenosis. Consequently, there remains a keen interest in minimally invasive treatment options that delay or obviate the need for invasive surgical procedures, such as decompressive laminectomy or fusion. Stand-alone interspinous spacers may fill a currently unmet treatment gap in the continuum of care and help to reduce the burden of this chronic degenerative condition on the health care system.

  18. Burden Analysis of Rare Microdeletions Suggests a Strong Impact of Neurodevelopmental Genes in Genetic Generalised Epilepsies

    PubMed Central

    Trucks, Holger; Schulz, Herbert; de Kovel, Carolien G.; Kasteleijn-Nolst Trenité, Dorothée; Sonsma, Anja C. M.; Koeleman, Bobby P.; Lindhout, Dick; Weber, Yvonne G.; Lerche, Holger; Kapser, Claudia; Schankin, Christoph J.; Kunz, Wolfram S.; Surges, Rainer; Elger, Christian E.; Gaus, Verena; Schmitz, Bettina; Helbig, Ingo; Muhle, Hiltrud; Stephani, Ulrich; Klein, Karl M.; Rosenow, Felix; Neubauer, Bernd A.; Reinthaler, Eva M.; Zimprich, Fritz; Feucht, Martha; Møller, Rikke S.; Hjalgrim, Helle; De Jonghe, Peter; Suls, Arvid; Lieb, Wolfgang; Franke, Andre; Strauch, Konstantin; Gieger, Christian; Schurmann, Claudia; Schminke, Ulf; Nürnberg, Peter; Sander, Thomas

    2015-01-01

    Genetic generalised epilepsy (GGE) is the most common form of genetic epilepsy, accounting for 20% of all epilepsies. Genomic copy number variations (CNVs) constitute important genetic risk factors of common GGE syndromes. In our present genome-wide burden analysis, large (≥ 400 kb) and rare (< 1%) autosomal microdeletions with high calling confidence (≥ 200 markers) were assessed by the Affymetrix SNP 6.0 array in European case-control cohorts of 1,366 GGE patients and 5,234 ancestry-matched controls. We aimed to: 1) assess the microdeletion burden in common GGE syndromes, 2) estimate the relative contribution of recurrent microdeletions at genomic rearrangement hotspots and non-recurrent microdeletions, and 3) identify potential candidate genes for GGE. We found a significant excess of microdeletions in 7.3% of GGE patients compared to 4.0% in controls (P = 1.8 x 10-7; OR = 1.9). Recurrent microdeletions at seven known genomic hotspots accounted for 36.9% of all microdeletions identified in the GGE cohort and showed a 7.5-fold increased burden (P = 2.6 x 10-17) relative to controls. Microdeletions affecting either a gene previously implicated in neurodevelopmental disorders (P = 8.0 x 10-18, OR = 4.6) or an evolutionarily conserved brain-expressed gene related to autism spectrum disorder (P = 1.3 x 10-12, OR = 4.1) were significantly enriched in the GGE patients. Microdeletions found only in GGE patients harboured a high proportion of genes previously associated with epilepsy and neuropsychiatric disorders (NRXN1, RBFOX1, PCDH7, KCNA2, EPM2A, RORB, PLCB1). Our results demonstrate that the significantly increased burden of large and rare microdeletions in GGE patients is largely confined to recurrent hotspot microdeletions and microdeletions affecting neurodevelopmental genes, suggesting a strong impact of fundamental neurodevelopmental processes in the pathogenesis of common GGE syndromes. PMID:25950944

  19. The impact of precipitation evaporation on the atmospheric aerosol distribution in EC-Earth v3.2.0

    NASA Astrophysics Data System (ADS)

    de Bruine, Marco; Krol, Maarten; van Noije, Twan; Le Sager, Philippe; Röckmann, Thomas

    2018-04-01

    The representation of aerosol-cloud interaction in global climate models (GCMs) remains a large source of uncertainty in climate projections. Due to its complexity, precipitation evaporation is either ignored or taken into account in a simplified manner in GCMs. This research explores various ways to treat aerosol resuspension and determines the possible impact of precipitation evaporation and subsequent aerosol resuspension on global aerosol burdens and distribution. The representation of aerosol wet deposition by large-scale precipitation in the EC-Earth model has been improved by utilising additional precipitation-related 3-D fields from the dynamical core, the Integrated Forecasting System (IFS) general circulation model, in the chemistry and aerosol module Tracer Model, version 5 (TM5). A simple approach of scaling aerosol release with evaporated precipitation fraction leads to an increase in the global aerosol burden (+7.8 to +15 % for different aerosol species). However, when taking into account the different sizes and evaporation rate of raindrops following Gong et al. (2006), the release of aerosols is strongly reduced, and the total aerosol burden decreases by -3.0 to -8.5 %. Moreover, inclusion of cloud processing based on observations by Mitra et al. (1992) transforms scavenged small aerosol to coarse particles, which enhances removal by sedimentation and hence leads to a -10 to -11 % lower aerosol burden. Finally, when these two effects are combined, the global aerosol burden decreases by -11 to -19 %. Compared to the Moderate Resolution Imaging Spectroradiometer (MODIS) satellite observations, aerosol optical depth (AOD) is generally underestimated in most parts of the world in all configurations of the TM5 model and although the representation is now physically more realistic, global AOD shows no large improvements in spatial patterns. Similarly, the agreement of the vertical profile with Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) satellite measurements does not improve significantly. We show, however, that aerosol resuspension has a considerable impact on the modelled aerosol distribution and needs to be taken into account.

  20. 78 FR 29812 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-21

    .... Affected Public: Private Sector: Businesses or other for-profits. Estimated Annual Burden Hours: 81,190... was computed and deposited. Affected Public: Private Sector: Businesses or other for-profits... taxpayer examinations. Affected Public: Private Sector: Businesses or other for-profits. Estimated Annual...

  1. Differences in coronary plaque composition with aging measured by coronary computed tomography angiography.

    PubMed

    Tota-Maharaj, Rajesh; Blaha, Michael J; Rivera, Juan J; Henry, Travis S; Choi, Eue-Keun; Chang, Sung-A; Yoon, Yeonyee E; Chun, Eun Ju; Choi, Sang-Il; Blumenthal, Roger S; Chang, Hyuk-Jae; Nasir, Khurram

    2012-07-12

    Little is known about the independent impact of aging on coronary plaque morphology and composition in the era of cardiac computed tomography angiography (CCTA). We studied 1015 consecutive asymptomatic South Korean subjects (49 ± 10 years, 64% men) who underwent 64-slice CCTA during routine health evaluation. Coronary plaque characteristics were analyzed on a per-segment basis according to the modified AHA classification. Plaques with >50% calcified tissue were classified as calcified (CAP), plaques with <50% calcified tissue were classified as mixed (MCAP), and plaques without calcium were classified as non-calcified (NCAP). Multiple regression analysis was employed to describe the cross-sectional association between age tertile and plaque type burden (≥ 2 affected segments) after adjustment for other cardiovascular risk factors. The prevalence of coronary plaque increased with age, (1st tertile: 7.5%, 3rd tertile: 38.5% [p<0.001]). The relative contribution of NCAP to overall plaque burden decreased with age from nearly 50% in the first tertile to approximately 20% in the third, while there was a reciprocal increase in both MCAP and CAP subtypes. In multivariable analysis, patients in the oldest tertile had a 2.5-fold increase in burden of NCAP, yet a nearly 40-fold increase in MCAP and 16-fold increase in CAP compared to the youngest tertile. In conclusion, CCTA is an effective method for measuring age-related differences in the burden of individual coronary plaque subtypes. Future research is needed to determine whether the increase in mixed and calcified plaques seen with aging produce an independent contribution to the age-related increase in cardiovascular risk. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  2. Quantitative coronary plaque analysis predicts high-risk plaque morphology on coronary computed tomography angiography: results from the ROMICAT II trial.

    PubMed

    Liu, Ting; Maurovich-Horvat, Pál; Mayrhofer, Thomas; Puchner, Stefan B; Lu, Michael T; Ghemigian, Khristine; Kitslaar, Pieter H; Broersen, Alexander; Pursnani, Amit; Hoffmann, Udo; Ferencik, Maros

    2018-02-01

    Semi-automated software can provide quantitative assessment of atherosclerotic plaques on coronary CT angiography (CTA). The relationship between established qualitative high-risk plaque features and quantitative plaque measurements has not been studied. We analyzed the association between quantitative plaque measurements and qualitative high-risk plaque features on coronary CTA. We included 260 patients with plaque who underwent coronary CTA in the Rule Out Myocardial Infarction/Ischemia Using Computer Assisted Tomography (ROMICAT) II trial. Quantitative plaque assessment and qualitative plaque characterization were performed on a per coronary segment basis. Quantitative coronary plaque measurements included plaque volume, plaque burden, remodeling index, and diameter stenosis. In qualitative analysis, high-risk plaque was present if positive remodeling, low CT attenuation plaque, napkin-ring sign or spotty calcium were detected. Univariable and multivariable logistic regression analyses were performed to assess the association between quantitative and qualitative high-risk plaque assessment. Among 888 segments with coronary plaque, high-risk plaque was present in 391 (44.0%) segments by qualitative analysis. In quantitative analysis, segments with high-risk plaque had higher total plaque volume, low CT attenuation plaque volume, plaque burden and remodeling index. Quantitatively assessed low CT attenuation plaque volume (odds ratio 1.12 per 1 mm 3 , 95% CI 1.04-1.21), positive remodeling (odds ratio 1.25 per 0.1, 95% CI 1.10-1.41) and plaque burden (odds ratio 1.53 per 0.1, 95% CI 1.08-2.16) were associated with high-risk plaque. Quantitative coronary plaque characteristics (low CT attenuation plaque volume, positive remodeling and plaque burden) measured by semi-automated software correlated with qualitative assessment of high-risk plaque features.

  3. Computed Tomography-Based Biomarker for Longitudinal Assessment of Disease Burden in Pulmonary Tuberculosis.

    PubMed

    Gordaliza, P M; Muñoz-Barrutia, A; Via, L E; Sharpe, S; Desco, M; Vaquero, J J

    2018-05-29

    Computed tomography (CT) images enable capturing specific manifestations of tuberculosis (TB) that are undetectable using common diagnostic tests, which suffer from limited specificity. In this study, we aimed to automatically quantify the burden of Mycobacterium tuberculosis (Mtb) using biomarkers extracted from x-ray CT images. Nine macaques were aerosol-infected with Mtb and treated with various antibiotic cocktails. Chest CT scans were acquired in all animals at specific times independently of disease progression. First, a fully automatic segmentation of the healthy lungs from the acquired chest CT volumes was performed and air-like structures were extracted. Next, unsegmented pulmonary regions corresponding to damaged parenchymal tissue and TB lesions were included. CT biomarkers were extracted by classification of the probability distribution of the intensity of the segmented images into three tissue types: (1) Healthy tissue, parenchyma free from infection; (2) soft diseased tissue, and (3) hard diseased tissue. The probability distribution of tissue intensities was assumed to follow a Gaussian mixture model. The thresholds identifying each region were automatically computed using an expectation-maximization algorithm. The estimated longitudinal course of TB infection shows that subjects that have followed the same antibiotic treatment present a similar response (relative change in the diseased volume) with respect to baseline. More interestingly, the correlation between the diseased volume (soft tissue + hard tissue), which was manually delineated by an expert, and the automatically extracted volume with the proposed method was very strong (R 2  ≈ 0.8). We present a methodology that is suitable for automatic extraction of a radiological biomarker from CT images for TB disease burden. The method could be used to describe the longitudinal evolution of Mtb infection in a clinical trial devoted to the design of new drugs.

  4. Adaptive Core Simulation Employing Discrete Inverse Theory - Part I: Theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abdel-Khalik, Hany S.; Turinsky, Paul J.

    2005-07-15

    Use of adaptive simulation is intended to improve the fidelity and robustness of important core attribute predictions such as core power distribution, thermal margins, and core reactivity. Adaptive simulation utilizes a selected set of past and current reactor measurements of reactor observables, i.e., in-core instrumentation readings, to adapt the simulation in a meaningful way. A meaningful adaption will result in high-fidelity and robust adapted core simulator models. To perform adaption, we propose an inverse theory approach in which the multitudes of input data to core simulators, i.e., reactor physics and thermal-hydraulic data, are to be adjusted to improve agreement withmore » measured observables while keeping core simulator models unadapted. At first glance, devising such adaption for typical core simulators with millions of input and observables data would spawn not only several prohibitive challenges but also numerous disparaging concerns. The challenges include the computational burdens of the sensitivity-type calculations required to construct Jacobian operators for the core simulator models. Also, the computational burdens of the uncertainty-type calculations required to estimate the uncertainty information of core simulator input data present a demanding challenge. The concerns however are mainly related to the reliability of the adjusted input data. The methodologies of adaptive simulation are well established in the literature of data adjustment. We adopt the same general framework for data adjustment; however, we refrain from solving the fundamental adjustment equations in a conventional manner. We demonstrate the use of our so-called Efficient Subspace Methods (ESMs) to overcome the computational and storage burdens associated with the core adaption problem. We illustrate the successful use of ESM-based adaptive techniques for a typical boiling water reactor core simulator adaption problem.« less

  5. Lead burdens and behavioral impairments of the lined shore crab Pachygrapsus crassipes

    USGS Publications Warehouse

    Hui, C.A.

    2002-01-01

    Sublethal burdens of lead impair behaviors critical to survival in a variety of animals. In a test arena, I measured refuge-seeking behaviors of adult, male, lined shore crabs from lead-free and lead-contaminated sites. The body sizes of the test groups did not differ although the mean total body lead burdens differed by over 2,300%. A lead-contaminated environment does not appear to affect growth. Each of the 31 crabs had at least six trials in the arena. The fraction of trials with more than one pause, number of pauses per trial, mean time per pause, and the fraction of time a crab spent in pauses did not differ between groups. The absence of behavioral effects of the lead burdens may be because a large portion of the lead burden was sequestered in the carapace. The neurological and other soft tissues would then have lower levels of lead. Predators that ingest primarily soft tissues would have little exposure to the lead burden of these crabs. Those that also ingest the carapace may benefit from its high calcium content that inhibits lead uptake from the gut, regardless of the location of lead in the crab body.

  6. Computer generated hologram from point cloud using graphics processor.

    PubMed

    Chen, Rick H-Y; Wilkinson, Timothy D

    2009-12-20

    Computer generated holography is an extremely demanding and complex task when it comes to providing realistic reconstructions with full parallax, occlusion, and shadowing. We present an algorithm designed for data-parallel computing on modern graphics processing units to alleviate the computational burden. We apply Gaussian interpolation to create a continuous surface representation from discrete input object points. The algorithm maintains a potential occluder list for each individual hologram plane sample to keep the number of visibility tests to a minimum. We experimented with two approximations that simplify and accelerate occlusion computation. It is observed that letting several neighboring hologram plane samples share visibility information on object points leads to significantly faster computation without causing noticeable artifacts in the reconstructed images. Computing a reduced sample set via nonuniform sampling is also found to be an effective acceleration technique.

  7. Gender differences in caregiver burden and its determinants in family members of terminally ill cancer patients.

    PubMed

    Schrank, Beate; Ebert-Vogel, Alexandra; Amering, Michaela; Masel, Eva K; Neubauer, Marie; Watzke, Herbert; Zehetmayer, Sonja; Schur, Sophie

    2016-07-01

    Female family caregivers consistently report higher levels of stress and burden compared to male caregivers. Explanations for the apparently higher psychological vulnerability of female caregivers are largely missing to date. This study assesses the correlates and determinants of caregiver burden in family caregivers of advanced cancer patients with a specific focus on gender differences. Three hundred and eight self-identified main informal caregivers of advanced cancer patients were cross-sectionally assessed using structured questionnaires for caregiver burden and hypothesised determinants of burden, including sociodemographic characteristics, caring arrangements, support needs, hope and coping style. Gender differences and predictors of burden were assessed using t-tests, chi-squared tests and univariate linear regression. Significant univariate predictors were entered in an analysis of covariance separately for men and women. Burden was significantly higher in women. Hope was the most significant protective factor against burden in both genders, together with perceived fulfilment of support needs. Only in women emotion-oriented coping and being in employment while caring were significantly predictive of higher burden in the multivariate analysis. The model explained 36% of the variance in burden in men and 29% in women. Psychological support interventions for family caregivers should take gender-specific risk factors into account. Interventions focusing on keeping up hope while caring for a terminally ill family member may be a valuable addition to palliative services to improve support for family carers. Women may benefit from interventions that address adaptive coping and strategies to deal with the dual demands of employment and caring. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  8. A transfer function type of simplified electrochemical model with modified boundary conditions and Padé approximation for Li-ion battery: Part 1. lithium concentration estimation

    NASA Astrophysics Data System (ADS)

    Yuan, Shifei; Jiang, Lei; Yin, Chengliang; Wu, Hongjie; Zhang, Xi

    2017-06-01

    To guarantee the safety, high efficiency and long lifetime for lithium-ion battery, an advanced battery management system requires a physics-meaningful yet computationally efficient battery model. The pseudo-two dimensional (P2D) electrochemical model can provide physical information about the lithium concentration and potential distributions across the cell dimension. However, the extensive computation burden caused by the temporal and spatial discretization limits its real-time application. In this research, we propose a new simplified electrochemical model (SEM) by modifying the boundary conditions for electrolyte diffusion equations, which significantly facilitates the analytical solving process. Then to obtain a reduced order transfer function, the Padé approximation method is adopted to simplify the derived transcendental impedance solution. The proposed model with the reduced order transfer function can be briefly computable and preserve physical meanings through the presence of parameters such as the solid/electrolyte diffusion coefficients (Ds&De) and particle radius. The simulation illustrates that the proposed simplified model maintains high accuracy for electrolyte phase concentration (Ce) predictions, saying 0.8% and 0.24% modeling error respectively, when compared to the rigorous model under 1C-rate pulse charge/discharge and urban dynamometer driving schedule (UDDS) profiles. Meanwhile, this simplified model yields significantly reduced computational burden, which benefits its real-time application.

  9. Medical and economic burden of influenza in the elderly population in central and eastern European countries

    PubMed Central

    Kovács, Gábor; Kovács, Gábor; Kaló, Zoltán; Kaló, Zoltán; Jahnz-Rozyk, Karina; Jahnz-Rozyk, Karina; Kyncl, Jan; Kyncl, Jan; Csohan, Agnes; Csohan, Agnes; Pistol, Adriana; Pistol, Adriana; Leleka, Mariya; Leleka, Mariya; Kipshakbaev, Rafail; Kipshakbaev, Rafail; Durand, Laure; Durand, Laure; Macabeo, Bérengère; Macabeo, Bérengère

    2014-01-01

    Influenza affects 5–15% of the population during an epidemic. In Western Europe, vaccination of at-risk groups forms the cornerstone of influenza prevention. However, vaccination coverage of the elderly (>65 y) is often low in Central and Eastern Europe (CEE); potentially because a paucity of country-specific data limits evidence-based policy making. Therefore the medical and economic burden of influenza were estimated in elderly populations in the Czech Republic, Hungary, Kazakhstan, Poland, Romania, and Ukraine. Data covering national influenza vaccination policies, surveillance and reporting, healthcare costs, populations, and epidemiology were obtained via literature review, open-access websites and databases, and interviews with experts. A simplified model of patient treatment flow incorporating cost, population, and incidence/prevalence data was used to calculate the influenza burden per country. In the elderly, influenza represented a large burden on the assessed healthcare systems, with yearly excess hospitalization rates of ~30/100 000. Burden varied between countries and was likely influenced by population size, surveillance system, healthcare provision, and vaccine coverage. The greatest burden was found in Poland, where direct costs were over EUR 5 million. Substantial differences in data availability and quality were identified, and to fully quantify the burden of influenza in CEE, influenza reporting systems should be standardized. This study most probably underestimates the real burden of influenza, however the public health problem is recognized worldwide, and will further increase with population aging. Extending influenza vaccination of the elderly may be a cost-effective way to reduce the burden of influenza in CEE. PMID:24165394

  10. The Burden of Mental Disorders in the Eastern Mediterranean Region, 1990-2013

    PubMed Central

    Charara, Raghid; Forouzanfar, Mohammad; Naghavi, Mohsen; Moradi-Lakeh, Maziar; Afshin, Ashkan; Vos, Theo; Daoud, Farah; Wang, Haidong; El Bcheraoui, Charbel; Khalil, Ibrahim; Hamadeh, Randah R.; Khosravi, Ardeshir; Rahimi-Movaghar, Vafa; Khader, Yousef; Al-Hamad, Nawal; Makhlouf Obermeyer, Carla; Rafay, Anwar; Asghar, Rana; Rana, Saleem M.; Shaheen, Amira; Abu-Rmeileh, Niveen M. E.; Husseini, Abdullatif; Abu-Raddad, Laith J.; Khoja, Tawfik; Al Rayess, Zulfa A.; AlBuhairan, Fadia S.; Hsairi, Mohamed; Alomari, Mahmoud A.; Ali, Raghib; Roshandel, Gholamreza; Terkawi, Abdullah Sulieman; Hamidi, Samer; Refaat, Amany H.; Westerman, Ronny; Kiadaliri, Aliasghar Ahmad; Akanda, Ali S.; Ali, Syed Danish; Bacha, Umar; Badawi, Alaa; Bazargan-Hejazi, Shahrzad; Faghmous, Imad A. D.; Fereshtehnejad, Seyed-Mohammad; Fischer, Florian; Jonas, Jost B.; Kuate Defo, Barthelemy; Mehari, Alem; Omer, Saad B.; Pourmalek, Farshad; Uthman, Olalekan A.; Mokdad, Ali A.; Maalouf, Fadi T.; Abd-Allah, Foad; Akseer, Nadia; Arya, Dinesh; Borschmann, Rohan; Brazinova, Alexandra; Brugha, Traolach S.; Catalá-López, Ferrán; Degenhardt, Louisa; Ferrari, Alize; Haro, Josep Maria; Horino, Masako; Hornberger, John C.; Huang, Hsiang; Kieling, Christian; Kim, Daniel; Kim, Yunjin; Knudsen, Ann Kristin; Mitchell, Philip B.; Patton, George; Sagar, Rajesh; Satpathy, Maheswar; Savuon, Kim; Seedat, Soraya; Shiue, Ivy; Skogen, Jens Christoffer; Stein, Dan J.; Tabb, Karen M.; Whiteford, Harvey A.; Yip, Paul; Yonemoto, Naohiro; Murray, Christopher J. L.; Mokdad, Ali H.

    2017-01-01

    The Eastern Mediterranean Region (EMR) is witnessing an increase in chronic disorders, including mental illness. With ongoing unrest, this is expected to rise. This is the first study to quantify the burden of mental disorders in the EMR. We used data from the Global Burden of Disease study (GBD) 2013. DALYs (disability-adjusted life years) allow assessment of both premature mortality (years of life lost–YLLs) and nonfatal outcomes (years lived with disability–YLDs). DALYs are computed by adding YLLs and YLDs for each age-sex-country group. In 2013, mental disorders contributed to 5.6% of the total disease burden in the EMR (1894 DALYS/100,000 population): 2519 DALYS/100,000 (2590/100,000 males, 2426/100,000 females) in high-income countries, 1884 DALYS/100,000 (1618/100,000 males, 2157/100,000 females) in middle-income countries, 1607 DALYS/100,000 (1500/100,000 males, 1717/100,000 females) in low-income countries. Females had a greater proportion of burden due to mental disorders than did males of equivalent ages, except for those under 15 years of age. The highest proportion of DALYs occurred in the 25–49 age group, with a peak in the 35–39 years age group (5344 DALYs/100,000). The burden of mental disorders in EMR increased from 1726 DALYs/100,000 in 1990 to 1912 DALYs/100,000 in 2013 (10.8% increase). Within the mental disorders group in EMR, depressive disorders accounted for most DALYs, followed by anxiety disorders. Among EMR countries, Palestine had the largest burden of mental disorders. Nearly all EMR countries had a higher mental disorder burden compared to the global level. Our findings call for EMR ministries of health to increase provision of mental health services and to address the stigma of mental illness. Moreover, our results showing the accelerating burden of mental health are alarming as the region is seeing an increased level of instability. Indeed, mental health problems, if not properly addressed, will lead to an increased burden of diseases in the region. PMID:28095477

  11. The Burden of Mental Disorders in the Eastern Mediterranean Region, 1990-2013.

    PubMed

    Charara, Raghid; Forouzanfar, Mohammad; Naghavi, Mohsen; Moradi-Lakeh, Maziar; Afshin, Ashkan; Vos, Theo; Daoud, Farah; Wang, Haidong; El Bcheraoui, Charbel; Khalil, Ibrahim; Hamadeh, Randah R; Khosravi, Ardeshir; Rahimi-Movaghar, Vafa; Khader, Yousef; Al-Hamad, Nawal; Makhlouf Obermeyer, Carla; Rafay, Anwar; Asghar, Rana; Rana, Saleem M; Shaheen, Amira; Abu-Rmeileh, Niveen M E; Husseini, Abdullatif; Abu-Raddad, Laith J; Khoja, Tawfik; Al Rayess, Zulfa A; AlBuhairan, Fadia S; Hsairi, Mohamed; Alomari, Mahmoud A; Ali, Raghib; Roshandel, Gholamreza; Terkawi, Abdullah Sulieman; Hamidi, Samer; Refaat, Amany H; Westerman, Ronny; Kiadaliri, Aliasghar Ahmad; Akanda, Ali S; Ali, Syed Danish; Bacha, Umar; Badawi, Alaa; Bazargan-Hejazi, Shahrzad; Faghmous, Imad A D; Fereshtehnejad, Seyed-Mohammad; Fischer, Florian; Jonas, Jost B; Kuate Defo, Barthelemy; Mehari, Alem; Omer, Saad B; Pourmalek, Farshad; Uthman, Olalekan A; Mokdad, Ali A; Maalouf, Fadi T; Abd-Allah, Foad; Akseer, Nadia; Arya, Dinesh; Borschmann, Rohan; Brazinova, Alexandra; Brugha, Traolach S; Catalá-López, Ferrán; Degenhardt, Louisa; Ferrari, Alize; Haro, Josep Maria; Horino, Masako; Hornberger, John C; Huang, Hsiang; Kieling, Christian; Kim, Daniel; Kim, Yunjin; Knudsen, Ann Kristin; Mitchell, Philip B; Patton, George; Sagar, Rajesh; Satpathy, Maheswar; Savuon, Kim; Seedat, Soraya; Shiue, Ivy; Skogen, Jens Christoffer; Stein, Dan J; Tabb, Karen M; Whiteford, Harvey A; Yip, Paul; Yonemoto, Naohiro; Murray, Christopher J L; Mokdad, Ali H

    2017-01-01

    The Eastern Mediterranean Region (EMR) is witnessing an increase in chronic disorders, including mental illness. With ongoing unrest, this is expected to rise. This is the first study to quantify the burden of mental disorders in the EMR. We used data from the Global Burden of Disease study (GBD) 2013. DALYs (disability-adjusted life years) allow assessment of both premature mortality (years of life lost-YLLs) and nonfatal outcomes (years lived with disability-YLDs). DALYs are computed by adding YLLs and YLDs for each age-sex-country group. In 2013, mental disorders contributed to 5.6% of the total disease burden in the EMR (1894 DALYS/100,000 population): 2519 DALYS/100,000 (2590/100,000 males, 2426/100,000 females) in high-income countries, 1884 DALYS/100,000 (1618/100,000 males, 2157/100,000 females) in middle-income countries, 1607 DALYS/100,000 (1500/100,000 males, 1717/100,000 females) in low-income countries. Females had a greater proportion of burden due to mental disorders than did males of equivalent ages, except for those under 15 years of age. The highest proportion of DALYs occurred in the 25-49 age group, with a peak in the 35-39 years age group (5344 DALYs/100,000). The burden of mental disorders in EMR increased from 1726 DALYs/100,000 in 1990 to 1912 DALYs/100,000 in 2013 (10.8% increase). Within the mental disorders group in EMR, depressive disorders accounted for most DALYs, followed by anxiety disorders. Among EMR countries, Palestine had the largest burden of mental disorders. Nearly all EMR countries had a higher mental disorder burden compared to the global level. Our findings call for EMR ministries of health to increase provision of mental health services and to address the stigma of mental illness. Moreover, our results showing the accelerating burden of mental health are alarming as the region is seeing an increased level of instability. Indeed, mental health problems, if not properly addressed, will lead to an increased burden of diseases in the region.

  12. Unsteady Propeller Hydrodynamics

    DTIC Science & Technology

    2001-06-01

    coupling routines, making the code more robust while decreasing the computation burden over currect methods. Finally, a higher order quadratic influence ... function technique was implemented within the wake to more accurately define the induction velocity at the trailing edge which has suffered in the past due to lack of discretization.

  13. Electric Power Distribution System Model Simplification Using Segment Substitution

    DOE PAGES

    Reiman, Andrew P.; McDermott, Thomas E.; Akcakaya, Murat; ...

    2017-09-20

    Quasi-static time-series (QSTS) simulation is used to simulate the behavior of distribution systems over long periods of time (typically hours to years). The technique involves repeatedly solving the load-flow problem for a distribution system model and is useful for distributed energy resource (DER) planning. When a QSTS simulation has a small time step and a long duration, the computational burden of the simulation can be a barrier to integration into utility workflows. One way to relieve the computational burden is to simplify the system model. The segment substitution method of simplifying distribution system models introduced in this paper offers modelmore » bus reduction of up to 98% with a simplification error as low as 0.2% (0.002 pu voltage). Finally, in contrast to existing methods of distribution system model simplification, which rely on topological inspection and linearization, the segment substitution method uses black-box segment data and an assumed simplified topology.« less

  14. m2-ABKS: Attribute-Based Multi-Keyword Search over Encrypted Personal Health Records in Multi-Owner Setting.

    PubMed

    Miao, Yinbin; Ma, Jianfeng; Liu, Ximeng; Wei, Fushan; Liu, Zhiquan; Wang, Xu An

    2016-11-01

    Online personal health record (PHR) is more inclined to shift data storage and search operations to cloud server so as to enjoy the elastic resources and lessen computational burden in cloud storage. As multiple patients' data is always stored in the cloud server simultaneously, it is a challenge to guarantee the confidentiality of PHR data and allow data users to search encrypted data in an efficient and privacy-preserving way. To this end, we design a secure cryptographic primitive called as attribute-based multi-keyword search over encrypted personal health records in multi-owner setting to support both fine-grained access control and multi-keyword search via Ciphertext-Policy Attribute-Based Encryption. Formal security analysis proves our scheme is selectively secure against chosen-keyword attack. As a further contribution, we conduct empirical experiments over real-world dataset to show its feasibility and practicality in a broad range of actual scenarios without incurring additional computational burden.

  15. Electric Power Distribution System Model Simplification Using Segment Substitution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reiman, Andrew P.; McDermott, Thomas E.; Akcakaya, Murat

    Quasi-static time-series (QSTS) simulation is used to simulate the behavior of distribution systems over long periods of time (typically hours to years). The technique involves repeatedly solving the load-flow problem for a distribution system model and is useful for distributed energy resource (DER) planning. When a QSTS simulation has a small time step and a long duration, the computational burden of the simulation can be a barrier to integration into utility workflows. One way to relieve the computational burden is to simplify the system model. The segment substitution method of simplifying distribution system models introduced in this paper offers modelmore » bus reduction of up to 98% with a simplification error as low as 0.2% (0.002 pu voltage). Finally, in contrast to existing methods of distribution system model simplification, which rely on topological inspection and linearization, the segment substitution method uses black-box segment data and an assumed simplified topology.« less

  16. Comparison of numerical weather prediction based deterministic and probabilistic wind resource assessment methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Jie; Draxl, Caroline; Hopson, Thomas

    Numerical weather prediction (NWP) models have been widely used for wind resource assessment. Model runs with higher spatial resolution are generally more accurate, yet extremely computational expensive. An alternative approach is to use data generated by a low resolution NWP model, in conjunction with statistical methods. In order to analyze the accuracy and computational efficiency of different types of NWP-based wind resource assessment methods, this paper performs a comparison of three deterministic and probabilistic NWP-based wind resource assessment methodologies: (i) a coarse resolution (0.5 degrees x 0.67 degrees) global reanalysis data set, the Modern-Era Retrospective Analysis for Research and Applicationsmore » (MERRA); (ii) an analog ensemble methodology based on the MERRA, which provides both deterministic and probabilistic predictions; and (iii) a fine resolution (2-km) NWP data set, the Wind Integration National Dataset (WIND) Toolkit, based on the Weather Research and Forecasting model. Results show that: (i) as expected, the analog ensemble and WIND Toolkit perform significantly better than MERRA confirming their ability to downscale coarse estimates; (ii) the analog ensemble provides the best estimate of the multi-year wind distribution at seven of the nine sites, while the WIND Toolkit is the best at one site; (iii) the WIND Toolkit is more accurate in estimating the distribution of hourly wind speed differences, which characterizes the wind variability, at five of the available sites, with the analog ensemble being best at the remaining four locations; and (iv) the analog ensemble computational cost is negligible, whereas the WIND Toolkit requires large computational resources. Future efforts could focus on the combination of the analog ensemble with intermediate resolution (e.g., 10-15 km) NWP estimates, to considerably reduce the computational burden, while providing accurate deterministic estimates and reliable probabilistic assessments.« less

  17. The current total economic burden of diabetes mellitus in the Netherlands.

    PubMed

    Peters, M L; Huisman, E L; Schoonen, M; Wolffenbuttel, B H R

    2017-09-01

    Insight into the total economic burden of diabetes mellitus (DM) is essential for decision makers and payers. Currently available estimates for the Netherlands only include part of the total burden or are no longer up-to-date. Therefore, this study aimed to determine the current total economic burden of DM and its complications in the Netherlands, by including all the relevant cost components. The study combined a systematic literature review to identify all relevant published information and a targeted review to identify relevant information in the grey literature. The identified evidence was then combined to estimate the current total economic burden. In 2016, there were an estimated 1.1 million DM patients in the Netherlands, of whom approximately 10% had type 1 and 90% had type 2 DM. The estimated current total economic burden of DM was € 6.8 billion in 2016. Healthcare costs (excluding costs of complications) were € 1.6 billion, direct costs of complications were € 1.3 billion and indirect costs due to productivity losses, welfare payments and complications were € 4.0 billion. DM and its complications pose a substantial economic burden to the Netherlands, which is expected to rise due to changing demographics and lifestyle. Indirect costs, such as welfare payments, accounted for a large portion of the current total economic burden of DM, while these cost components are often not included in cost estimations. Publicly available data for key cost drivers such as complications were scarce.

  18. Caregiver Burden in Semantic Dementia with Right- and Left-Sided Predominant Cerebral Atrophy and in Behavioral-Variant Frontotemporal Dementia.

    PubMed

    Koyama, Asuka; Hashimoto, Mamoru; Fukuhara, Ryuji; Ichimi, Naoko; Takasaki, Akihiro; Matsushita, Masateru; Ishikawa, Tomohisa; Tanaka, Hibiki; Miyagawa, Yusuke; Ikeda, Manabu

    2018-01-01

    Caregiver burden is a serious concern for family caregivers of dementia patients, but its nature is unclear in patients with semantic dementia (SD). This study aimed to clarify caregiver burden for right- (R > L) and left-sided (L > R) predominant SD versus behavioral-variant frontotemporal dementia (bvFTD) patients. Using the Japanese version of the Zarit Burden Interview (ZBI) and the Neuropsychiatric Inventory, we examined caregiver burden and behavioral and psychological symptoms of dementia (BPSD) in 43 first-visit outpatient/family caregiver dyads (bvFTD, 20 dyads; SD [L > R], 13 dyads; SD [R > L], 10 dyads). We found a significant difference in ZBI score between the 3 diagnostic groups. Post hoc tests revealed a significantly higher ZBI score in the bvFTD than in the SD (L > R) group. The ZBI scores in the SD (L > R) and SD (R > L) groups were not significantly different, although the effect size was large. Caregiver burden was significantly correlated with BPSD scores in all groups and was correlated with activities of daily living and instrumental activities of daily living decline in the bvFTD and SD (R > L) groups. Caregiver burden was highest in the bvFTD group, comparatively high in the SD (R > L) group, and lowest in the SD (L > R) group. Adequate support and intervention for caregivers should be tailored to differences in caregiver burden between these patient groups.

  19. Building a measurement framework of burden of treatment in complex patients with chronic conditions: a qualitative study.

    PubMed

    Eton, David T; Ramalho de Oliveira, Djenane; Egginton, Jason S; Ridgeway, Jennifer L; Odell, Laura; May, Carl R; Montori, Victor M

    2012-01-01

    Burden of treatment refers to the workload of health care as well as its impact on patient functioning and well-being. We set out to build a conceptual framework of issues descriptive of burden of treatment from the perspective of the complex patient, as a first step in the development of a new patient-reported measure. We conducted semistructured interviews with patients seeking medication therapy management services at a large, academic medical center. All patients had a complex regimen of self-care (including polypharmacy), and were coping with one or more chronic health conditions. We used framework analysis to identify and code themes and subthemes. A conceptual framework of burden of treatment was outlined from emergent themes and subthemes. Thirty-two patients (20 female, 12 male, age 26-85 years) were interviewed. Three broad themes of burden of treatment emerged including: the work patients must do to care for their health; problem-focused strategies and tools to facilitate the work of self-care; and factors that exacerbate the burden felt. The latter theme encompasses six subthemes including challenges with taking medication, emotional problems with others, role and activity limitations, financial challenges, confusion about medical information, and health care delivery obstacles. We identified several key domains and issues of burden of treatment amenable to future measurement and organized them into a conceptual framework. Further development work on this conceptual framework will inform the derivation of a patient-reported measure of burden of treatment.

  20. Finding the Right Fit: Assessing the Impact of Traditional v. Large Lecture/Small Lab Course Formats on a General Education Course

    ERIC Educational Resources Information Center

    Wildermuth, Susan M.; French, Tammy; Fredrick, Edward

    2013-01-01

    This study explores alternative approaches for teaching general education courses burdened with serving extremely large enrollments. It compares the effectiveness of a self-contained course in which each course section is taught by one instructor to a large lecture/small lab format in which all course enrollees attend one large lecture section and…

  1. Symptoms of depression in non-routine caregivers: the role of caregiver strain and burden.

    PubMed

    Phillips, Anna C; Gallagher, Stephen; Hunt, Kate; Der, Geoff; Carroll, Douglas

    2009-11-01

    The origins and persistence of psychological morbidity in caregivers are not fully understood. The present analysis examined the relationship between the strain and burden of caregiving and depression and anxiety in a large community sample. Social support and sleep quality were investigated as potential mediators. Cross-sectional and prospective observational study. Individuals caring for someone other than their own child (N=393) were identified from a population of 2,079. Caregiving strain and burden, social support, and sleep quality were assessed. Participants completed the hospital anxiety and depression scale at the same time and 5 years later. Caregiving strain and burden were associated with depression and anxiety symptoms cross-sectionally, and with a worsening of symptoms 5 years later. Sleep quality appeared to mediate the cross-sectional relationships. The demands of caregiving and associated sleep disruption contribute to symptoms of depression and anxiety in caregivers.

  2. Investigation of the relative fine and coarse mode aerosol loadings and properties in the Southern Arabian Gulf region

    NASA Astrophysics Data System (ADS)

    Kaku, Kathleen C.; Reid, Jeffrey S.; Reid, Elizabeth A.; Ross-Langerman, Kristy; Piketh, Stuart; Cliff, Steven; Al Mandoos, Abdulla; Broccardo, Stephen; Zhao, Yongjing; Zhang, Jianglong; Perry, Kevin D.

    2016-03-01

    The aerosol chemistry environment of the Arabian Gulf region is extraordinarily complex, with high concentrations of dust aerosols from surrounding deserts mixed with anthropogenic aerosols originating from a large petrochemical industry and pockets of highly urbanized areas. Despite the high levels of aerosols experienced by this region, little research has been done to explore the chemical composition of both the anthropogenic and mineral dust portion of the aerosol burden. The intensive portion of the United Arab Emirates Unified Aerosol Experiment (UAE2), conducted during August and September 2004 was designed in part to resolve the aerosol chemistry through the use of multiple size-segregated aerosol samplers. The coarse mode mass (derived by subtracting the PM2.5 aerosol mass from the PM10 mass) is largely dust at 76% ± 7% of the total coarse mode mass, but is significantly impacted by anthropogenic pollution, primarily sulfate and nitrate. The PM2.5 aerosol mass also contains a large dust burden, at 38% ± 26%, but the anthropogenic component dominates. The total aerosol burden has significant impact not only on the atmosphere, but also the local population, as the air quality levels for both the PM10 and PM2.5 aerosol masses reached unhealthy levels for 24% of the days sampled.

  3. The Convergence of High Performance Computing and Large Scale Data Analytics

    NASA Astrophysics Data System (ADS)

    Duffy, D.; Bowen, M. K.; Thompson, J. H.; Yang, C. P.; Hu, F.; Wills, B.

    2015-12-01

    As the combinations of remote sensing observations and model outputs have grown, scientists are increasingly burdened with both the necessity and complexity of large-scale data analysis. Scientists are increasingly applying traditional high performance computing (HPC) solutions to solve their "Big Data" problems. While this approach has the benefit of limiting data movement, the HPC system is not optimized to run analytics, which can create problems that permeate throughout the HPC environment. To solve these issues and to alleviate some of the strain on the HPC environment, the NASA Center for Climate Simulation (NCCS) has created the Advanced Data Analytics Platform (ADAPT), which combines both HPC and cloud technologies to create an agile system designed for analytics. Large, commonly used data sets are stored in this system in a write once/read many file system, such as Landsat, MODIS, MERRA, and NGA. High performance virtual machines are deployed and scaled according to the individual scientist's requirements specifically for data analysis. On the software side, the NCCS and GMU are working with emerging commercial technologies and applying them to structured, binary scientific data in order to expose the data in new ways. Native NetCDF data is being stored within a Hadoop Distributed File System (HDFS) enabling storage-proximal processing through MapReduce while continuing to provide accessibility of the data to traditional applications. Once the data is stored within HDFS, an additional indexing scheme is built on top of the data and placed into a relational database. This spatiotemporal index enables extremely fast mappings of queries to data locations to dramatically speed up analytics. These are some of the first steps toward a single unified platform that optimizes for both HPC and large-scale data analysis, and this presentation will elucidate the resulting and necessary exascale architectures required for future systems.

  4. Highly Scalable Matching Pursuit Signal Decomposition Algorithm

    NASA Technical Reports Server (NTRS)

    Christensen, Daniel; Das, Santanu; Srivastava, Ashok N.

    2009-01-01

    Matching Pursuit Decomposition (MPD) is a powerful iterative algorithm for signal decomposition and feature extraction. MPD decomposes any signal into linear combinations of its dictionary elements or atoms . A best fit atom from an arbitrarily defined dictionary is determined through cross-correlation. The selected atom is subtracted from the signal and this procedure is repeated on the residual in the subsequent iterations until a stopping criterion is met. The reconstructed signal reveals the waveform structure of the original signal. However, a sufficiently large dictionary is required for an accurate reconstruction; this in return increases the computational burden of the algorithm, thus limiting its applicability and level of adoption. The purpose of this research is to improve the scalability and performance of the classical MPD algorithm. Correlation thresholds were defined to prune insignificant atoms from the dictionary. The Coarse-Fine Grids and Multiple Atom Extraction techniques were proposed to decrease the computational burden of the algorithm. The Coarse-Fine Grids method enabled the approximation and refinement of the parameters for the best fit atom. The ability to extract multiple atoms within a single iteration enhanced the effectiveness and efficiency of each iteration. These improvements were implemented to produce an improved Matching Pursuit Decomposition algorithm entitled MPD++. Disparate signal decomposition applications may require a particular emphasis of accuracy or computational efficiency. The prominence of the key signal features required for the proper signal classification dictates the level of accuracy necessary in the decomposition. The MPD++ algorithm may be easily adapted to accommodate the imposed requirements. Certain feature extraction applications may require rapid signal decomposition. The full potential of MPD++ may be utilized to produce incredible performance gains while extracting only slightly less energy than the standard algorithm. When the utmost accuracy must be achieved, the modified algorithm extracts atoms more conservatively but still exhibits computational gains over classical MPD. The MPD++ algorithm was demonstrated using an over-complete dictionary on real life data. Computational times were reduced by factors of 1.9 and 44 for the emphases of accuracy and performance, respectively. The modified algorithm extracted similar amounts of energy compared to classical MPD. The degree of the improvement in computational time depends on the complexity of the data, the initialization parameters, and the breadth of the dictionary. The results of the research confirm that the three modifications successfully improved the scalability and computational efficiency of the MPD algorithm. Correlation Thresholding decreased the time complexity by reducing the dictionary size. Multiple Atom Extraction also reduced the time complexity by decreasing the number of iterations required for a stopping criterion to be reached. The Course-Fine Grids technique enabled complicated atoms with numerous variable parameters to be effectively represented in the dictionary. Due to the nature of the three proposed modifications, they are capable of being stacked and have cumulative effects on the reduction of the time complexity.

  5. Medical students as human subjects in educational research

    PubMed Central

    Sarpel, Umut; Hopkins, Mary Ann; More, Frederick; Yavner, Steven; Pusic, Martin; Nick, Michael W.; Song, Hyuksoon; Ellaway, Rachel; Kalet, Adina L.

    2013-01-01

    Introduction Special concerns often arise when medical students are themselves the subjects of education research. A recently completed large, multi-center randomized controlled trial of computer-assisted learning modules for surgical clerks provided the opportunity to explore the perceived level of risk of studies where medical students serve as human subjects by reporting on: 1) the response of Institutional Review Boards (IRBs) at seven institutions to the same study protocol; and 2) the thoughts and feelings of students across study sites about being research subjects. Methods From July 2009 to August 2010, all third-year medical students at seven collaborating institutions were eligible to participate. Patterns of IRB review of the same protocol were compared. Participation burden was calculated in terms of the time spent interacting with the modules. Focus groups were conducted with medical students at each site. Transcripts were coded by three independent reviewers and analyzed using Atlas.ti. Results The IRBs at the seven participating institutions granted full (n=1), expedited (n=4), or exempt (n=2) review of the WISE Trial protocol. 995 (73% of those eligible) consented to participate, and 207 (20%) of these students completed all outcome measures. The average time to complete the computer modules and associated measures was 175 min. Common themes in focus groups with participant students included the desire to contribute to medical education research, the absence of coercion to consent, and the low-risk nature of the research. Discussion Our findings demonstrate that risk assessment and the extent of review utilized for medical education research vary among IRBs. Despite variability in the perception of risk implied by differing IRB requirements, students themselves felt education research was low risk and did not consider themselves to be vulnerable. The vast majority of eligible medical students were willing to participate as research subjects. Participants acknowledged the time demands of their participation and were readily able to withdraw when those burdens became unsustainable. PMID:23443075

  6. Knowledge management: an application to wildfire prevention planning

    Treesearch

    Daniel L Schmoldt

    1989-01-01

    Residential encroachment into wildland areas places an additional burden on fire management activities. Prevention programs, fuel management efforts, and suppression strategies, previously employed in wildland areas, require modification for protection of increased values at risk in this interface area. Knowledge-based computer systems are being investigated as...

  7. Disease Burden of 32 Infectious Diseases in the Netherlands, 2007-2011.

    PubMed

    van Lier, Alies; McDonald, Scott A; Bouwknegt, Martijn; Kretzschmar, Mirjam E; Havelaar, Arie H; Mangen, Marie-Josée J; Wallinga, Jacco; de Melker, Hester E

    2016-01-01

    Infectious disease burden estimates provided by a composite health measure give a balanced view of the true impact of a disease on a population, allowing the relative impact of diseases that differ in severity and mortality to be monitored over time. This article presents the first national disease burden estimates for a comprehensive set of 32 infectious diseases in the Netherlands. The average annual disease burden was computed for the period 2007-2011 for selected infectious diseases in the Netherlands using the disability-adjusted life years (DALY) measure. The pathogen- and incidence-based approach was adopted to quantify the burden due to both morbidity and premature mortality associated with all short and long-term consequences of infection. Natural history models, disease progression probabilities, disability weights, and other parameters were adapted from previous research. Annual incidence was obtained from statutory notification and other surveillance systems, which was corrected for under-ascertainment and under-reporting. The highest average annual disease burden was estimated for invasive pneumococcal disease (9444 DALYs/year; 95% uncertainty interval [UI]: 8911-9961) and influenza (8670 DALYs/year; 95% UI: 8468-8874), which represents 16% and 15% of the total burden of all 32 diseases, respectively. The remaining 30 diseases ranked by number of DALYs/year from high to low were: HIV infection, legionellosis, toxoplasmosis, chlamydia, campylobacteriosis, pertussis, tuberculosis, hepatitis C infection, Q fever, norovirus infection, salmonellosis, gonorrhoea, invasive meningococcal disease, hepatitis B infection, invasive Haemophilus influenzae infection, shigellosis, listeriosis, giardiasis, hepatitis A infection, infection with STEC O157, measles, cryptosporidiosis, syphilis, rabies, variant Creutzfeldt-Jakob disease, tetanus, mumps, rubella, diphtheria, and poliomyelitis. The very low burden for the latter five diseases can be attributed to the National Immunisation Programme. The average disease burden per individual varied from 0.2 (95% UI: 0.1-0.4) DALYs per 100 infections for giardiasis, to 5081 and 3581 (95% UI: 3540-3611) DALYs per 100 infections for rabies and variant Creutzfeldt-Jakob disease, respectively. For guiding and supporting public health policy decisions regarding the prioritisation of interventions and preventive measures, estimates of disease burden and the comparison of burden between diseases can be informative. Although the collection of disease-specific parameters and estimation of incidence is a process subject to continuous improvement, the current study established a baseline for assessing the impact of future public health initiatives.

  8. Using Commercial-off-the-Shelf Computer Games to Train and Educate Complexity and Complex Decision-Making

    DTIC Science & Technology

    2008-09-01

    Jean Piaget is one of the pioneers of constructivist learning theory , Piaget states that knowledge is constructed and learning occurs through an...the mechanics of each game. For instance, if a training program is developed around the u.S. Army’s America ’ s Army computer games then little funds...gathering and maintaining the data needed. and C04󈧏pIeting and reviewing this collection of information. Send OOIT’II’lents regarding thi s burden

  9. Maximizing Computational Capability with Minimal Power

    DTIC Science & Technology

    2009-03-01

    Chip -Scale Energy and Power... and Heat Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of...OpticalBench Mounting Posts Imager Chip LCDinterfaced withthecomputer P o l a r i z e r P o l a r i z e r XYZ Translator Optical Slide VMM Computational Pixel...Signal routing power / memory: ? Power does not include comm off chip (i.e. accessing memory) Power = ½ C Vdd2 f for CMOS Chip to Chip (10pF load min

  10. International Symposium on 21st Century Challenges in Computational Engineering and Science

    DTIC Science & Technology

    2010-02-26

    REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 The public reporting burden for this collection of information is estimated to average 1 ...it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ORGANIZATION. 1 . REPORT DATE (DD-MM-YYYY) 26...CHALLENGES IN COMPUTATIONAL ENGINEERING AND SCIENCE 5a. CONTRACT NUMBER 5b. GRANT NUMBER FA9550-09- 1 -0648 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S

  11. Assessing self-care and social function using a computer adaptive testing version of the pediatric evaluation of disability inventory.

    PubMed

    Coster, Wendy J; Haley, Stephen M; Ni, Pengsheng; Dumas, Helene M; Fragala-Pinkham, Maria A

    2008-04-01

    To examine score agreement, validity, precision, and response burden of a prototype computer adaptive testing (CAT) version of the self-care and social function scales of the Pediatric Evaluation of Disability Inventory compared with the full-length version of these scales. Computer simulation analysis of cross-sectional and longitudinal retrospective data; cross-sectional prospective study. Pediatric rehabilitation hospital, including inpatient acute rehabilitation, day school program, outpatient clinics; community-based day care, preschool, and children's homes. Children with disabilities (n=469) and 412 children with no disabilities (analytic sample); 38 children with disabilities and 35 children without disabilities (cross-validation sample). Not applicable. Summary scores from prototype CAT applications of each scale using 15-, 10-, and 5-item stopping rules; scores from the full-length self-care and social function scales; time (in seconds) to complete assessments and respondent ratings of burden. Scores from both computer simulations and field administration of the prototype CATs were highly consistent with scores from full-length administration (r range, .94-.99). Using computer simulation of retrospective data, discriminant validity, and sensitivity to change of the CATs closely approximated that of the full-length scales, especially when the 15- and 10-item stopping rules were applied. In the cross-validation study the time to administer both CATs was 4 minutes, compared with over 16 minutes to complete the full-length scales. Self-care and social function score estimates from CAT administration are highly comparable with those obtained from full-length scale administration, with small losses in validity and precision and substantial decreases in administration time.

  12. Impact of injection therapy on retinal patients with diabetic macular edema or retinal vein occlusion.

    PubMed

    Sivaprasad, Sobha; Oyetunde, Sesan

    2016-01-01

    An important factor in the choice of therapy is the impact it has on the patient's quality of life. This survey aimed to understand treatment burden, treatment-related anxiety and worry, and practical issues such as appointment attendance and work absence in patients receiving injection therapy for diabetic macular edema (DME) or retinal vein occlusion (RVO). A European sample of 131 retinal patients completed a detailed questionnaire to elucidate the impact of injection therapy on individuals with DME or RVO. RVO and DME greatly impact a patient's quality of life. An intensive injection regimen and the requirements for multiple hospital visits place a large practical burden on the patient. Each intravitreal injection appointment (including travel time) was reported to take an average of 4.5 hours, with a total appointment burden over 6 months of 13.5 hours and 20 hours for RVO and DME patients, respectively. This creates a significant burden on patient time and may make appointment attendance difficult. Indeed, 53% of working patients needed to take at least 1 day off work per appointment and 71% of patients required a carer's assistance at the time of the injection appointment, ~6.3 hours per injection. In addition to practical issues, three-quarters of patients reported experiencing anxiety about their most recent injection treatment, with 54% of patients reporting that they were anxious for at least 2 days prior to the injection. Patients' most desired improvement to their treatment regimen was to have fewer injections and to require fewer appointments, to achieve the same visual results. Patients' quality of life is clearly very affected by having to manage an intensive intravitreal injection regimen, with a considerable treatment burden having a large negative effect. Reducing the appointment burden to achieve the same visual outcomes and the provision of additional support for patients to attend appointments would greatly benefit those receiving intravitreal injection therapies for DME and RVO.

  13. Who You Gonna Call? Responding to a Medical Emergency with the Strategic National Stockpile

    DTIC Science & Technology

    2004-06-01

    pharmaceuticals. The logistical burden on the local and state responders who receive, dispense, and distribute the SNS is considerable. The time and...bulk drugs in the SNS are repackaged on- site , the labeling machines that accompany each of the push packs can be programmed to provide the required...burden on local responders , who already will be facing a significant task in dispensing the SNS materiel to a (potentially) large affected population

  14. Under-reporting of sputum smear-positive tuberculosis cases in Kenya.

    PubMed

    Tollefson, D; Ngari, F; Mwakala, M; Gethi, D; Kipruto, H; Cain, K; Bloss, E

    2016-10-01

    Although an estimated three million tuberculosis (TB) cases worldwide are missed by national TB programs annually, the level of under-reporting of diagnosed cases in high TB burden settings is largely unknown. To quantify and describe under-reporting of sputum smear-positive TB cases in Kenya. A national-level retrospective TB inventory study was conducted. All sputum smear-positive TB cases diagnosed by public or private laboratories during 1 April-30 June 2013 were extracted from laboratory registers in 73 randomly sampled subcounties and matched to TB cases in the national TB surveillance system (TIBU). Bivariate and multivariate analyses were conducted. In the subcounties sampled, 715 of 3409 smear-positive TB cases in laboratory registers were not found in TIBU. The estimated level of under-reporting of smear-positive TB cases in Kenya was 20.7% (95%CI 18.4-23.0). Under-reporting was greatest in subcounties with a high TB burden. Unreported cases were more likely to be patients aged ⩾55 years, have scanty smear results, and be diagnosed at large facilities, private facilities, and facilities in high TB burden regions. In Kenya, one fifth of smear-positive TB cases diagnosed during the study period went unreported, suggesting that the true TB burden is higher than reported. TB surveillance in Kenya should be strengthened to ensure all diagnosed TB cases are reported.

  15. Controlling measles using supplemental immunization activities: A mathematical model to inform optimal policy

    PubMed Central

    Verguet, Stéphane; Johri, Mira; Morris, Shaun K.; Gauvreau, Cindy L.; Jha, Prabhat; Jit, Mark

    2015-01-01

    Background The Measles & Rubella Initiative, a broad consortium of global health agencies, has provided support to measles-burdened countries, focusing on sustaining high coverage of routine immunization of children and supplementing it with a second dose opportunity for measles vaccine through supplemental immunization activities (SIAs). We estimate optimal scheduling of SIAs in countries with the highest measles burden. Methods We develop an age-stratified dynamic compartmental model of measles transmission. We explore the frequency of SIAs in order to achieve measles control in selected countries and two Indian states with high measles burden. Specifically, we compute the maximum allowable time period between two consecutive SIAs to achieve measles control. Results Our analysis indicates that a single SIA will not control measles transmission in any of the countries with high measles burden. However, regular SIAs at high coverage levels are a viable strategy to prevent measles outbreaks. The periodicity of SIAs differs between countries and even within a single country, and is determined by population demographics and existing routine immunization coverage. Conclusions Our analysis can guide country policymakers deciding on the optimal scheduling of SIA campaigns and the best combination of routine and SIA vaccination to control measles. PMID:25541214

  16. Serum N-propeptide of collagen IIA (PIIANP) as a marker of radiographic osteoarthritis burden.

    PubMed

    Daghestani, Hikmat N; Jordan, Joanne M; Renner, Jordan B; Doherty, Michael; Wilson, A Gerry; Kraus, Virginia B

    2017-01-01

    Cartilage homeostasis relies on a balance of catabolism and anabolism of cartilage matrix. Our goal was to evaluate the burden of radiographic osteoarthritis and serum levels of type IIA procollagen amino terminal propeptide (sPIIANP), a biomarker representing type II collagen synthesis, in osteoarthritis. OA burden was quantified on the basis of radiographic features as total joint faces with an osteophyte, joint space narrowing, or in the spine, disc space narrowing. sPIIANP was measured in 1,235 participants from the Genetics of Generalized Osteoarthritis study using a competitive enzyme-linked immunosorbent assay. Separate multivariable linear regression models, adjusted for age, sex, and body mass index and additionally for ipsilateral osteophytes or joint/disc space narrowing, were used to assess the independent association of sPIIANP with osteophytes and with joint/disc space narrowing burden in knees, hips, hands and spine, individually and together. After full adjustment, sPIIANP was significantly associated with a lesser burden of hip joint space narrowing and knee osteophytes. sPIIANP was associated with a lesser burden of hand joint space narrowing but a greater burden of hand osteophytes; these results were only evident upon adjustment for osteoarthritic features in all other joints. There were no associations of sPIIANP and features of spine osteoarthritis. Higher cartilage collagen synthesis, as reflected in systemic PIIANP concentrations, was associated with lesser burden of osteoarthritic features in lower extremity joints (knees and hips), even accounting for osteoarthritis burden in hands and spine, age, sex and body mass index. These results suggest that pro-anabolic agents may be appropriate for early treatment to prevent severe lower extremity large joint osteoarthritis.

  17. A Large-Scale Design Integration Approach Developed in Conjunction with the Ares Launch Vehicle Program

    NASA Technical Reports Server (NTRS)

    Redmon, John W.; Shirley, Michael C.; Kinard, Paul S.

    2012-01-01

    This paper presents a method for performing large-scale design integration, taking a classical 2D drawing envelope and interface approach and applying it to modern three dimensional computer aided design (3D CAD) systems. Today, the paradigm often used when performing design integration with 3D models involves a digital mockup of an overall vehicle, in the form of a massive, fully detailed, CAD assembly; therefore, adding unnecessary burden and overhead to design and product data management processes. While fully detailed data may yield a broad depth of design detail, pertinent integration features are often obscured under the excessive amounts of information, making them difficult to discern. In contrast, the envelope and interface method results in a reduction in both the amount and complexity of information necessary for design integration while yielding significant savings in time and effort when applied to today's complex design integration projects. This approach, combining classical and modern methods, proved advantageous during the complex design integration activities of the Ares I vehicle. Downstream processes, benefiting from this approach by reducing development and design cycle time, include: Creation of analysis models for the Aerodynamic discipline; Vehicle to ground interface development; Documentation development for the vehicle assembly.

  18. CHESS improves cancer caregivers' burden and mood: results of an eHealth RCT.

    PubMed

    DuBenske, Lori L; Gustafson, David H; Namkoong, Kang; Hawkins, Robert P; Atwood, Amy K; Brown, Roger L; Chih, Ming-Yuan; McTavish, Fiona; Carmack, Cindy L; Buss, Mary K; Govindan, Ramaswamy; Cleary, James F

    2014-10-01

    Informal caregivers (family and friends) of people with cancer are often unprepared for their caregiving role, leading to increased burden or distress. Comprehensive Health Enhancement Support System (CHESS) is a Web-based lung cancer information, communication, and coaching system for caregivers. This randomized trial reports the impact on caregiver burden, disruptiveness, and mood of providing caregivers access to CHESS versus the Internet with a list of recommended lung cancer websites. A total of 285 informal caregivers of patients with advanced nonsmall cell lung cancer were randomly assigned to a comparison group that received Internet or a treatment group that received Internet and CHESS. Caregivers were provided a computer and Internet service if needed. Written surveys were completed at pretest and during the intervention period bimonthly for up to 24 months. Analyses of covariance (ANCOVAs) compared the intervention's effect on caregivers' disruptiveness and burden (CQOLI-C), and negative mood (combined Anxiety, Depression, and Anger scales of the POMS) at 6 months, controlling for blocking variables (site, caregiver's race, and relationship to patient) and the given outcome at pretest. Caregivers randomized to CHESS reported lower burden, t(84) = 2.36, p = .021, d = .39, and negative mood, t(86) = 2.82, p = .006, d = .44, than those in the Internet group. The effect on disruptiveness was not significant. Although caring for someone with a terminal illness will always exact a toll on caregivers, eHealth interventions like CHESS may improve caregivers' understanding and coping skills and, as a result, ease their burden and mood.

  19. CHESS Improves Cancer Caregivers’ Burden and Mood: Results of an eHealth RCT

    PubMed Central

    DuBenske, Lori L.; Gustafson, David H.; Namkoong, Kang; Hawkins, Robert P.; Atwood, Amy K.; Brown, Roger L.; Chih, Ming-Yuan; McTavish, Fiona; Carmack, Cindy L.; Buss, Mary K.; Govindan, Ramaswamy; Cleary, James F.

    2014-01-01

    Objective Informal caregivers (family and friends) of people with cancer are often unprepared for their caregiving role, leading to increased burden or distress. CHESS (Comprehensive Health Enhancement Support System) is a web-based lung cancer information, communication and coaching system for caregivers. This randomized trial reports the impact on caregiver burden, disruptiveness and mood of providing caregivers access to CHESS versus the Internet with a list of recommended lung cancer websites. Methods 285 informal caregivers of patients with advanced non-small cell lung cancer were randomly assigned to a comparison group that received Internet or a treatment group that received Internet and CHESS. Caregivers were provided a computer and Internet service if needed. Written surveys were completed at pretest and during the intervention period bimonthly for up to 24 months. ANCOVA analyses compared the intervention’s effect on caregivers’ disruptiveness and burden (CQOLI-C), and negative mood (combined Anxiety, Depression, and Anger scales of the POMS) at six months, controlling for blocking variables (site, caregiver’s race, and relationship to patient) and the given outcome at pretest. Results Caregivers randomized to CHESS reported lower burden [t (84) = 2.36, p = .021, d= .39] and negative mood [t (86) = 2.82, p = .006, d= .44] than those in the Internet group. The effect on disruptiveness was not significant. Conclusions Although caring for someone with a terminal illness will always exact a toll on caregivers, eHealth interventions like CHESS may improve caregivers’ understanding and coping skills and, as a result, ease their burden and mood. PMID:24245838

  20. Environmental Health Related Socio-Spatial Inequalities: Identifying “Hotspots” of Environmental Burdens and Social Vulnerability

    PubMed Central

    Shrestha, Rehana; Flacke, Johannes; Martinez, Javier; van Maarseveen, Martin

    2016-01-01

    Differential exposure to multiple environmental burdens and benefits and their distribution across a population with varying vulnerability can contribute heavily to health inequalities. Particularly relevant are areas with high cumulative burdens and high social vulnerability termed as “hotspots”. This paper develops an index-based approach to assess these multiple burdens and benefits in combination with vulnerability factors at detailed intra-urban level. The method is applied to the city of Dortmund, Germany. Using non-spatial and spatial methods we assessed inequalities and identified “hotspot” areas in the city. We found modest inequalities burdening higher vulnerable groups in Dortmund (CI = −0.020 at p < 0.05). At the detailed intra-urban level, however, inequalities showed strong geographical patterns. Large numbers of “hotspots” exist in the northern part of the city compared to the southern part. A holistic assessment, particularly at a detailed local level, considering both environmental burdens and benefits and their distribution across the population with the different vulnerability, is essential to inform environmental justice debates and to mobilize local stakeholders. Locating “hotspot” areas at this detailed spatial level can serve as a basis to develop interventions that target vulnerable groups to ensure a health conducive equal environment. PMID:27409625

  1. Information Operations and FATA Integration into the National Mainstream

    DTIC Science & Technology

    2012-09-01

    Edward L. Fisher THIS PAGE INTENTIONALLY LEFT BLANK i REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting burden for this...LEFT BLANK vii TABLE OF CONTENTS I . INTRODUCTION ............................................................................................. 1...39 ix i . Computer Network Operations .................................. 40 j. CNO as an IO Core Capability .................................... 40

  2. Design Matters: The Impact of CAPI on Interview Length

    ERIC Educational Resources Information Center

    Watson, Nicole; Wilkins, Roger

    2015-01-01

    Computer-assisted personal interviewing (CAPI) offers many attractive benefits over paper-and-pencil interviewing. There is, however, mixed evidence on the impact of CAPI on interview "length," an important survey outcome in the context of length limits imposed by survey budgets and concerns over respondent burden. In this article,…

  3. Thermal Cycle Annealing and its Application to Arsenic-Ion Implanted HgCdTe

    DTIC Science & Technology

    2014-06-26

    Rao Mulpuri Sina Simingalam, Priyalal Wijewarnasuriya, Mulpuri V. Rao 1720BH c. THIS PAGE The public reporting burden for this collection of...Implanted HgCdTe Sina Simingalama,b,c, Priyalal Wijewarnasuriyab, Mulpuri V. Raoc a. School of Physics, Astronomy and Computational Sciences, George

  4. The Design and Development of an Evaluation System for Online Instruction.

    ERIC Educational Resources Information Center

    Wentling, Tim L.; Johnson, Scott D.

    This paper describes the conceptualization and development of an evaluation system that can be used to monitor and evaluate online instructional efforts. The evaluation system addresses concerns of both program administrators and course instructors. Computer technology is used to provide partial automation to reduce respondent burden and to…

  5. 45 CFR 74.51 - Monitoring and reporting program performance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... quantitative data should be related to cost data for computation of unit costs. (2) Reasons why established..., analysis and explanation of cost overruns or high unit costs. (e) Recipients shall submit the original and..., “Controlling Paperwork Burdens on the Public,” when requesting performance data from recipients. ...

  6. Salient regions detection using convolutional neural networks and color volume

    NASA Astrophysics Data System (ADS)

    Liu, Guang-Hai; Hou, Yingkun

    2018-03-01

    Convolutional neural network is an important technique in machine learning, pattern recognition and image processing. In order to reduce the computational burden and extend the classical LeNet-5 model to the field of saliency detection, we propose a simple and novel computing model based on LeNet-5 network. In the proposed model, hue, saturation and intensity are utilized to extract depth cues, and then we integrate depth cues and color volume to saliency detection following the basic structure of the feature integration theory. Experimental results show that the proposed computing model outperforms some existing state-of-the-art methods on MSRA1000 and ECSSD datasets.

  7. A call to strengthen the global strategy against schistosomiasis and soil-transmitted helminthiasis: the time is now.

    PubMed

    Lo, Nathan C; Addiss, David G; Hotez, Peter J; King, Charles H; Stothard, J Russell; Evans, Darin S; Colley, Daniel G; Lin, William; Coulibaly, Jean T; Bustinduy, Amaya L; Raso, Giovanna; Bendavid, Eran; Bogoch, Isaac I; Fenwick, Alan; Savioli, Lorenzo; Molyneux, David; Utzinger, Jürg; Andrews, Jason R

    2017-02-01

    In 2001, the World Health Assembly (WHA) passed the landmark WHA 54.19 resolution for global scale-up of mass administration of anthelmintic drugs for morbidity control of schistosomiasis and soil-transmitted helminthiasis, which affect more than 1·5 billion of the world's poorest people. Since then, more than a decade of research and experience has yielded crucial knowledge on the control and elimination of these helminthiases. However, the global strategy has remained largely unchanged since the original 2001 WHA resolution and associated WHO guidelines on preventive chemotherapy. In this Personal View, we highlight recent advances that, taken together, support a call to revise the global strategy and guidelines for preventive chemotherapy and complementary interventions against schistosomiasis and soil-transmitted helminthiasis. These advances include the development of guidance that is specific to goals of morbidity control and elimination of transmission. We quantify the result of forgoing this opportunity by computing the yearly disease burden, mortality, and lost economic productivity associated with maintaining the status quo. Without change, we estimate that the population of sub-Saharan Africa will probably lose 2·3 million disability-adjusted life-years and US$3·5 billion of economic productivity every year, which is comparable to recent acute epidemics, including the 2014 Ebola and 2015 Zika epidemics. We propose that the time is now to strengthen the global strategy to address the substantial disease burden of schistosomiasis and soil-transmitted helminthiasis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. A call to strengthen the global strategy for schistosomiasis and soil-transmitted helminthiasis: the time is now

    PubMed Central

    Lo, Nathan C.; Addiss, David G.; Hotez, Peter J.; King, Charles H.; Stothard, J. Russell; Evans, Darin S.; Colley, Daniel G.; Lin, William; Coulibaly, Jean T.; Bustinduy, Amaya L.; Raso, Giovanna; Bendavid, Eran; Bogoch, Isaac I.; Fenwick, Alan; Savioli, Lorenzo; Molyneux, David; Utzinger, Jürg; Andrews, Jason R.

    2016-01-01

    Summary In 2001, the World Health Assembly (WHA) passed the landmark WHA 54.19 resolution for global scale up of mass administration of anthelminthic drugs for morbidity control of schistosomiasis and soil-transmitted helminthiasis (STH), which affect over 1.5 billion of the world's poorest people. Since then, over a decade of research and experience has yielded critical new knowledge on the control and elimination of these helminthiases. However, the global strategy has remained largely unchanged since the original 2001 WHA resolution and associated World Health Organization (WHO) guidelines on preventive chemotherapy. Here, we highlight recent advances that, taken together, support a call to revise the global strategy and guidelines for preventive chemotherapy and complementary interventions against schistosomiasis and STH. This includes the development of guidance that is specific to goals of “morbidity control” and “elimination of transmission.” We quantify the result of forgoing this opportunity by computing the yearly disease burden, mortality, and lost economic productivity associated with maintaining status quo. Without change, we estimate that the population of sub-Saharan Africa will likely lose 2.3 million disability-adjusted life years and US$3.5 billion of economic productivity every year, which is comparable to recent acute epidemics, including the 2014 Ebola and 2015 Zika epidemics. We propose that the time is now to strengthen the global strategy to address the substantial disease burden of schistosomiasis and STH. PMID:27914852

  9. Using an Explicit Emission Tagging Method in Global Modeling of Source-Receptor Relationships for Black Carbon in the Arctic: Variations, Sources and Transport Pathways

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Hailong; Rasch, Philip J.; Easter, Richard C.

    2014-11-27

    We introduce an explicit emission tagging technique in the Community Atmosphere Model to quantify source-region-resolved characteristics of black carbon (BC), focusing on the Arctic. Explicit tagging of BC source regions without perturbing the emissions makes it straightforward to establish source-receptor relationships and transport pathways, providing a physically consistent and computationally efficient approach to produce a detailed characterization of the destiny of regional BC emissions and the potential for mitigation actions. Our analysis shows that the contributions of major source regions to the global BC burden are not proportional to the respective emissions due to strong region-dependent removal rates and lifetimes,more » while the contributions to BC direct radiative forcing show a near-linear dependence on their respective contributions to the burden. Distant sources contribute to BC in remote regions mostly in the mid- and upper troposphere, having much less impact on lower-level concentrations (and deposition) than on burden. Arctic BC concentrations, deposition and source contributions all have strong seasonal variations. Eastern Asia contributes the most to the wintertime Arctic burden. Northern Europe emissions are more important to both surface concentration and deposition in winter than in summer. The largest contribution to Arctic BC in the summer is from Northern Asia. Although local emissions contribute less than 10% to the annual mean BC burden and deposition within the Arctic, the per-emission efficiency is much higher than for major non-Arctic sources. The interannual variability (1996-2005) due to meteorology is small in annual mean BC burden and radiative forcing but is significant in yearly seasonal means over the Arctic. When a slow aging treatment of BC is introduced, the increase of BC lifetime and burden is source-dependent. Global BC forcing-per-burden efficiency also increases primarily due to changes in BC vertical distributions. The relative contribution from major non-Arctic sources to the Arctic BC burden increases only slightly, although the contribution of Arctic local sources is reduced by a factor of 2 due to the slow aging treatment.« less

  10. Infrastructure for Large-Scale Tests in Marine Autonomy

    DTIC Science & Technology

    2012-02-01

    suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis...8217+!0$%+()!()+($+!15+$! (#.%$&$)$-!%-!.BK*3$-(+$!$)&$-!.%$&$)+ *$+$+-3$)$$!. NHI

  11. Injury prevention and other international public health initiatives.

    PubMed

    Razzak, Junaid A; Sasser, Scott M; Kellermann, Arthur L

    2005-02-01

    Injuries, whether caused by unintentional or intentional events, area significant public health problem. The burden of injury is greatest in low-and middle-income countries and among individuals of low socioeconomic status living in high-income countries. Most of these injuries are prevent-able. Emergency physicians can play an important role in reducing the global burden of injuries by providing expert care and by identifying, implementing, and evaluating population-based countermeasures to prevent and control injuries. The strategy used in a particular country depends in large part on the nature of the local problem, the concerns of the population, the availability of resources, and competing demands. Even simple countermeasures may have a big impact in reducing the global burden of death and disability due to injury.

  12. The Gap Procedure: for the identification of phylogenetic clusters in HIV-1 sequence data.

    PubMed

    Vrbik, Irene; Stephens, David A; Roger, Michel; Brenner, Bluma G

    2015-11-04

    In the context of infectious disease, sequence clustering can be used to provide important insights into the dynamics of transmission. Cluster analysis is usually performed using a phylogenetic approach whereby clusters are assigned on the basis of sufficiently small genetic distances and high bootstrap support (or posterior probabilities). The computational burden involved in this phylogenetic threshold approach is a major drawback, especially when a large number of sequences are being considered. In addition, this method requires a skilled user to specify the appropriate threshold values which may vary widely depending on the application. This paper presents the Gap Procedure, a distance-based clustering algorithm for the classification of DNA sequences sampled from individuals infected with the human immunodeficiency virus type 1 (HIV-1). Our heuristic algorithm bypasses the need for phylogenetic reconstruction, thereby supporting the quick analysis of large genetic data sets. Moreover, this fully automated procedure relies on data-driven gaps in sorted pairwise distances to infer clusters, thus no user-specified threshold values are required. The clustering results obtained by the Gap Procedure on both real and simulated data, closely agree with those found using the threshold approach, while only requiring a fraction of the time to complete the analysis. Apart from the dramatic gains in computational time, the Gap Procedure is highly effective in finding distinct groups of genetically similar sequences and obviates the need for subjective user-specified values. The clusters of genetically similar sequences returned by this procedure can be used to detect patterns in HIV-1 transmission and thereby aid in the prevention, treatment and containment of the disease.

  13. Personalized, Shareable Geoscience Dataspaces For Simplifying Data Management and Improving Reproducibility

    NASA Astrophysics Data System (ADS)

    Malik, T.; Foster, I.; Goodall, J. L.; Peckham, S. D.; Baker, J. B. H.; Gurnis, M.

    2015-12-01

    Research activities are iterative, collaborative, and now data- and compute-intensive. Such research activities mean that even the many researchers who work in small laboratories must often create, acquire, manage, and manipulate much diverse data and keep track of complex software. They face difficult data and software management challenges, and data sharing and reproducibility are neglected. There is signficant federal investment in powerful cyberinfrastructure, in part to lesson the burden associated with modern data- and compute-intensive research. Similarly, geoscience communities are establishing research repositories to facilitate data preservation. Yet we observe a large fraction of the geoscience community continues to struggle with data and software management. The reason, studies suggest, is not lack of awareness but rather that tools do not adequately support time-consuming data life cycle activities. Through NSF/EarthCube-funded GeoDataspace project, we are building personalized, shareable dataspaces that help scientists connect their individual or research group efforts with the community at large. The dataspaces provide a light-weight multiplatform research data management system with tools for recording research activities in what we call geounits, so that a geoscientist can at any time snapshot and preserve, both for their own use and to share with the community, all data and code required to understand and reproduce a study. A software-as-a-service (SaaS) deployment model enhances usability of core components, and integration with widely used software systems. In this talk we will present the open-source GeoDataspace project and demonstrate how it is enabling reproducibility across geoscience domains of hydrology, space science, and modeling toolkits.

  14. Manyscale Computing for Sensor Processing in Support of Space Situational Awareness

    NASA Astrophysics Data System (ADS)

    Schmalz, M.; Chapman, W.; Hayden, E.; Sahni, S.; Ranka, S.

    2014-09-01

    Increasing image and signal data burden associated with sensor data processing in support of space situational awareness implies continuing computational throughput growth beyond the petascale regime. In addition to growing applications data burden and diversity, the breadth, diversity and scalability of high performance computing architectures and their various organizations challenge the development of a single, unifying, practicable model of parallel computation. Therefore, models for scalable parallel processing have exploited architectural and structural idiosyncrasies, yielding potential misapplications when legacy programs are ported among such architectures. In response to this challenge, we have developed a concise, efficient computational paradigm and software called Manyscale Computing to facilitate efficient mapping of annotated application codes to heterogeneous parallel architectures. Our theory, algorithms, software, and experimental results support partitioning and scheduling of application codes for envisioned parallel architectures, in terms of work atoms that are mapped (for example) to threads or thread blocks on computational hardware. Because of the rigor, completeness, conciseness, and layered design of our manyscale approach, application-to-architecture mapping is feasible and scalable for architectures at petascales, exascales, and above. Further, our methodology is simple, relying primarily on a small set of primitive mapping operations and support routines that are readily implemented on modern parallel processors such as graphics processing units (GPUs) and hybrid multi-processors (HMPs). In this paper, we overview the opportunities and challenges of manyscale computing for image and signal processing in support of space situational awareness applications. We discuss applications in terms of a layered hardware architecture (laboratory > supercomputer > rack > processor > component hierarchy). Demonstration applications include performance analysis and results in terms of execution time as well as storage, power, and energy consumption for bus-connected and/or networked architectures. The feasibility of the manyscale paradigm is demonstrated by addressing four principal challenges: (1) architectural/structural diversity, parallelism, and locality, (2) masking of I/O and memory latencies, (3) scalability of design as well as implementation, and (4) efficient representation/expression of parallel applications. Examples will demonstrate how manyscale computing helps solve these challenges efficiently on real-world computing systems.

  15. [Computer-aided prescribing: from utopia to reality].

    PubMed

    Suárez-Varela Ubeda, J; Beltrán Calvo, C; Molina López, T; Navarro Marín, P

    2005-05-31

    To determine whether the introduction of computer-aided prescribing helped reduce the administrative burden at primary care centers. Descriptive, cross-sectional design. Torreblanca Health Center in the province of Seville, southern Spain. From 29 October 2003 to the present a pilot project involving nine pharmacies in the basic health zone served by this health center has been running to evaluate computer-aided prescribing (the Receta XXI project) with real patients. All patients on the center's list of patients who came to the center for an administrative consultation to renew prescriptions for medications or supplies for long-term treatment. Total number of administrative visits per patient for patients who came to the center to renew prescriptions for long-term treatment, as recorded by the Diraya system (Historia Clinica Digital del Ciudadano, or Citizen's Digital Medical Record) during the period from February to July 2004. Total number of the same type of administrative visits recorded by the previous system (TASS) during the period from February to July 2003. The mean number of administrative visits per month during the period from February to July 2003 was 160, compared to a mean number of 64 visits during the period from February to July 2004. The reduction in the number of visits for prescription renewal was 60%. Introducing a system for computer-aided prescribing significantly reduced the number of administrative visits for prescription renewal for long-term treatment. This could help reduce the administrative burden considerably in primary care if the system were used in all centers.

  16. Computed tomographic-based quantification of emphysema and correlation to pulmonary function and mechanics.

    PubMed

    Washko, George R; Criner, Gerald J; Mohsenifar, Zab; Sciurba, Frank C; Sharafkhaneh, Amir; Make, Barry J; Hoffman, Eric A; Reilly, John J

    2008-06-01

    Computed tomographic based indices of emphysematous lung destruction may highlight differences in disease pathogenesis and further enable the classification of subjects with Chronic Obstructive Pulmonary Disease. While there are multiple techniques that can be utilized for such radiographic analysis, there is very little published information comparing the performance of these methods in a clinical case series. Our objective was to examine several quantitative and semi-quantitative methods for the assessment of the burden of emphysema apparent on computed tomographic scans and compare their ability to predict lung mechanics and function. Automated densitometric analysis was performed on 1094 computed tomographic scans collected upon enrollment into the National Emphysema Treatment Trial. Trained radiologists performed an additional visual grading of emphysema on high resolution CT scans. Full pulmonary function test results were available for correlation, with a subset of subjects having additional measurements of lung static recoil. There was a wide range of emphysematous lung destruction apparent on the CT scans and univariate correlations to measures of lung function were of modest strength. No single method of CT scan analysis clearly outperformed the rest of the group. Quantification of the burden of emphysematous lung destruction apparent on CT scan is a weak predictor of lung function and mechanics in severe COPD with no uniformly superior method found to perform this analysis. The CT based quantification of emphysema may augment pulmonary function testing in the characterization of COPD by providing complementary phenotypic information.

  17. Inputs for subject-specific computational fluid dynamics simulation of blood flow in the mouse aorta.

    PubMed

    Van Doormaal, Mark; Zhou, Yu-Qing; Zhang, Xiaoli; Steinman, David A; Henkelman, R Mark

    2014-10-01

    Mouse models are an important way for exploring relationships between blood hemodynamics and eventual plaque formation. We have developed a mouse model of aortic regurgitation (AR) that produces large changes in plaque burden with charges in hemodynamics [Zhou et al., 2010, "Aortic Regurgitation Dramatically Alters the Distribution of Atherosclerotic Lesions and Enhances Atherogenesis in Mice," Arterioscler. Thromb. Vasc. Biol., 30(6), pp. 1181-1188]. In this paper, we explore the amount of detail needed for realistic computational fluid dynamics (CFD) calculations in this experimental model. The CFD calculations use inputs based on experimental measurements from ultrasound (US), micro computed tomography (CT), and both anatomical magnetic resonance imaging (MRI) and phase contrast MRI (PC-MRI). The adequacy of five different levels of model complexity (a) subject-specific CT data from a single mouse; (b) subject-specific CT centerlines with radii from US; (c) same as (b) but with MRI derived centerlines; (d) average CT centerlines and averaged vessel radius and branching vessels; and (e) same as (d) but with averaged MRI centerlines) is evaluated by demonstrating their impact on relative residence time (RRT) outputs. The paper concludes by demonstrating the necessity of subject-specific geometry and recommends for inputs the use of CT or anatomical MRI for establishing the aortic centerlines, M-mode US for scaling the aortic diameters, and a combination of PC-MRI and Doppler US for estimating the spatial and temporal characteristics of the input wave forms.

  18. [Occupational burdens in special educators working with intellectually disabled students].

    PubMed

    Plichta, Piotr

    2014-01-01

    The article presents the results of psychosocial burdens in special educators (specialists in the field of oligophrenopedagogy) with intellectually disabled students. In theoretical part, specific context of occupational stress in special educators was introduced. Additionally, the need of broader research context regarding occupational stress and the risk of burnout in special educators working with intellectually disabled individuals were included. The results were obtained using Plichta and Pyzalski's Questionnaire of Occupational Burdens in Teaching (QOBT). The presented results are based on a research sample (N = 100) of special educators (female) teaching intellectually disabled students attending special schools in the city of Lódz. The obtained results were compared with the results coming from a large random sample of public school teachers working with non-intellectually disabled children from the Lodi voivodeship (N = 429) and referred to the norms of QOBT. The results show significant percentage of respondents obtaining high level of occupational burdens (conflict situations - 45%, organizational burdens - 31%, lack of work sense - 40%, global score - 40%). Seniority is not related to the level of burdens. Some significant differences concerning the level of occupational burdens between both groups of teachers were found. The study showed e.g. the strong need for supporting special educators in the workplace context and the need of implementing preventive and remedial measures at both individual and organizational levels (especially in terms of improving personal relationships in a workplace). Generally, the results show similarity of the stressors' ranking in special educators and school teachers working with non-intellectually disabled children.

  19. Estimating the burden of foodborne diseases in Japan

    PubMed Central

    Kumagai, Yuko; Gilmour, Stuart; Ota, Erika; Momose, Yoshika; Onishi, Toshiro; Bilano, Ver Luanni Feliciano; Kasuga, Fumiko; Sekizaki, Tsutomu

    2015-01-01

    Abstract Objective To assess the burden posed by foodborne diseases in Japan using methods developed by the World Health Organization’s Foodborne Disease Burden Epidemiology Reference Group (FERG). Methods Expert consultation and statistics on food poisoning during 2011 were used to identify three common causes of foodborne disease in Japan: Campylobacter and Salmonella species and enterohaemorrhagic Escherichia coli (EHEC). We conducted systematic reviews of English and Japanese literature on the complications caused by these pathogens, by searching Embase, the Japan medical society abstract database and Medline. We estimated the annual incidence of acute gastroenteritis from reported surveillance data, based on estimated probabilities that an affected person would visit a physician and have gastroenteritis confirmed. We then calculated disability-adjusted life-years (DALYs) lost in 2011, using the incidence estimates along with disability weights derived from published studies. Findings In 2011, foodborne disease caused by Campylobacter species, Salmonella species and EHEC led to an estimated loss of 6099, 3145 and 463 DALYs in Japan, respectively. These estimated burdens are based on the pyramid reconstruction method; are largely due to morbidity rather than mortality; and are much higher than those indicated by routine surveillance data. Conclusion Routine surveillance data may indicate foodborne disease burdens that are much lower than the true values. Most of the burden posed by foodborne disease in Japan comes from secondary complications. The tools developed by FERG appear useful in estimating disease burdens and setting priorities in the field of food safety. PMID:26478611

  20. A Kinetic Model Describing Injury-Burden in Team Sports.

    PubMed

    Fuller, Colin W

    2017-12-01

    Injuries in team sports are normally characterised by the incidence, severity, and location and type of injuries sustained: these measures, however, do not provide an insight into the variable injury-burden experienced during a season. Injury burden varies according to the team's match and training loads, the rate at which injuries are sustained and the time taken for these injuries to resolve. At the present time, this time-based variation of injury burden has not been modelled. To develop a kinetic model describing the time-based injury burden experienced by teams in elite team sports and to demonstrate the model's utility. Rates of injury were quantified using a large eight-season database of rugby injuries (5253) and exposure (60,085 player-match-hours) in English professional rugby. Rates of recovery from injury were quantified using time-to-recovery analysis of the injuries. The kinetic model proposed for predicting a team's time-based injury burden is based on a composite rate equation developed from the incidence of injury, a first-order rate of recovery from injury and the team's playing load. The utility of the model was demonstrated by examining common scenarios encountered in elite rugby. The kinetic model developed describes and predicts the variable injury-burden arising from match play during a season of rugby union based on the incidence of match injuries, the rate of recovery from injury and the playing load. The model is equally applicable to other team sports and other scenarios.

  1. Contributions of the ARM Program to Radiative Transfer Modeling for Climate and Weather Applications

    NASA Technical Reports Server (NTRS)

    Mlawer, Eli J.; Iacono, Michael J.; Pincus, Robert; Barker, Howard W.; Oreopoulos, Lazaros; Mitchell, David L.

    2016-01-01

    Accurate climate and weather simulations must account for all relevant physical processes and their complex interactions. Each of these atmospheric, ocean, and land processes must be considered on an appropriate spatial and temporal scale, which leads these simulations to require a substantial computational burden. One especially critical physical process is the flow of solar and thermal radiant energy through the atmosphere, which controls planetary heating and cooling and drives the large-scale dynamics that moves energy from the tropics toward the poles. Radiation calculations are therefore essential for climate and weather simulations, but are themselves quite complex even without considering the effects of variable and inhomogeneous clouds. Clear-sky radiative transfer calculations have to account for thousands of absorption lines due to water vapor, carbon dioxide, and other gases, which are irregularly distributed across the spectrum and have shapes dependent on pressure and temperature. The line-by-line (LBL) codes that treat these details have a far greater computational cost than can be afforded by global models. Therefore, the crucial requirement for accurate radiation calculations in climate and weather prediction models must be satisfied by fast solar and thermal radiation parameterizations with a high level of accuracy that has been demonstrated through extensive comparisons with LBL codes. See attachment for continuation.

  2. Space-variant filtering for correction of wavefront curvature effects in spotlight-mode SAR imagery formed via polar formatting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jakowatz, C.V. Jr.; Wahl, D.E.; Thompson, P.A.

    1996-12-31

    Wavefront curvature defocus effects can occur in spotlight-mode SAR imagery when reconstructed via the well-known polar formatting algorithm (PFA) under certain scenarios that include imaging at close range, use of very low center frequency, and/or imaging of very large scenes. The range migration algorithm (RMA), also known as seismic migration, was developed to accommodate these wavefront curvature effects. However, the along-track upsampling of the phase history data required of the original version of range migration can in certain instances represent a major computational burden. A more recent version of migration processing, the Frequency Domain Replication and Downsampling (FReD) algorithm, obviatesmore » the need to upsample, and is accordingly more efficient. In this paper the authors demonstrate that the combination of traditional polar formatting with appropriate space-variant post-filtering for refocus can be as efficient or even more efficient than FReD under some imaging conditions, as demonstrated by the computer-simulated results in this paper. The post-filter can be pre-calculated from a theoretical derivation of the curvature effect. The conclusion is that the new polar formatting with post filtering algorithm (PF2) should be considered as a viable candidate for a spotlight-mode image formation processor when curvature effects are present.« less

  3. Optimum aggregation of geographically distributed flexible resources in strategic smart-grid/microgrid locations

    DOE PAGES

    Bhattarai, Bishnu P.; Myers, Kurt S.; Bak-Jensen, Brigitte; ...

    2017-05-17

    This paper determines optimum aggregation areas for a given distribution network considering spatial distribution of loads and costs of aggregation. An elitist genetic algorithm combined with a hierarchical clustering and a Thevenin network reduction is implemented to compute strategic locations and aggregate demand within each area. The aggregation reduces large distribution networks having thousands of nodes to an equivalent network with few aggregated loads, thereby significantly reducing the computational burden. Furthermore, it not only helps distribution system operators in making faster operational decisions by understanding during which time of the day will be in need of flexibility, from which specificmore » area, and in which amount, but also enables the flexibilities stemming from small distributed resources to be traded in various power/energy markets. A combination of central and local aggregation scheme where a central aggregator enables market participation, while local aggregators materialize the accepted bids, is implemented to realize this concept. The effectiveness of the proposed method is evaluated by comparing network performances with and without aggregation. Finally, for a given network configuration, steady-state performance of aggregated network is significantly accurate (≈ ±1.5% error) compared to very high errors associated with forecast of individual consumer demand.« less

  4. Efficient simulation of intrinsic, extrinsic and external noise in biochemical systems

    PubMed Central

    Pischel, Dennis; Sundmacher, Kai; Flassig, Robert J.

    2017-01-01

    Abstract Motivation: Biological cells operate in a noisy regime influenced by intrinsic, extrinsic and external noise, which leads to large differences of individual cell states. Stochastic effects must be taken into account to characterize biochemical kinetics accurately. Since the exact solution of the chemical master equation, which governs the underlying stochastic process, cannot be derived for most biochemical systems, approximate methods are used to obtain a solution. Results: In this study, a method to efficiently simulate the various sources of noise simultaneously is proposed and benchmarked on several examples. The method relies on the combination of the sigma point approach to describe extrinsic and external variability and the τ-leaping algorithm to account for the stochasticity due to probabilistic reactions. The comparison of our method to extensive Monte Carlo calculations demonstrates an immense computational advantage while losing an acceptable amount of accuracy. Additionally, the application to parameter optimization problems in stochastic biochemical reaction networks is shown, which is rarely applied due to its huge computational burden. To give further insight, a MATLAB script is provided including the proposed method applied to a simple toy example of gene expression. Availability and implementation: MATLAB code is available at Bioinformatics online. Contact: flassig@mpi-magdeburg.mpg.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28881987

  5. Optimum aggregation of geographically distributed flexible resources in strategic smart-grid/microgrid locations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhattarai, Bishnu P.; Myers, Kurt S.; Bak-Jensen, Brigitte

    This paper determines optimum aggregation areas for a given distribution network considering spatial distribution of loads and costs of aggregation. An elitist genetic algorithm combined with a hierarchical clustering and a Thevenin network reduction is implemented to compute strategic locations and aggregate demand within each area. The aggregation reduces large distribution networks having thousands of nodes to an equivalent network with few aggregated loads, thereby significantly reducing the computational burden. Furthermore, it not only helps distribution system operators in making faster operational decisions by understanding during which time of the day will be in need of flexibility, from which specificmore » area, and in which amount, but also enables the flexibilities stemming from small distributed resources to be traded in various power/energy markets. A combination of central and local aggregation scheme where a central aggregator enables market participation, while local aggregators materialize the accepted bids, is implemented to realize this concept. The effectiveness of the proposed method is evaluated by comparing network performances with and without aggregation. Finally, for a given network configuration, steady-state performance of aggregated network is significantly accurate (≈ ±1.5% error) compared to very high errors associated with forecast of individual consumer demand.« less

  6. A Spatial Division Clustering Method and Low Dimensional Feature Extraction Technique Based Indoor Positioning System

    PubMed Central

    Mo, Yun; Zhang, Zhongzhao; Meng, Weixiao; Ma, Lin; Wang, Yao

    2014-01-01

    Indoor positioning systems based on the fingerprint method are widely used due to the large number of existing devices with a wide range of coverage. However, extensive positioning regions with a massive fingerprint database may cause high computational complexity and error margins, therefore clustering methods are widely applied as a solution. However, traditional clustering methods in positioning systems can only measure the similarity of the Received Signal Strength without being concerned with the continuity of physical coordinates. Besides, outage of access points could result in asymmetric matching problems which severely affect the fine positioning procedure. To solve these issues, in this paper we propose a positioning system based on the Spatial Division Clustering (SDC) method for clustering the fingerprint dataset subject to physical distance constraints. With the Genetic Algorithm and Support Vector Machine techniques, SDC can achieve higher coarse positioning accuracy than traditional clustering algorithms. In terms of fine localization, based on the Kernel Principal Component Analysis method, the proposed positioning system outperforms its counterparts based on other feature extraction methods in low dimensionality. Apart from balancing online matching computational burden, the new positioning system exhibits advantageous performance on radio map clustering, and also shows better robustness and adaptability in the asymmetric matching problem aspect. PMID:24451470

  7. Fast bi-directional prediction selection in H.264/MPEG-4 AVC temporal scalable video coding.

    PubMed

    Lin, Hung-Chih; Hang, Hsueh-Ming; Peng, Wen-Hsiao

    2011-12-01

    In this paper, we propose a fast algorithm that efficiently selects the temporal prediction type for the dyadic hierarchical-B prediction structure in the H.264/MPEG-4 temporal scalable video coding (SVC). We make use of the strong correlations in prediction type inheritance to eliminate the superfluous computations for the bi-directional (BI) prediction in the finer partitions, 16×8/8×16/8×8 , by referring to the best temporal prediction type of 16 × 16. In addition, we carefully examine the relationship in motion bit-rate costs and distortions between the BI and the uni-directional temporal prediction types. As a result, we construct a set of adaptive thresholds to remove the unnecessary BI calculations. Moreover, for the block partitions smaller than 8 × 8, either the forward prediction (FW) or the backward prediction (BW) is skipped based upon the information of their 8 × 8 partitions. Hence, the proposed schemes can efficiently reduce the extensive computational burden in calculating the BI prediction. As compared to the JSVM 9.11 software, our method saves the encoding time from 48% to 67% for a large variety of test videos over a wide range of coding bit-rates and has only a minor coding performance loss. © 2011 IEEE

  8. Three-Dimensional Terahertz Coded-Aperture Imaging Based on Single Input Multiple Output Technology.

    PubMed

    Chen, Shuo; Luo, Chenggao; Deng, Bin; Wang, Hongqiang; Cheng, Yongqiang; Zhuang, Zhaowen

    2018-01-19

    As a promising radar imaging technique, terahertz coded-aperture imaging (TCAI) can achieve high-resolution, forward-looking, and staring imaging by producing spatiotemporal independent signals with coded apertures. In this paper, we propose a three-dimensional (3D) TCAI architecture based on single input multiple output (SIMO) technology, which can reduce the coding and sampling times sharply. The coded aperture applied in the proposed TCAI architecture loads either purposive or random phase modulation factor. In the transmitting process, the purposive phase modulation factor drives the terahertz beam to scan the divided 3D imaging cells. In the receiving process, the random phase modulation factor is adopted to modulate the terahertz wave to be spatiotemporally independent for high resolution. Considering human-scale targets, images of each 3D imaging cell are reconstructed one by one to decompose the global computational complexity, and then are synthesized together to obtain the complete high-resolution image. As for each imaging cell, the multi-resolution imaging method helps to reduce the computational burden on a large-scale reference-signal matrix. The experimental results demonstrate that the proposed architecture can achieve high-resolution imaging with much less time for 3D targets and has great potential in applications such as security screening, nondestructive detection, medical diagnosis, etc.

  9. Hybrid sentiment analysis utilizing multiple indicators to determine temporal shifts of opinion in OSNs

    NASA Astrophysics Data System (ADS)

    White, Joshua S.; Hall, Robert T.; Fields, Jeremy; White, Holly M.

    2016-05-01

    Utilization of traditional sentiment analysis for predicting the outcome of an event on a social network depends on: precise understanding of what topics relate to the event, selective elimination of trends that don't fit, and in most cases, expert knowledge of major players of the event. Sentiment analysis has traditionally taken one of two approaches to derive a quantitative value from qualitative text. These approaches include the bag of words model", and the usage of "NLP" to attempt a real understanding of the text. Each of these methods yield very similar accuracy results with the exception of some special use cases. To do so, however, they both impose a large computational burden on the analytic system. Newer approaches have this same problem. No matter what approach is used, SA typically caps out around 80% in accuracy. However, accuracy is the result of both polarity and degree of polarity, nothing else. In this paper we present a method for hybridizing traditional SA methods to better determine shifts in opinion over time within social networks. This hybridization process involves augmenting traditional SA measurements with contextual understanding, and knowledge about writers' demographics. Our goal is to not only to improve accuracy, but to do so with minimal impact to computation requirements.

  10. A kriging metamodel-assisted robust optimization method based on a reverse model

    NASA Astrophysics Data System (ADS)

    Zhou, Hui; Zhou, Qi; Liu, Congwei; Zhou, Taotao

    2018-02-01

    The goal of robust optimization methods is to obtain a solution that is both optimum and relatively insensitive to uncertainty factors. Most existing robust optimization approaches use outer-inner nested optimization structures where a large amount of computational effort is required because the robustness of each candidate solution delivered from the outer level should be evaluated in the inner level. In this article, a kriging metamodel-assisted robust optimization method based on a reverse model (K-RMRO) is first proposed, in which the nested optimization structure is reduced into a single-loop optimization structure to ease the computational burden. Ignoring the interpolation uncertainties from kriging, K-RMRO may yield non-robust optima. Hence, an improved kriging-assisted robust optimization method based on a reverse model (IK-RMRO) is presented to take the interpolation uncertainty of kriging metamodel into consideration. In IK-RMRO, an objective switching criterion is introduced to determine whether the inner level robust optimization or the kriging metamodel replacement should be used to evaluate the robustness of design alternatives. The proposed criterion is developed according to whether or not the robust status of the individual can be changed because of the interpolation uncertainties from the kriging metamodel. Numerical and engineering cases are used to demonstrate the applicability and efficiency of the proposed approach.

  11. Microphysics-based black carbon aging in a global CTM: constraints from HIPPO observations and implications for global black carbon budget

    NASA Astrophysics Data System (ADS)

    He, Cenlin; Li, Qinbin; Liou, Kuo-Nan; Qi, Ling; Tao, Shu; Schwarz, Joshua P.

    2016-03-01

    We develop and examine a microphysics-based black carbon (BC) aerosol aging scheme that accounts for condensation, coagulation, and heterogeneous chemical oxidation processes in a global 3-D chemical transport model (GEOS-Chem) by interpreting the BC measurements from the HIAPER Pole-to-Pole Observations (HIPPO, 2009-2011) using the model. We convert aerosol mass in the model to number concentration by assuming lognormal aerosol size distributions and compute the microphysical BC aging rate (excluding chemical oxidation aging) explicitly from the condensation of soluble materials onto hydrophobic BC and the coagulation between hydrophobic BC and preexisting soluble particles. The chemical oxidation aging is tested in the sensitivity simulation. The microphysical aging rate is ˜ 4 times higher in the lower troposphere over source regions than that from a fixed aging scheme with an e-folding time of 1.2 days. The higher aging rate reflects the large emissions of sulfate-nitrate and secondary organic aerosol precursors hence faster BC aging through condensation and coagulation. In contrast, the microphysical aging is more than 5-fold slower than the fixed aging in remote regions, where condensation and coagulation are weak. Globally, BC microphysical aging is dominated by condensation, while coagulation contribution is largest over eastern China, India, and central Africa. The fixed aging scheme results in an overestimate of HIPPO BC throughout the troposphere by a factor of 6 on average. The microphysical scheme reduces this discrepancy by a factor of ˜ 3, particularly in the middle and upper troposphere. It also leads to a 3-fold reduction in model bias in the latitudinal BC column burden averaged along the HIPPO flight tracks, with largest improvements in the tropics. The resulting global annual mean BC lifetime is 4.2 days and BC burden is 0.25 mg m-2, with 7.3 % of the burden at high altitudes (above 5 km). Wet scavenging accounts for 80.3 % of global BC deposition. We find that, in source regions, the microphysical aging rate is insensitive to aerosol size distribution, condensation threshold, and chemical oxidation aging, while it is the opposite in remote regions, where the aging rate is orders of magnitude smaller. As a result, global BC burden and lifetime show little sensitivity (< 5 % change) to these three factors.

  12. Microphysics-based black carbon aging in a global CTM: constraints from HIPPO observations and implications for global black carbon budget

    NASA Astrophysics Data System (ADS)

    He, C.; Li, Q.; Liou, K. N.; Qi, L.; Tao, S.; Schwarz, J. P.

    2015-11-01

    We develop and examine a microphysics-based black carbon (BC) aerosol aging scheme that accounts for condensation and coagulation processes in a global 3-D chemical transport model (GEOS-Chem) by interpreting the BC measurements from the HIAPER Pole-to-Pole Observations (HIPPO, 2009-2011) using the model. We convert aerosol mass in the model to number concentration by assuming lognormal aerosol size distributions and compute the microphysical BC aging rate explicitly from the condensation of soluble materials onto hydrophobic BC and the coagulation between hydrophobic BC and preexisting soluble particles. The resulting aging rate is ∼ 4 times higher in the lower troposphere over source regions than that from a fixed aging scheme with an e-folding time of 1.2 days. The higher aging rate reflects the large emissions of sulfate-nitrate and secondary organic aerosol precursors hence faster BC aging through condensation and coagulation. In contrast, the microphysical aging is more than fivefold slower than the fixed aging in remote regions, where condensation and coagulation are weak. Globally BC microphysical aging is dominated by condensation, while coagulation contribution is largest over East China, India, and Central Africa. The fixed aging scheme results in an overestimate of HIPPO BC throughout the troposphere by a factor of 6 on average. The microphysical scheme reduces this discrepancy by a factor of ∼ 3, particularly in the middle and upper troposphere. It also leads to a threefold reduction in model bias in the latitudinal BC column burden averaged along the HIPPO flight tracks, with largest improvements in the tropics. The resulting global annual mean BC lifetime is 4.2 days and BC burden is 0.25 mg m-2, with 7.3 % of the burden at high altitudes (above 5 km). Wet scavenging accounts for 80.3 % of global BC deposition. We find that in source regions the microphysical aging rate is insensitive to aerosol size distribution, condensation threshold, and chemical oxidation aging, while it is the opposite in remote regions, where the aging rate is orders of magnitude smaller. As a result, global BC burden and lifetime show little sensitivity (< 5 % change) to these three factors.

  13. Growing Epidemic of Coronary Heart Disease in Low- and Middle-Income Countries

    PubMed Central

    Gaziano, Thomas A.; Bitton, Asaf; Anand, Shuchi; Abrahams-Gessel, Shafika; Murphy, Adrianna

    2010-01-01

    Coronary heart disease (CHD) is the single largest cause of death in the developed countries and is one of the leading causes of disease burden in developing countries. In 2001, there were 7.3 million deaths due to CHD worldwide. Three-fourths of global deaths due to CHD occurred in the low and middle-income countries. The rapid rise in CHD burden in most of the low and middle and income countries is due to socio-economic changes, increase in life span and acquisition of lifestyle related risk factors. The CHD death rate, however, varies dramatically across the developing countries. The varying incidence, prevalence, and mortality rates reflect the different levels of risk factors, other competing causes of death, availability of resources to combat CVD, and the stage of epidemiologic transition that each country or region finds itself. The economic burden of CHD is equally large but solutions exist to manage this growing burden. PMID:20109979

  14. Use of vaccines as probes to define disease burden

    PubMed Central

    Feikin, Daniel R; Scott, J Anthony G; Gessner, Bradford D

    2015-01-01

    Vaccine probe studies have emerged in the past 15 years as a useful way to characterise disease. By contrast, traditional studies of vaccines focus on defining the vaccine effectiveness or efficacy. The underlying basis for the vaccine probe approach is that the difference in disease burden between vaccinated and unvaccinated individuals can be ascribed to the vaccine-specific pathogen. Vaccine probe studies can increase understanding of a vaccine’s public health value. For instance, even when a vaccine has a seemingly low efficacy, a high baseline disease incidence can lead to a large vaccine-preventable disease burden and thus that population-based vaccine introduction would be justified. So far, vaccines have been used as probes to characterise disease syndromes caused by Haemophilus influenzae type b, pneumococcus, rotavirus, and early infant influenza. However, vaccine probe studies have enormous potential and could be used more widely in epidemiology, for example, to define the vaccine-preventable burden of malaria, typhoid, paediatric influenza, and dengue, and to identify causal interactions between different pathogens. PMID:24553294

  15. Blood pressure and the global burden of cardiovascular disease.

    PubMed

    Rodgers, A; MacMahon, S

    1999-01-01

    Cardiovascular disease is responsible for a large and increasing proportion of death and disability worldwide. Half of this burden occurs in Asia. This study assessed the possible effects of population-wide (2% lower DBP for all) and targeted (7% lower DBP for those with usual DBP > or = 95 mmHg) BP interventions in Asia, using data from surveys of blood pressure levels, the Global Burden of Disease Project, Eastern Asian cohort studies and randomised trials of blood pressure lowering. Overall each of the two interventions would be expected to avert about one million deaths per year throughout Asia in 2020. These benefits would be approximately additive. About half a million deaths might be averted annually by each intervention in China alone, with about four-fifths of this benefit due to averted stroke. The relative benefits of these two strategies are similar to estimates made for US and UK populations. However, the absolute benefits are many times greater due to the size of the predicted CVD burden in Asia.

  16. The quality-of-life burden of knee osteoarthritis in New Zealand adults: A model-based evaluation

    PubMed Central

    Wilson, Ross; Hansen, Paul; Losina, Elena

    2017-01-01

    Background Knee osteoarthritis is a leading global cause of health-related quality of life loss. The aim of this project was to quantify health losses arising from knee osteoarthritis in New Zealand (NZ) in terms of quality-adjusted life years (QALYs) lost. Methods The Osteoarthritis Policy Model (OAPol), a validated Monte Carlo computer simulation model, was used to estimate QALYs lost due to knee osteoarthritis in the NZ adult population aged 40–84 over their lifetimes from the base year of 2006 until death. Data were from the NZ Health Survey, NZ Burden of Diseases, NZ Census, and relevant literature. QALYs were derived from NZ EQ-5D value set 2. Sensitivity to health state valuation, disease and pain prevalence were assessed in secondary analyses. Results Based on NZ EQ-5D health state valuations, mean health losses due to knee osteoarthritis over people’s lifetimes in NZ are 3.44 QALYs per person, corresponding to 467,240 QALYs across the adult population. Average estimated per person QALY losses are higher for non-Māori females (3.55) than Māori females (3.38), and higher for non-Māori males (3.34) than Māori males (2.60). The proportion of QALYs lost out of the total quality-adjusted life expectancy for those without knee osteoarthritis is similar across all subgroups, ranging from 20 to 23 percent. Conclusions At both the individual and population levels, knee osteoarthritis is responsible for large lifetime QALY losses. QALY losses are higher for females than males due to greater prevalence of knee osteoarthritis and higher life expectancy, and lower for Māori than non-Māori due to lower life expectancy. Large health gains are potentially realisable from public health and policy measures aimed at decreasing incidence, progression, pain, and disability of osteoarthritis. PMID:29065119

  17. Cough Frequency During Treatment Associated With Baseline Cavitary Volume and Proximity to the Airway in Pulmonary TB.

    PubMed

    Proaño, Alvaro; Bui, David P; López, José W; Vu, Nancy M; Bravard, Marjory A; Lee, Gwenyth O; Tracey, Brian H; Xu, Ziyue; Comina, Germán; Ticona, Eduardo; Mollura, Daniel J; Friedland, Jon S; Moore, David A J; Evans, Carlton A; Caligiuri, Philip; Gilman, Robert H

    2018-06-01

    Cough frequency, and its duration, is a biomarker that can be used in low-resource settings without the need of laboratory culture and has been associated with transmission and treatment response. Radiologic characteristics associated with increased cough frequency may be important in understanding transmission. The relationship between cough frequency and cavitary lung disease has not been studied. We analyzed data in 41 adults who were HIV negative and had culture-confirmed, drug-susceptible pulmonary TB throughout treatment. Cough recordings were based on the Cayetano Cough Monitor, and sputum samples were evaluated using microscopic observation drug susceptibility broth culture; among culture-positive samples, bacillary burden was assessed by means of time to positivity. CT scans were analyzed by a US-board-certified radiologist and a computer-automated algorithm. The algorithm evaluated cavity volume and cavitary proximity to the airway. CT scans were obtained within 1 month of treatment initiation. We compared small cavities (≤ 7 mL) and large cavities (> 7 mL) and cavities located closer to (≤ 10 mm) and farther from (> 10 mm) the airway to cough frequency and cough cessation until treatment day 60. Cough frequency during treatment was twofold higher in participants with large cavity volumes (rate ratio [RR], 1.98; P = .01) and cavities located closer to the airway (RR, 2.44; P = .001). Comparably, cough ceased three times faster in participants with smaller cavities (adjusted hazard ratio [HR], 2.89; P = .06) and those farther from the airway (adjusted HR, 3.61;, P = .02). Similar results were found for bacillary burden and culture conversion during treatment. Cough frequency during treatment is greater and lasts longer in patients with larger cavities, especially those closer to the airway. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  18. Wireless Cloud Computing on Guided Missile Destroyers: A Business Case Analysis

    DTIC Science & Technology

    2013-06-01

    Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction...to the Office of Management and Budget, Paperwork Reduction Project (0704–0188) Washington DC 20503. 1 . AGENCY USE ONLY (Leave blank) 2. REPORT...INTRODUCTION........................................................................................................ 1   A.  OVERVIEW

  19. 76 FR 23854 - Reclassification of Motorcycles (Two and Three Wheeled Vehicles) in the Guide to Reporting...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-28

    ... and the unintended consequences of misclassification. Harley Davidson Motor Company (HDMC) stated that... concerns about the administrative, logistical and financial burdens of providing information based on the... estimated that the cost of updating their computers to process the information included in the new guidance...

  20. C-SWAT: The Soil and Water Assessment Tool with consolidated input files in alleviating computational burden of recursive simulations

    USDA-ARS?s Scientific Manuscript database

    The temptation to include model parameters and high resolution input data together with the availability of powerful optimization and uncertainty analysis algorithms has significantly enhanced the complexity of hydrologic and water quality modeling. However, the ability to take advantage of sophist...

  1. Automated Network Mapping and Topology Verification

    DTIC Science & Technology

    2016-06-01

    collection of information includes amplifying data about the networked devices such as hardware details, logical addressing schemes, 7 operating ...collection of information, including suggestions for reducing this burden, to Washington headquarters Services, Directorate for Information Operations ...maximum 200 words) The current military reliance on computer networks for operational missions and administrative duties makes network

  2. Activities and Reflection for Influencing Beliefs about Learning with Smartphones

    ERIC Educational Resources Information Center

    Cochrane, Robert

    2015-01-01

    English education in Japan faces numerous challenges, including an English as a Foreign Language (EFL) context, mandatory English classes, and an exam-oriented education system. Computer technology and the almost universal possession of smartphones can ease the burden of learning, but only if these tools are used effectively. Japanese university…

  3. 77 FR 61470 - Agency Information Collection Activities: Request for Comments for a New Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-09

    ... information; and (4) ways that the burden could be minimized, including the use of computer technology... electronic technology, without reducing the quality of the collected information. All comments should include... reporting to FHWA on the projects, use of Recovery Act funds, and jobs supported. States and FLMA that...

  4. The Effects of Integrating Social Learning Environment with Online Learning

    ERIC Educational Resources Information Center

    Raspopovic, Miroslava; Cvetanovic, Svetlana; Medan, Ivana; Ljubojevic, Danijela

    2017-01-01

    The aim of this paper is to present the learning and teaching styles using the Social Learning Environment (SLE), which was developed based on the computer supported collaborative learning approach. To avoid burdening learners with multiple platforms and tools, SLE was designed and developed in order to integrate existing systems, institutional…

  5. Modeling Booklet Effects for Nonequivalent Group Designs in Large-Scale Assessment

    ERIC Educational Resources Information Center

    Hecht, Martin; Weirich, Sebastian; Siegle, Thilo; Frey, Andreas

    2015-01-01

    Multiple matrix designs are commonly used in large-scale assessments to distribute test items to students. These designs comprise several booklets, each containing a subset of the complete item pool. Besides reducing the test burden of individual students, using various booklets allows aligning the difficulty of the presented items to the assumed…

  6. SAMSAN- MODERN NUMERICAL METHODS FOR CLASSICAL SAMPLED SYSTEM ANALYSIS

    NASA Technical Reports Server (NTRS)

    Frisch, H. P.

    1994-01-01

    SAMSAN was developed to aid the control system analyst by providing a self consistent set of computer algorithms that support large order control system design and evaluation studies, with an emphasis placed on sampled system analysis. Control system analysts have access to a vast array of published algorithms to solve an equally large spectrum of controls related computational problems. The analyst usually spends considerable time and effort bringing these published algorithms to an integrated operational status and often finds them less general than desired. SAMSAN reduces the burden on the analyst by providing a set of algorithms that have been well tested and documented, and that can be readily integrated for solving control system problems. Algorithm selection for SAMSAN has been biased toward numerical accuracy for large order systems with computational speed and portability being considered important but not paramount. In addition to containing relevant subroutines from EISPAK for eigen-analysis and from LINPAK for the solution of linear systems and related problems, SAMSAN contains the following not so generally available capabilities: 1) Reduction of a real non-symmetric matrix to block diagonal form via a real similarity transformation matrix which is well conditioned with respect to inversion, 2) Solution of the generalized eigenvalue problem with balancing and grading, 3) Computation of all zeros of the determinant of a matrix of polynomials, 4) Matrix exponentiation and the evaluation of integrals involving the matrix exponential, with option to first block diagonalize, 5) Root locus and frequency response for single variable transfer functions in the S, Z, and W domains, 6) Several methods of computing zeros for linear systems, and 7) The ability to generate documentation "on demand". All matrix operations in the SAMSAN algorithms assume non-symmetric matrices with real double precision elements. There is no fixed size limit on any matrix in any SAMSAN algorithm; however, it is generally agreed by experienced users, and in the numerical error analysis literature, that computation with non-symmetric matrices of order greater than about 200 should be avoided or treated with extreme care. SAMSAN attempts to support the needs of application oriented analysis by providing: 1) a methodology with unlimited growth potential, 2) a methodology to insure that associated documentation is current and available "on demand", 3) a foundation of basic computational algorithms that most controls analysis procedures are based upon, 4) a set of check out and evaluation programs which demonstrate usage of the algorithms on a series of problems which are structured to expose the limits of each algorithm's applicability, and 5) capabilities which support both a priori and a posteriori error analysis for the computational algorithms provided. The SAMSAN algorithms are coded in FORTRAN 77 for batch or interactive execution and have been implemented on a DEC VAX computer under VMS 4.7. An effort was made to assure that the FORTRAN source code was portable and thus SAMSAN may be adaptable to other machine environments. The documentation is included on the distribution tape or can be purchased separately at the price below. SAMSAN version 2.0 was developed in 1982 and updated to version 3.0 in 1988.

  7. Assessing self-care and social function using a computer adaptive testing version of the Pediatric Evaluation of Disability Inventory Accepted for Publication, Archives of Physical Medicine and Rehabilitation

    PubMed Central

    Coster, Wendy J.; Haley, Stephen M.; Ni, Pengsheng; Dumas, Helene M.; Fragala-Pinkham, Maria A.

    2009-01-01

    Objective To examine score agreement, validity, precision, and response burden of a prototype computer adaptive testing (CAT) version of the Self-Care and Social Function scales of the Pediatric Evaluation of Disability Inventory (PEDI) compared to the full-length version of these scales. Design Computer simulation analysis of cross-sectional and longitudinal retrospective data; cross-sectional prospective study. Settings Pediatric rehabilitation hospital, including inpatient acute rehabilitation, day school program, outpatient clinics; community-based day care, preschool, and children’s homes. Participants Four hundred sixty-nine children with disabilities and 412 children with no disabilities (analytic sample); 38 children with disabilities and 35 children without disabilities (cross-validation sample). Interventions Not applicable. Main Outcome Measures Summary scores from prototype CAT applications of each scale using 15-, 10-, and 5-item stopping rules; scores from the full-length Self-Care and Social Function scales; time (in seconds) to complete assessments and respondent ratings of burden. Results Scores from both computer simulations and field administration of the prototype CATs were highly consistent with scores from full-length administration (all r’s between .94 and .99). Using computer simulation of retrospective data, discriminant validity and sensitivity to change of the CATs closely approximated that of the full-length scales, especially when the 15- and 10-item stopping rules were applied. In the cross-validation study the time to administer both CATs was 4 minutes, compared to over 16 minutes to complete the full-length scales. Conclusions Self-care and Social Function score estimates from CAT administration are highly comparable to those obtained from full-length scale administration, with small losses in validity and precision and substantial decreases in administration time. PMID:18373991

  8. Self-reported financial burden of cancer care and its effect on physical and mental health-related quality of life among US cancer survivors.

    PubMed

    Kale, Hrishikesh P; Carroll, Norman V

    2016-04-15

    Cancer-related financial burden has been linked to cancer survivors (CS) forgoing/delaying medical care, skipping follow-up visits, and discontinuing medications. To the authors' knowledge, little is known regarding the effect of financial burden on the health-related quality of life of CS. The authors analyzed 2011 Medical Expenditure Panel Survey data. Financial burden was present if one of the following problems was reported: borrowed money/declared bankruptcy, worried about paying large medical bills, unable to cover the cost of medical care visits, or other financial sacrifices. The following outcomes were evaluated: Physical Component Score (PCS) and Mental Component Score (MCS) of the 12-Item Short-Form Health Survey (SF-12), depressed mood, psychological distress, and worry related to cancer recurrence. The authors also assessed the effect of the number of financial problems on these outcomes. Of the 19.6 million CS analyzed, 28.7% reported financial burden. Among them, the average PCS (42.3 vs 44.9) and MCS (48.1 vs 52.1) were lower for those with financial burden versus those without. In adjusted analyses, CS with financial burden had significantly lower PCS (β = -2.45), and MCS (β = -3.05), had increased odds of depressed mood (odds ratio, 1.95), and were more likely to worry about cancer recurrence (odds ratio, 3.54). Survivors reporting ≥ 3 financial problems reported statistically significant and clinically meaningful differences (≥3 points) in the mean PCS and MCS compared with survivors without financial problems. Cancer-related financial burden was associated with lower health-related quality of life, increased risk of depressed mood, and a higher frequency of worrying about cancer recurrence among CS. © 2015 American Cancer Society.

  9. A systematic review of the direct economic burden of type 2 diabetes in china.

    PubMed

    Hu, Huimei; Sawhney, Monika; Shi, Lizheng; Duan, Shengnan; Yu, Yunxian; Wu, Zhihong; Qiu, Guixing; Dong, Hengjin

    2015-03-01

    Type 2 diabetes is associated with acute and chronic complications and poses a large economic, social, and medical burden on patients and their families as well as society. This study aims to evaluate the direct economic burden of type 2 diabetes in China. systematic review on cost of illness, health care costs, direct service costs, drug costs, and health expenditures in relation to type 2 diabetes was conducted up to 2014 using databases such as Pubmed; EBSCO; Elsevier ScienceDirect, Web of Science; and a series of Chinese databases, including Wanfang Data, China National Knowledge Infrastructure (CNKI), and the China Science and Technology Journal Database. Factors influencing hospitalization and drug fees were also identified. (1) estimation of the direct economic burden including hospitalization and outpatient cost of type 2 diabetes patients in China; (2) evaluation of the factors influencing the direct economic burden. Articles only focusing on the cost-effectiveness analysis of diabetes drugs were excluded. The direct economic burden of type 2 diabetes has increased over time in China, and in 2008, the direct medical cost reached $9.1 billion, Both outpatient and inpatient costs have increased. Income level, type of medical insurance, the level of hospital care, and type and number of complications are primary factors influencing diabetes related hospitalization costs. Compared to urban areas, the direct non-medical cost of type 2 diabetes in rural areas is significantly greater. The direct economic burden of type 2 diabetes poses a significant challenge to China. To address the economic burden associated with type 2 diabetes, measures need to be taken to reduce prevalence rate and severity of diabetes and hospitalization cost.

  10. Trajectories of caregiver burden in families of adult cystic fibrosis patients.

    PubMed

    Wojtaszczyk, Ann; Glajchen, Myra; Portenoy, Russell K; Berdella, Maria; Walker, Patricia; Barrett, Malcolm; Chen, Jack; Plachta, Amy; Balzano, Julie; Fresenius, Ashley; Wilder, Kenya; Langfelder-Schwind, Elinor; Dhingra, Lara

    2017-10-17

    Little is known about the experience of family caregivers of adults with cystic fibrosis (CF). This information is important for the identification of caregivers at risk for burden. This was a longitudinal analysis of survey data obtained from caregivers of adult CF patients participating in an early intervention palliative care trial. Caregivers completed the validated Brief Assessment Scale for Caregivers (BASC) repeatedly over a 28-month period. Mixed-effects modeling evaluated multivariate associations with positive and negative caregiver perceptions over time. Of the 54 caregivers, 47.9% were spouses. The mean age was 50.9 years (SD = 13.2); 72.2% were women; 75.9% were married; and 63.0% were employed. At baseline, the BASC revealed large variations in positive and negative perceptions of caregiving. Although average scores over time were unchanging, variation was greater across caregivers than within caregivers (0.49 vs. 0.27, respectively). At baseline, the positive impact of caregiving in the sample was higher than the negative impact. Multivariate analysis revealed that patients' baseline pulmonary function and their full-time employment status predicted caregiver burden over time. Caregivers of CF patients varied in their positive and negative caregiving experiences, although burden levels in individual caregivers were stable over time. When the disease was advanced, caregivers of CF patients experienced more overall burden but also more positive impact. This suggests that the role of caregivers may become more meaningful as disease severity worsens. In addition, full-time patient employment was associated with lower caregiver burden regardless of disease severity. This suggests that burden in CF caregivers may be predicted by financial strain or benefits conferred by patient employment. These associations require further investigation to determine whether highly burdened caregivers can be identified and assisted using tailored interventions.

  11. Factors that lessen the burden of treatment in complex patients with chronic conditions: a qualitative study.

    PubMed

    Ridgeway, Jennifer L; Egginton, Jason S; Tiedje, Kristina; Linzer, Mark; Boehm, Deborah; Poplau, Sara; de Oliveira, Djenane Ramalho; Odell, Laura; Montori, Victor M; Eton, David T

    2014-01-01

    Patients with multiple chronic conditions (multimorbidity) often require ongoing treatment and complex self-care. This workload and its impact on patient functioning and well-being are, together, known as treatment burden. This study reports on factors that patients with multimorbidity draw on to lessen perceptions of treatment burden. Interviews (n=50) and focus groups (n=4 groups, five to eight participants per group) were conducted with patients receiving care in a large academic medical center or an urban safety-net hospital. Interview data were analyzed using qualitative framework analysis methods, and themes and subthemes were used to identify factors that mitigate burden. Focus groups were held to confirm these findings and clarify any new issues. This study was part of a larger program to develop a patient-reported measure of treatment burden. Five major themes emerged from the interview data. These included: 1) problem-focused strategies, like routinizing self-care, enlisting support of others, planning for the future, and using technology; 2) emotion-focused coping strategies, like maintaining a positive attitude, focusing on other life priorities, and spirituality/faith; 3) questioning the notion of treatment burden as a function of adapting to self-care and comparing oneself to others; 4) social support (informational, tangible, and emotional assistance); and 5) positive aspects of health care, like coordination of care and beneficial relationships with providers. Additional subthemes arising from focus groups included preserving autonomy/independence and being proactive with providers. Patients attempt to lessen the experience of treatment burden using a variety of personal, social, and health care resources. Assessing these factors in tandem with patient perceptions of treatment burden can provide a more complete picture of how patients fit complex self-care into their daily lives.

  12. Hsp-27 levels and thrombus burden relate to clinical outcomes in patients with ST-segment elevation myocardial infarction

    PubMed Central

    Tian, Maozhou; Zhu, Lingmin; Lin, Hongyang; Lin, Qiaoyan; Huang, Peng; Yu, Xiao; Jing, Yanyan

    2017-01-01

    High thrombus burden, subsequent distal embolization, and myocardial no-reflow remain a large obstacle that may negate the benefits of urgent coronary revascularization in patients with ST-segment elevation myocardial infarction (STEMI). However, the biological function and clinical association of Hsp-27 with thrombus burden and clinical outcomes in patients with STEMI is not clear. Consecutive patients (n = 146) having STEMI undergoing primary percutaneous coronary intervention (pPCI) within 12 hours from the onset of symptoms were enrolled in this prospective study in the Affiliated Yantai Yuhuangding Hospital of Qingdao University, Yantai, Shangdong, P.R. China. Patients were divided into low thrombus burden and high thrombus burden groups. The present study demonstrated that patients with high-thrombus burden had higher plasma Hsp-27 levels ([32.0 ± 8.6 vs. 58.0 ± 12.3] ng/mL, P < 0.001). The median value of Hsp-27 levels in all patients with STEMI was 45 ng/mL. Using the receiver operating characteristic (ROC) curve analysis, plasma Hsp-27 levels were of significant diagnostic value for high thrombus burden (AUC, 0.847; 95% CI, 0.775–0.918; P < 0.01). The multivariate cox regression analysis demonstrated that Hsp-27 > 45 ng/mL (HR 2.801, 95% CI 1.296–4.789, P = 0.001), were positively correlated with the incidence of major adverse cardiovascular events (MACE). Kaplan-Meier survival analysis demonstrated that MACE-free survival at 180-day follow-up was significantly lower in patients with Hsp-27 > 45 ng/mL (log rank = 10.28, P < 0.001). Our data demonstrate that plasma Hsp-27 was positively correlated with high thrombus burden and the incidence of MACE in patients with STEMI who underwent pPCI. PMID:29088740

  13. Discrete square root filtering - A survey of current techniques.

    NASA Technical Reports Server (NTRS)

    Kaminskii, P. G.; Bryson, A. E., Jr.; Schmidt, S. F.

    1971-01-01

    Current techniques in square root filtering are surveyed and related by applying a duality association. Four efficient square root implementations are suggested, and compared with three common conventional implementations in terms of computational complexity and precision. It is shown that the square root computational burden should not exceed the conventional by more than 50% in most practical problems. An examination of numerical conditioning predicts that the square root approach can yield twice the effective precision of the conventional filter in ill-conditioned problems. This prediction is verified in two examples.

  14. State estimation for distributed systems with sensing delay

    NASA Astrophysics Data System (ADS)

    Alexander, Harold L.

    1991-08-01

    Control of complex systems such as remote robotic vehicles requires combining data from many sensors where the data may often be delayed by sensory processing requirements. The number and variety of sensors make it desirable to distribute the computational burden of sensing and estimation among multiple processors. Classic Kalman filters do not lend themselves to distributed implementations or delayed measurement data. The alternative Kalman filter designs presented in this paper are adapted for delays in sensor data generation and for distribution of computation for sensing and estimation over a set of networked processors.

  15. Disease Burden of 32 Infectious Diseases in the Netherlands, 2007-2011

    PubMed Central

    Bouwknegt, Martijn; Kretzschmar, Mirjam E.; Mangen, Marie-Josée J.; Wallinga, Jacco; de Melker, Hester E.

    2016-01-01

    Background Infectious disease burden estimates provided by a composite health measure give a balanced view of the true impact of a disease on a population, allowing the relative impact of diseases that differ in severity and mortality to be monitored over time. This article presents the first national disease burden estimates for a comprehensive set of 32 infectious diseases in the Netherlands. Methods and Findings The average annual disease burden was computed for the period 2007–2011 for selected infectious diseases in the Netherlands using the disability-adjusted life years (DALY) measure. The pathogen- and incidence-based approach was adopted to quantify the burden due to both morbidity and premature mortality associated with all short and long-term consequences of infection. Natural history models, disease progression probabilities, disability weights, and other parameters were adapted from previous research. Annual incidence was obtained from statutory notification and other surveillance systems, which was corrected for under-ascertainment and under-reporting. The highest average annual disease burden was estimated for invasive pneumococcal disease (9444 DALYs/year; 95% uncertainty interval [UI]: 8911–9961) and influenza (8670 DALYs/year; 95% UI: 8468–8874), which represents 16% and 15% of the total burden of all 32 diseases, respectively. The remaining 30 diseases ranked by number of DALYs/year from high to low were: HIV infection, legionellosis, toxoplasmosis, chlamydia, campylobacteriosis, pertussis, tuberculosis, hepatitis C infection, Q fever, norovirus infection, salmonellosis, gonorrhoea, invasive meningococcal disease, hepatitis B infection, invasive Haemophilus influenzae infection, shigellosis, listeriosis, giardiasis, hepatitis A infection, infection with STEC O157, measles, cryptosporidiosis, syphilis, rabies, variant Creutzfeldt-Jakob disease, tetanus, mumps, rubella, diphtheria, and poliomyelitis. The very low burden for the latter five diseases can be attributed to the National Immunisation Programme. The average disease burden per individual varied from 0.2 (95% UI: 0.1–0.4) DALYs per 100 infections for giardiasis, to 5081 and 3581 (95% UI: 3540–3611) DALYs per 100 infections for rabies and variant Creutzfeldt-Jakob disease, respectively. Conclusions For guiding and supporting public health policy decisions regarding the prioritisation of interventions and preventive measures, estimates of disease burden and the comparison of burden between diseases can be informative. Although the collection of disease-specific parameters and estimation of incidence is a process subject to continuous improvement, the current study established a baseline for assessing the impact of future public health initiatives. PMID:27097024

  16. Global burden of cancer attributable to high body-mass index in 2012: a population-based study

    PubMed Central

    Byrnes, Graham; Renehan, Prof Andrew G; Stevens, Gretchen A; Ezzati, Prof Majid; Ferlay, Jacques; Miranda, J. Jaime; Romieu, Isabelle; Dikshit, Rajesh; Forman, David; Soerjomataram, Isabelle

    2015-01-01

    Background Excess body mass index (BMI) is associated with increased risk of cancer. To inform public health policyand future research, we estimated the global burden of cancer attributable to excess BMI. Methods Population attributable fractions (PAFs) were derived using relative risks and BMI estimates in adults by age, sex and country. Assuming a10-year lag-period, PAFs were calculated using BMI estimates in 2002. GLOBOCAN2012 was used to compute numbers of new cancer cases attributable to excess BMI. In an alternative scenario, we computed the proportion of potentially avoidable cancers assuming that populations maintained their BMI-level observed in 1982. Secondary analyses were performed to test the model and estimate the impactof hormone replacement therapy (HRT) and smoking. Findings Worldwide, we estimated that 481,000 or 3·6% of all new cancer cases in 2012 were attributable to excess BMI. PAFs were greater in women compared with men (5·4% versus 1·9%). The burden was concentrated in countries with very high and high human development index (HDI, PAF: 5·3% and 4·8%) compared with countries with moderate and low HDI (PAF: 1·6% and 1·0%). Corpus uteri, post-menopausal breast and colon cancers accounted for approximately two-thirds (64%) of excess BMI attributable cancers. One fourth (~118,000) of all cases related to excess BMI in 2012 could be attributed to the rising BMI since 1982. Interpretation These findings further underpin the need for a global effort to abate the rising trends in population-level excess weight. Assuming that the relationship between excess BMI and cancer is causal and the current pattern of population weight gain continues, this will likely augment the future burden of cancer. Funding World Cancer Research Fund, Marie Currie Fellowship, the National Health and Medical Research Council Australia and US NIH. PMID:25467404

  17. The hidden economic burden of air pollution-related morbidity: evidence from the Aphekom project.

    PubMed

    Chanel, Olivier; Perez, Laura; Künzli, Nino; Medina, Sylvia

    2016-12-01

    Public decision-makers commonly use health impact assessments (HIA) to quantify the impacts of various regulation policies. However, standard HIAs do not consider that chronic diseases (CDs) can be both caused and exacerbated by a common factor, and generally focus on exacerbations. As an illustration, exposure to near road traffic-related pollution (NRTP) may affect the onset of CDs, and general ambient or urban background air pollution (BP) may exacerbate these CDs. We propose a comprehensive HIA that explicitly accounts for both the acute effects and the long-term effects, making it possible to compute the overall burden of disease attributable to air pollution. A case study applies the two HIA methods to two CDs-asthma in children and coronary heart disease (CHD) in adults over 65-for ten European cities, totaling 1.89 million 0-17-year-old children and 1.85 million adults aged 65 and over. We compare the current health effects with those that might, hypothetically, be obtained if exposure to NRTP was equally low for those living close to busy roads as it is for those living farther away, and if annual mean concentrations of both PM 10 and NO 2 -taken as markers of general urban air pollution-were no higher than 20 μg/m 3 . Returning an assessment of € 0.55 million (95 % CI 0-0.95), the HIA based on acute effects alone accounts for only about 6.2 % of the annual hospitalization burden computed with the comprehensive method [€ 8.81 million (95 % CI 3-14.4)], and for about 0.15 % of the overall economic burden of air pollution-related CDs [€ 370 million (95 % CI 106-592)]. Morbidity effects thus impact the health system more directly and strongly than previously believed. These findings may clarify the full extent of benefits from any public health or environmental policy involving CDs due to and exacerbated by a common factor.

  18. The relationship between caregiving self-efficacy and depressive symptoms in family caregivers of patients with Alzheimer disease: a longitudinal study.

    PubMed

    Grano, Caterina; Lucidi, Fabio; Violani, Cristiano

    2017-07-01

    Caregiving for a relative with dementia has been associated with negative consequences for mental health. Self-efficacy has been shown to correlate negatively with depression but the long-term association between caregiver burden, caregiver self-efficacy, and depressive symptoms, remains still largely unexplored. The aim of the present study was to evaluate whether different self-efficacy domains partially mediated the relationship between caregiving burden and depression. A three-wave design was used, with initial assessment and follow-ups three months later and one year later. One hundred seventy caregivers of patients with AD responded to measures of caregiver burden, caregiving self-efficacy, and depressive symptoms. Data were analyzed by means of structural equation models. The tested model provided support for the guiding hypothesis. Burden at the time of the first assessment (T1) significantly influenced depression one year later and the relationship between burden at time one and depressive symptoms one year later was partially mediated by self-efficacy for controlling upsetting thoughts. The findings of the present study provide evidence that, along a considerable length of time, the effects of caregiver burden on depressive symptoms can be explained by the caregivers' efficacy beliefs in controlling upsetting thoughts related to the caregiving tasks. Interventions for caregivers of patients with AD may help them in tackling negative thoughts about the caregiving role.

  19. Burden of illness in functional gastrointestinal disorder--the consequences for the individual and society.

    PubMed

    Glise, H; Wiklund, I; Hallerbäck, B

    1998-01-01

    To review the consequences of functional gastrointestinal disorders (FGD), i.e. heartburn without esophagitis, dyspepsia and IBS for the individual and society. Current publications indicate that functional gastrointestinal disorders are more prevalent than organic gastrointestinal disorders in the population. Symptoms, not the organic finding per se, are most important to the individual. Functional disorders are furthermore linked to somatic symptoms, from other parts of the body, as well as to mental health. Together they constitute a large medical burden on society in terms of consultations, drug consumption and surgery. Social costs are further increased by problems at work and a considerable increase in absenteeism. Functional gastrointestinal disorders should be taken more seriously by the medical community and society, since the burden of illness seems much larger than earlier anticipated.

  20. Nonlocal Intracranial Cavity Extraction

    PubMed Central

    Manjón, José V.; Eskildsen, Simon F.; Coupé, Pierrick; Romero, José E.; Collins, D. Louis; Robles, Montserrat

    2014-01-01

    Automatic and accurate methods to estimate normalized regional brain volumes from MRI data are valuable tools which may help to obtain an objective diagnosis and followup of many neurological diseases. To estimate such regional brain volumes, the intracranial cavity volume (ICV) is often used for normalization. However, the high variability of brain shape and size due to normal intersubject variability, normal changes occurring over the lifespan, and abnormal changes due to disease makes the ICV estimation problem challenging. In this paper, we present a new approach to perform ICV extraction based on the use of a library of prelabeled brain images to capture the large variability of brain shapes. To this end, an improved nonlocal label fusion scheme based on BEaST technique is proposed to increase the accuracy of the ICV estimation. The proposed method is compared with recent state-of-the-art methods and the results demonstrate an improved performance both in terms of accuracy and reproducibility while maintaining a reduced computational burden. PMID:25328511

  1. Phyx: phylogenetic tools for unix.

    PubMed

    Brown, Joseph W; Walker, Joseph F; Smith, Stephen A

    2017-06-15

    The ease with which phylogenomic data can be generated has drastically escalated the computational burden for even routine phylogenetic investigations. To address this, we present phyx : a collection of programs written in C ++ to explore, manipulate, analyze and simulate phylogenetic objects (alignments, trees and MCMC logs). Modelled after Unix/GNU/Linux command line tools, individual programs perform a single task and operate on standard I/O streams that can be piped to quickly and easily form complex analytical pipelines. Because of the stream-centric paradigm, memory requirements are minimized (often only a single tree or sequence in memory at any instance), and hence phyx is capable of efficiently processing very large datasets. phyx runs on POSIX-compliant operating systems. Source code, installation instructions, documentation and example files are freely available under the GNU General Public License at https://github.com/FePhyFoFum/phyx. eebsmith@umich.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press.

  2. Phyx: phylogenetic tools for unix

    PubMed Central

    Brown, Joseph W.; Walker, Joseph F.; Smith, Stephen A.

    2017-01-01

    Abstract Summary: The ease with which phylogenomic data can be generated has drastically escalated the computational burden for even routine phylogenetic investigations. To address this, we present phyx: a collection of programs written in C ++ to explore, manipulate, analyze and simulate phylogenetic objects (alignments, trees and MCMC logs). Modelled after Unix/GNU/Linux command line tools, individual programs perform a single task and operate on standard I/O streams that can be piped to quickly and easily form complex analytical pipelines. Because of the stream-centric paradigm, memory requirements are minimized (often only a single tree or sequence in memory at any instance), and hence phyx is capable of efficiently processing very large datasets. Availability and Implementation: phyx runs on POSIX-compliant operating systems. Source code, installation instructions, documentation and example files are freely available under the GNU General Public License at https://github.com/FePhyFoFum/phyx Contact: eebsmith@umich.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28174903

  3. Open-source framework for power system transmission and distribution dynamics co-simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Renke; Fan, Rui; Daily, Jeff

    The promise of the smart grid entails more interactions between the transmission and distribution networks, and there is an immediate need for tools to provide the comprehensive modelling and simulation required to integrate operations at both transmission and distribution levels. Existing electromagnetic transient simulators can perform simulations with integration of transmission and distribution systems, but the computational burden is high for large-scale system analysis. For transient stability analysis, currently there are only separate tools for simulating transient dynamics of the transmission and distribution systems. In this paper, we introduce an open source co-simulation framework “Framework for Network Co-Simulation” (FNCS), togethermore » with the decoupled simulation approach that links existing transmission and distribution dynamic simulators through FNCS. FNCS is a middleware interface and framework that manages the interaction and synchronization of the transmission and distribution simulators. Preliminary testing results show the validity and capability of the proposed open-source co-simulation framework and the decoupled co-simulation methodology.« less

  4. PDEMOD: Software for control/structures optimization

    NASA Technical Reports Server (NTRS)

    Taylor, Lawrence W., Jr.; Zimmerman, David

    1991-01-01

    Because of the possibility of adverse interaction between the control system and the structural dynamics of large, flexible spacecraft, great care must be taken to ensure stability and system performance. Because of the high cost of insertion of mass into low earth orbit, it is prudent to optimize the roles of structure and control systems simultaneously. Because of the difficulty and the computational burden in modeling and analyzing the control structure system dynamics, the total problem is often split and treated iteratively. It would aid design if the control structure system dynamics could be represented in a single system of equations. With the use of the software PDEMOD (Partial Differential Equation Model), it is now possible to optimize structure and control systems simultaneously. The distributed parameter modeling approach enables embedding the control system dynamics into the same equations for the structural dynamics model. By doing this, the current difficulties involved in model order reduction are avoided. The NASA Mini-MAST truss is used an an example for studying integrated control structure design.

  5. Multiple Vehicle Detection and Segmentation in Malaysia Traffic Flow

    NASA Astrophysics Data System (ADS)

    Fariz Hasan, Ahmad; Fikri Che Husin, Mohd; Affendi Rosli, Khairul; Norhafiz Hashim, Mohd; Faiz Zainal Abidin, Amar

    2018-03-01

    Vision based system are widely used in the field of Intelligent Transportation System (ITS) to extract a large amount of information to analyze traffic scenes. By rapid number of vehicles on the road as well as significant increase on cameras dictated the need for traffic surveillance systems. This system can take over the burden some task was performed by human operator in traffic monitoring centre. The main technique proposed by this paper is concentrated on developing a multiple vehicle detection and segmentation focusing on monitoring through Closed Circuit Television (CCTV) video. The system is able to automatically segment vehicle extracted from heavy traffic scene by optical flow estimation alongside with blob analysis technique in order to detect the moving vehicle. Prior to segmentation, blob analysis technique will compute the area of interest region corresponding to moving vehicle which will be used to create bounding box on that particular vehicle. Experimental validation on the proposed system was performed and the algorithm is demonstrated on various set of traffic scene.

  6. Software Support for Transiently Powered Computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Der Woude, Joel Matthew

    With the continued reduction in size and cost of computing, power becomes an increasingly heavy burden on system designers for embedded applications. While energy harvesting techniques are an increasingly desirable solution for many deeply embedded applications where size and lifetime are a priority, previous work has shown that energy harvesting provides insufficient power for long running computation. We present Ratchet, which to the authors knowledge is the first automatic, software-only checkpointing system for energy harvesting platforms. We show that Ratchet provides a means to extend computation across power cycles, consistent with those experienced by energy harvesting devices. We demonstrate themore » correctness of our system under frequent failures and show that it has an average overhead of 58.9% across a suite of benchmarks representative for embedded applications.« less

  7. Hybrid-dual-fourier tomographic algorithm for a fast three-dimensionial optical image reconstruction in turbid media

    NASA Technical Reports Server (NTRS)

    Alfano, Robert R. (Inventor); Cai, Wei (Inventor)

    2007-01-01

    A reconstruction technique for reducing computation burden in the 3D image processes, wherein the reconstruction procedure comprises an inverse and a forward model. The inverse model uses a hybrid dual Fourier algorithm that combines a 2D Fourier inversion with a 1D matrix inversion to thereby provide high-speed inverse computations. The inverse algorithm uses a hybrid transfer to provide fast Fourier inversion for data of multiple sources and multiple detectors. The forward model is based on an analytical cumulant solution of a radiative transfer equation. The accurate analytical form of the solution to the radiative transfer equation provides an efficient formalism for fast computation of the forward model.

  8. The Within-Year Concentration of Medical Care: Implications for Family Out-of-Pocket Expenditure Burdens

    PubMed Central

    Selden, Thomas M

    2009-01-01

    Objective To examine the within-year concentration of family health care and the resulting exposure of families to short periods of high expenditure burdens. Data Source Household data from the pooled 2003 and 2004 Medical Expenditure Panel Survey (MEPS) yielding nationally representative estimates for the nonelderly civilian noninstitutionalized population. Study Design The paper examines the within-year concentration of family medical care use and the frequency with which family out-of-pocket expenditures exceeded 20 percent of family income, computed at the annual, quarterly, and monthly levels. Principal Findings On average among families with medical care, 49 percent of all (charge-weighted) care occurred in a single month, and 63 percent occurred in a single quarter). Nationally, 27 percent of the study population experienced at least 1 month in which out-of-pocket expenditures exceeded 20 percent of income. Monthly 20 percent burden rates were highest among the poor, at 43 percent, and were close to or above 30 percent for all but the highest income group (families above four times the federal poverty line). Conclusions Within-year spikes in health care utilization can create financial pressures missed by conventional annual burden analyses. Within-year health-related financial pressures may be especially acute among lower-income families due to low asset holdings. PMID:19674431

  9. Prevalence and burden of primary headache in Akaki textile mill workers, Ethiopia.

    PubMed

    Takele, Getahun Mengistu; Tekle Haimanot, Redda; Martelletti, Paolo

    2008-04-01

    Headache disorders are the most common complaints worldwide. Migraine, tension-type and cluster headaches account for majority of primary headaches and impose a substantial burden on the individual, family or society at large. The burden is immense on workers, women and children in terms of missing work and school days. There are few studies that show relatively lower prevalence of primary headaches in Africa as compared to Europe and America. There might be many reasons for this lower prevalence. The objective of this study is to determine the prevalence and burden of primary headaches among the Akaki textile factory workers, which may provide data for the local and international level toward the campaign of lifting the burden of headache worldwide. The overall 1-year prevalence of all types of primary headaches was found to be 16.4%, and that of migraine was 6.2%. The prevalence of migraine in females was 10.1% while it was 3.7% in males. The prevalence of tension-type headaches was found to be 9.8%. This was 16.3 % in females as compared to 5.7% in males. The burden of the primary headaches in terms of lost workdays, gross under recognition and absence of effective treatment is tremendous. In conclusion, the prevalence of primary headaches in the Akaki textile mill workers is significant, particularly in females, and the burden is massive, in a place of poverty and ignorance. We recommend the availability and administration of specific therapy to the factory workers with primary headaches, and community based well-designed study for the whole nation's rural and urban population.

  10. Caregiver burden among adults caring for orphaned children in rural South Africa

    PubMed Central

    Kidman, Rachel; Thurman, Tonya R.

    2014-01-01

    The AIDS epidemic has created an unprecedented number of orphans. While largely absorbed by extended family, this additional responsibility can weigh heavily on their caregivers. The concept of caregiver burden captures multiple dimensions of well-being (e.g., physical, social and psychological). Measuring the extent and determinants of caregiving burden can inform the design of programmes to ease the negative consequences of caregiving. This study uses the baseline data from a study assessing interventions for orphans and vulnerable adolescents in the Eastern Cape, South Africa. Orphan caregivers (n = 726) completed an adapted version of the 12-item Zarit Burden Interview. In addition to basic caregiver and household demographics, the survey also collected information on AIDS-related illness and recent deaths. Descriptive data are presented, followed by multivariate Poisson regression models to explore factors associated with caregiver burden. Approximately 40% of caregivers reported high levels of orphan caregiving burden. Feelings of stress and inadequacy concerning their care responsibilities as well as anger towards the child were common. Household food insecurity was the most important predictor of orphan caregiving burden (marginal effect = 7.82; p < 0.001 for those reporting severe hunger); income was also a significant determinant. When other AIDS impacts were added to the model, only the AIDS-related illness of the caregiver was significantly associated with burden (marginal effect = 3.77; p < 0.001). This study suggests that caregivers with economic vulnerability and those struggling with their own AIDS-related illness feel most overburdened. These findings are particularly relevant to service providers who must identify caregivers in need of immediate assistance and allocate limited resources effectively. To alleviate caregiver burden, programmes must foster greater economic security (e.g., by facilitating access to social grants or directly providing cash transfers) and coordinate services with home-based care programmes serving the chronically ill. PMID:24999368

  11. Understanding dengue virus evolution to support epidemic surveillance and counter-measure development.

    PubMed

    Pollett, S; Melendrez, M C; Maljkovic Berry, I; Duchêne, S; Salje, H; Dat, Cummings; Jarman, R G

    2018-04-25

    Dengue virus (DENV) causes a profound burden of morbidity and mortality, and its global burden is rising due to the co-circulation of four divergent DENV serotypes in the ecological context of globalization, travel, climate change, urbanization, and expansion of the geographic range of the Ae.aegypti and Ae.albopictus vectors. Understanding DENV evolution offers valuable opportunities to enhance surveillance and response to DENV epidemics via advances in RNA virus sequencing, bioinformatics, phylogenetic and other computational biology methods. Here we provide a scoping overview of the evolution and molecular epidemiology of DENV and the range of ways that evolutionary analyses can be applied as a public health tool against this arboviral pathogen. Copyright © 2018. Published by Elsevier B.V.

  12. Low rank alternating direction method of multipliers reconstruction for MR fingerprinting.

    PubMed

    Assländer, Jakob; Cloos, Martijn A; Knoll, Florian; Sodickson, Daniel K; Hennig, Jürgen; Lattanzi, Riccardo

    2018-01-01

    The proposed reconstruction framework addresses the reconstruction accuracy, noise propagation and computation time for magnetic resonance fingerprinting. Based on a singular value decomposition of the signal evolution, magnetic resonance fingerprinting is formulated as a low rank (LR) inverse problem in which one image is reconstructed for each singular value under consideration. This LR approximation of the signal evolution reduces the computational burden by reducing the number of Fourier transformations. Also, the LR approximation improves the conditioning of the problem, which is further improved by extending the LR inverse problem to an augmented Lagrangian that is solved by the alternating direction method of multipliers. The root mean square error and the noise propagation are analyzed in simulations. For verification, in vivo examples are provided. The proposed LR alternating direction method of multipliers approach shows a reduced root mean square error compared to the original fingerprinting reconstruction, to a LR approximation alone and to an alternating direction method of multipliers approach without a LR approximation. Incorporating sensitivity encoding allows for further artifact reduction. The proposed reconstruction provides robust convergence, reduced computational burden and improved image quality compared to other magnetic resonance fingerprinting reconstruction approaches evaluated in this study. Magn Reson Med 79:83-96, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  13. Understanding Loan Use and Debt Burden among Low-Income and Minority Students at a Large Urban Community College

    ERIC Educational Resources Information Center

    Luna-Torres, Maria; McKinney, Lyle; Horn, Catherine; Jones, Sara

    2018-01-01

    This study examined a sample of community college students from a diverse, large urban community college system in Texas. To gain a deeper understanding about the effects of background characteristics on student borrowing behaviors and enrollment outcomes, the study employed descriptive statistics and regression techniques to examine two separate…

  14. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    PubMed

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.

  15. Machine learning based Intelligent cognitive network using fog computing

    NASA Astrophysics Data System (ADS)

    Lu, Jingyang; Li, Lun; Chen, Genshe; Shen, Dan; Pham, Khanh; Blasch, Erik

    2017-05-01

    In this paper, a Cognitive Radio Network (CRN) based on artificial intelligence is proposed to distribute the limited radio spectrum resources more efficiently. The CRN framework can analyze the time-sensitive signal data close to the signal source using fog computing with different types of machine learning techniques. Depending on the computational capabilities of the fog nodes, different features and machine learning techniques are chosen to optimize spectrum allocation. Also, the computing nodes send the periodic signal summary which is much smaller than the original signal to the cloud so that the overall system spectrum source allocation strategies are dynamically updated. Applying fog computing, the system is more adaptive to the local environment and robust to spectrum changes. As most of the signal data is processed at the fog level, it further strengthens the system security by reducing the communication burden of the communications network.

  16. Comparative study of surrogate models for groundwater contamination source identification at DNAPL-contaminated sites

    NASA Astrophysics Data System (ADS)

    Hou, Zeyu; Lu, Wenxi

    2018-05-01

    Knowledge of groundwater contamination sources is critical for effectively protecting groundwater resources, estimating risks, mitigating disaster, and designing remediation strategies. Many methods for groundwater contamination source identification (GCSI) have been developed in recent years, including the simulation-optimization technique. This study proposes utilizing a support vector regression (SVR) model and a kernel extreme learning machine (KELM) model to enrich the content of the surrogate model. The surrogate model was itself key in replacing the simulation model, reducing the huge computational burden of iterations in the simulation-optimization technique to solve GCSI problems, especially in GCSI problems of aquifers contaminated by dense nonaqueous phase liquids (DNAPLs). A comparative study between the Kriging, SVR, and KELM models is reported. Additionally, there is analysis of the influence of parameter optimization and the structure of the training sample dataset on the approximation accuracy of the surrogate model. It was found that the KELM model was the most accurate surrogate model, and its performance was significantly improved after parameter optimization. The approximation accuracy of the surrogate model to the simulation model did not always improve with increasing numbers of training samples. Using the appropriate number of training samples was critical for improving the performance of the surrogate model and avoiding unnecessary computational workload. It was concluded that the KELM model developed in this work could reasonably predict system responses in given operation conditions. Replacing the simulation model with a KELM model considerably reduced the computational burden of the simulation-optimization process and also maintained high computation accuracy.

  17. A Scalable, Out-of-Band Diagnostics Architecture for International Space Station Systems Support

    NASA Technical Reports Server (NTRS)

    Fletcher, Daryl P.; Alena, Rick; Clancy, Daniel (Technical Monitor)

    2002-01-01

    The computational infrastructure of the International Space Station (ISS) is a dynamic system that supports multiple vehicle subsystems such as Caution and Warning, Electrical Power Systems and Command and Data Handling (C&DH), as well as scientific payloads of varying size and complexity. The dynamic nature of the ISS configuration coupled with the increased demand for payload support places a significant burden on the inherently resource constrained computational infrastructure of the ISS. Onboard system diagnostics applications are hosted on computers that are elements of the avionics network while ground-based diagnostic applications receive only a subset of available telemetry, down-linked via S-band communications. In this paper we propose a scalable, out-of-band diagnostics architecture for ISS systems support that uses a read-only connection for C&DH data acquisition, which provides a lower cost of deployment and maintenance (versus a higher criticality readwrite connection). The diagnostics processing burden is off-loaded from the avionics network to elements of the on-board LAN that have a lower overall cost of operation and increased computational capacity. A superset of diagnostic data, richer in content than the configured telemetry, is made available to Advanced Diagnostic System (ADS) clients running on wireless handheld devices, affording the crew greater mobility for troubleshooting and providing improved insight into vehicle state. The superset of diagnostic data is made available to the ground in near real-time via an out-of band downlink, providing a high level of fidelity between vehicle state and test, training and operational facilities on the ground.

  18. Big Data Challenges in Global Seismic 'Adjoint Tomography' (Invited)

    NASA Astrophysics Data System (ADS)

    Tromp, J.; Bozdag, E.; Krischer, L.; Lefebvre, M.; Lei, W.; Smith, J.

    2013-12-01

    The challenge of imaging Earth's interior on a global scale is closely linked to the challenge of handling large data sets. The related iterative workflow involves five distinct phases, namely, 1) data gathering and culling, 2) synthetic seismogram calculations, 3) pre-processing (time-series analysis and time-window selection), 4) data assimilation and adjoint calculations, 5) post-processing (pre-conditioning, regularization, model update). In order to implement this workflow on modern high-performance computing systems, a new seismic data format is being developed. The Adaptable Seismic Data Format (ASDF) is designed to replace currently used data formats with a more flexible format that allows for fast parallel I/O. The metadata is divided into abstract categories, such as "source" and "receiver", along with provenance information for complete reproducibility. The structure of ASDF is designed keeping in mind three distinct applications: earthquake seismology, seismic interferometry, and exploration seismology. Existing time-series analysis tool kits, such as SAC and ObsPy, can be easily interfaced with ASDF so that seismologists can use robust, previously developed software packages. ASDF accommodates an automated, efficient workflow for global adjoint tomography. Manually managing the large number of simulations associated with the workflow can rapidly become a burden, especially with increasing numbers of earthquakes and stations. Therefore, it is of importance to investigate the possibility of automating the entire workflow. Scientific Workflow Management Software (SWfMS) allows users to execute workflows almost routinely. SWfMS provides additional advantages. In particular, it is possible to group independent simulations in a single job to fit the available computational resources. They also give a basic level of fault resilience as the workflow can be resumed at the correct state preceding a failure. Some of the best candidates for our particular workflow are Kepler and Swift, and the latter appears to be the most serious candidate for a large-scale workflow on a single supercomputer, remaining sufficiently simple to accommodate further modifications and improvements.

  19. Neurite, a Finite Difference Large Scale Parallel Program for the Simulation of Electrical Signal Propagation in Neurites under Mechanical Loading

    PubMed Central

    García-Grajales, Julián A.; Rucabado, Gabriel; García-Dopico, Antonio; Peña, José-María; Jérusalem, Antoine

    2015-01-01

    With the growing body of research on traumatic brain injury and spinal cord injury, computational neuroscience has recently focused its modeling efforts on neuronal functional deficits following mechanical loading. However, in most of these efforts, cell damage is generally only characterized by purely mechanistic criteria, functions of quantities such as stress, strain or their corresponding rates. The modeling of functional deficits in neurites as a consequence of macroscopic mechanical insults has been rarely explored. In particular, a quantitative mechanically based model of electrophysiological impairment in neuronal cells, Neurite, has only very recently been proposed. In this paper, we present the implementation details of this model: a finite difference parallel program for simulating electrical signal propagation along neurites under mechanical loading. Following the application of a macroscopic strain at a given strain rate produced by a mechanical insult, Neurite is able to simulate the resulting neuronal electrical signal propagation, and thus the corresponding functional deficits. The simulation of the coupled mechanical and electrophysiological behaviors requires computational expensive calculations that increase in complexity as the network of the simulated cells grows. The solvers implemented in Neurite—explicit and implicit—were therefore parallelized using graphics processing units in order to reduce the burden of the simulation costs of large scale scenarios. Cable Theory and Hodgkin-Huxley models were implemented to account for the electrophysiological passive and active regions of a neurite, respectively, whereas a coupled mechanical model accounting for the neurite mechanical behavior within its surrounding medium was adopted as a link between electrophysiology and mechanics. This paper provides the details of the parallel implementation of Neurite, along with three different application examples: a long myelinated axon, a segmented dendritic tree, and a damaged axon. The capabilities of the program to deal with large scale scenarios, segmented neuronal structures, and functional deficits under mechanical loading are specifically highlighted. PMID:25680098

  20. Refined stratified-worm-burden models that incorporate specific biological features of human and snail hosts provide better estimates of Schistosoma diagnosis, transmission, and control.

    PubMed

    Gurarie, David; King, Charles H; Yoon, Nara; Li, Emily

    2016-08-04

    Schistosoma parasites sustain a complex transmission process that cycles between a definitive human host, two free-swimming larval stages, and an intermediate snail host. Multiple factors modify their transmission and affect their control, including heterogeneity in host populations and environment, the aggregated distribution of human worm burdens, and features of parasite reproduction and host snail biology. Because these factors serve to enhance local transmission, their inclusion is important in attempting accurate quantitative prediction of the outcomes of schistosomiasis control programs. However, their inclusion raises many mathematical and computational challenges. To address these, we have recently developed a tractable stratified worm burden (SWB) model that occupies an intermediate place between simpler deterministic mean worm burden models and the very computationally-intensive, autonomous agent models. To refine the accuracy of model predictions, we modified an earlier version of the SWB by incorporating factors representing essential in-host biology (parasite mating, aggregation, density-dependent fecundity, and random egg-release) into demographically structured host communities. We also revised the snail component of the transmission model to reflect a saturable form of human-to-snail transmission. The new model allowed us to realistically simulate overdispersed egg-test results observed in individual-level field data. We further developed a Bayesian-type calibration methodology that accounted for model and data uncertainties. The new model methodology was applied to multi-year, individual-level field data on S. haematobium infections in coastal Kenya. We successfully derived age-specific estimates of worm burden distributions and worm fecundity and crowding functions for children and adults. Estimates from the new SWB model were compared with those from the older, simpler SWB with some substantial differences noted. We validated our new SWB estimates in prediction of drug treatment-based control outcomes for a typical Kenyan community. The new version of the SWB model provides a better tool to predict the outcomes of ongoing schistosomiasis control programs. It reflects parasite features that augment and perpetuate transmission, while it also readily incorporates differences in diagnostic testing and human sub-population differences in treatment coverage. Once extended to other Schistosoma species and transmission environments, it will provide a useful and efficient tool for planning control and elimination strategies.

  1. Water resources planning and management : A stochastic dual dynamic programming approach

    NASA Astrophysics Data System (ADS)

    Goor, Q.; Pinte, D.; Tilmant, A.

    2008-12-01

    Allocating water between different users and uses, including the environment, is one of the most challenging task facing water resources managers and has always been at the heart of Integrated Water Resources Management (IWRM). As water scarcity is expected to increase over time, allocation decisions among the different uses will have to be found taking into account the complex interactions between water and the economy. Hydro-economic optimization models can capture those interactions while prescribing efficient allocation policies. Many hydro-economic models found in the literature are formulated as large-scale non linear optimization problems (NLP), seeking to maximize net benefits from the system operation while meeting operational and/or institutional constraints, and describing the main hydrological processes. However, those models rarely incorporate the uncertainty inherent to the availability of water, essentially because of the computational difficulties associated stochastic formulations. The purpose of this presentation is to present a stochastic programming model that can identify economically efficient allocation policies in large-scale multipurpose multireservoir systems. The model is based on stochastic dual dynamic programming (SDDP), an extension of traditional SDP that is not affected by the curse of dimensionality. SDDP identify efficient allocation policies while considering the hydrologic uncertainty. The objective function includes the net benefits from the hydropower and irrigation sectors, as well as penalties for not meeting operational and/or institutional constraints. To be able to implement the efficient decomposition scheme that remove the computational burden, the one-stage SDDP problem has to be a linear program. Recent developments improve the representation of the non-linear and mildly non- convex hydropower function through a convex hull approximation of the true hydropower function. This model is illustrated on a cascade of 14 reservoirs on the Nile river basin.

  2. A two-step sensitivity analysis for hydrological signatures in Jinhua River Basin, East China

    NASA Astrophysics Data System (ADS)

    Pan, S.; Fu, G.; Chiang, Y. M.; Xu, Y. P.

    2016-12-01

    Owing to model complexity and large number of parameters, calibration and sensitivity analysis are difficult processes for distributed hydrological models. In this study, a two-step sensitivity analysis approach is proposed for analyzing the hydrological signatures in Jinhua River Basin, East China, using the Distributed Hydrology-Soil-Vegetation Model (DHSVM). A rough sensitivity analysis is firstly conducted to obtain preliminary influential parameters via Analysis of Variance. The number of parameters was greatly reduced from eighteen-three to sixteen. Afterwards, the sixteen parameters are further analyzed based on a variance-based global sensitivity analysis, i.e., Sobol's sensitivity analysis method, to achieve robust sensitivity rankings and parameter contributions. Parallel-Computing is applied to reduce computational burden in variance-based sensitivity analysis. The results reveal that only a few number of model parameters are significantly sensitive, including rain LAI multiplier, lateral conductivity, porosity, field capacity, wilting point of clay loam, understory monthly LAI, understory minimum resistance and root zone depths of croplands. Finally several hydrological signatures are used for investigating the performance of DHSVM. Results show that high value of efficiency criteria didn't indicate excellent performance of hydrological signatures. For most samples from Sobol's sensitivity analysis, water yield was simulated very well. However, lowest and maximum annual daily runoffs were underestimated. Most of seven-day minimum runoffs were overestimated. Nevertheless, good performances of the three signatures above still exist in a number of samples. Analysis of peak flow shows that small and medium floods are simulated perfectly while slight underestimations happen to large floods. The work in this study helps to further multi-objective calibration of DHSVM model and indicates where to improve the reliability and credibility of model simulation.

  3. Cost Sharing, Family Health Care Burden, and the Use of Specialty Drugs for Rheumatoid Arthritis

    PubMed Central

    Karaca-Mandic, Pinar; Joyce, Geoffrey F; Goldman, Dana P; Laouri, Marianne

    2010-01-01

    Objectives To examine the impact of benefit generosity and household health care financial burden on the demand for specialty drugs in the treatment of rheumatoid arthritis (RA). Data Sources/Study Setting Enrollment, claims, and benefit design information for 35 large private employers during 2000–2005. Study Design We estimated multivariate models of the effects of benefit generosity and household financial burden on initiation and continuation of biologic therapies. Data Extraction Methods We defined initiation of biologic therapy as first-time use of etanercept, adalimumab, or infliximab, and we constructed an index of plan generosity based on coverage of biologic therapies in each plan. We estimated the household's burden by summing up the annual out-of-pocket (OOP) expenses of other family members. Principal Findings Benefit generosity affected both the likelihood of initiating a biologic and continuing drug therapy, although the effects were stronger for initiation. Initiation of a biologic was lower in households where other family members incurred high OOP expenses. Conclusions The use of biologic therapy for RA is sensitive to benefit generosity and household financial burden. The increasing use of coinsurance rates for specialty drugs (as under Medicare Part D) raises concern about adverse health consequences. PMID:20831715

  4. INDIVIDUAL-BASED MODELS: POWERFUL OR POWER STRUGGLE?

    PubMed

    Willem, L; Stijven, S; Hens, N; Vladislavleva, E; Broeckhove, J; Beutels, P

    2015-01-01

    Individual-based models (IBMs) offer endless possibilities to explore various research questions but come with high model complexity and computational burden. Large-scale IBMs have become feasible but the novel hardware architectures require adapted software. The increased model complexity also requires systematic exploration to gain thorough system understanding. We elaborate on the development of IBMs for vaccine-preventable infectious diseases and model exploration with active learning. Investment in IBM simulator code can lead to significant runtime reductions. We found large performance differences due to data locality. Sorting the population once, reduced simulation time by a factor two. Storing person attributes separately instead of using person objects also seemed more efficient. Next, we improved model performance up to 70% by structuring potential contacts based on health status before processing disease transmission. The active learning approach we present is based on iterative surrogate modelling and model-guided experimentation. Symbolic regression is used for nonlinear response surface modelling with automatic feature selection. We illustrate our approach using an IBM for influenza vaccination. After optimizing the parameter spade, we observed an inverse relationship between vaccination coverage and the clinical attack rate reinforced by herd immunity. These insights can be used to focus and optimise research activities, and to reduce both dimensionality and decision uncertainty.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    DOREN,NEALL E.

    Wavefront curvature defocus effects occur in spotlight-mode SAR imagery when reconstructed via the well-known polar-formatting algorithm (PFA) under certain imaging scenarios. These include imaging at close range, using a very low radar center frequency, utilizing high resolution, and/or imaging very large scenes. Wavefront curvature effects arise from the unrealistic assumption of strictly planar wavefronts illuminating the imaged scene. This dissertation presents a method for the correction of wavefront curvature defocus effects under these scenarios, concentrating on the generalized: squint-mode imaging scenario and its computational aspects. This correction is accomplished through an efficient one-dimensional, image domain filter applied as a post-processingmore » step to PF.4. This post-filter, referred to as SVPF, is precalculated from a theoretical derivation of the wavefront curvature effect and varies as a function of scene location. Prior to SVPF, severe restrictions were placed on the imaged scene size in order to avoid defocus effects under these scenarios when using PFA. The SVPF algorithm eliminates the need for scene size restrictions when wavefront curvature effects are present, correcting for wavefront curvature in broadside as well as squinted collection modes while imposing little additional computational penalty for squinted images. This dissertation covers the theoretical development, implementation and analysis of the generalized, squint-mode SVPF algorithm (of which broadside-mode is a special case) and provides examples of its capabilities and limitations as well as offering guidelines for maximizing its computational efficiency. Tradeoffs between the PFA/SVPF combination and other spotlight-mode SAR image formation techniques are discussed with regard to computational burden, image quality, and imaging geometry constraints. It is demonstrated that other methods fail to exhibit a clear computational advantage over polar-formatting in conjunction with SVPF. This research concludes that PFA in conjunction with SVPF provides a computationally efficient spotlight-mode image formation solution that solves the wavefront curvature problem for most standoff distances and patch sizes, regardless of squint, resolution or radar center frequency. Additional advantages are that SVPF is not iterative and has no dependence on the visual contents of the scene: resulting in a deterministic computational complexity which typically adds only thirty percent to the overall image formation time.« less

  6. A Novel Approach for Determining Source-Receptor Relationships of Aerosols in Model Simulations

    NASA Astrophysics Data System (ADS)

    Ma, P.; Gattiker, J.; Liu, X.; Rasch, P. J.

    2013-12-01

    The climate modeling community usually performs sensitivity studies in the 'one-factor-at-a-time' fashion. However, owing to the a-priori unknown complexity and nonlinearity of the climate system and simulation response, it is computationally expensive to systematically identify the cause-and-effect of multiple factors in climate models. In this study, we use a Gaussian Process emulator, based on a small number of Community Atmosphere Model Version 5.1 (CAM5) simulations (constrained by meteorological reanalyses) using a Latin Hypercube experimental design, to demonstrate that it is possible to characterize model behavior accurately and very efficiently without any modifications to the model itself. We use the emulator to characterize the source-receptor relationships of black carbon (BC), focusing specifically on describing the constituent burden and surface deposition rates from emissions in various regions. Our results show that the emulator is capable of quantifying the contribution of aerosol burden and surface deposition from different source regions, finding that most of current Arctic BC comes from remote sources. We also demonstrate that the sensitivity of the BC burdens to emission perturbations differs for various source regions. For example, the emission growth in Africa where dry convections are strong results in a moderate increase of BC burden over the globe while the same emission growth in the Arctic leads to a significant increase of local BC burdens and surface deposition rates. These results provide insights into the dynamical, physical, and chemical processes of the climate model, and the conclusions may have policy implications for making cost-effective global and regional pollution management strategies.

  7. Global and regional trends in particulate air pollution and attributable health burden over the past 50 years

    NASA Astrophysics Data System (ADS)

    Butt, E. W.; Turnock, S. T.; Rigby, R.; Reddington, C. L.; Yoshioka, M.; Johnson, J. S.; Regayre, L. A.; Pringle, K. J.; Mann, G. W.; Spracklen, D. V.

    2017-10-01

    Long-term exposure to ambient particulate matter (PM2.5, mass of particles with an aerodynamic dry diameter of < 2.5 μm) is a major risk factor to the global burden of disease. Previous studies have focussed on present day or future health burdens attributed to ambient PM2.5. Few studies have estimated changes in PM2.5 and attributable health burdens over the last few decades, a period where air quality has changed rapidly. Here we used the HadGEM3-UKCA coupled chemistry-climate model, integrated exposure-response relationships, demographic and background disease data to provide the first estimate of the changes in global and regional ambient PM2.5 concentrations and attributable health burdens over the period 1960 to 2009. Over this period, global mean population-weighted PM2.5 concentrations increased by 38%, dominated by increases in China and India. Global attributable deaths increased by 89% to 124% over the period 1960 to 2009, dominated by large increases in China and India. Population growth and ageing contributed mostly to the increases in attributable deaths in China and India, highlighting the importance of demographic trends. In contrast, decreasing PM2.5 concentrations and background disease dominated the reduction in attributable health burden in Europe and the United States. Our results shed light on how future projected trends in demographics and uncertainty in the exposure-response relationship may provide challenges for future air quality policy in Asia.

  8. The Burden of Cancer in Korea during 2012: Findings from a Prevalence-Based Approach.

    PubMed

    Gong, Young Hoon; Yoon, Seok Jun; Jo, Min Woo; Kim, Arim; Kim, Young Ae; Yoon, Jihyun; Seo, Hyeyoung; Kim, Dongwoo

    2016-11-01

    Cancer causes a significant deterioration in health and premature death and is a national socioeconomic burden. This study aimed to measure the burden of cancer using the disability-adjusted life year (DALY) metric based on the newly adopted methodology from the Global Burden of Disease Study in 2010. This study was conducted based on data from the Korean National Cancer Registry. The DALYs were calculated using a prevalence-based method instead of the incidence-based method used by previous studies. The total burden of cancer in 2012 was 3,470.79 DALYs per 100,000 persons. Lung cancer was the most prevalent cancer burden, followed by liver, stomach, colorectal, and breast cancer. The DALYs for lung, liver, stomach, colon and rectum, and pancreatic cancer were high in men, whereas the DALYs for breast, lung, stomach, colorectal, and liver cancer were high in women. Health loss from leukemia and cancer of the brain and nervous system was prevalent for those younger than age 20; from stomach, breast, and liver for those aged 30-50; and from lung, colon and rectum, and pancreas for a large proportion of individuals over the age of 60. The most important differences were that the DALYs were calculated by prevalence and that other components of the DALYs were measured by a population-based perspective. Thus, prevalence-based DALYs could provide more suitable data for decision making in the healthcare field.

  9. Life Course Challenges Faced by Siblings of Individuals with Schizophrenia May Increase Risk for Depressive Symptoms.

    PubMed

    Smith, Matthew J; Greenberg, Jan S; Sciortino, Sarah A; Sandoval, Gisela M; Lukens, Ellen P

    Research suggests siblings of individuals with schizophrenia are at a heightened risk for depressive symptomatology. Research has not yet examined whether the strains of growing up with a brother or sister with schizophrenia contribute to this risk. This study examined whether early life course burdens associated with an emerging mental illness, and current objective and subjective caregiver burden predicted depressive symptoms in siblings of individuals with schizophrenia. Forty-one siblings of individuals with schizophrenia were recruited from a large study of schizophrenia neurobiology to complete a self-administered questionnaire and a neuropsychological test battery. Early life course burdens and current objective and subjective burdens explained incremental variance in depressive symptoms of siblings of individuals with schizophrenia after accounting for gender and global neurocognitive function. Higher levels of depressive symptoms among siblings were associated with perceptions of being stigmatized by the community (β=.37, p <.01), and perceiving that the brother or sister's emerging illness negatively impacted the sibling's social life during childhood and adolescence (β=.39, p <.01). Taking on adult responsibilities while the sibling was growing up was found to be protective against depressive symptoms in adulthood (β= -.36, p <.01). Early life course burdens associated with having a sibling with schizophrenia and current subjective burden provide insight into psychosocial factors that may contribute to the risk for depression in this sibling group. Mental health service providers and psychoeducation programs would benefit by considering these factors when developing family-based interventions.

  10. Tuberculosis DALY-Gap: Spatial and Quantitative Comparison of Disease Burden Across Urban Slum and Non-slum Census Tracts.

    PubMed

    Marlow, Mariel A; Maciel, Ethel Leonor Noia; Sales, Carolina Maia Martins; Gomes, Teresa; Snyder, Robert E; Daumas, Regina Paiva; Riley, Lee W

    2015-08-01

    To quantitatively assess disease burden due to tuberculosis between populations residing in and outside of urban informal settlements in Rio de Janeiro, Brazil, we compared disability-adjusted life years (DALYs), or "DALY-gap." Using the 2010 Brazilian census definition of informal settlements as aglomerados subnormais (AGSN), we allocated tuberculosis (TB) DALYs to AGSN vs non-AGSN census tracts based on geocoded addresses of TB cases reported to the Brazilian Information System for Notifiable Diseases in 2005 and 2010. DALYs were calculated based on the 2010 Global Burden of Disease methodology. DALY-gap was calculated as the difference between age-adjusted DALYs/100,000 population between AGSN and non-AGSN. Total TB DALY in Rio in 2010 was 16,731 (266 DALYs/100,000). DALYs were higher in AGSN census tracts (306 vs 236 DALYs/100,000), yielding a DALY-gap of 70 DALYs/100,000. Attributable DALY fraction for living in an AGSN was 25.4%. DALY-gap was highest for males 40-59 years of age (501 DALYs/100,000) and in census tracts with <60% electricity (12,327 DALYs/100,000). DALY-gap comparison revealed spatial and quantitative differences in TB burden between slum vs non-slum census tracts that were not apparent using traditional measures of incidence and mortality. This metric could be applied to compare TB burden or burden for other diseases in mega-cities with large informal settlements for more targeted resource allocation and evaluation of intervention programs.

  11. Factors influencing aquatic-to-terrestrial contaminant transport to terrestrial arthropod consumers in a multiuse river system.

    PubMed

    Alberts, Jeremy M; Sullivan, S Mažeika P

    2016-06-01

    Emerging aquatic insects are important vectors of contaminant transfer from aquatic to terrestrial food webs. However, the environmental factors that regulate contaminant body burdens in nearshore terrestrial consumers remain largely unexplored. We investigated the relative influences of riparian landscape composition (i.e., land use and nearshore vegetation structure) and contaminant flux via the emergent aquatic insect subsidy on selenium (Se) and mercury (Hg) body burdens of riparian ants (Formica subsericea) and spiders of the family Tetragnathidae along 11 river reaches spanning an urban-rural land-use gradient in Ohio, USA. Model-selection results indicated that fine-scale land cover (e.g., riparian zone width, shrub cover) in the riparian zone was positively associated with reach-wide body burdens of Se and Hg in both riparian F. subsericea and tetragnathid spiders (i.e., total magnitude of Hg and Se concentrations in ant and spider populations, respectively, for each reach). River distance downstream of Columbus, Ohio - where study reaches were impounded and flow through a large urban center - was also implicated as an important factor. Although stable-isotope analysis suggested that emergent aquatic insects were likely vectors of Se and Hg to tetragnathid spiders (but not to F. subsericea), emergent insect contaminant flux did not emerge as a significant predictor for either reach-wide body burdens of spider Hg or Se. Improved understanding of the pathways and influences that control aquatic-to-terrestrial contaminant transport will be critical for effective risk management and remediation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Relationships among Symptom Management Burden, Coping Responses, and Caregiver Psychological Distress at End of Life.

    PubMed

    Washington, Karla T; Wilkes, Chelsey M; Rakes, Christopher R; Otten, Sheila J; Parker Oliver, Debra; Demiris, George

    2018-05-04

    Family caregivers (FCGs) face numerous stressors and are at heightened risk of psychological distress. While theoretical explanations exist linking caregiving stressors with outcomes such as anxiety and depression, limited testing of these theories has occurred among FCGs of patients nearing the end of life. Researchers sought to evaluate mediational relationships among burden experienced by hospice FCGs because of symptom management demands, caregivers' coping responses, and caregivers' psychological distress. Quantitative data for this descriptive exploratory study were collected through survey. Hypothesized relationships among caregiver variables were examined with structural equation modeling. Respondents were FCGs (N = 228) of hospice patients receiving services from a large, non-profit community hospice in the Mid-Southern United States. Burden associated with managing hospice patients' psychological symptoms was shown to predict psychological distress for FCGs. Caregivers' use of escape-avoidance coping responses mediated this relationship. Results suggest that FCGs would benefit from additional tools to address patients' psychological symptoms at end of life. When faced with psychological symptom management burden, caregivers need a range of coping skills as alternatives to escape-avoidance coping.

  13. Vaccination against herpes zoster in developed countries: state of the evidence.

    PubMed

    Drolet, Mélanie; Oxman, Michael N; Levin, Myron J; Schmader, Kenneth E; Johnson, Robert W; Patrick, David; Mansi, James A; Brisson, Marc

    2013-05-01

    Although progress has been made in the treatment of herpes zoster (HZ) and postherpetic neuralgia (PHN), available therapeutic options are only partially effective. Given evidence that a live-attenuated varicella-zoster-virus vaccine is effective at reducing the incidence of HZ, PHN and the burden of illness, policymakers and clinicians are being asked to make recommendations regarding the use of the zoster vaccine. In this report, we summarize the evidence regarding the: (1) burden of illness; (2) vaccine efficacy and safety; and (3) cost-effectiveness of vaccination, to assist evidence-based policy making and guide clinicians in their recommendations. First, there is general agreement that the overall burden of illness associated with HZ and PHN is substantial. Second, the safety and efficacy of the zoster vaccine at reducing the burden of illness due to HZ and the incidence of PHN have been clearly demonstrated in large placebo-controlled trials. However, uncertainty remains about the vaccine's duration of protection. Third, vaccination against HZ is likely to be cost-effective when the vaccine is given at approximately 65 y of age, if vaccine duration is longer than 10 y.

  14. Vaccination against herpes zoster in developed countries

    PubMed Central

    Drolet, Mélanie; Oxman, Michael N.; Levin, Myron J.; Schmader, Kenneth E.; Johnson, Robert W.; Patrick, David; Mansi, James A.; Brisson, Marc

    2013-01-01

    Although progress has been made in the treatment of herpes zoster (HZ) and postherpetic neuralgia (PHN), available therapeutic options are only partially effective. Given evidence that a live-attenuated varicella-zoster-virus vaccine is effective at reducing the incidence of HZ, PHN and the burden of illness, policymakers and clinicians are being asked to make recommendations regarding the use of the zoster vaccine. In this report, we summarize the evidence regarding the: (1) burden of illness; (2) vaccine efficacy and safety; and (3) cost-effectiveness of vaccination, to assist evidence-based policy making and guide clinicians in their recommendations. First, there is general agreement that the overall burden of illness associated with HZ and PHN is substantial. Second, the safety and efficacy of the zoster vaccine at reducing the burden of illness due to HZ and the incidence of PHN have been clearly demonstrated in large placebo-controlled trials. However, uncertainty remains about the vaccine’s duration of protection. Third, vaccination against HZ is likely to be cost-effective when the vaccine is given at approximately 65 y of age, if vaccine duration is longer than 10 y. PMID:23324598

  15. Fast Algorithms for Designing Unimodular Waveform(s) With Good Correlation Properties

    NASA Astrophysics Data System (ADS)

    Li, Yongzhe; Vorobyov, Sergiy A.

    2018-03-01

    In this paper, we develop new fast and efficient algorithms for designing single/multiple unimodular waveforms/codes with good auto- and cross-correlation or weighted correlation properties, which are highly desired in radar and communication systems. The waveform design is based on the minimization of the integrated sidelobe level (ISL) and weighted ISL (WISL) of waveforms. As the corresponding optimization problems can quickly grow to large scale with increasing the code length and number of waveforms, the main issue turns to be the development of fast large-scale optimization techniques. The difficulty is also that the corresponding optimization problems are non-convex, but the required accuracy is high. Therefore, we formulate the ISL and WISL minimization problems as non-convex quartic optimization problems in frequency domain, and then simplify them into quadratic problems by utilizing the majorization-minimization technique, which is one of the basic techniques for addressing large-scale and/or non-convex optimization problems. While designing our fast algorithms, we find out and use inherent algebraic structures in the objective functions to rewrite them into quartic forms, and in the case of WISL minimization, to derive additionally an alternative quartic form which allows to apply the quartic-quadratic transformation. Our algorithms are applicable to large-scale unimodular waveform design problems as they are proved to have lower or comparable computational burden (analyzed theoretically) and faster convergence speed (confirmed by comprehensive simulations) than the state-of-the-art algorithms. In addition, the waveforms designed by our algorithms demonstrate better correlation properties compared to their counterparts.

  16. Software Infrastructure for Computer-aided Drug Discovery and Development, a Practical Example with Guidelines.

    PubMed

    Moretti, Loris; Sartori, Luca

    2016-09-01

    In the field of Computer-Aided Drug Discovery and Development (CADDD) the proper software infrastructure is essential for everyday investigations. The creation of such an environment should be carefully planned and implemented with certain features in order to be productive and efficient. Here we describe a solution to integrate standard computational services into a functional unit that empowers modelling applications for drug discovery. This system allows users with various level of expertise to run in silico experiments automatically and without the burden of file formatting for different software, managing the actual computation, keeping track of the activities and graphical rendering of the structural outcomes. To showcase the potential of this approach, performances of five different docking programs on an Hiv-1 protease test set are presented. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Cognition, glucose metabolism and amyloid burden in Alzheimer’s disease

    PubMed Central

    Furst, Ansgar J.; Rabinovici, Gil D.; Rostomian, Ara H.; Steed, Tyler; Alkalay, Adi; Racine, Caroline; Miller, Bruce L.; Jagust, William J.

    2010-01-01

    We investigated relationships between glucose metabolism, amyloid load and measures of cognitive and functional impairment in Alzheimer’s disease (AD). Patients meeting criteria for probable AD underwent [11C]PIB and [18F]FDG PET imaging and were assessed on a set of clinical measures. PIB Distribution volume ratios and FDG scans were spatially normalized and average PIB counts from regions-of-interest (ROI) were used to compute a measure of global PIB uptake. Separate voxel-wise regressions explored local and global relationships between metabolism, amyloid burden and clinical measures. Regressions reflected cognitive domains assessed by individual measures, with visuospatial tests associated with more posterior metabolism, and language tests associated with metabolism in the left hemisphere. Correlating regional FDG uptake with these measures confirmed these findings. In contrast, no correlations were found between either voxel-wise or regional PIB uptake and any of the clinical measures. Finally, there were no associations between regional PIB and FDG uptake. We conclude that regional and global amyloid burden does not correlate with clinical status or glucose metabolism in AD. PMID:20417582

  18. A computational tool integrating host immunity with antibiotic dynamics to study tuberculosis treatment.

    PubMed

    Pienaar, Elsje; Cilfone, Nicholas A; Lin, Philana Ling; Dartois, Véronique; Mattila, Joshua T; Butler, J Russell; Flynn, JoAnne L; Kirschner, Denise E; Linderman, Jennifer J

    2015-02-21

    While active tuberculosis (TB) is a treatable disease, many complex factors prevent its global elimination. Part of the difficulty in developing optimal therapies is the large design space of antibiotic doses, regimens and combinations. Computational models that capture the spatial and temporal dynamics of antibiotics at the site of infection can aid in reducing the design space of costly and time-consuming animal pre-clinical and human clinical trials. The site of infection in TB is the granuloma, a collection of immune cells and bacteria that form in the lung, and new data suggest that penetration of drugs throughout granulomas is problematic. Here we integrate our computational model of granuloma formation and function with models for plasma pharmacokinetics, lung tissue pharmacokinetics and pharmacodynamics for two first line anti-TB antibiotics. The integrated model is calibrated to animal data. We make four predictions. First, antibiotics are frequently below effective concentrations inside granulomas, leading to bacterial growth between doses and contributing to the long treatment periods required for TB. Second, antibiotic concentration gradients form within granulomas, with lower concentrations toward their centers. Third, during antibiotic treatment, bacterial subpopulations are similar for INH and RIF treatment: mostly intracellular with extracellular bacteria located in areas non-permissive for replication (hypoxic areas), presenting a slowly increasing target population over time. Finally, we find that on an individual granuloma basis, pre-treatment infection severity (including bacterial burden, host cell activation and host cell death) is predictive of treatment outcome. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. A computational tool integrating host immunity with antibiotic dynamics to study tuberculosis treatment

    DOE PAGES

    Pienaar, Elsje; Cilfone, Nicholas A.; Lin, Philana Ling; ...

    2014-12-08

    While active tuberculosis (TB) is a treatable disease, many complex factors prevent its global elimination. Part of the difficulty in developing optimal therapies is the large design space of antibiotic doses, regimens and combinations. Computational models that capture the spatial and temporal dynamics of antibiotics at the site of infection can aid in reducing the design space of costly and time-consuming animal pre-clinical and human clinical trials. The site of infection in TB is the granuloma, a collection of immune cells and bacteria that form in the lung, and new data suggest that penetration of drugs throughout granulomas is problematic.more » In this paper, we integrate our computational model of granuloma formation and function with models for plasma pharmacokinetics, lung tissue pharmacokinetics and pharmacodynamics for two first line anti-TB antibiotics. The integrated model is calibrated to animal data. We make four predictions. First, antibiotics are frequently below effective concentrations inside granulomas, leading to bacterial growth between doses and contributing to the long treatment periods required for TB. Second, antibiotic concentration gradients form within granulomas, with lower concentrations toward their centers. Third, during antibiotic treatment, bacterial subpopulations are similar for INH and RIF treatment: mostly intracellular with extracellular bacteria located in areas non-permissive for replication (hypoxic areas), presenting a slowly increasing target population over time. In conclusion, we find that on an individual granuloma basis, pre-treatment infection severity (including bacterial burden, host cell activation and host cell death) is predictive of treatment outcome.« less

  20. COMETS2: An advanced MATLAB toolbox for the numerical analysis of electric fields generated by transcranial direct current stimulation.

    PubMed

    Lee, Chany; Jung, Young-Jin; Lee, Sang Jun; Im, Chang-Hwan

    2017-02-01

    Since there is no way to measure electric current generated by transcranial direct current stimulation (tDCS) inside the human head through in vivo experiments, numerical analysis based on the finite element method has been widely used to estimate the electric field inside the head. In 2013, we released a MATLAB toolbox named COMETS, which has been used by a number of groups and has helped researchers to gain insight into the electric field distribution during stimulation. The aim of this study was to develop an advanced MATLAB toolbox, named COMETS2, for the numerical analysis of the electric field generated by tDCS. COMETS2 can generate any sizes of rectangular pad electrodes on any positions on the scalp surface. To reduce the large computational burden when repeatedly testing multiple electrode locations and sizes, a new technique to decompose the global stiffness matrix was proposed. As examples of potential applications, we observed the effects of sizes and displacements of electrodes on the results of electric field analysis. The proposed mesh decomposition method significantly enhanced the overall computational efficiency. We implemented an automatic electrode modeler for the first time, and proposed a new technique to enhance the computational efficiency. In this paper, an efficient toolbox for tDCS analysis is introduced (freely available at http://www.cometstool.com). It is expected that COMETS2 will be a useful toolbox for researchers who want to benefit from the numerical analysis of electric fields generated by tDCS. Copyright © 2016. Published by Elsevier B.V.

  1. A computational tool integrating host immunity with antibiotic dynamics to study tuberculosis treatment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pienaar, Elsje; Cilfone, Nicholas A.; Lin, Philana Ling

    While active tuberculosis (TB) is a treatable disease, many complex factors prevent its global elimination. Part of the difficulty in developing optimal therapies is the large design space of antibiotic doses, regimens and combinations. Computational models that capture the spatial and temporal dynamics of antibiotics at the site of infection can aid in reducing the design space of costly and time-consuming animal pre-clinical and human clinical trials. The site of infection in TB is the granuloma, a collection of immune cells and bacteria that form in the lung, and new data suggest that penetration of drugs throughout granulomas is problematic.more » In this paper, we integrate our computational model of granuloma formation and function with models for plasma pharmacokinetics, lung tissue pharmacokinetics and pharmacodynamics for two first line anti-TB antibiotics. The integrated model is calibrated to animal data. We make four predictions. First, antibiotics are frequently below effective concentrations inside granulomas, leading to bacterial growth between doses and contributing to the long treatment periods required for TB. Second, antibiotic concentration gradients form within granulomas, with lower concentrations toward their centers. Third, during antibiotic treatment, bacterial subpopulations are similar for INH and RIF treatment: mostly intracellular with extracellular bacteria located in areas non-permissive for replication (hypoxic areas), presenting a slowly increasing target population over time. In conclusion, we find that on an individual granuloma basis, pre-treatment infection severity (including bacterial burden, host cell activation and host cell death) is predictive of treatment outcome.« less

  2. Detecting epistasis with the marginal epistasis test in genetic mapping studies of quantitative traits

    PubMed Central

    Zeng, Ping; Mukherjee, Sayan; Zhou, Xiang

    2017-01-01

    Epistasis, commonly defined as the interaction between multiple genes, is an important genetic component underlying phenotypic variation. Many statistical methods have been developed to model and identify epistatic interactions between genetic variants. However, because of the large combinatorial search space of interactions, most epistasis mapping methods face enormous computational challenges and often suffer from low statistical power due to multiple test correction. Here, we present a novel, alternative strategy for mapping epistasis: instead of directly identifying individual pairwise or higher-order interactions, we focus on mapping variants that have non-zero marginal epistatic effects—the combined pairwise interaction effects between a given variant and all other variants. By testing marginal epistatic effects, we can identify candidate variants that are involved in epistasis without the need to identify the exact partners with which the variants interact, thus potentially alleviating much of the statistical and computational burden associated with standard epistatic mapping procedures. Our method is based on a variance component model, and relies on a recently developed variance component estimation method for efficient parameter inference and p-value computation. We refer to our method as the “MArginal ePIstasis Test”, or MAPIT. With simulations, we show how MAPIT can be used to estimate and test marginal epistatic effects, produce calibrated test statistics under the null, and facilitate the detection of pairwise epistatic interactions. We further illustrate the benefits of MAPIT in a QTL mapping study by analyzing the gene expression data of over 400 individuals from the GEUVADIS consortium. PMID:28746338

  3. GPU-accelerated Lattice Boltzmann method for anatomical extraction in patient-specific computational hemodynamics

    NASA Astrophysics Data System (ADS)

    Yu, H.; Wang, Z.; Zhang, C.; Chen, N.; Zhao, Y.; Sawchuk, A. P.; Dalsing, M. C.; Teague, S. D.; Cheng, Y.

    2014-11-01

    Existing research of patient-specific computational hemodynamics (PSCH) heavily relies on software for anatomical extraction of blood arteries. Data reconstruction and mesh generation have to be done using existing commercial software due to the gap between medical image processing and CFD, which increases computation burden and introduces inaccuracy during data transformation thus limits the medical applications of PSCH. We use lattice Boltzmann method (LBM) to solve the level-set equation over an Eulerian distance field and implicitly and dynamically segment the artery surfaces from radiological CT/MRI imaging data. The segments seamlessly feed to the LBM based CFD computation of PSCH thus explicit mesh construction and extra data management are avoided. The LBM is ideally suited for GPU (graphic processing unit)-based parallel computing. The parallel acceleration over GPU achieves excellent performance in PSCH computation. An application study will be presented which segments an aortic artery from a chest CT dataset and models PSCH of the segmented artery.

  4. Severe Traumatic Brain Injury

    MedlinePlus

    ... but it also has a large societal and economic toll. The estimated economic cost of TBI in 2010, including direct and ... P, Miller T and associates. The Incidence and Economic Burden of Injuries in the United States. New ...

  5. Confronting the “Indian summer monsoon response to black carbon aerosol” with the uncertainty in its radiative forcing and beyond

    DOE PAGES

    Kovilakam, Mahesh; Mahajan, Salil

    2016-06-28

    While black carbon aerosols (BC) are believed to modulate the Indian monsoons, the radiative forcing estimate of BC suffers from large uncertainties globally. In this paper, we analyze a suite of idealized experiments forced with a range of BC concentrations that span a large swath of the latest estimates of its global radiative forcing. Within those bounds of uncertainty, summer precipitation over the Indian region increases nearly linearly with the increase in BC burden. The linearity holds even as the BC concentration is increased to levels resembling those hypothesized in nuclear winter scenarios, despite large surface cooling over India andmore » adjoining regions. The enhanced monsoonal circulation is associated with a linear increase in the large-scale meridional tropospheric temperature gradient. The precipitable water over the region also increases linearly with an increase in BC burden, due to increased moisture transport from the Arabian sea to the land areas. The wide range of Indian monsoon response elicited in these experiments emphasizes the need to reduce the uncertainty in BC estimates to accurately quantify their role in modulating the Indian monsoons. Finally, the increase in monsoonal circulation in response to large BC concentrations contrasts earlier findings that the Indian summer monsoon may break down following a nuclear war.« less

  6. Confronting the “Indian summer monsoon response to black carbon aerosol” with the uncertainty in its radiative forcing and beyond

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kovilakam, Mahesh; Mahajan, Salil

    While black carbon aerosols (BC) are believed to modulate the Indian monsoons, the radiative forcing estimate of BC suffers from large uncertainties globally. In this paper, we analyze a suite of idealized experiments forced with a range of BC concentrations that span a large swath of the latest estimates of its global radiative forcing. Within those bounds of uncertainty, summer precipitation over the Indian region increases nearly linearly with the increase in BC burden. The linearity holds even as the BC concentration is increased to levels resembling those hypothesized in nuclear winter scenarios, despite large surface cooling over India andmore » adjoining regions. The enhanced monsoonal circulation is associated with a linear increase in the large-scale meridional tropospheric temperature gradient. The precipitable water over the region also increases linearly with an increase in BC burden, due to increased moisture transport from the Arabian sea to the land areas. The wide range of Indian monsoon response elicited in these experiments emphasizes the need to reduce the uncertainty in BC estimates to accurately quantify their role in modulating the Indian monsoons. Finally, the increase in monsoonal circulation in response to large BC concentrations contrasts earlier findings that the Indian summer monsoon may break down following a nuclear war.« less

  7. Further reduction of minimal first-met bad markings for the computationally efficient synthesis of a maximally permissive controller

    NASA Astrophysics Data System (ADS)

    Liu, GaiYun; Chao, Daniel Yuh

    2015-08-01

    To date, research on the supervisor design for flexible manufacturing systems focuses on speeding up the computation of optimal (maximally permissive) liveness-enforcing controllers. Recent deadlock prevention policies for systems of simple sequential processes with resources (S3PR) reduce the computation burden by considering only the minimal portion of all first-met bad markings (FBMs). Maximal permissiveness is ensured by not forbidding any live state. This paper proposes a method to further reduce the size of minimal set of FBMs to efficiently solve integer linear programming problems while maintaining maximal permissiveness using a vector-covering approach. This paper improves the previous work and achieves the simplest structure with the minimal number of monitors.

  8. Solving subsurface structural problems using a computer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Witte, D.M.

    1987-02-01

    Until recently, the solution of subsurface structural problems has required a combination of graphical construction, trigonometry, time, and patience. Recent advances in software available for both mainframe and microcomputers now reduce the time and potential error of these calculations by an order of magnitude. Software for analysis of deviated wells, three point problems, apparent dip, apparent thickness, and the intersection of two planes, as well as the plotting and interpretation of these data can be used to allow timely and accurate exploration or operational decisions. The available computer software provides a set of utilities, or tools, rather than a comprehensive,more » intelligent system. The burden for selection of appropriate techniques, computation methods, and interpretations still lies with the explorationist user.« less

  9. Automatic detection of new tumors and tumor burden evaluation in longitudinal liver CT scan studies.

    PubMed

    Vivanti, R; Szeskin, A; Lev-Cohain, N; Sosna, J; Joskowicz, L

    2017-11-01

    Radiological longitudinal follow-up of liver tumors in CT scans is the standard of care for disease progression assessment and for liver tumor therapy. Finding new tumors in the follow-up scan is essential to determine malignancy, to evaluate the total tumor burden, and to determine treatment efficacy. Since new tumors are typically small, they may be missed by examining radiologists. We describe a new method for the automatic detection and segmentation of new tumors in longitudinal liver CT studies and for liver tumors burden quantification. Its inputs are the baseline and follow-up CT scans, the baseline tumors delineation, and a tumor appearance prior model. Its outputs are the new tumors segmentations in the follow-up scan, the tumor burden quantification in both scans, and the tumor burden change. Our method is the first comprehensive method that is explicitly designed to find new liver tumors. It integrates information from the scans, the baseline known tumors delineations, and a tumor appearance prior model in the form of a global convolutional neural network classifier. Unlike other deep learning-based methods, it does not require large tagged training sets. Our experimental results on 246 tumors, of which 97 were new tumors, from 37 longitudinal liver CT studies with radiologist approved ground-truth segmentations, yields a true positive new tumors detection rate of 86 versus 72% with stand-alone detection, and a tumor burden volume overlap error of 16%. New tumors detection and tumor burden volumetry are important for diagnosis and treatment. Our new method enables a simplified radiologist-friendly workflow that is potentially more accurate and reliable than the existing one by automatically and accurately following known tumors and detecting new tumors in the follow-up scan.

  10. Poverty-related and neglected diseases - an economic and epidemiological analysis of poverty relatedness and neglect in research and development.

    PubMed

    von Philipsborn, Peter; Steinbeis, Fridolin; Bender, Max E; Regmi, Sadie; Tinnemann, Peter

    2015-01-01

    Economic growth in low- and middle-income countries (LMIC) has raised interest in how disease burden patterns are related to economic development. Meanwhile, poverty-related diseases are considered to be neglected in terms of research and development (R&D). Developing intuitive and meaningful metrics to measure how different diseases are related to poverty and neglected in the current R&D system. We measured how diseases are related to economic development with the income relation factor (IRF), defined by the ratio of disability-adjusted life-years (DALYs) per 100,000 inhabitants in LMIC versus that in high-income countries. We calculated the IRF for 291 diseases and injuries and 67 risk factors included in the Global Burden of Disease Study 2010. We measured neglect in R&D with the neglect factor (NF), defined by the ratio of disease burden in DALYs (as percentage of the total global disease burden) and R&D expenditure (as percentage of total global health-related R&D expenditure) for 26 diseases. The disease burden varies considerably with the level of economic development, shown by the IRF (median: 1.38; interquartile range (IQR): 0.79-6.3). Comparison of IRFs from 1990 to 2010 highlights general patterns of the global epidemiological transition. The 26 poverty-related diseases included in our analysis of neglect in R&D are responsible for 13.8% of the global disease burden, but receive only 1.34% of global health-related R&D expenditure. Within this group, the NF varies considerably (median: 19; IQR: 6-52). The IRF is an intuitive and meaningful metric to highlight shifts in global disease burden patterns. A large shortfall exists in global R&D spending for poverty-related and neglected diseases, with strong variations between diseases.

  11. Volumetric bone mineral density (vBMD), bone structure, and structural geometry among rural South Indian, US Caucasian, and Afro-Caribbean older men.

    PubMed

    Jammy, Guru Rajesh; Boudreau, Robert M; Singh, Tushar; Sharma, Pawan Kumar; Ensrud, Kristine; Zmuda, Joseph M; Reddy, P S; Newman, Anne B; Cauley, Jane A

    2018-05-22

    Peripheral quantitative computed tomography (pQCT) provides biomechanical estimates of bone strength. Rural South Indian men have reduced biomechanical indices of bone strength compared to US Caucasian and Afro-Caribbean men. This suggests an underlying higher risk of osteoporotic fractures and greater future fracture burden among the rural South Indian men. Geographical and racial comparisons of bone mineral density (BMD) have largely focused on DXA measures of areal BMD. In contrast, peripheral quantitative computed tomography (pQCT) measures volumetric BMD (vBMD), bone structural geometry and provides estimates of biomechanical strength. To further understand potential geographical and racial differences in skeletal health, we compared pQCT measures among US Caucasian, Afro-Caribbean, and rural South Indian men. We studied men aged ≥ 60 years enrolled in the Mobility and Independent Living among Elders Study (MILES) in rural south India (N = 245), Osteoporotic Fractures in Men Study (MrOS) in the US (N = 1148), and the Tobago Bone Health Study (N = 828). The BMI (kg/m 2 ) of rural South Indian men (21.6) was significantly lower compared to the US Caucasians (28) and Afro-Caribbean men (26.9). Adjusting for age, height, body weight, and grip strength; rural South Indian men compared to US Caucasians had significantly lower trabecular vBMD [- 1.3 to - 1.5 standard deviation (SD)], cortical thickness [- 0.8 to - 1.2 SD]; significantly higher endosteal circumference [0.5 to 0.8 SD]; but similar cortical vBMD. Afro-Caribbean men compared to US Caucasians had similar trabecular vBMD but significantly higher cortical vBMD [0.9 to 1.2 SD], SSIp [0.2 to 1.4 SD], and tibial endosteal circumference [1 SD], CONCLUSIONS: In comparison to US Caucasians, rural South Indian men have reduced bone strength (lower trabecular vBMD) and Afro-Caribbean men have greater bone strength (higher cortical vBMD). These results suggest an underlying higher risk of osteoporotic fractures and greater future fracture burden among rural South Indian men.

  12. Internet Architecture: Lessons Learned and Looking Forward

    DTIC Science & Technology

    2006-12-01

    Internet Architecture: Lessons Learned and Looking Forward Geoffrey G. Xie Department of Computer Science Naval Postgraduate School April 2006... Internet architecture. Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is...readers are referred there for more information about a specific protocol or concept. 2. Origin of Internet Architecture The Internet is easily

  13. Intelligent Command and Control Demonstration Setup and Presentation Instructions

    DTIC Science & Technology

    2017-12-01

    and Control Demonstration Setup and Presentation Instructions by Laurel C Sadler and Somiya Metu Computational and Information Sciences...0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing...instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection information . Send

  14. Unitary Transformations in 3 D Vector Representation of Qutrit States

    DTIC Science & Technology

    2018-03-12

    Representation of Qutrit States Vinod K Mishra Computational and Information Sciences Directorate, ARL Approved for public... information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection information . Send comments regarding this burden estimate or any other aspect

  15. Coexistence of Named Data Networking (NDN) and Software-Defined Networking (SDN)

    DTIC Science & Technology

    2017-09-01

    Networking (NDN) and Software-Defined Networking (SDN) by Vinod Mishra Computational and Information Sciences Directorate, ARL...reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching...existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection information . Send comments regarding this

  16. Effects of an Automated Telephone Support System on Caregiver Burden and Anxiety: Findings from the REACH for TLC Intervention Study

    ERIC Educational Resources Information Center

    Mahoney, Diane Feeney; Tarlow, Barbara J.; Jones, Richard N.

    2003-01-01

    Purpose: We determine the main outcome effects of a 12-month computer-mediated automated interactive voice response (IVR) intervention designed to assist family caregivers managing persons with disruptive behaviors related to Alzheimer's disease (AD). Design and Methods: We conducted a randomized controlled study of 100 caregivers, 51 in the usual…

  17. High Energy Computed Tomographic Inspection of Munitions

    DTIC Science & Technology

    2016-11-01

    this collection of information is estimated to av erage 1 hour per response, including the time for rev iewing instructions, searching existing data... estimate or any other aspect of this collection of information, including suggestions for reducing the burden to Department of Defense, Washington...UNCLASSIFIED i CONTENTS Page System Background 1 Unique Features 3 Scattering Estimating Device 3 Distortion and Geometric Calibration

  18. Time Orientation and Human Performance

    DTIC Science & Technology

    2004-06-01

    Work with Computing Systems 2004. H.M. Khalid, M.G. Helander, A.W. Yeo (Editors) . Kuala Lumpur: Damai Sciences. 1 Time Orientation and Human...Multi-tasking. 1 . Introduction With increased globalization, understanding the various cultures and people’s attitudes and behaviours is crucial...reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching

  19. The emerging role of large eddy simulation in industrial practice: challenges and opportunities.

    PubMed

    Hutton, A G

    2009-07-28

    That class of methods for treating turbulence gathered under the banner of large eddy simulation is poised to enter mainstream engineering practice. There is a growing body of evidence that such methods offer a significant stretch in industrial capability over solely Reynolds-averaged Navier-Stokes (RANS)-based modelling. A key enabling development will be the adaptation of innovative processor architectures, resulting from the huge investment in the gaming industry, to engineering analysis. This promises to reduce the computational burden to practicable levels. However, there are many lessons to be learned from the history of the past three decades. These lessons should be analysed in order to inform, if not modulate, the unfolding of this next cycle in the development of industrial modelling capability. This provides the theme for this paper, which is written very much from the standpoint of the informed practitioner rather than the innovator; someone with a strong motivation to improve significantly the competence with which industrial turbulent flows are treated. It is asserted that the reliable deployment of the methodology in the industrial context will prove to be a knowledge-based discipline, as was the case with RANS-based modelling, if not more so. The community at large should collectively make great efforts to put in place that knowledge base from which best practice advice can be derived at the very start of this cycle of advancement and continue to enrich it as the cycle progresses.

  20. Impact of Dietary and Metabolic Risk Factors on Cardiovascular and Diabetes Mortality in South Asia: Analysis From the 2010 Global Burden of Disease Study.

    PubMed

    Yakoob, Mohammad Y; Micha, Renata; Khatibzadeh, Shahab; Singh, Gitanjali M; Shi, Peilin; Ahsan, Habibul; Balakrishna, Nagalla; Brahmam, Ginnela N V; Chen, Yu; Afshin, Ashkan; Fahimi, Saman; Danaei, Goodarz; Powles, John W; Ezzati, Majid; Mozaffarian, Dariush

    2016-12-01

    To quantify cardiovascular disease and diabetes deaths attributable to dietary and metabolic risks by country, age, sex, and time in South Asian countries. We used the 2010 Global Burden of Disease national surveys to characterize risk factor levels by age and sex. We derived etiological effects of risk factors-disease endpoints, by age, from meta-analyses. We defined optimal levels. We combined these inputs with cause-specific mortality rates to compute population-attributable fractions as a percentage of total cardiometabolic deaths. Suboptimal diet was the leading cause of cardiometabolic mortality in 4 of 5 countries, with population-attributable fractions from 40.7% (95% uncertainty interval = 37.4, 44.1) in Bangladesh to 56.9% (95% uncertainty interval = 52.4, 61.5) in Pakistan. High systolic blood pressure was the second leading cause, except in Bangladesh, where it superseded suboptimal diet. This was followed in all nations by high fasting plasma glucose, low fruit intake, and low whole grain intake. Other prominent burdens were more variable, such as low intake of vegetables, low omega-3 fats, and high sodium intake in India, Nepal, and Pakistan. Important similarities and differences are evident in cardiometabolic mortality burdens of modifiable dietary and metabolic risks across these countries, informing health policy and program priorities.

  1. Perspectives on diagnostic strategies for hyperglycemia in pregnancy - Dealing with the barriers and challenges in South Asia.

    PubMed

    Kapur, Anil; Divakar, Hema; Seshiah, Veeraswamy

    2018-02-02

    Estimates indicate that south Asia accounts for over two fifths of the global burden of hyperglycemia in pregnancy (HIP) and the ongoing nutritional and epidemiological transition may make the situation worse. Given their higher risk, all women of south Asian decent require to be tested for HIP. With approximately 37 million births annually in the region requires that 37 million women be tested annually; thereby placing a huge burden on the fragile inadequately resourced health systems in the region with poor awareness and lack of trained manpower. Recommendation for testing must therefore be pragmatic, feasible, convenient and cost effective. Diabetes in pregnancy study group India (DIPSI) has proposed a simple testing protocol that is endorsed by the Indian National Guideline on GDM, and by the FIGO guideline on HIP for use in South Asia. This testing protocol has received widespread support in the region. Despite the many challenges it is encouraging to note that in the four large countries in the region - Bangladesh, India, Pakistan and Sri Lanka which account for over 80% of the estimated burden of HIP in south Asia, large scale credible programs have been initiated to address the identified barriers. Copyright © 2018. Published by Elsevier B.V.

  2. Technical aspects of CT imaging of the spine.

    PubMed

    Tins, Bernhard

    2010-11-01

    This review article discusses technical aspects of computed tomography (CT) imaging of the spine. Patient positioning, and its influence on image quality and movement artefact, is discussed. Particular emphasis is placed on the choice of scan parameters and their relation to image quality and radiation burden to the patient. Strategies to reduce radiation burden and artefact from metal implants are outlined. Data acquisition, processing, image display and steps to reduce artefact are reviewed. CT imaging of the spine is put into context with other imaging modalities for specific clinical indications or problems. This review aims to review underlying principles for image acquisition and to provide a rough guide for clinical problems without being prescriptive. Individual practice will always vary and reflect differences in local experience, technical provisions and clinical requirements.

  3. Estimating the Global Burden of Endemic Canine Rabies

    PubMed Central

    Hampson, Katie; Coudeville, Laurent; Lembo, Tiziana; Sambo, Maganga; Kieffer, Alexia; Attlan, Michaël; Barrat, Jacques; Blanton, Jesse D.; Briggs, Deborah J.; Cleaveland, Sarah; Costa, Peter; Freuling, Conrad M.; Hiby, Elly; Knopf, Lea; Leanes, Fernando; Meslin, François-Xavier; Metlin, Artem; Miranda, Mary Elizabeth; Müller, Thomas; Nel, Louis H.; Recuenco, Sergio; Rupprecht, Charles E.; Schumacher, Carolin; Taylor, Louise; Vigilato, Marco Antonio Natal; Zinsstag, Jakob; Dushoff, Jonathan

    2015-01-01

    Background Rabies is a notoriously underreported and neglected disease of low-income countries. This study aims to estimate the public health and economic burden of rabies circulating in domestic dog populations, globally and on a country-by-country basis, allowing an objective assessment of how much this preventable disease costs endemic countries. Methodology/Principal Findings We established relationships between rabies mortality and rabies prevention and control measures, which we incorporated into a model framework. We used data derived from extensive literature searches and questionnaires on disease incidence, control interventions and preventative measures within this framework to estimate the disease burden. The burden of rabies impacts on public health sector budgets, local communities and livestock economies, with the highest risk of rabies in the poorest regions of the world. This study estimates that globally canine rabies causes approximately 59,000 (95% Confidence Intervals: 25-159,000) human deaths, over 3.7 million (95% CIs: 1.6-10.4 million) disability-adjusted life years (DALYs) and 8.6 billion USD (95% CIs: 2.9-21.5 billion) economic losses annually. The largest component of the economic burden is due to premature death (55%), followed by direct costs of post-exposure prophylaxis (PEP, 20%) and lost income whilst seeking PEP (15.5%), with only limited costs to the veterinary sector due to dog vaccination (1.5%), and additional costs to communities from livestock losses (6%). Conclusions/Significance This study demonstrates that investment in dog vaccination, the single most effective way of reducing the disease burden, has been inadequate and that the availability and affordability of PEP needs improving. Collaborative investments by medical and veterinary sectors could dramatically reduce the current large, and unnecessary, burden of rabies on affected communities. Improved surveillance is needed to reduce uncertainty in burden estimates and to monitor the impacts of control efforts. PMID:25881058

  4. An Evidence-Based Practical Approach to Pediatric Otolaryngology in the Developing World.

    PubMed

    Belcher, Ryan H; Molter, David W; Goudy, Steven L

    2018-06-01

    Despite humanitarian otolaryngology groups traveling in record numbers to resource-limited areas treating pediatric otolaryngology disease processes and training local providers, there remains a large burden of unmet needs. There is a meager amount of published information that comes from the developing world from an otolaryngology standpoint. As would be expected, the little information that does comes involves some of the most common pediatric otolaryngology diseases and surgical burdens including childhood hearing loss, otitis media, adenotonsillectomies, airway obstructions requiring tracheostomies, foreign body aspirations, and craniomaxillofacial surgeries, including cleft lip and palate. Copyright © 2018 Elsevier Inc. All rights reserved.

  5. Intelligent Space Tube Optimization for speeding ground water remedial design.

    PubMed

    Kalwij, Ineke M; Peralta, Richard C

    2008-01-01

    An innovative Intelligent Space Tube Optimization (ISTO) two-stage approach facilitates solving complex nonlinear flow and contaminant transport management problems. It reduces computational effort of designing optimal ground water remediation systems and strategies for an assumed set of wells. ISTO's stage 1 defines an adaptive mobile space tube that lengthens toward the optimal solution. The space tube has overlapping multidimensional subspaces. Stage 1 generates several strategies within the space tube, trains neural surrogate simulators (NSS) using the limited space tube data, and optimizes using an advanced genetic algorithm (AGA) with NSS. Stage 1 speeds evaluating assumed well locations and combinations. For a large complex plume of solvents and explosives, ISTO stage 1 reaches within 10% of the optimal solution 25% faster than an efficient AGA coupled with comprehensive tabu search (AGCT) does by itself. ISTO input parameters include space tube radius and number of strategies used to train NSS per cycle. Larger radii can speed convergence to optimality for optimizations that achieve it but might increase the number of optimizations reaching it. ISTO stage 2 automatically refines the NSS-AGA stage 1 optimal strategy using heuristic optimization (we used AGCT), without using NSS surrogates. Stage 2 explores the entire solution space. ISTO is applicable for many heuristic optimization settings in which the numerical simulator is computationally intensive, and one would like to reduce that burden.

  6. An Improved Nested Sampling Algorithm for Model Selection and Assessment

    NASA Astrophysics Data System (ADS)

    Zeng, X.; Ye, M.; Wu, J.; WANG, D.

    2017-12-01

    Multimodel strategy is a general approach for treating model structure uncertainty in recent researches. The unknown groundwater system is represented by several plausible conceptual models. Each alternative conceptual model is attached with a weight which represents the possibility of this model. In Bayesian framework, the posterior model weight is computed as the product of model prior weight and marginal likelihood (or termed as model evidence). As a result, estimating marginal likelihoods is crucial for reliable model selection and assessment in multimodel analysis. Nested sampling estimator (NSE) is a new proposed algorithm for marginal likelihood estimation. The implementation of NSE comprises searching the parameters' space from low likelihood area to high likelihood area gradually, and this evolution is finished iteratively via local sampling procedure. Thus, the efficiency of NSE is dominated by the strength of local sampling procedure. Currently, Metropolis-Hasting (M-H) algorithm and its variants are often used for local sampling in NSE. However, M-H is not an efficient sampling algorithm for high-dimensional or complex likelihood function. For improving the performance of NSE, it could be feasible to integrate more efficient and elaborated sampling algorithm - DREAMzs into the local sampling. In addition, in order to overcome the computation burden problem of large quantity of repeating model executions in marginal likelihood estimation, an adaptive sparse grid stochastic collocation method is used to build the surrogates for original groundwater model.

  7. Efficient simulation of intrinsic, extrinsic and external noise in biochemical systems.

    PubMed

    Pischel, Dennis; Sundmacher, Kai; Flassig, Robert J

    2017-07-15

    Biological cells operate in a noisy regime influenced by intrinsic, extrinsic and external noise, which leads to large differences of individual cell states. Stochastic effects must be taken into account to characterize biochemical kinetics accurately. Since the exact solution of the chemical master equation, which governs the underlying stochastic process, cannot be derived for most biochemical systems, approximate methods are used to obtain a solution. In this study, a method to efficiently simulate the various sources of noise simultaneously is proposed and benchmarked on several examples. The method relies on the combination of the sigma point approach to describe extrinsic and external variability and the τ -leaping algorithm to account for the stochasticity due to probabilistic reactions. The comparison of our method to extensive Monte Carlo calculations demonstrates an immense computational advantage while losing an acceptable amount of accuracy. Additionally, the application to parameter optimization problems in stochastic biochemical reaction networks is shown, which is rarely applied due to its huge computational burden. To give further insight, a MATLAB script is provided including the proposed method applied to a simple toy example of gene expression. MATLAB code is available at Bioinformatics online. flassig@mpi-magdeburg.mpg.de. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  8. Sequential Design of Experiments to Maximize Learning from Carbon Capture Pilot Plant Testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soepyan, Frits B.; Morgan, Joshua C.; Omell, Benjamin P.

    Pilot plant test campaigns can be expensive and time-consuming. Therefore, it is of interest to maximize the amount of learning and the efficiency of the test campaign given the limited number of experiments that can be conducted. This work investigates the use of sequential design of experiments (SDOE) to overcome these challenges by demonstrating its usefulness for a recent solvent-based CO2 capture plant test campaign. Unlike traditional design of experiments methods, SDOE regularly uses information from ongoing experiments to determine the optimum locations in the design space for subsequent runs within the same experiment. However, there are challenges that needmore » to be addressed, including reducing the high computational burden to efficiently update the model, and the need to incorporate the methodology into a computational tool. We address these challenges by applying SDOE in combination with a software tool, the Framework for Optimization, Quantification of Uncertainty and Surrogates (FOQUS) (Miller et al., 2014a, 2016, 2017). The results of applying SDOE on a pilot plant test campaign for CO2 capture suggests that relative to traditional design of experiments methods, SDOE can more effectively reduce the uncertainty of the model, thus decreasing technical risk. Future work includes integrating SDOE into FOQUS and using SDOE to support additional large-scale pilot plant test campaigns.« less

  9. Nonuniform update for sparse target recovery in fluorescence molecular tomography accelerated by ordered subsets.

    PubMed

    Zhu, Dianwen; Li, Changqing

    2014-12-01

    Fluorescence molecular tomography (FMT) is a promising imaging modality and has been actively studied in the past two decades since it can locate the specific tumor position three-dimensionally in small animals. However, it remains a challenging task to obtain fast, robust and accurate reconstruction of fluorescent probe distribution in small animals due to the large computational burden, the noisy measurement and the ill-posed nature of the inverse problem. In this paper we propose a nonuniform preconditioning method in combination with L (1) regularization and ordered subsets technique (NUMOS) to take care of the different updating needs at different pixels, to enhance sparsity and suppress noise, and to further boost convergence of approximate solutions for fluorescence molecular tomography. Using both simulated data and phantom experiment, we found that the proposed nonuniform updating method outperforms its popular uniform counterpart by obtaining a more localized, less noisy, more accurate image. The computational cost was greatly reduced as well. The ordered subset (OS) technique provided additional 5 times and 3 times speed enhancements for simulation and phantom experiments, respectively, without degrading image qualities. When compared with the popular L (1) algorithms such as iterative soft-thresholding algorithm (ISTA) and Fast iterative soft-thresholding algorithm (FISTA) algorithms, NUMOS also outperforms them by obtaining a better image in much shorter period of time.

  10. Energy-Aware Computation Offloading of IoT Sensors in Cloudlet-Based Mobile Edge Computing.

    PubMed

    Ma, Xiao; Lin, Chuang; Zhang, Han; Liu, Jianwei

    2018-06-15

    Mobile edge computing is proposed as a promising computing paradigm to relieve the excessive burden of data centers and mobile networks, which is induced by the rapid growth of Internet of Things (IoT). This work introduces the cloud-assisted multi-cloudlet framework to provision scalable services in cloudlet-based mobile edge computing. Due to the constrained computation resources of cloudlets and limited communication resources of wireless access points (APs), IoT sensors with identical computation offloading decisions interact with each other. To optimize the processing delay and energy consumption of computation tasks, theoretic analysis of the computation offloading decision problem of IoT sensors is presented in this paper. In more detail, the computation offloading decision problem of IoT sensors is formulated as a computation offloading game and the condition of Nash equilibrium is derived by introducing the tool of a potential game. By exploiting the finite improvement property of the game, the Computation Offloading Decision (COD) algorithm is designed to provide decentralized computation offloading strategies for IoT sensors. Simulation results demonstrate that the COD algorithm can significantly reduce the system cost compared with the random-selection algorithm and the cloud-first algorithm. Furthermore, the COD algorithm can scale well with increasing IoT sensors.

  11. The health and economic burden of chickenpox and herpes zoster in Belgium.

    PubMed

    Bilcke, J; Ogunjimi, B; Marais, C; de Smet, F; Callens, M; Callaert, K; van Kerschaver, E; Ramet, J; van Damme, P; Beutels, P

    2012-11-01

    Varicella-zoster virus causes chickenpox (CP) and after reactivation herpes zoster (HZ). Vaccines are available against both diseases warranting an assessment of the pre-vaccination burden of disease. We collected data from relevant Belgian databases and performed five surveys of CP and HZ patients. The rates at which a general practitioner is visited at least once for CP and HZ are 346 and 378/100 000 person-years, respectively. The average CP and HZ hospitalization rates are 5·3 and 14·2/100 000 person-years respectively. The direct medical cost for HZ is about twice as large as the direct medical cost for CP. The quality-adjusted life years lost for ambulatory CP patients consulting a physician is more than double that of those not consulting a physician (0·010 vs. 0·004). In conclusion, both diseases cause a substantial burden in Belgium.

  12. General Revenue Financing of Medicare: Who Will Bear the Burden?

    PubMed Central

    Johnson, Janet L.; Long, Stephen H.

    1982-01-01

    Two recent national advisory committees on Social Security recommended major shifts in Medicare financing to preserve the financial viability of the Social Security trust funds. This paper estimates the income redistribution consequences of the two proposals, in contrast to current law, using a micro-simulation model of taxes and premiums. These estimates show that while the current Medicare financing package is mildly progressive, the new proposals would substantially increase income redistribution under the program. Two insights provided by separate estimates, for families headed by the elderly (persons age 65 or over) versus those headed by the non-elderly, are: 1) the surprisingly large Medicare tax burdens on families headed by the elderly under the current financing package of payroll taxes, general revenues, and enrollee premiums; and 2) the substantial increases in these burdens under proposed shifts toward increased general revenue financing. PMID:10309601

  13. Coping with Prescription Drug Cost Sharing: Knowledge, Adherence, and Financial Burden

    PubMed Central

    Reed, Mary; Brand, Richard; Newhouse, Joseph P; Selby, Joe V; Hsu, John

    2008-01-01

    Objective Assess patient knowledge of and response to drug cost sharing. Study Setting Adult members of a large prepaid, integrated delivery system. Study Design/Data Collection Telephone interviews with 932 participants (72 percent response rate) who reported knowledge of the structures and amounts of their prescription drug cost sharing. Participants reported cost-related changes in their drug adherence, any financial burden, and other cost-coping behaviors. Actual cost sharing amounts came from administrative databases. Principal Findings Overall, 27 percent of patients knew all of their drug cost sharing structures and amounts. After adjustment for individual characteristics, additional patient cost sharing structures (tiers and caps), and higher copayment amounts were associated with reporting decreased adherence, financial burden, or other cost-coping behaviors. Conclusions Patient knowledge of their drug benefits is limited, especially for more complex cost sharing structures. Patients also report a range of responses to greater cost sharing, including decreasing adherence. PMID:18370979

  14. Global Sensitivity Analysis for Large-scale Socio-hydrological Models using the Cloud

    NASA Astrophysics Data System (ADS)

    Hu, Y.; Garcia-Cabrejo, O.; Cai, X.; Valocchi, A. J.; Dupont, B.

    2014-12-01

    In the context of coupled human and natural system (CHNS), incorporating human factors into water resource management provides us with the opportunity to understand the interactions between human and environmental systems. A multi-agent system (MAS) model is designed to couple with the physically-based Republican River Compact Administration (RRCA) groundwater model, in an attempt to understand the declining water table and base flow in the heavily irrigated Republican River basin. For MAS modelling, we defined five behavioral parameters (κ_pr, ν_pr, κ_prep, ν_prep and λ) to characterize the agent's pumping behavior given the uncertainties of the future crop prices and precipitation. κ and ν describe agent's beliefs in their prior knowledge of the mean and variance of crop prices (κ_pr, ν_pr) and precipitation (κ_prep, ν_prep), and λ is used to describe the agent's attitude towards the fluctuation of crop profits. Notice that these human behavioral parameters as inputs to the MAS model are highly uncertain and even not measurable. Thus, we estimate the influences of these behavioral parameters on the coupled models using Global Sensitivity Analysis (GSA). In this paper, we address two main challenges arising from GSA with such a large-scale socio-hydrological model by using Hadoop-based Cloud Computing techniques and Polynomial Chaos Expansion (PCE) based variance decomposition approach. As a result, 1,000 scenarios of the coupled models are completed within two hours with the Hadoop framework, rather than about 28days if we run those scenarios sequentially. Based on the model results, GSA using PCE is able to measure the impacts of the spatial and temporal variations of these behavioral parameters on crop profits and water table, and thus identifies two influential parameters, κ_pr and λ. The major contribution of this work is a methodological framework for the application of GSA in large-scale socio-hydrological models. This framework attempts to find a balance between the heavy computational burden regarding model execution and the number of model evaluations required in the GSA analysis, particularly through an organic combination of Hadoop-based Cloud Computing to efficiently evaluate the socio-hydrological model and PCE where the sensitivity indices are efficiently estimated from its coefficients.

  15. Lightweight fuzzy processes in clinical computing.

    PubMed

    Hurdle, J F

    1997-09-01

    In spite of advances in computing hardware, many hospitals still have a hard time finding extra capacity in their production clinical information system to run artificial intelligence (AI) modules, for example: to support real-time drug-drug or drug-lab interactions; to track infection trends; to monitor compliance with case specific clinical guidelines; or to monitor/ control biomedical devices like an intelligent ventilator. Historically, adding AI functionality was not a major design concern when a typical clinical system is originally specified. AI technology is usually retrofitted 'on top of the old system' or 'run off line' in tandem with the old system to ensure that the routine work load would still get done (with as little impact from the AI side as possible). To compound the burden on system performance, most institutions have witnessed a long and increasing trend for intramural and extramural reporting, (e.g. the collection of data for a quality-control report in microbiology, or a meta-analysis of a suite of coronary artery bypass grafts techniques, etc.) and these place an ever-growing burden on typical the computer system's performance. We discuss a promising approach to adding extra AI processing power to a heavily-used system based on the notion 'lightweight fuzzy processing (LFP)', that is, fuzzy modules designed from the outset to impose a small computational load. A formal model for a useful subclass of fuzzy systems is defined below and is used as a framework for the automated generation of LFPs. By seeking to reduce the arithmetic complexity of the model (a hand-crafted process) and the data complexity of the model (an automated process), we show how LFPs can be generated for three sample datasets of clinical relevance.

  16. Computers in medicine: liability issues for physicians.

    PubMed

    Hafner, A W; Filipowicz, A B; Whitely, W P

    1989-07-01

    Physicians routinely use computers to store, access, and retrieve medical information. As computer use becomes even more widespread in medicine, failure to utilize information systems may be seen as a violation of professional custom and lead to findings of professional liability. Even when a technology is not widespread, failure to incorporate it into medical practice may give rise to liability if the technology is accessible to the physician and reduces risk to the patient. Improvement in the availability of medical information sources imposes a greater burden on the physician to keep current and to obtain informed consent from patients. To routinely perform computer-assisted literature searches for informed consent and diagnosis is 'good medicine'. Clinical and diagnostic applications of computer technology now include computer-assisted decision making with the aid of sophisticated databases. Although such systems will expand the knowledge base and competence of physicians, malfunctioning software raises a major liability question. Also, complex computer-driven technology is used in direct patient care. Defective or improperly used hardware or software can lead to patient injury, thus raising additional complicated questions of professional liability and product liability.

  17. Disease and Health Inequalities Attributable to Air Pollutant Exposure in Detroit, Michigan

    PubMed Central

    Milando, Chad W.; Williams, Guy O.; Batterman, Stuart A.

    2017-01-01

    The environmental burden of disease is the mortality and morbidity attributable to exposures of air pollution and other stressors. The inequality metrics used in cumulative impact and environmental justice studies can be incorporated into environmental burden studies to better understand the health disparities of ambient air pollutant exposures. This study examines the diseases and health disparities attributable to air pollutants for the Detroit urban area. We apportion this burden to various groups of emission sources and pollutants, and show how the burden is distributed among demographic and socioeconomic subgroups. The analysis uses spatially-resolved estimates of exposures, baseline health rates, age-stratified populations, and demographic characteristics that serve as proxies for increased vulnerability, e.g., race/ethnicity and income. Based on current levels, exposures to fine particulate matter (PM2.5), ozone (O3), sulfur dioxide (SO2), and nitrogen dioxide (NO2) are responsible for more than 10,000 disability-adjusted life years (DALYs) per year, causing an annual monetized health impact of $6.5 billion. This burden is mainly driven by PM2.5 and O3 exposures, which cause 660 premature deaths each year among the 945,000 individuals in the study area. NO2 exposures, largely from traffic, are important for respiratory outcomes among older adults and children with asthma, e.g., 46% of air-pollution related asthma hospitalizations are due to NO2 exposures. Based on quantitative inequality metrics, the greatest inequality of health burdens results from industrial and traffic emissions. These metrics also show disproportionate burdens among Hispanic/Latino populations due to industrial emissions, and among low income populations due to traffic emissions. Attributable health burdens are a function of exposures, susceptibility and vulnerability (e.g., baseline incidence rates), and population density. Because of these dependencies, inequality metrics should be calculated using the attributable health burden when feasible to avoid potentially underestimating inequality. Quantitative health impact and inequality analyses can inform health and environmental justice evaluations, providing important information to decision makers for prioritizing strategies to address exposures at the local level. PMID:29048385

  18. The Global Burden of Mental, Neurological and Substance Use Disorders: An Analysis from the Global Burden of Disease Study 2010

    PubMed Central

    Whiteford, Harvey A.; Ferrari, Alize J.; Degenhardt, Louisa; Feigin, Valery; Vos, Theo

    2015-01-01

    Background The Global Burden of Disease Study 2010 (GBD 2010), estimated that a substantial proportion of the world’s disease burden came from mental, neurological and substance use disorders. In this paper, we used GBD 2010 data to investigate time, year, region and age specific trends in burden due to mental, neurological and substance use disorders. Method For each disorder, prevalence data were assembled from systematic literature reviews. DisMod-MR, a Bayesian meta-regression tool, was used to model prevalence by country, region, age, sex and year. Prevalence data were combined with disability weights derived from survey data to estimate years lived with disability (YLDs). Years lost to premature mortality (YLLs) were estimated by multiplying deaths occurring as a result of a given disorder by the reference standard life expectancy at the age death occurred. Disability-adjusted life years (DALYs) were computed as the sum of YLDs and YLLs. Results In 2010, mental, neurological and substance use disorders accounted for 10.4% of global DALYs, 2.3% of global YLLs and, 28.5% of global YLDs, making them the leading cause of YLDs. Mental disorders accounted for the largest proportion of DALYs (56.7%), followed by neurological disorders (28.6%) and substance use disorders (14.7%). DALYs peaked in early adulthood for mental and substance use disorders but were more consistent across age for neurological disorders. Females accounted for more DALYs in all mental and neurological disorders, except for mental disorders occurring in childhood, schizophrenia, substance use disorders, Parkinson’s disease and epilepsy where males accounted for more DALYs. Overall DALYs were highest in Eastern Europe/Central Asia and lowest in East Asia/the Pacific. Conclusion Mental, neurological and substance use disorders contribute to a significant proportion of disease burden. Health systems can respond by implementing established, cost effective interventions, or by supporting the research necessary to develop better prevention and treatment options. PMID:25658103

  19. The effect of postal questionnaire burden on response rate and answer patterns following admission to intensive care: a randomised controlled trial.

    PubMed

    Hatch, Robert; Young, Duncan; Barber, Vicki; Harrison, David A; Watkinson, Peter

    2017-03-27

    The effects of postal questionnaire burden on return rates and answers given are unclear following treatment on an intensive care unit (ICU). We aimed to establish the effects of different postal questionnaire burdens on return rates and answers given. Design: A parallel group randomised controlled trial. We assigned patients by computer-based randomisation to one of two questionnaire packs (Group A and Group B). Patients from 26 ICUs in the United Kingdom. Patients who had received at least 24 h of level 3 care and were 16 years of age or older. Patients did not know that there were different questionnaire burdens. The study included 18,490 patients. 12,170 were eligible to be sent a questionnaire pack at 3 months. We sent 12,105 questionnaires (6112 to group A and 5993 to group B). The Group A pack contained demographic and EuroQol group 5 Dimensions 3 level (EQ-5D-3 L) questionnaires, making four questionnaire pages. The Group B pack also contained the Hospital Anxiety and Depression Score (HADS) and the Post-Traumatic Stress Disorder Check List-Civilian (PCL-C) questionnaires, making eight questionnaire pages in total. Questionnaire return rate 3 months after ICU discharge by group. In group A, 2466/6112 (40.3%) participants responded at 3 months. In group B 2315/ 5993 (38.6%) participants responded (difference 1.7% CI for difference 0-3.5% p = 0.053). Group A reported better functionality than group B in the EQ-5D-3 L mobility (41% versus 37% reporting no problems p = 0.003) and anxiety/depression (59% versus 55% reporting no problems p = 0.017) domains. In survivors of intensive care, questionnaire burden had no effect on return rates. However, questionnaire burden affected answers to the same questionnaire (EQ-5D-3 L). ISRCTN69112866 (assigned 02/05/2006).

  20. Association of traditional cardiovascular risk factors with coronary plaque sub-types assessed by 64-slice computed tomography angiography in a large cohort of asymptomatic subjects.

    PubMed

    Rivera, Juan J; Nasir, Khurram; Cox, Pedro R; Choi, Eue-Keun; Yoon, Yeonyee; Cho, Iksung; Chun, Eun-Ju; Choi, Sang-Il; Blumenthal, Roger S; Chang, Hyuk-Jae

    2009-10-01

    Although prior studies have shown that traditional cardiovascular (CV) risk factors are associated with the burden of coronary atherosclerosis, less is known about the relationship of risk factors with coronary plaque sub-types. Coronary computed tomography angiography (CCTA) allows an assessment of both, total disease burden and plaque characteristics. In this study, we investigate the relationship between traditional CV risk factors and the presence and extent of coronary plaque sub-types in a large group of asymptomatic individuals. The study population consisted of 1015 asymptomatic Korean subjects (53+/-10 years; 64% were males) free of known CV disease who underwent 64-slice CCTA as part of a health screening evaluation. We analyzed plaque characteristics on a per-segment basis according to the modified American Heart Association classification. Plaques in which calcified tissue occupied more than 50% of the plaque area were classified as calcified (CAP), <50% calcified area as mixed (MCAP), and plaques without any calcium as non-calcified (NCAP). A total of 215 (21%) subjects had coronary plaque while 800 (79%) had no identifiable disease. Multivariate regression analysis demonstrated that increased age (per decade) and gender are the strongest predictors for the presence of any coronary plaque or the presence of at least one segment of CAP and MCAP (any plaque-age: OR 2.89; 95% CI 2.34, 3.56; male gender: OR 5.21; 95% CI 3.20, 8.49; CAP-age: OR 2.75; 95% CI 2.12, 3.58; male gender: 4.78; 95% CI 2.48, 9.23; MCAP-age: OR 2.62; 95% CI 2.02, 3.39; male gender: OR 4.15; 95% CI 2.17, 7.94). The strongest predictors for the presence of any NCAP were gender (OR 3.56; 95% CI 1.96-6.55) and diabetes mellitus (OR 2.87; 95% CI 1.63-5.08). When looking at the multivariate association between the presence of >/=2 coronary segments with a plaque sub-type and CV risk factors, male gender was the strongest predictor for CAP (OR 7.31; 95% CI 2.12, 25.20) and MCAP (OR 5.54; 95% CI 1.84, 16.68). Alternatively, smoking was the strongest predictor for the presence of >/=2 coronary segments with NCAP (OR 4.86; 95% CI 1.68, 14.07). Low-density lipoprotein cholesterol (LDL-C) was only a predictor for the presence and extent of mixed coronary plaque. Age and gender are overall the strongest predictors of atherosclerosis as assessed by CCTA in this large asymptomatic Korean population and these two risk factors are not particularly associated with a specific coronary plaque sub-type. Smoking is a strong predictor of NCAP, which has been suggested by previous reports as a more vulnerable lesion. Whether a specific plaque sub-type is associated with a worse prognosis is yet to be determined by future prospective studies.

  1. Open surgical management of pediatric urolithiasis: A developing country perspective.

    PubMed

    Rizvi, Syed A; Sultan, Sajid; Ijaz, Hussain; Mirza, Zafar N; Ahmed, Bashir; Saulat, Sherjeel; Umar, Sadaf Aba; Naqvi, Syed A

    2010-10-01

    To describe decision factors and outcome of open surgical procedures in the management of children with stone. Between January 2004 and December 2008, 3969 surgical procedures were performed in 3053 children with stone disease. Procedures employed included minimally invasive techniques shockwave lithotripsy (SWL), percutaneous nephrolithotomy (PCNL), ureterorenoscopy (URS), perurethral cystolithotripsy (PUCL), percutaneous cystolithotripsy (PCCL), and open surgery. From sociomedical records demographics, clinical history, operative procedures, complications, and outcome were recorded for all patients. Of 3969 surgeries, 2794 (70%) were minimally invasive surgery (MIS) techniques to include SWL 19%, PCNL 16%, URS 18.9%, and PUCL+PCCL 16% and 1175 (30%) were open surgeries. The main factors necessitating open surgery were large stone burden 37%, anatomical abnormalities 16%, stones with renal failure 34%, gross hydronephrosis with thin cortex 58%, urinary tract infection (UTI) 25%, and failed MIS 18%. Nearly 50% of the surgeries were necessitated by economic constraints and long distance from center where one-time treatment was preferred by the patient. Stone-free rates by open surgeries were pyelolithotomy 91%, ureterolithotomy 100%, and cystolithotomy 100% with complication rate of upto 3%. In developing countries, large stone burden, neglected stones with renal failure, paucity of urological facilities, residence of poor patients away from tertiary centers necessitate open surgical procedures as the therapy of choice in about 1/3rd of the patients. Open surgery provides comparable success rates to MIS although the burden and nature of disease is more complex. The scope of open surgery will remain much wide for a large population for considered time in developing countries.

  2. Why did ancient people have atherosclerosis?: from autopsies to computed tomography to potential causes.

    PubMed

    Thomas, Gregory S; Wann, L Samuel; Allam, Adel H; Thompson, Randall C; Michalik, David E; Sutherland, M Linda; Sutherland, James D; Lombardi, Guido P; Watson, Lucia; Cox, Samantha L; Valladolid, Clide M; Abd El-Maksoud, Gomaa; Al-Tohamy Soliman, Muhammad; Badr, Ibrahem; el-Halim Nur el-Din, Abd; Clarke, Emily M; Thomas, Ian G; Miyamoto, Michael I; Kaplan, Hillard S; Frohlich, Bruno; Narula, Jagat; Stewart, Alexandre F R; Zink, Albert; Finch, Caleb E

    2014-06-01

    Computed tomographic findings of atherosclerosis in the ancient cultures of Egypt, Peru, the American Southwest and the Aleutian Islands challenge our understanding of the fundamental causes of atherosclerosis. Could these findings be true? Is so, what traditional risk factors might be present in these cultures that could explain this apparent paradox? The recent computed tomographic findings are consistent with multiple autopsy studies dating as far back as 1852 that demonstrate calcific atherosclerosis in ancient Egyptians and Peruvians. A nontraditional cause of atherosclerosis that could explain this burden of atherosclerosis is the microbial and parasitic inflammatory burden likely to be present in ancient cultures inherently lacking modern hygiene and antimicrobials. Patients with chronic systemic inflammatory diseases of today, including systemic lupus erythematosus, rheumatoid arthritis, and human immunodeficiency virus infection, experience premature atherosclerosis and coronary events. Might the chronic inflammatory load of ancient times secondary to infection have resulted in atherosclerosis? Smoke inhalation from the use of open fires for daily cooking and illumination represents another potential cause. Undiscovered risk factors could also have been present, potential causes that technologically cannot currently be measured in our serum or other tissue. A synthesis of these findings suggests that a gene-environmental interplay is causal for atherosclerosis. That is, humans have an inherent genetic susceptibility to atherosclerosis, whereas the speed and severity of its development are secondary to known and potentially unknown environmental factors. Copyright © 2014 World Heart Federation (Geneva). Published by Elsevier B.V. All rights reserved.

  3. Optimal Super Dielectric Material

    DTIC Science & Technology

    2015-09-01

    INTENTIONALLY LEFT BLANK i REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704–0188 Public reporting burden for this collection of information is estimated...containing liquid with dissolved ionic species will form large dipoles, polarized opposite the applied field. Large dipole SDM placed between the...electrodes of a parallel plate capacitor will reduce the net field to an unprecedented extent. This family of materials can form materials with

  4. Evaluation of a recycling process for printed circuit board by physical separation and heat treatment.

    PubMed

    Fujita, Toyohisa; Ono, Hiroyuki; Dodbiba, Gjergj; Yamaguchi, Kunihiko

    2014-07-01

    Printed circuit boards (PCBs) from discarded personal computer (PC) and hard disk drive were crushed by explosion in water or mechanical comminution in order to disintegrate the attached parts. More parts were stripped from PCB of PC, composed of epoxy resin; than from PCB of household appliance, composed of phenol resin. In an attempt to raise the copper grade of PCB by removing other components, a carbonization treatment was investigated. The crushed PCB without surface-mounted parts was carbonized under a nitrogen atmosphere at 873-1073 K. After screening, the char was classified by size into oversized pieces, undersized pieces and powder. The copper foil and glass fiber pieces were liberated and collected in undersized fraction. The copper foil was liberated easily from glass fiber by stamping treatment. As one of the mounted parts, the multi-layered ceramic capacitors (MLCCs), which contain nickel, were carbonized at 873 K. The magnetic separation is carried out at a lower magnetic field strength of 0.1T and then at 0.8 T. In the +0.5mm size fraction the nickel grade in magnetic product was increased from 0.16% to 6.7% and the nickel recovery is 74%. The other useful mounted parts are tantalum capacitors. The tantalum capacitors were collected from mounted parts. The tantalum-sintered bodies were separated from molded resins by heat treatment at 723-773 K in air atmosphere and screening of 0.5mm. Silica was removed and 70% of tantalum grade was obtained after more than 823K heating and separation. Next, the evaluation of Cu recycling in PCB is estimated. Energy consumption of new process increased and the treatment cost becomes 3 times higher comparing the conventional process, while the environmental burden of new process decreased comparing conventional process. The nickel recovery process in fine ground particles increased energy and energy cost comparing those of the conventional process. However, the environmental burden decreased than the conventional one. The process for recovering tantalum used more heat for the treatment and therefore the energy consumption increased by 50%, when comparing with conventional process. However, the market price for tantalum is very large; the profit for tantalum recovery is added. Also the environmental burden decreased by the recycling of tantalum recovery. Therefore, the tantalum recovery is very important step in the PCB recycling. If there is no tantalum, the consumed energy and treatment cost increase in the new process, though the environmental burden decreases. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Project U17 : Trusted Truck II (phase C).

    DOT National Transportation Integrated Search

    2009-07-01

    The states conduct close to 750,000 roadside inspections of commercial vehicles per year. Even with this seemingly large number of inspections, the states are still being overwhelmed with the burden of performing inspections in a fashion that ensures...

  6. Graph Representations of Flow and Transport in Fracture Networks using Machine Learning

    NASA Astrophysics Data System (ADS)

    Srinivasan, G.; Viswanathan, H. S.; Karra, S.; O'Malley, D.; Godinez, H. C.; Hagberg, A.; Osthus, D.; Mohd-Yusof, J.

    2017-12-01

    Flow and transport of fluids through fractured systems is governed by the properties and interactions at the micro-scale. Retaining information about the micro-structure such as fracture length, orientation, aperture and connectivity in mesh-based computational models results in solving for millions to billions of degrees of freedom and quickly renders the problem computationally intractable. Our approach depicts fracture networks graphically, by mapping fractures to nodes and intersections to edges, thereby greatly reducing computational burden. Additionally, we use machine learning techniques to build simulators on the graph representation, trained on data from the mesh-based high fidelity simulations to speed up computation by orders of magnitude. We demonstrate our methodology on ensembles of discrete fracture networks, dividing up the data into training and validation sets. Our machine learned graph-based solvers result in over 3 orders of magnitude speedup without any significant sacrifice in accuracy.

  7. Computational Approaches to the Chemical Equilibrium Constant in Protein-ligand Binding.

    PubMed

    Montalvo-Acosta, Joel José; Cecchini, Marco

    2016-12-01

    The physiological role played by protein-ligand recognition has motivated the development of several computational approaches to the ligand binding affinity. Some of them, termed rigorous, have a strong theoretical foundation but involve too much computation to be generally useful. Some others alleviate the computational burden by introducing strong approximations and/or empirical calibrations, which also limit their general use. Most importantly, there is no straightforward correlation between the predictive power and the level of approximation introduced. Here, we present a general framework for the quantitative interpretation of protein-ligand binding based on statistical mechanics. Within this framework, we re-derive self-consistently the fundamental equations of some popular approaches to the binding constant and pinpoint the inherent approximations. Our analysis represents a first step towards the development of variants with optimum accuracy/efficiency ratio for each stage of the drug discovery pipeline. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. The global burden of cholera

    PubMed Central

    Lopez, Anna Lena; You, Young Ae; Kim, Young Eun; Sah, Binod; Maskery, Brian; Clemens, John

    2012-01-01

    Abstract Objective To estimate the global burden of cholera using population-based incidence data and reports. Methods Countries with a recent history of cholera were classified as endemic or non-endemic, depending on whether they had reported cholera cases in at least three of the five most recent years. The percentages of the population in each country that lacked access to improved sanitation were used to compute the populations at risk for cholera, and incidence rates from published studies were applied to groups of countries to estimate the annual number of cholera cases in endemic countries. The estimates of cholera cases in non-endemic countries were based on the average numbers of cases reported from 2000 to 2008. Literature-based estimates of cholera case-fatality rates (CFRs) were used to compute the variance-weighted average cholera CFRs for estimating the number of cholera deaths. Findings About 1.4 billion people are at risk for cholera in endemic countries. An estimated 2.8 million cholera cases occur annually in such countries (uncertainty range: 1.4–4.3) and an estimated 87 000 cholera cases occur in non-endemic countries. The incidence is estimated to be greatest in children less than 5 years of age. Every year about 91 000 people (uncertainty range: 28 000 to 142 000) die of cholera in endemic countries and 2500 people die of the disease in non-endemic countries. Conclusion The global burden of cholera, as determined through a systematic review with clearly stated assumptions, is high. The findings of this study provide a contemporary basis for planning public health interventions to control cholera. PMID:22461716

  9. Emissions from residential energy use dominate exposure to ambient fine particulate matter in India

    NASA Astrophysics Data System (ADS)

    Conibear, L.; Butt, E. W.; Knote, C. J.; Arnold, S.; Spracklen, D. V.

    2017-12-01

    Exposure to ambient particulate matter of less than 2.5 µm in diameter (PM2.5) is a leading cause of disease burden in India. Information on the source contributions to the burden of disease attributable to ambient PM2.5 exposure is critical to support the national and sub-national control of air pollution. Previous studies analysing the contributions of different emission sectors to disease burden in India have been limited by coarse model resolutions and a lack of extensive PM2.5 observations before 2016. We use a regional numerical weather prediction model online-coupled with chemistry, evaluated against extensive surface observations, to make the first high resolution study of the contributions of seven emission sectors to the disease burden associated with ambient PM2.5 exposure in India. We find that residential energy use is the dominant contributing emission sector. Removing air pollution emissions from residential energy use would reduce population-weighted annual mean ambient PM2.5 concentrations by 52%, reducing the number of premature mortalities caused by exposure to ambient PM2.5 by 26%, equivalent to 268,000 (95% uncertainty interval (95UI): 167,000-360,000) lives every year. The smaller fractional reduction in mortality burden is due to the non-linear exposure-response relationship at the high PM2.5 concentrations observed across India and consequently large reductions in emissions are required to reduce the health burden from ambient PM2.5 exposure in India. Keywords: ambient air quality, India, residential energy use, health impact, particulate matter, WRF-Chem

  10. Automated, quantitative measures of grey and white matter lesion burden correlates with motor and cognitive function in children with unilateral cerebral palsy.

    PubMed

    Pagnozzi, Alex M; Dowson, Nicholas; Doecke, James; Fiori, Simona; Bradley, Andrew P; Boyd, Roslyn N; Rose, Stephen

    2016-01-01

    White and grey matter lesions are the most prevalent type of injury observable in the Magnetic Resonance Images (MRIs) of children with cerebral palsy (CP). Previous studies investigating the impact of lesions in children with CP have been qualitative, limited by the lack of automated segmentation approaches in this setting. As a result, the quantitative relationship between lesion burden has yet to be established. In this study, we perform automatic lesion segmentation on a large cohort of data (107 children with unilateral CP and 18 healthy children) with a new, validated method for segmenting both white matter (WM) and grey matter (GM) lesions. The method has better accuracy (94%) than the best current methods (73%), and only requires standard structural MRI sequences. Anatomical lesion burdens most predictive of clinical scores of motor, cognitive, visual and communicative function were identified using the Least Absolute Shrinkage and Selection operator (LASSO). The improved segmentations enabled identification of significant correlations between regional lesion burden and clinical performance, which conform to known structure-function relationships. Model performance was validated in an independent test set, with significant correlations observed for both WM and GM regional lesion burden with motor function (p < 0.008), and between WM and GM lesions alone with cognitive and visual function respectively (p < 0.008). The significant correlation of GM lesions with functional outcome highlights the serious implications GM lesions, in addition to WM lesions, have for prognosis, and the utility of structural MRI alone for quantifying lesion burden and planning therapy interventions.

  11. Influence of personality on depression, burden, and health-related quality of life in family caregivers of persons with dementia.

    PubMed

    Kim, Sun Kyung; Park, Myonghwa; Lee, Yunhwan; Choi, Seong Hye; Moon, So Young; Seo, Sang Won; Park, Kyung Won; Ku, Bon D; Han, Hyun Jeong; Park, Kee Hyung; Han, Seol-Heui; Kim, Eun-Joo; Lee, Jae-Hong; Park, Sun A; Shim, Yong S; Kim, Jong Hun; Hong, Chang Hyung; Na, Duk L; Ye, Byoung Seok; Kim, Hee Jin; Moon, Yeonsil

    2017-02-01

    Personality may predispose family caregivers to experience caregiving differently in similar situations and influence the outcomes of caregiving. A limited body of research has examined the role of some personality traits for health-related quality of life (HRQoL) among family caregivers of persons with dementia (PWD) in relation to burden and depression. Data from a large clinic-based national study in South Korea, the Caregivers of Alzheimer's Disease Research (CARE), were analyzed (N = 476). Path analysis was performed to explore the association between family caregivers' personality traits and HRQoL. With depression and burden as mediating factors, direct and indirect associations between five personality traits and HRQoL of family caregivers were examined. Results demonstrated the mediating role of caregiver burden and depression in linking two personality traits (neuroticism and extraversion) and HRQoL. Neuroticism and extraversion directly and indirectly influenced the mental HRQoL of caregivers. Neuroticism and extraversion only indirectly influenced their physical HRQoL. Neuroticism increased the caregiver's depression, whereas extraversion decreased it. Neuroticism only was mediated by burden to influence depression and mental and physical HRQoL. Personality traits can influence caregiving outcomes and be viewed as an individual resource of the caregiver. A family caregiver's personality characteristics need to be assessed for tailoring support programs to get the optimal benefits from caregiver interventions.

  12. Economic losses and burden of disease by medical conditions in Norway.

    PubMed

    Kinge, Jonas Minet; Sælensminde, Kjartan; Dieleman, Joseph; Vollset, Stein Emil; Norheim, Ole Frithjof

    2017-06-01

    We explore the correlation between disease specific estimates of economic losses and the burden of disease. This is based on data for Norway in 2013 from the Global Burden of Disease (GBD) project and the Norwegian Directorate of Health. The diagnostic categories were equivalent to the ICD-10 chapters. Mental disorders topped the list of the costliest conditions in Norway in 2013, and musculoskeletal disorders caused the highest production loss, while neoplasms caused the greatest burden in terms of DALYs. There was a positive and significant association between economic losses and burden of disease. Neoplasms, circulatory diseases, mental and musculoskeletal disorders all contributed to large health care expenditures. Non-fatal conditions with a high prevalence in working populations, like musculoskeletal and mental disorders, caused the largest production loss, while fatal conditions such as neoplasms and circulatory disease did not, since they occur mostly at old age. The magnitude of the production loss varied with the estimation method. The estimations presented in this study did not include reductions in future consumption, by net-recipients, due to premature deaths. Non-fatal diseases are thus even more burdensome, relative to fatal diseases, than the production loss in this study suggests. Hence, ignoring production losses may underestimate the economic losses from chronic diseases in countries with an epidemiological profile similar to Norway. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Implications of Functional Capacity Loss and Fatality for Vehicle Safety Prioritization.

    PubMed

    McMurry, Timothy L; Sherwood, Chris; Poplin, Gerald S; Seguí-Gómez, María; Crandall, Jeff

    2015-01-01

    We investigate the use of the Functional Capacity Index (FCI) as a tool for establishing vehicle safety priorities by comparing the life year burden of injuries to the burden of fatality in frontal and side automotive crashes. We demonstrate FCI's utility by investigating in detail the resulting disabling injuries and their life year costs. We selected occupants in the 2000-2013 NASS-CDS database involved in frontal and side crashes, merged their injuries with FCI, and then used the merged data to estimate each occupant's overall functional loss. Lifetime functional loss was assessed by combining this measure of impairment with the occupants' expected future life spans, estimated from the Social Security Administration's Actuarial Life Table. Frontal crashes produce a large number of disabling injuries, particularly to the lower extremities. In our population, these crashes are estimated to account for approximately 400,000 life years lost to disability in comparison with 500,000 life years lost to fatality. Victims of side crashes experienced a higher rate of fatality but a significantly lower rate of disabling injury (0.3 vs. 1.0%), resulting in approximately 370,000 life years lost to fatality versus 50,000 life years lost to disability. The burden of disabling injuries to car crash survivors should be considered when setting vehicle safety design priorities. In frontal crashes this burden in life years is similar to the burden attributable to fatality.

  14. Monitoring somatic symptoms in patients with mental disorders: Sensitivity to change and minimal clinically important difference of the Somatic Symptom Scale - 8 (SSS-8).

    PubMed

    Gierk, Benjamin; Kohlmann, Sebastian; Hagemann-Goebel, Marion; Löwe, Bernd; Nestoriuc, Yvonne

    2017-09-01

    The SSS-8 is a brief questionnaire for the assessment of somatic symptom burden. This study examines its sensitivity to change and the minimal clinically important difference (MCID) in patients with mental disorders. 55 outpatients with mental disorders completed the SSS-8 and measures of anxiety, depression, and disability before and after receiving treatment. Effect sizes and correlations between the change scores were calculated. The MCID was estimated using a one standard error of measurement threshold and the change in disability as an external criterion. There was a medium decline in somatic symptom burden for the complete sample (n=55, d z =0.53) and a large decline in a subgroup with very high somatic symptom burden at baseline (n=11, d z =0.94). Decreases in somatic symptom burden were associated with decreases in anxiety (r=0.68, p<0.001), depression (r=0.62, p<0.001) and disability (r=0.51, p<0.001). The MCID was estimated as a 3-point decrease. The SSS-8 is sensitive to change. A 3-point decrease reflects a clinically important improvement. Due to its brevity and sound psychometric properties, the SSS-8 is useful for monitoring somatic symptom burden. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Burden of disease from inadequate water, sanitation and hygiene in low- and middle-income settings: a retrospective analysis of data from 145 countries

    PubMed Central

    Prüss-Ustün, Annette; Bartram, Jamie; Clasen, Thomas; Colford, John M; Cumming, Oliver; Curtis, Valerie; Bonjour, Sophie; Dangour, Alan D; De France, Jennifer; Fewtrell, Lorna; Freeman, Matthew C; Gordon, Bruce; Hunter, Paul R; Johnston, Richard B; Mathers, Colin; Mäusezahl, Daniel; Medlicott, Kate; Neira, Maria; Stocks, Meredith; Wolf, Jennyfer; Cairncross, Sandy

    2014-01-01

    Objective To estimate the burden of diarrhoeal diseases from exposure to inadequate water, sanitation and hand hygiene in low- and middle-income settings and provide an overview of the impact on other diseases. Methods For estimating the impact of water, sanitation and hygiene on diarrhoea, we selected exposure levels with both sufficient global exposure data and a matching exposure-risk relationship. Global exposure data were estimated for the year 2012, and risk estimates were taken from the most recent systematic analyses. We estimated attributable deaths and disability-adjusted life years (DALYs) by country, age and sex for inadequate water, sanitation and hand hygiene separately, and as a cluster of risk factors. Uncertainty estimates were computed on the basis of uncertainty surrounding exposure estimates and relative risks. Results In 2012, 502 000 diarrhoea deaths were estimated to be caused by inadequate drinking water and 280 000 deaths by inadequate sanitation. The most likely estimate of disease burden from inadequate hand hygiene amounts to 297 000 deaths. In total, 842 000 diarrhoea deaths are estimated to be caused by this cluster of risk factors, which amounts to 1.5% of the total disease burden and 58% of diarrhoeal diseases. In children under 5 years old, 361 000 deaths could be prevented, representing 5.5% of deaths in that age group. Conclusions This estimate confirms the importance of improving water and sanitation in low- and middle-income settings for the prevention of diarrhoeal disease burden. It also underscores the need for better data on exposure and risk reductions that can be achieved with provision of reliable piped water, community sewage with treatment and hand hygiene. PMID:24779548

  16. Burden of disease from inadequate water, sanitation and hygiene in low- and middle-income settings: a retrospective analysis of data from 145 countries.

    PubMed

    Prüss-Ustün, Annette; Bartram, Jamie; Clasen, Thomas; Colford, John M; Cumming, Oliver; Curtis, Valerie; Bonjour, Sophie; Dangour, Alan D; De France, Jennifer; Fewtrell, Lorna; Freeman, Matthew C; Gordon, Bruce; Hunter, Paul R; Johnston, Richard B; Mathers, Colin; Mäusezahl, Daniel; Medlicott, Kate; Neira, Maria; Stocks, Meredith; Wolf, Jennyfer; Cairncross, Sandy

    2014-08-01

    To estimate the burden of diarrhoeal diseases from exposure to inadequate water, sanitation and hand hygiene in low- and middle-income settings and provide an overview of the impact on other diseases. For estimating the impact of water, sanitation and hygiene on diarrhoea, we selected exposure levels with both sufficient global exposure data and a matching exposure-risk relationship. Global exposure data were estimated for the year 2012, and risk estimates were taken from the most recent systematic analyses. We estimated attributable deaths and disability-adjusted life years (DALYs) by country, age and sex for inadequate water, sanitation and hand hygiene separately, and as a cluster of risk factors. Uncertainty estimates were computed on the basis of uncertainty surrounding exposure estimates and relative risks. In 2012, 502,000 diarrhoea deaths were estimated to be caused by inadequate drinking water and 280,000 deaths by inadequate sanitation. The most likely estimate of disease burden from inadequate hand hygiene amounts to 297,000 deaths. In total, 842,000 diarrhoea deaths are estimated to be caused by this cluster of risk factors, which amounts to 1.5% of the total disease burden and 58% of diarrhoeal diseases. In children under 5 years old, 361,000 deaths could be prevented, representing 5.5% of deaths in that age group. This estimate confirms the importance of improving water and sanitation in low- and middle-income settings for the prevention of diarrhoeal disease burden. It also underscores the need for better data on exposure and risk reductions that can be achieved with provision of reliable piped water, community sewage with treatment and hand hygiene. © 2014 The Authors. Tropical Medicine and International Health published by John Wiley & Sons Ltd.

  17. Global Burden of Disease of Mercury Used in Artisanal Small-Scale Gold Mining.

    PubMed

    Steckling, Nadine; Tobollik, Myriam; Plass, Dietrich; Hornberg, Claudia; Ericson, Bret; Fuller, Richard; Bose-O'Reilly, Stephan

    Artisanal small-scale gold mining (ASGM) is the world's largest anthropogenic source of mercury emission. Gold miners are highly exposed to metallic mercury and suffer occupational mercury intoxication. The global disease burden as a result of this exposure is largely unknown because the informal character of ASGM restricts the availability of reliable data. To estimate the prevalence of occupational mercury intoxication and the disability-adjusted life years (DALYs) attributable to chronic metallic mercury vapor intoxication (CMMVI) among ASGM gold miners globally and in selected countries. Estimates of the number of artisanal small-scale gold (ASG) miners were extracted from reviews supplemented by a literature search. Prevalence of moderate CMMVI among miners was determined by compiling a dataset of available studies that assessed frequency of intoxication in gold miners using a standardized diagnostic tool and biomonitoring data on mercury in urine. Severe cases of CMMVI were not included because it was assumed that these persons can no longer be employed as miners. Cases in workers' families and communities were not considered. Years lived with disability as a result of CMMVI among ASG miners were quantified by multiplying the number of prevalent cases of CMMVI by the appropriate disability weight. No deaths are expected to result from CMMVI and therefore years of life lost were not calculated. Disease burden was calculated by multiplying the prevalence rate with the number of miners for each country and the disability weight. Sensitivity analyses were performed using different assumptions on the number of miners and the intoxication prevalence rate. Globally, 14-19 million workers are employed as ASG miners. Based on human biomonitoring data, between 25% and 33% of these miners-3.3-6.5 million miners globally-suffer from moderate CMMVI. The resulting global burden of disease is estimated to range from 1.22 (uncertainty interval [UI] 0.87-1.61) to 2.39 (UI 1.69-3.14) million DALYs. This study presents the first global and country-based estimates of disease burden caused by mercury intoxication in ASGM. Data availability and quality limit the results, and the total disease burden is likely undercounted. Despite these limitations, the data clearly indicate that mercury intoxication in ASG miners is a major, largely neglected global health problem. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  18. The development of a public optometry system in Mozambique: a Cost Benefit Analysis.

    PubMed

    Thompson, Stephen; Naidoo, Kovin; Harris, Geoff; Bilotto, Luigi; Ferrão, Jorge; Loughman, James

    2014-09-23

    The economic burden of uncorrected refractive error (URE) is thought to be high in Mozambique, largely as a consequence of the lack of resources and systems to tackle this largely avoidable problem. The Mozambique Eyecare Project (MEP) has established the first optometry training and human resource deployment initiative to address the burden of URE in Lusophone Africa. The nature of the MEP programme provides the opportunity to determine, using Cost Benefit Analysis (CBA), whether investing in the establishment and delivery of a comprehensive system for optometry human resource development and public sector deployment is economically justifiable for Lusophone Africa. A CBA methodology was applied across the period 2009-2049. Costs associated with establishing and operating a school of optometry, and a programme to address uncorrected refractive error, were included. Benefits were calculated using a human capital approach to valuing sight. Disability weightings from the Global Burden of Disease study were applied. Costs were subtracted from benefits to provide the net societal benefit, which was discounted to provide the net present value using a 3% discount rate. Using the most recently published disability weightings, the potential exists, through the correction of URE in 24.3 million potentially economically productive persons, to achieve a net present value societal benefit of up to $1.1 billion by 2049, at a Benefit-Cost ratio of 14:1. When CBA assumptions are varied as part of the sensitivity analysis, the results suggest the societal benefit could lie in the range of $649 million to $9.6 billion by 2049. This study demonstrates that a programme designed to address the burden of refractive error in Mozambique is economically justifiable in terms of the increased productivity that would result due to its implementation.

  19. Examining the association of smoking with work productivity and associated costs in Japan.

    PubMed

    Suwa, Kiyomi; Flores, Natalia M; Yoshikawa, Reiko; Goto, Rei; Vietri, Jeffrey; Igarashi, Ataru

    2017-09-01

    Smoking is associated with significant health and economic burden globally, including an increased risk of many leading causes of mortality and significant impairments in work productivity. This burden is attenuated by successful tobacco cessation, including reduced risk of disease and improved productivity. The current study aimed to show the benefits of smoking cessation for workplace productivity and decreased costs associated with loss of work impairment. The data source was the 2011 Japan National Health and Wellness Survey (n = 30,000). Respondents aged 20-64 were used in the analyses (n = 23,738) and were categorized into: current smokers, former smokers, and never smokers. Generalized linear models controlling for demographics and health characteristics examined the relationship of smoking status with the Work Productivity and Activity Impairment questionnaire (WPAI-GH) endpoints, as well as estimated indirect costs. Current smokers reported the greatest overall work impairment, including absenteeism (i.e. work time missed) and presenteeism (i.e. impairment while at work); however, after controlling for covariates, there were no significant differences between former smokers and never smokers on overall work impairment. Current smokers and former smokers had greater activity impairment (i.e. impairment in daily activities) than never smokers. Current smokers reported the highest indirect costs (i.e. costs associated with work impairment); however, after controlling for covariates, there were no significant differences between former smokers and never smokers on indirect costs. Smoking exerts a large health and economic burden; however, smoking cessation attenuates this burden. The current study provides important further evidence of this association, with former smokers appearing statistically indistinguishable from never smokers in terms of work productivity loss and associated indirect costs among a large representative sample of Japanese workers. This report highlights the workplace benefits of smoking cessation across productivity markers and cost-savings.

  20. Precision phenotyping, panomics, and system-level bioinformatics to delineate complex biologies of atherosclerosis: rationale and design of the "Genetic Loci and the Burden of Atherosclerotic Lesions" study.

    PubMed

    Voros, Szilard; Maurovich-Horvat, Pal; Marvasty, Idean B; Bansal, Aruna T; Barnes, Michael R; Vazquez, Gustavo; Murray, Sarah S; Voros, Viktor; Merkely, Bela; Brown, Bradley O; Warnick, G Russell

    2014-01-01

    Complex biological networks of atherosclerosis are largely unknown. The main objective of the Genetic Loci and the Burden of Atherosclerotic Lesions study is to assemble comprehensive biological networks of atherosclerosis using advanced cardiovascular imaging for phenotyping, a panomic approach to identify underlying genomic, proteomic, metabolomic, and lipidomic underpinnings, analyzed by systems biology-driven bioinformatics. By design, this is a hypothesis-free unbiased discovery study collecting a large number of biologically related factors to examine biological associations between genomic, proteomic, metabolomic, lipidomic, and phenotypic factors of atherosclerosis. The Genetic Loci and the Burden of Atherosclerotic Lesions study (NCT01738828) is a prospective, multicenter, international observational study of atherosclerotic coronary artery disease. Approximately 7500 patients are enrolled and undergo non-contrast-enhanced coronary calcium scanning by CT for the detection and quantification of coronary artery calcium, as well as coronary artery CT angiography for the detection and quantification of plaque, stenosis, and overall coronary artery disease burden. In addition, patients undergo whole genome sequencing, DNA methylation, whole blood-based transcriptome sequencing, unbiased proteomics based on mass spectrometry, as well as metabolomics and lipidomics on a mass spectrometry platform. The study is analyzed in 3 subsequent phases, and each phase consists of a discovery cohort and an independent validation cohort. For the primary analysis, the primary phenotype will be the presence of any atherosclerotic plaque, as detected by cardiac CT. Additional phenotypic analyses will include per patient maximal luminal stenosis defined as 50% and 70% diameter stenosis. Single-omic and multi-omic associations will be examined for each phenotype; putative biomarkers will be assessed for association, calibration, discrimination, and reclassification. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  1. Interest of a simple on-line screening registry for measuring ICU burden related to an influenza pandemic.

    PubMed

    Richard, Jean-Christophe Marie; Pham, Tài; Brun-Buisson, Christian; Reignier, Jean; Mercat, Alain; Beduneau, Gaëtan; Régnier, Bernard; Mourvillier, Bruno; Guitton, Christophe; Castanier, Matthias; Combes, Alain; Le Tulzo, Yves; Brochard, Laurent

    2012-07-09

    The specific burden imposed on Intensive Care Units (ICUs) during the A/H1N1 influenza 2009 pandemic has been poorly explored. An on-line screening registry allowed a daily report of ICU beds occupancy rate by flu infected patients (Flu-OR) admitted in French ICUs. We conducted a prospective inception cohort study with results of an on-line screening registry designed for daily assessment of ICU burden. Among the 108 centers participating to the French H1N1 research network on mechanical ventilation (REVA) - French Society of Intensive Care (SRLF) registry, 69 ICUs belonging to seven large geographical areas voluntarily participated in a website screening-registry. The aim was to daily assess the ICU beds occupancy rate by influenza-infected and non-infected patients for at least three weeks. Three hundred ninety-one critically ill infected patients were enrolled in the cohort, representing a subset of 35% of the whole French 2009 pandemic cohort; 73% were mechanically ventilated, 13% required extra corporal membrane oxygenation (ECMO) and 22% died. The global Flu-OR in these ICUs was only 7.6%, but it exceeded a predefined 15% critical threshold in 32 ICUs for a total of 103 weeks. Flu-ORs were significantly higher in University than in non-University hospitals. The peak ICU burden was poorly predicted by observations obtained at the level of large geographical areas. The peak Flu-OR during the pandemic significantly exceeded a 15% critical threshold in almost half of the ICUs, with an uneven distribution with time, geographical areas and between University and non-University hospitals. An on-line assessment of Flu-OR via a simple dedicated registry may contribute to better match resources and needs.

  2. Parallel computing in experimental mechanics and optical measurement: A review (II)

    NASA Astrophysics Data System (ADS)

    Wang, Tianyi; Kemao, Qian

    2018-05-01

    With advantages such as non-destructiveness, high sensitivity and high accuracy, optical techniques have successfully integrated into various important physical quantities in experimental mechanics (EM) and optical measurement (OM). However, in pursuit of higher image resolutions for higher accuracy, the computation burden of optical techniques has become much heavier. Therefore, in recent years, heterogeneous platforms composing of hardware such as CPUs and GPUs, have been widely employed to accelerate these techniques due to their cost-effectiveness, short development cycle, easy portability, and high scalability. In this paper, we analyze various works by first illustrating their different architectures, followed by introducing their various parallel patterns for high speed computation. Next, we review the effects of CPU and GPU parallel computing specifically in EM & OM applications in a broad scope, which include digital image/volume correlation, fringe pattern analysis, tomography, hyperspectral imaging, computer-generated holograms, and integral imaging. In our survey, we have found that high parallelism can always be exploited in such applications for the development of high-performance systems.

  3. Direct migration motion estimation and mode decision to decoder for a low-complexity decoder Wyner-Ziv video coding

    NASA Astrophysics Data System (ADS)

    Lei, Ted Chih-Wei; Tseng, Fan-Shuo

    2017-07-01

    This paper addresses the problem of high-computational complexity decoding in traditional Wyner-Ziv video coding (WZVC). The key focus is the migration of two traditionally high-computationally complex encoder algorithms, namely motion estimation and mode decision. In order to reduce the computational burden in this process, the proposed architecture adopts the partial boundary matching algorithm and four flexible types of block mode decision at the decoder. This approach does away with the need for motion estimation and mode decision at the encoder. The experimental results show that the proposed padding block-based WZVC not only decreases decoder complexity to approximately one hundredth that of the state-of-the-art DISCOVER decoding but also outperforms DISCOVER codec by up to 3 to 4 dB.

  4. Propagation of Statistical Noise Through a Two-Qubit Maximum Likelihood Tomography

    DTIC Science & Technology

    2018-04-01

    University Daniel E Jones, Brian T Kirby, and Michael Brodsky Computational and Information Sciences Directorate, ARL Approved for...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing the collection information . Send comments regarding this burden estimate or any other

  5. Neuron Design in Neuromorphic Computing Systems and Its Application in Wireless Communications

    DTIC Science & Technology

    2017-03-01

    0188 The public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing...for data representation using hardware spike timing dependent encoding for neuromorphic processors; (b) explore the applications of neuromorphic...envisioned architecture will serve as the foundation for unprecedented capabilities in real- time applications such as the MIMO channel estimation that

  6. Computational Study of the Structure and Mechanical Properties of the Molecular Crystal RDX

    DTIC Science & Technology

    2011-01-01

    Doctor of Philosophy, 2011 Directed By: Assistant Professor Santiago D. Solares , Department of Mechanical Engineering Molecular crystals...Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per response...NAME OF RESPONSIBLE PERSON a. REPORT unclassified b. ABSTRACT unclassified c. THIS PAGE unclassified Standard Form 298 (Rev. 8-98) Prescribed

  7. A fuzzy structural matching scheme for space robotics vision

    NASA Technical Reports Server (NTRS)

    Naka, Masao; Yamamoto, Hiromichi; Homma, Khozo; Iwata, Yoshitaka

    1994-01-01

    In this paper, we propose a new fuzzy structural matching scheme for space stereo vision which is based on the fuzzy properties of regions of images and effectively reduces the computational burden in the following low level matching process. Three dimensional distance images of a space truss structural model are estimated using this scheme from stereo images sensed by Charge Coupled Device (CCD) TV cameras.

  8. Improving imperfect data from health management information systems in Africa using space-time geostatistics.

    PubMed

    Gething, Peter W; Noor, Abdisalan M; Gikandi, Priscilla W; Ogara, Esther A A; Hay, Simon I; Nixon, Mark S; Snow, Robert W; Atkinson, Peter M

    2006-06-01

    Reliable and timely information on disease-specific treatment burdens within a health system is critical for the planning and monitoring of service provision. Health management information systems (HMIS) exist to address this need at national scales across Africa but are failing to deliver adequate data because of widespread underreporting by health facilities. Faced with this inadequacy, vital public health decisions often rely on crudely adjusted regional and national estimates of treatment burdens. This study has taken the example of presumed malaria in outpatients within the largely incomplete Kenyan HMIS database and has defined a geostatistical modelling framework that can predict values for all data that are missing through space and time. The resulting complete set can then be used to define treatment burdens for presumed malaria at any level of spatial and temporal aggregation. Validation of the model has shown that these burdens are quantified to an acceptable level of accuracy at the district, provincial, and national scale. The modelling framework presented here provides, to our knowledge for the first time, reliable information from imperfect HMIS data to support evidence-based decision-making at national and sub-national levels.

  9. Health Cobenefits and Transportation-Related Reductions in Greenhouse Gas Emissions in the San Francisco Bay Area

    PubMed Central

    Woodcock, James; Co, Sean; Ostro, Bart; Fanai, Amir; Fairley, David

    2013-01-01

    Objectives. We quantified health benefits of transportation strategies to reduce greenhouse gas emissions (GHGE). Methods. Statistics on travel patterns and injuries, physical activity, fine particulate matter, and GHGE in the San Francisco Bay Area, California, were input to a model that calculated the health impacts of walking and bicycling short distances usually traveled by car or driving low-emission automobiles. We measured the change in disease burden in disability-adjusted life years (DALYs) based on dose–response relationships and the distributions of physical activity, particulate matter, and traffic injuries. Results: Increasing median daily walking and bicycling from 4 to 22 minutes reduced the burden of cardiovascular disease and diabetes by 14% (32 466 DALYs), increased the traffic injury burden by 39% (5907 DALYS), and decreased GHGE by 14%. Low-carbon driving reduced GHGE by 33.5% and cardiorespiratory disease burden by less than 1%. Conclusions: Increased physical activity associated with active transport could generate a large net improvement in population health. Measures would be needed to minimize pedestrian and bicyclist injuries. Together, active transport and low-carbon driving could achieve GHGE reductions sufficient for California to meet legislative mandates. PMID:23409903

  10. Improving Imperfect Data from Health Management Information Systems in Africa Using Space–Time Geostatistics

    PubMed Central

    Gething, Peter W; Noor, Abdisalan M; Gikandi, Priscilla W; Ogara, Esther A. A; Hay, Simon I; Nixon, Mark S; Snow, Robert W; Atkinson, Peter M

    2006-01-01

    Background Reliable and timely information on disease-specific treatment burdens within a health system is critical for the planning and monitoring of service provision. Health management information systems (HMIS) exist to address this need at national scales across Africa but are failing to deliver adequate data because of widespread underreporting by health facilities. Faced with this inadequacy, vital public health decisions often rely on crudely adjusted regional and national estimates of treatment burdens. Methods and Findings This study has taken the example of presumed malaria in outpatients within the largely incomplete Kenyan HMIS database and has defined a geostatistical modelling framework that can predict values for all data that are missing through space and time. The resulting complete set can then be used to define treatment burdens for presumed malaria at any level of spatial and temporal aggregation. Validation of the model has shown that these burdens are quantified to an acceptable level of accuracy at the district, provincial, and national scale. Conclusions The modelling framework presented here provides, to our knowledge for the first time, reliable information from imperfect HMIS data to support evidence-based decision-making at national and sub-national levels. PMID:16719557

  11. Microbial translocation and skeletal muscle in young and old vervet monkeys.

    PubMed

    Kavanagh, Kylie; Brown, Richelle N; Davis, Ashley T; Uberseder, Beth; Floyd, Edison; Pfisterer, Bianca; Shively, Carol A

    2016-06-01

    Intestinal barrier dysfunction leads to microbial translocation (MT) and inflammation in vertebrate and invertebrate animal models. Age is recently recognized as a factor leading to MT, and in some human and animal model studies, MT was associated with physical function. We evaluated sarcopenia, inflammation, MT biomarkers, and muscle insulin sensitivity in healthy female vervet monkeys (6-27 years old). Monkeys were fed consistent diets and had large and varied environments to facilitate physical activity, and stable social conditions. Aging led to sarcopenia as indicated by reduced walking speeds and muscle mass, but general metabolic health was similar in older monkeys (n = 25) as compared to younger ones (n = 26). When older monkeys were physically active, their MT burden approximated that in young monkeys; however, when older monkeys were sedentary, MT burden was dramatically increased. MT levels were positively associated with inflammatory burden and negatively associated with skeletal muscle insulin sensitivity. Time spent being active was positively associated with insulin sensitivity as expected, but this relationship was specifically modified by the individual monkey's MT, not inflammatory burden. Our data supports clinical observations that MT interacts with physical function as a factor in healthy aging.

  12. Assessment of physical server reliability in multi cloud computing system

    NASA Astrophysics Data System (ADS)

    Kalyani, B. J. D.; Rao, Kolasani Ramchand H.

    2018-04-01

    Business organizations nowadays functioning with more than one cloud provider. By spreading cloud deployment across multiple service providers, it creates space for competitive prices that minimize the burden on enterprises spending budget. To assess the software reliability of multi cloud application layered software reliability assessment paradigm is considered with three levels of abstractions application layer, virtualization layer, and server layer. The reliability of each layer is assessed separately and is combined to get the reliability of multi-cloud computing application. In this paper, we focused on how to assess the reliability of server layer with required algorithms and explore the steps in the assessment of server reliability.

  13. Data association approaches in bearings-only multi-target tracking

    NASA Astrophysics Data System (ADS)

    Xu, Benlian; Wang, Zhiquan

    2008-03-01

    According to requirements of time computation complexity and correctness of data association of the multi-target tracking, two algorithms are suggested in this paper. The proposed Algorithm 1 is developed from the modified version of dual Simplex method, and it has the advantage of direct and explicit form of the optimal solution. The Algorithm 2 is based on the idea of Algorithm 1 and rotational sort method, it combines not only advantages of Algorithm 1, but also reduces the computational burden, whose complexity is only 1/ N times that of Algorithm 1. Finally, numerical analyses are carried out to evaluate the performance of the two data association algorithms.

  14. Flexible binding simulation by a novel and improved version of virtual-system coupled adaptive umbrella sampling

    NASA Astrophysics Data System (ADS)

    Dasgupta, Bhaskar; Nakamura, Haruki; Higo, Junichi

    2016-10-01

    Virtual-system coupled adaptive umbrella sampling (VAUS) enhances sampling along a reaction coordinate by using a virtual degree of freedom. However, VAUS and regular adaptive umbrella sampling (AUS) methods are yet computationally expensive. To decrease the computational burden further, improvements of VAUS for all-atom explicit solvent simulation are presented here. The improvements include probability distribution calculation by a Markov approximation; parameterization of biasing forces by iterative polynomial fitting; and force scaling. These when applied to study Ala-pentapeptide dimerization in explicit solvent showed advantage over regular AUS. By using improved VAUS larger biological systems are amenable.

  15. Preliminary performance analysis of an interplanetary navigation system using asteroid based beacons

    NASA Technical Reports Server (NTRS)

    Jee, J. Rodney; Khatib, Ahmad R.; Muellerschoen, Ronald J.; Williams, Bobby G.; Vincent, Mark A.

    1988-01-01

    A futuristic interplanetary navigation system using transmitters placed on selected asteroids is introduced. This network of space beacons is seen as a needed alternative to the overly burdened Deep Space Network. Covariance analyses on the potential performance of these space beacons located on a candidate constellation of eight real asteroids are initiated. Simplified analytic calculations are performed to determine limiting accuracies attainable with the network for geometric positioning. More sophisticated computer simulations are also performed to determine potential accuracies using long arcs of range and Doppler data from the beacons. The results from these computations show promise for this navigation system.

  16. Modeling inelastic phonon scattering in atomic- and molecular-wire junctions

    NASA Astrophysics Data System (ADS)

    Paulsson, Magnus; Frederiksen, Thomas; Brandbyge, Mads

    2005-11-01

    Computationally inexpensive approximations describing electron-phonon scattering in molecular-scale conductors are derived from the nonequilibrium Green’s function method. The accuracy is demonstrated with a first-principles calculation on an atomic gold wire. Quantitative agreement between the full nonequilibrium Green’s function calculation and the newly derived expressions is obtained while simplifying the computational burden by several orders of magnitude. In addition, analytical models provide intuitive understanding of the conductance including nonequilibrium heating and provide a convenient way of parameterizing the physics. This is exemplified by fitting the expressions to the experimentally observed conductances through both an atomic gold wire and a hydrogen molecule.

  17. An MPA-IO interface to HPSS

    NASA Technical Reports Server (NTRS)

    Jones, Terry; Mark, Richard; Martin, Jeanne; May, John; Pierce, Elsie; Stanberry, Linda

    1996-01-01

    This paper describes an implementation of the proposed MPI-IO (Message Passing Interface - Input/Output) standard for parallel I/O. Our system uses third-party transfer to move data over an external network between the processors where it is used and the I/O devices where it resides. Data travels directly from source to destination, without the need for shuffling it among processors or funneling it through a central node. Our distributed server model lets multiple compute nodes share the burden of coordinating data transfers. The system is built on the High Performance Storage System (HPSS), and a prototype version runs on a Meiko CS-2 parallel computer.

  18. Association of Burden of Atrial Fibrillation With Risk of Ischemic Stroke in Adults With Paroxysmal Atrial Fibrillation: The KP-RHYTHM Study.

    PubMed

    Go, Alan S; Reynolds, Kristi; Yang, Jingrong; Gupta, Nigel; Lenane, Judith; Sung, Sue Hee; Harrison, Teresa N; Liu, Taylor I; Solomon, Matthew D

    2018-05-16

    Atrial fibrillation is a potent risk factor for stroke, but whether the burden of atrial fibrillation in patients with paroxysmal atrial fibrillation independently influences the risk of thromboembolism remains controversial. To determine if the burden of atrial fibrillation characterized using noninvasive, continuous ambulatory monitoring is associated with the risk of ischemic stroke or arterial thromboembolism in adults with paroxysmal atrial fibrillation. This retrospective cohort study conducted from October 2011 and October 2016 at 2 large integrated health care delivery systems used an extended continuous cardiac monitoring system to identify adults who were found to have paroxysmal atrial fibrillation on 14-day continuous ambulatory electrocardiographic monitoring. The burden of atrial fibrillation was defined as the percentage of analyzable wear time in atrial fibrillation or flutter during the up to 14-day monitoring period. Ischemic stroke and other arterial thromboembolic events occurring while patients were not taking anticoagulation were identified through November 2016 using electronic medical records and were validated by manual review. We evaluated the association of the burden of atrial fibrillation with thromboembolism while not taking anticoagulation after adjusting for the Anticoagulation and Risk Factors in Atrial Fibrillation (ATRIA) or CHA2DS2-VASc stroke risk scores. Among 1965 adults with paroxysmal atrial fibrillation, the mean (SD) age was 69 (11.8) years, 880 (45%) were women, 496 (25%) were persons of color, the median ATRIA stroke risk score was 4 (interquartile range [IQR], 2-7), and the median CHA2DS2-VASc score was 3 (IQR, 1-4). The median burden of atrial fibrillation was 4.4% (IQR ,1.1%-17.23%). Patients with a higher burden of atrial fibrillation were less likely to be women or of Hispanic ethnicity, but had more prior cardioversion attempts compared with those who had a lower burden. After adjusting for either ATRIA or CHA2DS2-VASc stroke risk scores, the highest tertile of atrial fibrillation burden (≥11.4%) was associated with a more than 3-fold higher adjusted rate of thromboembolism while not taking anticoagulants (adjusted hazard ratios, 3.13 [95% CI, 1.50-6.56] and 3.16 [95% CI, 1.51-6.62], respectively) compared with the combined lower 2 tertiles of atrial fibrillation burden. Results were consistent across demographic and clinical subgroups. A greater burden of atrial fibrillation is associated with a higher risk of ischemic stroke independent of known stroke risk factors in adults with paroxysmal atrial fibrillation.

  19. Dioxins and Cardiovascular Mortality: A Review (EHP)

    EPA Science Inventory

    In spite of its large public health burden, the risk factors for cardiovascular disease remain incompletely understood. Here we review the association of cardiovascular disease (CVD) mortality with exposure to dioxin, a pollutant resulting from the production and combustion of ch...

  20. How language production shapes language form and comprehension

    PubMed Central

    MacDonald, Maryellen C.

    2012-01-01

    Language production processes can provide insight into how language comprehension works and language typology—why languages tend to have certain characteristics more often than others. Drawing on work in memory retrieval, motor planning, and serial order in action planning, the Production-Distribution-Comprehension (PDC) account links work in the fields of language production, typology, and comprehension: (1) faced with substantial computational burdens of planning and producing utterances, language producers implicitly follow three biases in utterance planning that promote word order choices that reduce these burdens, thereby improving production fluency. (2) These choices, repeated over many utterances and individuals, shape the distributions of utterance forms in language. The claim that language form stems in large degree from producers' attempts to mitigate utterance planning difficulty is contrasted with alternative accounts in which form is driven by language use more broadly, language acquisition processes, or producers' attempts to create language forms that are easily understood by comprehenders. (3) Language perceivers implicitly learn the statistical regularities in their linguistic input, and they use this prior experience to guide comprehension of subsequent language. In particular, they learn to predict the sequential structure of linguistic signals, based on the statistics of previously-encountered input. Thus, key aspects of comprehension behavior are tied to lexico-syntactic statistics in the language, which in turn derive from utterance planning biases promoting production of comparatively easy utterance forms over more difficult ones. This approach contrasts with classic theories in which comprehension behaviors are attributed to innate design features of the language comprehension system and associated working memory. The PDC instead links basic features of comprehension to a different source: production processes that shape language form. PMID:23637689

  1. A mobile platform for automated screening of asthma and chronic obstructive pulmonary disease.

    PubMed

    Chamberlain, Daniel B; Kodgule, Rahul; Fletcher, Richard Ribon

    2016-08-01

    Chronic Obstructive Pulmonary Disease (COPD) and asthma each represent a large proportion of the global disease burden; COPD is the third leading cause of death worldwide and asthma is one of the most prevalent chronic diseases, afflicting over 300 million people. Much of this burden is concentrated in the developing world, where patients lack access to physicians trained in the diagnosis of pulmonary disease. As a result, these patients experience high rates of underdiagnosis and misdiagnosis. To address this need, we present a mobile platform capable of screening for Asthma and COPD. Our solution is based on a mobile smart phone and consists of an electronic stethoscope, a peak flow meter application, and a patient questionnaire. This data is combined with a machine learning algorithm to identify patients with asthma and COPD. To test and validate the design, we collected data from 119 healthy and sick participants using our custom mobile application and ran the analysis on a PC computer. For comparison, all subjects were examined by an experienced pulmonologist using a full pulmonary testing laboratory. Employing a two-stage logistic regression model, our algorithms were first able to identify patients with either asthma or COPD from the general population, yielding an ROC curve with an AUC of 0.95. Then, after identifying these patients, our algorithm was able to distinguish between patients with asthma and patients with COPD, yielding an ROC curve with AUC of 0.97. This work represents an important milestone towards creating a self-contained mobile phone-based platform that can be used for screening and diagnosis of pulmonary disease in many parts of the world.

  2. Chemistry in CESM-SE: Evaluation, Performance and Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lamarque, Jean-Francois; Conley, Andrew; Vitt, Francis

    2016-01-06

    The purpose of the proposed work focused on development of chemistry representation within the Spectral Element (SE) dynamical core as implemented in the Community Earth System Model (CESM). More specifically, a main focus was on the ability of SE to accurately represent tracer transport. The proposed approach was to incrementally increase the complexity of the problem, starting from specified two-dimensional flow and tracers to simulations using specified dynamics and full chemistry. As demonstrated below, we have successfully studied all aspects of the proposed work, although only part of the work has been published in the refereed literature so far. Furthermore,more » because the SE dynamical core has been found to have several deficiencies that are still being investigated for solution, not all proposed tasks were finalized. In addition to the tests for SE performance, in an effort to decrease the computational burden of interactive chemistry, especially in the case of a large number of chemical species and chemical reactions, development on a faster chemical solver and implementation on GPUs has been implemented in CESM under the leadership of John Drake (U. Tennessee).« less

  3. iPad: Semantic annotation and markup of radiological images.

    PubMed

    Rubin, Daniel L; Rodriguez, Cesar; Shah, Priyanka; Beaulieu, Chris

    2008-11-06

    Radiological images contain a wealth of information,such as anatomy and pathology, which is often not explicit and computationally accessible. Information schemes are being developed to describe the semantic content of images, but such schemes can be unwieldy to operationalize because there are few tools to enable users to capture structured information easily as part of the routine research workflow. We have created iPad, an open source tool enabling researchers and clinicians to create semantic annotations on radiological images. iPad hides the complexity of the underlying image annotation information model from users, permitting them to describe images and image regions using a graphical interface that maps their descriptions to structured ontologies semi-automatically. Image annotations are saved in a variety of formats,enabling interoperability among medical records systems, image archives in hospitals, and the Semantic Web. Tools such as iPad can help reduce the burden of collecting structured information from images, and it could ultimately enable researchers and physicians to exploit images on a very large scale and glean the biological and physiological significance of image content.

  4. Recent developments in blast furnace process control within British Steel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warren, P.W.

    1995-12-01

    British Steel generally operates seven blast furnaces on four integrated works. All furnaces have been equipped with comprehensive instrumentation and data logging computers over the past eight years. The four Scunthorpe furnaces practice coal injection up to 170 kg/tHM (340 lb/THM), the remainder injecting oil at up to 100 kg/tHM (200 lb/THM). Distribution control is effected by Paul Wurth Bell-Less Tops on six of the seven furnaces, and Movable Throat Armour with bells on the remaining one. All have at least one sub burden probe. The blast furnace operator has a vast quantity of data and signals to consider andmore » evaluate when attempting to achieve the objective of providing a consistent supply of hot metal. Techniques have been, and are being, developed to assist the operator to interpret large numbers of signals. A simple operator guidance system has been developed to provide advice, based on current operating procedures and interpreted data. Further development will involve the use of a sophisticated Expert System software shell.« less

  5. The burden of pediatric diarrhea: a cross-sectional study of incurred costs and perceptions of cost among Bolivian families.

    PubMed

    Burke, Rachel M; Rebolledo, Paulina A; Embrey, Sally R; Wagner, Laura Danielle; Cowden, Carter L; Kelly, Fiona M; Smith, Emily R; Iñiguez, Volga; Leon, Juan S

    2013-08-02

    Worldwide, acute gastroenteritis represents an enormous public health threat to children under five years of age, causing one billion episodes and 1.9 to 3.2 million deaths per year. In Bolivia, which has one of the lower GDPs in South America, an estimated 15% of under-five deaths are caused by diarrhea. Bolivian caregiver expenses related to diarrhea are believed to be minimal, as citizens benefit from universal health insurance for children under five. The goals of this report were to describe total incurred costs and cost burden associated with caregivers seeking treatment for pediatric gastroenteritis, and to quantify relationships among costs, cost burden, treatment setting, and perceptions of costs. From 2007 to 2009, researchers interviewed caregivers (n=1,107) of pediatric patients (<5 years of age) seeking treatment for diarrhea in sentinel hospitals participating in Bolivia's diarrheal surveillance program across three main geographic regions. Data collected included demographics, clinical symptoms, direct costs (e.g. medication, consult fees) and indirect costs (e.g. lost wages). Patient populations were similar across cities in terms of gender, duration of illness, and age, but familial income varied significantly (p<0.05) when stratified on appointment type. Direct, indirect, and total costs to families were significantly higher for inpatients as compared to outpatients of urban (p<0.001) and rural (p<0.05) residence. Consult fees and indirect costs made up a large proportion of total costs. Forty-five percent of patients' families paid ≥1% of their annual household income for this single diarrheal episode. The perception that cost was affecting family finances was more frequent among those with higher actual cost burden. This study demonstrated that indirect costs due to acute pediatric diarrhea were a large component of total incurred familial costs. Additionally, familial costs associated with a single diarrheal episode affected the actual and perceived financial situation of a large number of caregivers. These data serve as a baseline for societal diarrheal costs before and immediately following the implementation of the rotavirus vaccine and highlight the serious economic importance of a diarrheal episode to Bolivian caregivers.

  6. The burden of pediatric diarrhea: a cross-sectional study of incurred costs and perceptions of cost among Bolivian families

    PubMed Central

    2013-01-01

    Background Worldwide, acute gastroenteritis represents an enormous public health threat to children under five years of age, causing one billion episodes and 1.9 to 3.2 million deaths per year. In Bolivia, which has one of the lower GDPs in South America, an estimated 15% of under-five deaths are caused by diarrhea. Bolivian caregiver expenses related to diarrhea are believed to be minimal, as citizens benefit from universal health insurance for children under five. The goals of this report were to describe total incurred costs and cost burden associated with caregivers seeking treatment for pediatric gastroenteritis, and to quantify relationships among costs, cost burden, treatment setting, and perceptions of costs. Methods From 2007 to 2009, researchers interviewed caregivers (n=1,107) of pediatric patients (<5 years of age) seeking treatment for diarrhea in sentinel hospitals participating in Bolivia’s diarrheal surveillance program across three main geographic regions. Data collected included demographics, clinical symptoms, direct costs (e.g. medication, consult fees) and indirect costs (e.g. lost wages). Results Patient populations were similar across cities in terms of gender, duration of illness, and age, but familial income varied significantly (p<0.05) when stratified on appointment type. Direct, indirect, and total costs to families were significantly higher for inpatients as compared to outpatients of urban (p<0.001) and rural (p<0.05) residence. Consult fees and indirect costs made up a large proportion of total costs. Forty-five percent of patients’ families paid ≥1% of their annual household income for this single diarrheal episode. The perception that cost was affecting family finances was more frequent among those with higher actual cost burden. Conclusions This study demonstrated that indirect costs due to acute pediatric diarrhea were a large component of total incurred familial costs. Additionally, familial costs associated with a single diarrheal episode affected the actual and perceived financial situation of a large number of caregivers. These data serve as a baseline for societal diarrheal costs before and immediately following the implementation of the rotavirus vaccine and highlight the serious economic importance of a diarrheal episode to Bolivian caregivers. PMID:23915207

  7. The economic cost of physical inactivity in China.

    PubMed

    Zhang, Juan; Chaaban, Jad

    2013-01-01

    To estimate the total economic burden of physical inactivity in China. The costs of physical inactivity combine the medical and non-medical costs of five major Non Communicable Diseases (NCDs) associated with inactivity. The national data from the Chinese Behavioral Risk Factors Surveillance Surveys (2007) and the National Health Service Survey (2003) are used to compute population attributable risks (PARs) of inactivity for each major NCD. Costs specific to inactivity are obtained by multiplying each disease costs by the PAR for each NCD, by incorporating the inactivity effects through overweight and obesity. Physical inactivity contributes between 12% and 19% to the risks associated with the five major NCDs in China, namely coronary heart disease, stroke, hypertension, cancer, and type 2 diabetes. Physical inactivity is imposing a substantial economic burden on the country, as it is responsible alone for more than 15% of the medical and non-medical yearly costs of the main NCDs in the country. The high economic burden of physical inactivity implies the need to develop more programs and interventions that address this modifiable behavioral risk, in order to curb the rising NCDs epidemic in China. Copyright © 2012 Elsevier Inc. All rights reserved.

  8. Emotional and cognitive health correlates of leisure activities in older Latino and Caucasian women

    PubMed Central

    Herrera, Angelica P.; Meeks, Thomas W.; Dawes, Sharron E.; Hernandez, Dominique M.; Thompson, Wesley K.; Sommerfeld, David H.; Allison, Matthew A.; Jeste, Dilip V.

    2011-01-01

    This study examined differences in the frequency of leisure activity participation and relationships to depressive symptom burden and cognition in Latino and Caucasian women. Cross-sectional data were obtained from a demographically matched subsample of Latino and Caucasian (n = 113 each) post-menopausal women (age ≥60), interviewed in 2004–06 for a multi-ethnic cohort study of successful aging in San Diego County. Frequencies of engagement in 16 leisure activities and associations between objective cognitive performance and depressive symptom burden by ethnicity were identified using bivariate and linear regression, adjusted for physical functioning and demographic covariates. Compared to Caucasian women, Latinas were significantly more likely to be caregivers and used computers less often. Engaging in organized social activity was associated with fewer depressive symptoms in both groups. Listening to the radio was positively correlated with lower depressive symptom burden for Latinas, and better cognitive functioning in Caucasians. Cognitive functioning was better in Latinas who read and did puzzles. Housework was negatively associated with Latinas’ emotional health and Caucasians’ cognitive functioning. Latino and Caucasian women participate in different patterns of leisure activities. Additionally, ethnicity significantly affects the relationship between leisure activities and both emotional and cognitive health. PMID:21391135

  9. Modeling the Regulatory Mechanisms by Which NLRX1 Modulates Innate Immune Responses to Helicobacter pylori Infection

    PubMed Central

    Philipson, Casandra W.; Bassaganya-Riera, Josep; Viladomiu, Monica; Kronsteiner, Barbara; Abedi, Vida; Hoops, Stefan; Michalak, Pawel; Kang, Lin; Girardin, Stephen E.; Hontecillas, Raquel

    2015-01-01

    Helicobacter pylori colonizes half of the world’s population as the dominant member of the gastric microbiota resulting in a lifelong chronic infection. Host responses toward the bacterium can result in asymptomatic, pathogenic or even favorable health outcomes; however, mechanisms underlying the dual role of H. pylori as a commensal versus pathogenic organism are not well characterized. Recent evidence suggests mononuclear phagocytes are largely involved in shaping dominant immunity during infection mediating the balance between host tolerance and succumbing to overt disease. We combined computational modeling, bioinformatics and experimental validation in order to investigate interactions between macrophages and intracellular H. pylori. Global transcriptomic analysis on bone marrow-derived macrophages (BMDM) in a gentamycin protection assay at six time points unveiled the presence of three sequential host response waves: an early transient regulatory gene module followed by sustained and late effector responses. Kinetic behaviors of pattern recognition receptors (PRRs) are linked to differential expression of spatiotemporal response waves and function to induce effector immunity through extracellular and intracellular detection of H. pylori. We report that bacterial interaction with the host intracellular environment caused significant suppression of regulatory NLRC3 and NLRX1 in a pattern inverse to early regulatory responses. To further delineate complex immune responses and pathway crosstalk between effector and regulatory PRRs, we built a computational model calibrated using time-series RNAseq data. Our validated computational hypotheses are that: 1) NLRX1 expression regulates bacterial burden in macrophages; and 2) early host response cytokines down-regulate NLRX1 expression through a negative feedback circuit. This paper applies modeling approaches to characterize the regulatory role of NLRX1 in mechanisms of host tolerance employed by macrophages to respond to and/or to co-exist with intracellular H. pylori. PMID:26367386

  10. Rapid simulation of spatial epidemics: a spectral method.

    PubMed

    Brand, Samuel P C; Tildesley, Michael J; Keeling, Matthew J

    2015-04-07

    Spatial structure and hence the spatial position of host populations plays a vital role in the spread of infection. In the majority of situations, it is only possible to predict the spatial spread of infection using simulation models, which can be computationally demanding especially for large population sizes. Here we develop an approximation method that vastly reduces this computational burden. We assume that the transmission rates between individuals or sub-populations are determined by a spatial transmission kernel. This kernel is assumed to be isotropic, such that the transmission rate is simply a function of the distance between susceptible and infectious individuals; as such this provides the ideal mechanism for modelling localised transmission in a spatial environment. We show that the spatial force of infection acting on all susceptibles can be represented as a spatial convolution between the transmission kernel and a spatially extended 'image' of the infection state. This representation allows the rapid calculation of stochastic rates of infection using fast-Fourier transform (FFT) routines, which greatly improves the computational efficiency of spatial simulations. We demonstrate the efficiency and accuracy of this fast spectral rate recalculation (FSR) method with two examples: an idealised scenario simulating an SIR-type epidemic outbreak amongst N habitats distributed across a two-dimensional plane; the spread of infection between US cattle farms, illustrating that the FSR method makes continental-scale outbreak forecasting feasible with desktop processing power. The latter model demonstrates which areas of the US are at consistently high risk for cattle-infections, although predictions of epidemic size are highly dependent on assumptions about the tail of the transmission kernel. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Semiquantitative visual approach to scoring lung cancer treatment response using computed tomography: a pilot study.

    PubMed

    Gottlieb, Ronald H; Kumar, Prasanna; Loud, Peter; Klippenstein, Donald; Raczyk, Cheryl; Tan, Wei; Lu, Jenny; Ramnath, Nithya

    2009-01-01

    Our objective was to compare a newly developed semiquantitative visual scoring (SVS) method with the current standard, the Response Evaluation Criteria in Solid Tumors (RECIST) method, in the categorization of treatment response and reader agreement for patients with metastatic lung cancer followed by computed tomography. The 18 subjects (5 women and 13 men; mean age, 62.8 years) were from an institutional review board-approved phase 2 study that evaluated a second-line chemotherapy regimen for metastatic (stages III and IV) non-small cell lung cancer. Four radiologists, blinded to the patient outcome and each other's reads, evaluated the change in the patients' tumor burden from the baseline to the first restaging computed tomographic scan using either the RECIST or the SVS method. We compared the numbers of patients placed into the partial response, the stable disease (SD), and the progressive disease (PD) categories (Fisher exact test) and observer agreement (kappa statistic). Requiring the concordance of 3 of the 4 readers resulted in the RECIST placing 17 (100%) of 17 patients in the SD category compared with the SVS placing 9 (60%) of 15 patients in the partial response, 5 (33%) of the 15 patients in the SD, and 1 (6.7%) of the 15 patients in the PD categories (P < 0.0001). Interobserver agreement was higher among the readers using the SVS method (kappa, 0.54; P < 0.0001) compared with that of the readers using the RECIST method (kappa, -0.01; P = 0.5378). Using the SVS method, the readers more finely discriminated between the patient response categories with superior agreement compared with the RECIST method, which could potentially result in large differences in early treatment decisions for advanced lung cancer.

  12. Cardiac computed tomography in patients with symptomatic new-onset atrial fibrillation, rule-out acute coronary syndrome, but with intermediate pretest probability for coronary artery disease admitted to a chest pain unit.

    PubMed

    Koopmann, Matthias; Hinrichs, Liane; Olligs, Jan; Lichtenberg, Michael; Eckardt, Lars; Böse, Dirk; Möhlenkamp, Stefan; Waltenberger, Johannes; Breuckmann, Frank

    2018-01-24

    Atrial fibrillation (AF) and coronary artery disease (CAD) may be encountered coincidently in a large portion of patients. However, data on coronary artery calcium burden in such patients are lacking. Thus, we sought to determine the value of cardiac computed tomography (CCT) in patients presenting with new-onset AF associated with an intermediate pretest probability for CAD admitted to a chest pain unit (CPU). Calcium scores (CS) of 73 new-onset, symptomatic AF subjects without typical clinical, electrocardiographic, or laboratory signs of acute coronary syndrome (ACS) admitted to our CPU were analyzed. In addition, results from computed tomography angiography (CTA) were related to coronary angiography findings whenever available. Calcium scores of zero were found in 25%. Median Agatston score was 77 (interquartile range: 1-270) with gender- and territory-specific dispersal. CS scores above average were present in about 50%, high (> 400)-to-very high (> 1000) CS scores were found in 22%. Overall percentile ranking showed a relative accordance to the reference percentile distribution. Additional CTA was performed in 47%, revealing stenoses in 12%. Coronary angiography was performed in 22% and resulted in coronary intervention or surgical revascularization in 7%. On univariate analysis, CS > 50th percentile failed to serve as an independent determinant of significant stenosis during catheterization. Within a CPU setting, relevant CAD was excluded or confirmed in almost 50%, the latter with a high proportion of coronary angiographies and subsequent coronary interventions, underlining the diagnostic value of CCT in symptomatic, non-ACS, new-onset AF patients when admitted to a CPU.

  13. Dynamic emulation modelling for the optimal operation of water systems: an overview

    NASA Astrophysics Data System (ADS)

    Castelletti, A.; Galelli, S.; Giuliani, M.

    2014-12-01

    Despite sustained increase in computing power over recent decades, computational limitations remain a major barrier to the effective and systematic use of large-scale, process-based simulation models in rational environmental decision-making. Whereas complex models may provide clear advantages when the goal of the modelling exercise is to enhance our understanding of the natural processes, they introduce problems of model identifiability caused by over-parameterization and suffer from high computational burden when used in management and planning problems. As a result, increasing attention is now being devoted to emulation modelling (or model reduction) as a way of overcoming these limitations. An emulation model, or emulator, is a low-order approximation of the process-based model that can be substituted for it in order to solve high resource-demanding problems. In this talk, an overview of emulation modelling within the context of the optimal operation of water systems will be provided. Particular emphasis will be given to Dynamic Emulation Modelling (DEMo), a special type of model complexity reduction in which the dynamic nature of the original process-based model is preserved, with consequent advantages in a wide range of problems, particularly feedback control problems. This will be contrasted with traditional non-dynamic emulators (e.g. response surface and surrogate models) that have been studied extensively in recent years and are mainly used for planning purposes. A number of real world numerical experiences will be used to support the discussion ranging from multi-outlet water quality control in water reservoir through erosion/sedimentation rebalancing in the operation of run-off-river power plants to salinity control in lake and reservoirs.

  14. FUX-Sim: Implementation of a fast universal simulation/reconstruction framework for X-ray systems.

    PubMed

    Abella, Monica; Serrano, Estefania; Garcia-Blas, Javier; García, Ines; de Molina, Claudia; Carretero, Jesus; Desco, Manuel

    2017-01-01

    The availability of digital X-ray detectors, together with advances in reconstruction algorithms, creates an opportunity for bringing 3D capabilities to conventional radiology systems. The downside is that reconstruction algorithms for non-standard acquisition protocols are generally based on iterative approaches that involve a high computational burden. The development of new flexible X-ray systems could benefit from computer simulations, which may enable performance to be checked before expensive real systems are implemented. The development of simulation/reconstruction algorithms in this context poses three main difficulties. First, the algorithms deal with large data volumes and are computationally expensive, thus leading to the need for hardware and software optimizations. Second, these optimizations are limited by the high flexibility required to explore new scanning geometries, including fully configurable positioning of source and detector elements. And third, the evolution of the various hardware setups increases the effort required for maintaining and adapting the implementations to current and future programming models. Previous works lack support for completely flexible geometries and/or compatibility with multiple programming models and platforms. In this paper, we present FUX-Sim, a novel X-ray simulation/reconstruction framework that was designed to be flexible and fast. Optimized implementation for different families of GPUs (CUDA and OpenCL) and multi-core CPUs was achieved thanks to a modularized approach based on a layered architecture and parallel implementation of the algorithms for both architectures. A detailed performance evaluation demonstrates that for different system configurations and hardware platforms, FUX-Sim maximizes performance with the CUDA programming model (5 times faster than other state-of-the-art implementations). Furthermore, the CPU and OpenCL programming models allow FUX-Sim to be executed over a wide range of hardware platforms.

  15. A combined Fuzzy and Naive Bayesian strategy can be used to assign event codes to injury narratives.

    PubMed

    Marucci-Wellman, H; Lehto, M; Corns, H

    2011-12-01

    Bayesian methods show promise for classifying injury narratives from large administrative datasets into cause groups. This study examined a combined approach where two Bayesian models (Fuzzy and Naïve) were used to either classify a narrative or select it for manual review. Injury narratives were extracted from claims filed with a worker's compensation insurance provider between January 2002 and December 2004. Narratives were separated into a training set (n=11,000) and prediction set (n=3,000). Expert coders assigned two-digit Bureau of Labor Statistics Occupational Injury and Illness Classification event codes to each narrative. Fuzzy and Naïve Bayesian models were developed using manually classified cases in the training set. Two semi-automatic machine coding strategies were evaluated. The first strategy assigned cases for manual review if the Fuzzy and Naïve models disagreed on the classification. The second strategy selected additional cases for manual review from the Agree dataset using prediction strength to reach a level of 50% computer coding and 50% manual coding. When agreement alone was used as the filtering strategy, the majority were coded by the computer (n=1,928, 64%) leaving 36% for manual review. The overall combined (human plus computer) sensitivity was 0.90 and positive predictive value (PPV) was >0.90 for 11 of 18 2-digit event categories. Implementing the 2nd strategy improved results with an overall sensitivity of 0.95 and PPV >0.90 for 17 of 18 categories. A combined Naïve-Fuzzy Bayesian approach can classify some narratives with high accuracy and identify others most beneficial for manual review, reducing the burden on human coders.

  16. Administrative work consumes one-sixth of U.S. physicians' working hours and lowers their career satisfaction.

    PubMed

    Woolhandler, Steffie; Himmelstein, David U

    2014-01-01

    Doctors often complain about the burden of administrative work, but few studies have quantified how much time clinicians devote to administrative tasks. We quantified the time U.S. physicians spent on administrative tasks, and its relationship to their career satisfaction, based on a nationally representative survey of 4,720 U.S. physicians working 20 or more hours per week in direct patient care. The average doctor spent 8.7 hours per week (16.6% of working hours) on administration. Psychiatrists spent the highest proportion of their time on administration (20.3%), followed by internists (17.3%) and family/general practitioners (17.3%). Pediatricians spent the least amount of time, 6.7 hours per week or 14.1 percent of professional time. Doctors in large practices, those in practices owned by a hospital, and those with financial incentives to reduce services spent more time on administration. More extensive use of electronic medical records was associated with a greater administrative burden. Doctors spending more time on administration had lower career satisfaction, even after controlling for income and other factors. Current trends in U.S. health policy--a shift to employment in large practices, the implementation of electronic medical records, and the increasing prevalence of financial risk sharing--are likely to increase doctors' paperwork burdens and may decrease their career satisfaction.

  17. Ethical implications of excessive cluster sizes in cluster randomised trials.

    PubMed

    Hemming, Karla; Taljaard, Monica; Forbes, Gordon; Eldridge, Sandra M; Weijer, Charles

    2018-02-20

    The cluster randomised trial (CRT) is commonly used in healthcare research. It is the gold-standard study design for evaluating healthcare policy interventions. A key characteristic of this design is that as more participants are included, in a fixed number of clusters, the increase in achievable power will level off. CRTs with cluster sizes that exceed the point of levelling-off will have excessive numbers of participants, even if they do not achieve nominal levels of power. Excessively large cluster sizes may have ethical implications due to exposing trial participants unnecessarily to the burdens of both participating in the trial and the potential risks of harm associated with the intervention. We explore these issues through the use of two case studies. Where data are routinely collected, available at minimum cost and the intervention poses low risk, the ethical implications of excessively large cluster sizes are likely to be low (case study 1). However, to maximise the social benefit of the study, identification of excessive cluster sizes can allow for prespecified and fully powered secondary analyses. In the second case study, while there is no burden through trial participation (because the outcome data are routinely collected and non-identifiable), the intervention might be considered to pose some indirect risk to patients and risks to the healthcare workers. In this case study it is therefore important that the inclusion of excessively large cluster sizes is justifiable on other grounds (perhaps to show sustainability). In any randomised controlled trial, including evaluations of health policy interventions, it is important to minimise the burdens and risks to participants. Funders, researchers and research ethics committees should be aware of the ethical issues of excessively large cluster sizes in cluster trials. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  18. The potential impact of increased treatment rates for alcohol dependence in the United Kingdom in 2004.

    PubMed

    Shield, Kevin D; Rehm, Jürgen; Rehm, Maximilien X; Gmel, Gerrit; Drummond, Colin

    2014-02-05

    Alcohol consumption has been linked to a considerable burden of disease in the United Kingdom (UK), with most of this burden due to heavy drinking and Alcohol Dependence (AD). However, AD is undertreated in the UK, with only 8% of those individuals with AD being treated in England and only 6% of those individuals with AD being treated in Scotland. Thus, the objective of this paper is to quantify the deaths that would have been avoided in the UK in 2004 if the treatment rate for AD had been increased. Data on the prevalence of AD, alcohol consumption, and mortality were obtained from the Adult Psychiatric Morbidity Survey, the Global Information System on Alcohol and Health, and the 2004 Global Burden of Disease study respectively. Data on the effectiveness of pharmacological treatment and Motivational Interviewing/Cognitive Behavioural Therapy were obtained from Cochrane reviews and meta-analyses. Simulations were used to model the number of deaths under different treatment scenarios. Sensitivity analyses were performed to model the effects of Brief Interventions and to examine the effect of using AD prevalence data obtained from the National Institute for Health and Clinical Excellence. In the UK, 320 female and 1,385 male deaths would have been avoided if treatment coverage of pharmacological treatment had been increased to 20%. This decrease in the number of deaths represents 7.9% of all alcohol-attributable deaths (7.0% of all alcohol-attributable deaths for women and 8.1% of all alcohol-attributable deaths for men). If we used lower AD prevalence rates obtained from the National Institute for Health and Clinical Excellence, then treatment coverage of pharmacological treatment in hospitals for 20% of the population with AD would have resulted in the avoidance of 529 deaths in 2004 (99 deaths avoided for women and 430 deaths avoided for men). Increasing AD treatment in the UK would have led to a large number of deaths being avoided in 2004. Increased AD treatment rates not only impact mortality but also impact upon the large burden of disability and morbidity attributable to AD, as well as the associated social and economic burdens.

  19. Perspectives on Sharing Models and Related Resources in Computational Biomechanics Research.

    PubMed

    Erdemir, Ahmet; Hunter, Peter J; Holzapfel, Gerhard A; Loew, Leslie M; Middleton, John; Jacobs, Christopher R; Nithiarasu, Perumal; Löhner, Rainlad; Wei, Guowei; Winkelstein, Beth A; Barocas, Victor H; Guilak, Farshid; Ku, Joy P; Hicks, Jennifer L; Delp, Scott L; Sacks, Michael; Weiss, Jeffrey A; Ateshian, Gerard A; Maas, Steve A; McCulloch, Andrew D; Peng, Grace C Y

    2018-02-01

    The role of computational modeling for biomechanics research and related clinical care will be increasingly prominent. The biomechanics community has been developing computational models routinely for exploration of the mechanics and mechanobiology of diverse biological structures. As a result, a large array of models, data, and discipline-specific simulation software has emerged to support endeavors in computational biomechanics. Sharing computational models and related data and simulation software has first become a utilitarian interest, and now, it is a necessity. Exchange of models, in support of knowledge exchange provided by scholarly publishing, has important implications. Specifically, model sharing can facilitate assessment of reproducibility in computational biomechanics and can provide an opportunity for repurposing and reuse, and a venue for medical training. The community's desire to investigate biological and biomechanical phenomena crossing multiple systems, scales, and physical domains, also motivates sharing of modeling resources as blending of models developed by domain experts will be a required step for comprehensive simulation studies as well as the enhancement of their rigor and reproducibility. The goal of this paper is to understand current perspectives in the biomechanics community for the sharing of computational models and related resources. Opinions on opportunities, challenges, and pathways to model sharing, particularly as part of the scholarly publishing workflow, were sought. A group of journal editors and a handful of investigators active in computational biomechanics were approached to collect short opinion pieces as a part of a larger effort of the IEEE EMBS Computational Biology and the Physiome Technical Committee to address model reproducibility through publications. A synthesis of these opinion pieces indicates that the community recognizes the necessity and usefulness of model sharing. There is a strong will to facilitate model sharing, and there are corresponding initiatives by the scientific journals. Outside the publishing enterprise, infrastructure to facilitate model sharing in biomechanics exists, and simulation software developers are interested in accommodating the community's needs for sharing of modeling resources. Encouragement for the use of standardized markups, concerns related to quality assurance, acknowledgement of increased burden, and importance of stewardship of resources are noted. In the short-term, it is advisable that the community builds upon recent strategies and experiments with new pathways for continued demonstration of model sharing, its promotion, and its utility. Nonetheless, the need for a long-term strategy to unify approaches in sharing computational models and related resources is acknowledged. Development of a sustainable platform supported by a culture of open model sharing will likely evolve through continued and inclusive discussions bringing all stakeholders at the table, e.g., by possibly establishing a consortium.

  20. Reductions in fish-community contamination following lowhead dam removal linked more to shifts in food-web structure than sediment pollution.

    PubMed

    Davis, Robert P; Sullivan, S Mažeika P; Stefanik, Kay C

    2017-12-01

    Recent increases in dam removals have prompted research on ecological and geomorphic river responses, yet contaminant dynamics following dam removals are poorly understood. We investigated changes in sediment concentrations and fish-community body burdens of mercury (Hg), selenium (Se), polychlorinated biphenyls (PCB), and chlorinated pesticides before and after two lowhead dam removals in the Scioto and Olentangy Rivers (Columbus, Ohio). These changes were then related to documented shifts in fish food-web structure. Seven study reaches were surveyed from 2011 to 2015, including controls, upstream and downstream of the previous dams, and upstream restored vs. unrestored. For most contaminants, fish-community body burdens declined following dam removal and converged across study reaches by the last year of the study in both rivers. Aldrin and dieldrin body burdens in the Olentangy River declined more rapidly in the upstream-restored vs. the upstream-unrestored reach, but were indistinguishable by year three post dam removal. No upstream-downstream differences were observed in body burdens in the Olentangy River, but aldrin and dieldrin body burdens were 138 and 148% higher, respectively, in downstream reaches than in upstream reaches of the Scioto River following dam removal. The strongest relationships between trophic position and body burdens were observed with PCBs and Se in the Scioto River, and with dieldrin in the Olentangy River. Food-chain length - a key measure of trophic structure - was only weakly related to aldrin body burdens, and unrelated to other contaminants. Overall, we demonstrate that lowhead dam removal may effectively reduce ecosystem contamination, largely via shifts in fish food-web dynamics versus sediment contaminant concentrations. This study presents some of the first findings documenting ecosystem contamination following dam removal and will be useful in informing future dam removals. Copyright © 2017 Elsevier Ltd. All rights reserved.

Top