Sample records for optimal statistical quality

  1. Sb2Te3 and Its Superlattices: Optimization by Statistical Design.

    PubMed

    Behera, Jitendra K; Zhou, Xilin; Ranjan, Alok; Simpson, Robert E

    2018-05-02

    The objective of this work is to demonstrate the usefulness of fractional factorial design for optimizing the crystal quality of chalcogenide van der Waals (vdW) crystals. We statistically analyze the growth parameters of highly c axis oriented Sb 2 Te 3 crystals and Sb 2 Te 3 -GeTe phase change vdW heterostructured superlattices. The statistical significance of the growth parameters of temperature, pressure, power, buffer materials, and buffer layer thickness was found by fractional factorial design and response surface analysis. Temperature, pressure, power, and their second-order interactions are the major factors that significantly influence the quality of the crystals. Additionally, using tungsten rather than molybdenum as a buffer layer significantly enhances the crystal quality. Fractional factorial design minimizes the number of experiments that are necessary to find the optimal growth conditions, resulting in an order of magnitude improvement in the crystal quality. We highlight that statistical design of experiment methods, which is more commonly used in product design, should be considered more broadly by those designing and optimizing materials.

  2. Emerging Techniques for Dose Optimization in Abdominal CT

    PubMed Central

    Platt, Joel F.; Goodsitt, Mitchell M.; Al-Hawary, Mahmoud M.; Maturen, Katherine E.; Wasnik, Ashish P.; Pandya, Amit

    2014-01-01

    Recent advances in computed tomographic (CT) scanning technique such as automated tube current modulation (ATCM), optimized x-ray tube voltage, and better use of iterative image reconstruction have allowed maintenance of good CT image quality with reduced radiation dose. ATCM varies the tube current during scanning to account for differences in patient attenuation, ensuring a more homogeneous image quality, although selection of the appropriate image quality parameter is essential for achieving optimal dose reduction. Reducing the x-ray tube voltage is best suited for evaluating iodinated structures, since the effective energy of the x-ray beam will be closer to the k-edge of iodine, resulting in a higher attenuation for the iodine. The optimal kilovoltage for a CT study should be chosen on the basis of imaging task and patient habitus. The aim of iterative image reconstruction is to identify factors that contribute to noise on CT images with use of statistical models of noise (statistical iterative reconstruction) and selective removal of noise to improve image quality. The degree of noise suppression achieved with statistical iterative reconstruction can be customized to minimize the effect of altered image quality on CT images. Unlike with statistical iterative reconstruction, model-based iterative reconstruction algorithms model both the statistical noise and the physical acquisition process, allowing CT to be performed with further reduction in radiation dose without an increase in image noise or loss of spatial resolution. Understanding these recently developed scanning techniques is essential for optimization of imaging protocols designed to achieve the desired image quality with a reduced dose. © RSNA, 2014 PMID:24428277

  3. Explicit optimization of plan quality measures in intensity-modulated radiation therapy treatment planning.

    PubMed

    Engberg, Lovisa; Forsgren, Anders; Eriksson, Kjell; Hårdemark, Björn

    2017-06-01

    To formulate convex planning objectives of treatment plan multicriteria optimization with explicit relationships to the dose-volume histogram (DVH) statistics used in plan quality evaluation. Conventional planning objectives are designed to minimize the violation of DVH statistics thresholds using penalty functions. Although successful in guiding the DVH curve towards these thresholds, conventional planning objectives offer limited control of the individual points on the DVH curve (doses-at-volume) used to evaluate plan quality. In this study, we abandon the usual penalty-function framework and propose planning objectives that more closely relate to DVH statistics. The proposed planning objectives are based on mean-tail-dose, resulting in convex optimization. We also demonstrate how to adapt a standard optimization method to the proposed formulation in order to obtain a substantial reduction in computational cost. We investigated the potential of the proposed planning objectives as tools for optimizing DVH statistics through juxtaposition with the conventional planning objectives on two patient cases. Sets of treatment plans with differently balanced planning objectives were generated using either the proposed or the conventional approach. Dominance in the sense of better distributed doses-at-volume was observed in plans optimized within the proposed framework. The initial computational study indicates that the DVH statistics are better optimized and more efficiently balanced using the proposed planning objectives than using the conventional approach. © 2017 American Association of Physicists in Medicine.

  4. Dispositional optimism and sleep quality: a test of mediating pathways

    PubMed Central

    Cribbet, Matthew; Kent de Grey, Robert G.; Cronan, Sierra; Trettevik, Ryan; Smith, Timothy W.

    2016-01-01

    Dispositional optimism has been related to beneficial influences on physical health outcomes. However, its links to global sleep quality and the psychological mediators responsible for such associations are less studied. This study thus examined if trait optimism predicted global sleep quality, and if measures of subjective well-being were statistical mediators of such links. A community sample of 175 participants (93 men, 82 women) completed measures of trait optimism, depression, and life satisfaction. Global sleep quality was assessed using the Pittsburgh Sleep Quality Index. Results indicated that trait optimism was a strong predictor of better PSQI global sleep quality. Moreover, this association was mediated by depression and life satisfaction in both single and multiple mediator models. These results highlight the importance of optimism for the restorative process of sleep, as well as the utility of multiple mediator models in testing distinct psychological pathways. PMID:27592128

  5. Dispositional optimism and sleep quality: a test of mediating pathways.

    PubMed

    Uchino, Bert N; Cribbet, Matthew; de Grey, Robert G Kent; Cronan, Sierra; Trettevik, Ryan; Smith, Timothy W

    2017-04-01

    Dispositional optimism has been related to beneficial influences on physical health outcomes. However, its links to global sleep quality and the psychological mediators responsible for such associations are less studied. This study thus examined if trait optimism predicted global sleep quality, and if measures of subjective well-being were statistical mediators of such links. A community sample of 175 participants (93 men, 82 women) completed measures of trait optimism, depression, and life satisfaction. Global sleep quality was assessed using the Pittsburgh Sleep Quality Index. Results indicated that trait optimism was a strong predictor of better PSQI global sleep quality. Moreover, this association was mediated by depression and life satisfaction in both single and multiple mediator models. These results highlight the importance of optimism for the restorative process of sleep, as well as the utility of multiple mediator models in testing distinct psychological pathways.

  6. Subjective audio quality evaluation of embedded-optimization-based distortion precompensation algorithms.

    PubMed

    Defraene, Bruno; van Waterschoot, Toon; Diehl, Moritz; Moonen, Marc

    2016-07-01

    Subjective audio quality evaluation experiments have been conducted to assess the performance of embedded-optimization-based precompensation algorithms for mitigating perceptible linear and nonlinear distortion in audio signals. It is concluded with statistical significance that the perceived audio quality is improved by applying an embedded-optimization-based precompensation algorithm, both in case (i) nonlinear distortion and (ii) a combination of linear and nonlinear distortion is present. Moreover, a significant positive correlation is reported between the collected subjective and objective PEAQ audio quality scores, supporting the validity of using PEAQ to predict the impact of linear and nonlinear distortion on the perceived audio quality.

  7. Assessment of quality outcomes for robotic pancreaticoduodenectomy: identification of the learning curve.

    PubMed

    Boone, Brian A; Zenati, Mazen; Hogg, Melissa E; Steve, Jennifer; Moser, Arthur James; Bartlett, David L; Zeh, Herbert J; Zureikat, Amer H

    2015-05-01

    Quality assessment is an important instrument to ensure optimal surgical outcomes, particularly during the adoption of new surgical technology. The use of the robotic platform for complex pancreatic resections, such as the pancreaticoduodenectomy, requires close monitoring of outcomes during its implementation phase to ensure patient safety is maintained and the learning curve identified. To report the results of a quality analysis and learning curve during the implementation of robotic pancreaticoduodenectomy (RPD). A retrospective review of a prospectively maintained database of 200 consecutive patients who underwent RPD in a large academic center from October 3, 2008, through March 1, 2014, was evaluated for important metrics of quality. Patients were analyzed in groups of 20 to minimize demographic differences and optimize the ability to detect statistically meaningful changes in performance. Robotic pancreaticoduodenectomy. Optimization of perioperative outcome parameters. No statistical differences in mortality rates or major morbidity were noted during the study. Statistical improvements in estimated blood loss and conversions to open surgery occurred after 20 cases (600 mL vs 250 mL [P = .002] and 35.0% vs 3.3% [P < .001], respectively), incidence of pancreatic fistula after 40 cases (27.5% vs 14.4%; P = .04), and operative time after 80 cases (581 minutes vs 417 minutes [P < .001]). Complication rates, lengths of stay, and readmission rates showed continuous improvement that did not reach statistical significance. Outcomes for the last 120 cases (representing optimized metrics beyond the learning curve) included a mean operative time of 417 minutes, median estimated blood loss of 250 mL, a conversion rate of 3.3%, 90-day mortality of 3.3%, a clinically significant (grade B/C) pancreatic fistula rate of 6.9%, and a median length of stay of 9 days. Continuous assessment of quality metrics allows for safe implementation of RPD. We identified several inflexion points corresponding to optimization of performance metrics for RPD that can be used as benchmarks for surgeons who are adopting this technology.

  8. An intelligent case-adjustment algorithm for the automated design of population-based quality auditing protocols.

    PubMed

    Advani, Aneel; Jones, Neil; Shahar, Yuval; Goldstein, Mary K; Musen, Mark A

    2004-01-01

    We develop a method and algorithm for deciding the optimal approach to creating quality-auditing protocols for guideline-based clinical performance measures. An important element of the audit protocol design problem is deciding which guide-line elements to audit. Specifically, the problem is how and when to aggregate individual patient case-specific guideline elements into population-based quality measures. The key statistical issue involved is the trade-off between increased reliability with more general population-based quality measures versus increased validity from individually case-adjusted but more restricted measures done at a greater audit cost. Our intelligent algorithm for auditing protocol design is based on hierarchically modeling incrementally case-adjusted quality constraints. We select quality constraints to measure using an optimization criterion based on statistical generalizability coefficients. We present results of the approach from a deployed decision support system for a hypertension guideline.

  9. The effect of statistical noise on IMRT plan quality and convergence for MC-based and MC-correction-based optimized treatment plans.

    PubMed

    Siebers, Jeffrey V

    2008-04-04

    Monte Carlo (MC) is rarely used for IMRT plan optimization outside of research centres due to the extensive computational resources or long computation times required to complete the process. Time can be reduced by degrading the statistical precision of the MC dose calculation used within the optimization loop. However, this eventually introduces optimization convergence errors (OCEs). This study determines the statistical noise levels tolerated during MC-IMRT optimization under the condition that the optimized plan has OCEs <100 cGy (1.5% of the prescription dose) for MC-optimized IMRT treatment plans.Seven-field prostate IMRT treatment plans for 10 prostate patients are used in this study. Pre-optimization is performed for deliverable beams with a pencil-beam (PB) dose algorithm. Further deliverable-based optimization proceeds using: (1) MC-based optimization, where dose is recomputed with MC after each intensity update or (2) a once-corrected (OC) MC-hybrid optimization, where a MC dose computation defines beam-by-beam dose correction matrices that are used during a PB-based optimization. Optimizations are performed with nominal per beam MC statistical precisions of 2, 5, 8, 10, 15, and 20%. Following optimizer convergence, beams are re-computed with MC using 2% per beam nominal statistical precision and the 2 PTV and 10 OAR dose indices used in the optimization objective function are tallied. For both the MC-optimization and OC-optimization methods, statistical equivalence tests found that OCEs are less than 1.5% of the prescription dose for plans optimized with nominal statistical uncertainties of up to 10% per beam. The achieved statistical uncertainty in the patient for the 10% per beam simulations from the combination of the 7 beams is ~3% with respect to maximum dose for voxels with D>0.5D(max). The MC dose computation time for the OC-optimization is only 6.2 minutes on a single 3 Ghz processor with results clinically equivalent to high precision MC computations.

  10. Optimal experimental designs for fMRI when the model matrix is uncertain.

    PubMed

    Kao, Ming-Hung; Zhou, Lin

    2017-07-15

    This study concerns optimal designs for functional magnetic resonance imaging (fMRI) experiments when the model matrix of the statistical model depends on both the selected stimulus sequence (fMRI design), and the subject's uncertain feedback (e.g. answer) to each mental stimulus (e.g. question) presented to her/him. While practically important, this design issue is challenging. This mainly is because that the information matrix cannot be fully determined at the design stage, making it difficult to evaluate the quality of the selected designs. To tackle this challenging issue, we propose an easy-to-use optimality criterion for evaluating the quality of designs, and an efficient approach for obtaining designs optimizing this criterion. Compared with a previously proposed method, our approach requires a much less computing time to achieve designs with high statistical efficiencies. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. [Applications of the hospital statistics management system].

    PubMed

    Zhai, Hong; Ren, Yong; Liu, Jing; Li, You-Zhang; Ma, Xiao-Long; Jiao, Tao-Tao

    2008-01-01

    The Hospital Statistics Management System is built on an Office Automation Platform of Shandong provincial hospital system. Its workflow, role and popedom technologies are used to standardize and optimize the management program of statistics in the total quality control of hospital statistics. The system's applications have combined the office automation platform with the statistics management in a hospital and this provides a practical example of a modern hospital statistics management model.

  12. A Data-Driven Solution for Performance Improvement

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Marketed as the "Software of the Future," Optimal Engineering Systems P.I. EXPERT(TM) technology offers statistical process control and optimization techniques that are critical to businesses looking to restructure or accelerate operations in order to gain a competitive edge. Kennedy Space Center granted Optimal Engineering Systems the funding and aid necessary to develop a prototype of the process monitoring and improvement software. Completion of this prototype demonstrated that it was possible to integrate traditional statistical quality assurance tools with robust optimization techniques in a user- friendly format that is visually compelling. Using an expert system knowledge base, the software allows the user to determine objectives, capture constraints and out-of-control processes, predict results, and compute optimal process settings.

  13. A combined geostatistical-optimization model for the optimal design of a groundwater quality monitoring network

    NASA Astrophysics Data System (ADS)

    Kolosionis, Konstantinos; Papadopoulou, Maria P.

    2017-04-01

    Monitoring networks provide essential information for water resources management especially in areas with significant groundwater exploitation due to extensive agricultural activities. In this work, a simulation-optimization framework is developed based on heuristic optimization methodologies and geostatistical modeling approaches to obtain an optimal design for a groundwater quality monitoring network. Groundwater quantity and quality data obtained from 43 existing observation locations at 3 different hydrological periods in Mires basin in Crete, Greece will be used in the proposed framework in terms of Regression Kriging to develop the spatial distribution of nitrates concentration in the aquifer of interest. Based on the existing groundwater quality mapping, the proposed optimization tool will determine a cost-effective observation wells network that contributes significant information to water managers and authorities. The elimination of observation wells that add little or no beneficial information to groundwater level and quality mapping of the area can be obtain using estimations uncertainty and statistical error metrics without effecting the assessment of the groundwater quality. Given the high maintenance cost of groundwater monitoring networks, the proposed tool could used by water regulators in the decision-making process to obtain a efficient network design that is essential.

  14. Optimization of hole generation in Ti/CFRP stacks

    NASA Astrophysics Data System (ADS)

    Ivanov, Y. N.; Pashkov, A. E.; Chashhin, N. S.

    2018-03-01

    The article aims to describe methods for improving the surface quality and hole accuracy in Ti/CFRP stacks by optimizing cutting methods and drill geometry. The research is based on the fundamentals of machine building, theory of probability, mathematical statistics, and experiment planning and manufacturing process optimization theories. Statistical processing of experiment data was carried out by means of Statistica 6 and Microsoft Excel 2010. Surface geometry in Ti stacks was analyzed using a Taylor Hobson Form Talysurf i200 Series Profilometer, and in CFRP stacks - using a Bruker ContourGT-Kl Optical Microscope. Hole shapes and sizes were analyzed using a Carl Zeiss CONTURA G2 Measuring machine, temperatures in cutting zones were recorded with a FLIR SC7000 Series Infrared Camera. Models of multivariate analysis of variance were developed. They show effects of drilling modes on surface quality and accuracy of holes in Ti/CFRP stacks. The task of multicriteria drilling process optimization was solved. Optimal cutting technologies which improve performance were developed. Methods for assessing thermal tool and material expansion effects on the accuracy of holes in Ti/CFRP/Ti stacks were developed.

  15. Statistical efficiency of adaptive algorithms.

    PubMed

    Widrow, Bernard; Kamenetsky, Max

    2003-01-01

    The statistical efficiency of a learning algorithm applied to the adaptation of a given set of variable weights is defined as the ratio of the quality of the converged solution to the amount of data used in training the weights. Statistical efficiency is computed by averaging over an ensemble of learning experiences. A high quality solution is very close to optimal, while a low quality solution corresponds to noisy weights and less than optimal performance. In this work, two gradient descent adaptive algorithms are compared, the LMS algorithm and the LMS/Newton algorithm. LMS is simple and practical, and is used in many applications worldwide. LMS/Newton is based on Newton's method and the LMS algorithm. LMS/Newton is optimal in the least squares sense. It maximizes the quality of its adaptive solution while minimizing the use of training data. Many least squares adaptive algorithms have been devised over the years, but no other least squares algorithm can give better performance, on average, than LMS/Newton. LMS is easily implemented, but LMS/Newton, although of great mathematical interest, cannot be implemented in most practical applications. Because of its optimality, LMS/Newton serves as a benchmark for all least squares adaptive algorithms. The performances of LMS and LMS/Newton are compared, and it is found that under many circumstances, both algorithms provide equal performance. For example, when both algorithms are tested with statistically nonstationary input signals, their average performances are equal. When adapting with stationary input signals and with random initial conditions, their respective learning times are on average equal. However, under worst-case initial conditions, the learning time of LMS can be much greater than that of LMS/Newton, and this is the principal disadvantage of the LMS algorithm. But the strong points of LMS are ease of implementation and optimal performance under important practical conditions. For these reasons, the LMS algorithm has enjoyed very widespread application. It is used in almost every modem for channel equalization and echo cancelling. Furthermore, it is related to the famous backpropagation algorithm used for training neural networks.

  16. Volume reconstruction optimization for tomo-PIV algorithms applied to experimental data

    NASA Astrophysics Data System (ADS)

    Martins, Fabio J. W. A.; Foucaut, Jean-Marc; Thomas, Lionel; Azevedo, Luis F. A.; Stanislas, Michel

    2015-08-01

    Tomographic PIV is a three-component volumetric velocity measurement technique based on the tomographic reconstruction of a particle distribution imaged by multiple camera views. In essence, the performance and accuracy of this technique is highly dependent on the parametric adjustment and the reconstruction algorithm used. Although synthetic data have been widely employed to optimize experiments, the resulting reconstructed volumes might not have optimal quality. The purpose of the present study is to offer quality indicators that can be applied to data samples in order to improve the quality of velocity results obtained by the tomo-PIV technique. The methodology proposed can potentially lead to significantly reduction in the time required to optimize a tomo-PIV reconstruction, also leading to better quality velocity results. Tomo-PIV data provided by a six-camera turbulent boundary-layer experiment were used to optimize the reconstruction algorithms according to this methodology. Velocity statistics measurements obtained by optimized BIMART, SMART and MART algorithms were compared with hot-wire anemometer data and velocity measurement uncertainties were computed. Results indicated that BIMART and SMART algorithms produced reconstructed volumes with equivalent quality as the standard MART with the benefit of reduced computational time.

  17. Direct labeling of serum proteins by fluorescent dye for antibody microarray.

    PubMed

    Klimushina, M V; Gumanova, N G; Metelskaya, V A

    2017-05-06

    Analysis of serum proteome by antibody microarray is used to identify novel biomarkers and to study signaling pathways including protein phosphorylation and protein-protein interactions. Labeling of serum proteins is important for optimal performance of the antibody microarray. Proper choice of fluorescent label and optimal concentration of protein loaded on the microarray ensure good quality of imaging that can be reliably scanned and processed by the software. We have optimized direct serum protein labeling using fluorescent dye Arrayit Green 540 (Arrayit Corporation, USA) for antibody microarray. Optimized procedure produces high quality images that can be readily scanned and used for statistical analysis of protein composition of the serum. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Identification of optimal mask size parameter for noise filtering in 99mTc-methylene diphosphonate bone scintigraphy images.

    PubMed

    Pandey, Anil K; Bisht, Chandan S; Sharma, Param D; ArunRaj, Sreedharan Thankarajan; Taywade, Sameer; Patel, Chetan; Bal, Chandrashekhar; Kumar, Rakesh

    2017-11-01

    Tc-methylene diphosphonate (Tc-MDP) bone scintigraphy images have limited number of counts per pixel. A noise filtering method based on local statistics of the image produces better results than a linear filter. However, the mask size has a significant effect on image quality. In this study, we have identified the optimal mask size that yields a good smooth bone scan image. Forty four bone scan images were processed using mask sizes 3, 5, 7, 9, 11, 13, and 15 pixels. The input and processed images were reviewed in two steps. In the first step, the images were inspected and the mask sizes that produced images with significant loss of clinical details in comparison with the input image were excluded. In the second step, the image quality of the 40 sets of images (each set had input image, and its corresponding three processed images with 3, 5, and 7-pixel masks) was assessed by two nuclear medicine physicians. They selected one good smooth image from each set of images. The image quality was also assessed quantitatively with a line profile. Fisher's exact test was used to find statistically significant differences in image quality processed with 5 and 7-pixel mask at a 5% cut-off. A statistically significant difference was found between the image quality processed with 5 and 7-pixel mask at P=0.00528. The identified optimal mask size to produce a good smooth image was found to be 7 pixels. The best mask size for the John-Sen Lee filter was found to be 7×7 pixels, which yielded Tc-methylene diphosphonate bone scan images with the highest acceptable smoothness.

  19. Implementation of quality by design principles in the development of microsponges as drug delivery carriers: Identification and optimization of critical factors using multivariate statistical analyses and design of experiments studies.

    PubMed

    Simonoska Crcarevska, Maja; Dimitrovska, Aneta; Sibinovska, Nadica; Mladenovska, Kristina; Slavevska Raicki, Renata; Glavas Dodov, Marija

    2015-07-15

    Microsponges drug delivery system (MDDC) was prepared by double emulsion-solvent-diffusion technique using rotor-stator homogenization. Quality by design (QbD) concept was implemented for the development of MDDC with potential to be incorporated into semisolid dosage form (gel). Quality target product profile (QTPP) and critical quality attributes (CQA) were defined and identified, accordingly. Critical material attributes (CMA) and Critical process parameters (CPP) were identified using quality risk management (QRM) tool, failure mode, effects and criticality analysis (FMECA). CMA and CPP were identified based on results obtained from principal component analysis (PCA-X&Y) and partial least squares (PLS) statistical analysis along with literature data, product and process knowledge and understanding. FMECA identified amount of ethylcellulose, chitosan, acetone, dichloromethane, span 80, tween 80 and water ratio in primary/multiple emulsions as CMA and rotation speed and stirrer type used for organic solvent removal as CPP. The relationship between identified CPP and particle size as CQA was described in the design space using design of experiments - one-factor response surface method. Obtained results from statistically designed experiments enabled establishment of mathematical models and equations that were used for detailed characterization of influence of identified CPP upon MDDC particle size and particle size distribution and their subsequent optimization. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Computer Optimization of Biodegradable Nanoparticles Fabricated by Dispersion Polymerization.

    PubMed

    Akala, Emmanuel O; Adesina, Simeon; Ogunwuyi, Oluwaseun

    2015-12-22

    Quality by design (QbD) in the pharmaceutical industry involves designing and developing drug formulations and manufacturing processes which ensure predefined drug product specifications. QbD helps to understand how process and formulation variables affect product characteristics and subsequent optimization of these variables vis-à-vis final specifications. Statistical design of experiments (DoE) identifies important parameters in a pharmaceutical dosage form design followed by optimizing the parameters with respect to certain specifications. DoE establishes in mathematical form the relationships between critical process parameters together with critical material attributes and critical quality attributes. We focused on the fabrication of biodegradable nanoparticles by dispersion polymerization. Aided by a statistical software, d-optimal mixture design was used to vary the components (crosslinker, initiator, stabilizer, and macromonomers) to obtain twenty nanoparticle formulations (PLLA-based nanoparticles) and thirty formulations (poly-ɛ-caprolactone-based nanoparticles). Scheffe polynomial models were generated to predict particle size (nm), zeta potential, and yield (%) as functions of the composition of the formulations. Simultaneous optimizations were carried out on the response variables. Solutions were returned from simultaneous optimization of the response variables for component combinations to (1) minimize nanoparticle size; (2) maximize the surface negative zeta potential; and (3) maximize percent yield to make the nanoparticle fabrication an economic proposition.

  1. Interpretation of the results of statistical measurements. [search for basic probability model

    NASA Technical Reports Server (NTRS)

    Olshevskiy, V. V.

    1973-01-01

    For random processes, the calculated probability characteristic, and the measured statistical estimate are used in a quality functional, which defines the difference between the two functions. Based on the assumption that the statistical measurement procedure is organized so that the parameters for a selected model are optimized, it is shown that the interpretation of experimental research is a search for a basic probability model.

  2. [Quantitative structure-gas chromatographic retention relationship of polycyclic aromatic sulfur heterocycles using molecular electronegativity-distance vector].

    PubMed

    Li, Zhenghua; Cheng, Fansheng; Xia, Zhining

    2011-01-01

    The chemical structures of 114 polycyclic aromatic sulfur heterocycles (PASHs) have been studied by molecular electronegativity-distance vector (MEDV). The linear relationships between gas chromatographic retention index and the MEDV have been established by a multiple linear regression (MLR) model. The results of variable selection by stepwise multiple regression (SMR) and the powerful predictive abilities of the optimization model appraised by leave-one-out cross-validation showed that the optimization model with the correlation coefficient (R) of 0.994 7 and the cross-validated correlation coefficient (Rcv) of 0.994 0 possessed the best statistical quality. Furthermore, when the 114 PASHs compounds were divided into calibration and test sets in the ratio of 2:1, the statistical analysis showed our models possesses almost equal statistical quality, the very similar regression coefficients and the good robustness. The quantitative structure-retention relationship (QSRR) model established may provide a convenient and powerful method for predicting the gas chromatographic retention of PASHs.

  3. Statistical model for speckle pattern optimization.

    PubMed

    Su, Yong; Zhang, Qingchuan; Gao, Zeren

    2017-11-27

    Image registration is the key technique of optical metrologies such as digital image correlation (DIC), particle image velocimetry (PIV), and speckle metrology. Its performance depends critically on the quality of image pattern, and thus pattern optimization attracts extensive attention. In this article, a statistical model is built to optimize speckle patterns that are composed of randomly positioned speckles. It is found that the process of speckle pattern generation is essentially a filtered Poisson process. The dependence of measurement errors (including systematic errors, random errors, and overall errors) upon speckle pattern generation parameters is characterized analytically. By minimizing the errors, formulas of the optimal speckle radius are presented. Although the primary motivation is from the field of DIC, we believed that scholars in other optical measurement communities, such as PIV and speckle metrology, will benefit from these discussions.

  4. Mitigating Provider Uncertainty in Service Provision Contracts

    NASA Astrophysics Data System (ADS)

    Smith, Chris; van Moorsel, Aad

    Uncertainty is an inherent property of open, distributed and multiparty systems. The viability of the mutually beneficial relationships which motivate these systems relies on rational decision-making by each constituent party under uncertainty. Service provision in distributed systems is one such relationship. Uncertainty is experienced by the service provider in his ability to deliver a service with selected quality level guarantees due to inherent non-determinism, such as load fluctuations and hardware failures. Statistical estimators utilized to model this non-determinism introduce additional uncertainty through sampling error. Inability of the provider to accurately model and analyze uncertainty in the quality level guarantees can result in the formation of sub-optimal service provision contracts. Emblematic consequences include loss of revenue, inefficient resource utilization and erosion of reputation and consumer trust. We propose a utility model for contract-based service provision to provide a systematic approach to optimal service provision contract formation under uncertainty. Performance prediction methods to enable the derivation of statistical estimators for quality level are introduced, with analysis of their resultant accuracy and cost.

  5. Spread spectrum image watermarking based on perceptual quality metric.

    PubMed

    Zhang, Fan; Liu, Wenyu; Lin, Weisi; Ngan, King Ngi

    2011-11-01

    Efficient image watermarking calls for full exploitation of the perceptual distortion constraint. Second-order statistics of visual stimuli are regarded as critical features for perception. This paper proposes a second-order statistics (SOS)-based image quality metric, which considers the texture masking effect and the contrast sensitivity in Karhunen-Loève transform domain. Compared with the state-of-the-art metrics, the quality prediction by SOS better correlates with several subjectively rated image databases, in which the images are impaired by the typical coding and watermarking artifacts. With the explicit metric definition, spread spectrum watermarking is posed as an optimization problem: we search for a watermark to minimize the distortion of the watermarked image and to maximize the correlation between the watermark pattern and the spread spectrum carrier. The simple metric guarantees the optimal watermark a closed-form solution and a fast implementation. The experiments show that the proposed watermarking scheme can take full advantage of the distortion constraint and improve the robustness in return.

  6. Feasibility study of using statistical process control to customized quality assurance in proton therapy.

    PubMed

    Rah, Jeong-Eun; Shin, Dongho; Oh, Do Hoon; Kim, Tae Hyun; Kim, Gwe-Ya

    2014-09-01

    To evaluate and improve the reliability of proton quality assurance (QA) processes and, to provide an optimal customized tolerance level using the statistical process control (SPC) methodology. The authors investigated the consistency check of dose per monitor unit (D/MU) and range in proton beams to see whether it was within the tolerance level of the daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to improve the patient-specific QA process in proton beams by using process capability indices. The authors established a customized tolerance level of ±2% for D/MU and ±0.5 mm for beam range in the daily proton QA process. In the authors' analysis of the process capability indices, the patient-specific range measurements were capable of a specification limit of ±2% in clinical plans. SPC methodology is a useful tool for customizing the optimal QA tolerance levels and improving the quality of proton machine maintenance, treatment delivery, and ultimately patient safety.

  7. Fast machine-learning online optimization of ultra-cold-atom experiments.

    PubMed

    Wigley, P B; Everitt, P J; van den Hengel, A; Bastian, J W; Sooriyabandara, M A; McDonald, G D; Hardman, K S; Quinlivan, C D; Manju, P; Kuhn, C C N; Petersen, I R; Luiten, A N; Hope, J J; Robins, N P; Hush, M R

    2016-05-16

    We apply an online optimization process based on machine learning to the production of Bose-Einstein condensates (BEC). BEC is typically created with an exponential evaporation ramp that is optimal for ergodic dynamics with two-body s-wave interactions and no other loss rates, but likely sub-optimal for real experiments. Through repeated machine-controlled scientific experimentation and observations our 'learner' discovers an optimal evaporation ramp for BEC production. In contrast to previous work, our learner uses a Gaussian process to develop a statistical model of the relationship between the parameters it controls and the quality of the BEC produced. We demonstrate that the Gaussian process machine learner is able to discover a ramp that produces high quality BECs in 10 times fewer iterations than a previously used online optimization technique. Furthermore, we show the internal model developed can be used to determine which parameters are essential in BEC creation and which are unimportant, providing insight into the optimization process of the system.

  8. Fast machine-learning online optimization of ultra-cold-atom experiments

    PubMed Central

    Wigley, P. B.; Everitt, P. J.; van den Hengel, A.; Bastian, J. W.; Sooriyabandara, M. A.; McDonald, G. D.; Hardman, K. S.; Quinlivan, C. D.; Manju, P.; Kuhn, C. C. N.; Petersen, I. R.; Luiten, A. N.; Hope, J. J.; Robins, N. P.; Hush, M. R.

    2016-01-01

    We apply an online optimization process based on machine learning to the production of Bose-Einstein condensates (BEC). BEC is typically created with an exponential evaporation ramp that is optimal for ergodic dynamics with two-body s-wave interactions and no other loss rates, but likely sub-optimal for real experiments. Through repeated machine-controlled scientific experimentation and observations our ‘learner’ discovers an optimal evaporation ramp for BEC production. In contrast to previous work, our learner uses a Gaussian process to develop a statistical model of the relationship between the parameters it controls and the quality of the BEC produced. We demonstrate that the Gaussian process machine learner is able to discover a ramp that produces high quality BECs in 10 times fewer iterations than a previously used online optimization technique. Furthermore, we show the internal model developed can be used to determine which parameters are essential in BEC creation and which are unimportant, providing insight into the optimization process of the system. PMID:27180805

  9. Optimization of Thick, Large Area YBCO Film Growth Through Response Surface Methods

    NASA Astrophysics Data System (ADS)

    Porzio, J.; Mahoney, C. H.; Sullivan, M. C.

    2014-03-01

    We present our work on the optimization of thick, large area YB2C3O7-δ (YBCO) film growth through response surface methods. Thick, large area films have commercial uses and have recently been used in dramatic demonstrations of levitation and suspension. Our films are grown via pulsed laser deposition and we have optimized growth parameters via response surface methods. Response surface methods is a statistical tool to optimize selected quantities with respect to a set of variables. We optimized our YBCO films' critical temperatures, thicknesses, and structures with respect to three PLD growth parameters: deposition temperature, laser energy, and deposition pressure. We will present an overview of YBCO growth via pulsed laser deposition, the statistical theory behind response surface methods, and the application of response surface methods to pulsed laser deposition growth of YBCO. Results from the experiment will be presented in a discussion of the optimized film quality. Supported by NFS grant DMR-1305637

  10. Incidence of flare-ups and evaluation of quality after retreatment of resorcinol-formaldehyde resin ("Russian Red Cement") endodontic therapy.

    PubMed

    Gound, Tom G; Marx, David; Schwandt, Nathan A

    2003-10-01

    The purpose of this retrospective study was to evaluate the quality of treatment and incidence of flare-ups when teeth with resorcinol-formaldehyde resin are retreated in a postgraduate endodontic clinic. Fifty-eight cases were included in this study. Obturated and unfilled canal space was measured on radiographs. Forty-eight percent of the total canal space was filled before retreatment; 90% was filled after retreatment. After retreatment, obturations were rated as optimal in 59%, improved in 33%, unchanged in 6%, and worse in 2%. Seven patients (12%) had postretreatment flare-ups. Data were statistically analyzed using the Cochran-Armitage Test for Discrete Variables. No statistical difference in the incidence of flare-ups was found in teeth that before treatment had more than half the canal space filled compared to teeth with less than half, cases with pre-existing periradicular radiolucencies compared to cases with normal periradicular appearance, symptomatic cases compared to asymptomatic cases, or cases with optimal fillings after retreatment compared to less than optimal cases. It was concluded that teeth with resorcinol-formaldehyde fillings might be retreated with a good prognosis for improving the radiographic quality, but a higher than normal incidence of flare-ups may occur.

  11. Optimizing construction quality management of pavements using mechanistic performance analysis.

    DOT National Transportation Integrated Search

    2004-08-01

    This report presents a statistical-based algorithm that was developed to reconcile the results from several pavement performance models used in the state of practice with systematic process control techniques. These algorithms identify project-specif...

  12. Research on the optimization of air quality monitoring station layout based on spatial grid statistical analysis method.

    PubMed

    Li, Tianxin; Zhou, Xing Chen; Ikhumhen, Harrison Odion; Difei, An

    2018-05-01

    In recent years, with the significant increase in urban development, it has become necessary to optimize the current air monitoring stations to reflect the quality of air in the environment. Highlighting the spatial representation of some air monitoring stations using Beijing's regional air monitoring station data from 2012 to 2014, the monthly mean particulate matter concentration (PM10) in the region was calculated and through the IDW interpolation method and spatial grid statistical method using GIS, the spatial distribution of PM10 concentration in the whole region was deduced. The spatial distribution variation of districts in Beijing using the gridding model was performed, and through the 3-year spatial analysis, PM10 concentration data including the variation and spatial overlay (1.5 km × 1.5 km cell resolution grid), the spatial distribution result obtained showed that the total PM10 concentration frequency variation exceeded the standard. It is very important to optimize the layout of the existing air monitoring stations by combining the concentration distribution of air pollutants with the spatial region using GIS.

  13. A solution quality assessment method for swarm intelligence optimization algorithms.

    PubMed

    Zhang, Zhaojun; Wang, Gai-Ge; Zou, Kuansheng; Zhang, Jianhua

    2014-01-01

    Nowadays, swarm intelligence optimization has become an important optimization tool and wildly used in many fields of application. In contrast to many successful applications, the theoretical foundation is rather weak. Therefore, there are still many problems to be solved. One problem is how to quantify the performance of algorithm in finite time, that is, how to evaluate the solution quality got by algorithm for practical problems. It greatly limits the application in practical problems. A solution quality assessment method for intelligent optimization is proposed in this paper. It is an experimental analysis method based on the analysis of search space and characteristic of algorithm itself. Instead of "value performance," the "ordinal performance" is used as evaluation criteria in this method. The feasible solutions were clustered according to distance to divide solution samples into several parts. Then, solution space and "good enough" set can be decomposed based on the clustering results. Last, using relative knowledge of statistics, the evaluation result can be got. To validate the proposed method, some intelligent algorithms such as ant colony optimization (ACO), particle swarm optimization (PSO), and artificial fish swarm algorithm (AFS) were taken to solve traveling salesman problem. Computational results indicate the feasibility of proposed method.

  14. Feasibility study of using statistical process control to customized quality assurance in proton therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rah, Jeong-Eun; Oh, Do Hoon; Shin, Dongho

    Purpose: To evaluate and improve the reliability of proton quality assurance (QA) processes and, to provide an optimal customized tolerance level using the statistical process control (SPC) methodology. Methods: The authors investigated the consistency check of dose per monitor unit (D/MU) and range in proton beams to see whether it was within the tolerance level of the daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to improve the patient-specific QA process in proton beams by using process capability indices. Results: The authors established a customized tolerance level of ±2% formore » D/MU and ±0.5 mm for beam range in the daily proton QA process. In the authors’ analysis of the process capability indices, the patient-specific range measurements were capable of a specification limit of ±2% in clinical plans. Conclusions: SPC methodology is a useful tool for customizing the optimal QA tolerance levels and improving the quality of proton machine maintenance, treatment delivery, and ultimately patient safety.« less

  15. Automated selection of the optimal cardiac phase for single-beat coronary CT angiography reconstruction.

    PubMed

    Stassi, D; Dutta, S; Ma, H; Soderman, A; Pazzani, D; Gros, E; Okerlund, D; Schmidt, T G

    2016-01-01

    Reconstructing a low-motion cardiac phase is expected to improve coronary artery visualization in coronary computed tomography angiography (CCTA) exams. This study developed an automated algorithm for selecting the optimal cardiac phase for CCTA reconstruction. The algorithm uses prospectively gated, single-beat, multiphase data made possible by wide cone-beam imaging. The proposed algorithm differs from previous approaches because the optimal phase is identified based on vessel image quality (IQ) directly, compared to previous approaches that included motion estimation and interphase processing. Because there is no processing of interphase information, the algorithm can be applied to any sampling of image phases, making it suited for prospectively gated studies where only a subset of phases are available. An automated algorithm was developed to select the optimal phase based on quantitative IQ metrics. For each reconstructed slice at each reconstructed phase, an image quality metric was calculated based on measures of circularity and edge strength of through-plane vessels. The image quality metric was aggregated across slices, while a metric of vessel-location consistency was used to ignore slices that did not contain through-plane vessels. The algorithm performance was evaluated using two observer studies. Fourteen single-beat cardiac CT exams (Revolution CT, GE Healthcare, Chalfont St. Giles, UK) reconstructed at 2% intervals were evaluated for best systolic (1), diastolic (6), or systolic and diastolic phases (7) by three readers and the algorithm. Pairwise inter-reader and reader-algorithm agreement was evaluated using the mean absolute difference (MAD) and concordance correlation coefficient (CCC) between the reader and algorithm-selected phases. A reader-consensus best phase was determined and compared to the algorithm selected phase. In cases where the algorithm and consensus best phases differed by more than 2%, IQ was scored by three readers using a five point Likert scale. There was no statistically significant difference between inter-reader and reader-algorithm agreement for either MAD or CCC metrics (p > 0.1). The algorithm phase was within 2% of the consensus phase in 15/21 of cases. The average absolute difference between consensus and algorithm best phases was 2.29% ± 2.47%, with a maximum difference of 8%. Average image quality scores for the algorithm chosen best phase were 4.01 ± 0.65 overall, 3.33 ± 1.27 for right coronary artery (RCA), 4.50 ± 0.35 for left anterior descending (LAD) artery, and 4.50 ± 0.35 for left circumflex artery (LCX). Average image quality scores for the consensus best phase were 4.11 ± 0.54 overall, 3.44 ± 1.03 for RCA, 4.39 ± 0.39 for LAD, and 4.50 ± 0.18 for LCX. There was no statistically significant difference (p > 0.1) between the image quality scores of the algorithm phase and the consensus phase. The proposed algorithm was statistically equivalent to a reader in selecting an optimal cardiac phase for CCTA exams. When reader and algorithm phases differed by >2%, image quality as rated by blinded readers was statistically equivalent. By detecting the optimal phase for CCTA reconstruction, the proposed algorithm is expected to improve coronary artery visualization in CCTA exams.

  16. Improving alignment in Tract-based spatial statistics: evaluation and optimization of image registration.

    PubMed

    de Groot, Marius; Vernooij, Meike W; Klein, Stefan; Ikram, M Arfan; Vos, Frans M; Smith, Stephen M; Niessen, Wiro J; Andersson, Jesper L R

    2013-08-01

    Anatomical alignment in neuroimaging studies is of such importance that considerable effort is put into improving the registration used to establish spatial correspondence. Tract-based spatial statistics (TBSS) is a popular method for comparing diffusion characteristics across subjects. TBSS establishes spatial correspondence using a combination of nonlinear registration and a "skeleton projection" that may break topological consistency of the transformed brain images. We therefore investigated feasibility of replacing the two-stage registration-projection procedure in TBSS with a single, regularized, high-dimensional registration. To optimize registration parameters and to evaluate registration performance in diffusion MRI, we designed an evaluation framework that uses native space probabilistic tractography for 23 white matter tracts, and quantifies tract similarity across subjects in standard space. We optimized parameters for two registration algorithms on two diffusion datasets of different quality. We investigated reproducibility of the evaluation framework, and of the optimized registration algorithms. Next, we compared registration performance of the regularized registration methods and TBSS. Finally, feasibility and effect of incorporating the improved registration in TBSS were evaluated in an example study. The evaluation framework was highly reproducible for both algorithms (R(2) 0.993; 0.931). The optimal registration parameters depended on the quality of the dataset in a graded and predictable manner. At optimal parameters, both algorithms outperformed the registration of TBSS, showing feasibility of adopting such approaches in TBSS. This was further confirmed in the example experiment. Copyright © 2013 Elsevier Inc. All rights reserved.

  17. A fast inverse treatment planning strategy facilitating optimized catheter selection in image-guided high-dose-rate interstitial gynecologic brachytherapy.

    PubMed

    Guthier, Christian V; Damato, Antonio L; Hesser, Juergen W; Viswanathan, Akila N; Cormack, Robert A

    2017-12-01

    Interstitial high-dose rate (HDR) brachytherapy is an important therapeutic strategy for the treatment of locally advanced gynecologic (GYN) cancers. The outcome of this therapy is determined by the quality of dose distribution achieved. This paper focuses on a novel yet simple heuristic for catheter selection for GYN HDR brachytherapy and their comparison against state of the art optimization strategies. The proposed technique is intended to act as a decision-supporting tool to select a favorable needle configuration. The presented heuristic for catheter optimization is based on a shrinkage-type algorithm (SACO). It is compared against state of the art planning in a retrospective study of 20 patients who previously received image-guided interstitial HDR brachytherapy using a Syed Neblett template. From those plans, template orientation and position are estimated via a rigid registration of the template with the actual catheter trajectories. All potential straight trajectories intersecting the contoured clinical target volume (CTV) are considered for catheter optimization. Retrospectively generated plans and clinical plans are compared with respect to dosimetric performance and optimization time. All plans were generated with one single run of the optimizer lasting 0.6-97.4 s. Compared to manual optimization, SACO yields a statistically significant (P ≤ 0.05) improved target coverage while at the same time fulfilling all dosimetric constraints for organs at risk (OARs). Comparing inverse planning strategies, dosimetric evaluation for SACO and "hybrid inverse planning and optimization" (HIPO), as gold standard, shows no statistically significant difference (P > 0.05). However, SACO provides the potential to reduce the number of used catheters without compromising plan quality. The proposed heuristic for needle selection provides fast catheter selection with optimization times suited for intraoperative treatment planning. Compared to manual optimization, the proposed methodology results in fewer catheters without a clinically significant loss in plan quality. The proposed approach can be used as a decision support tool that guides the user to find the ideal number and configuration of catheters. © 2017 American Association of Physicists in Medicine.

  18. Image-guided optimization of the ECG trace in cardiac MRI.

    PubMed

    Barnwell, James D; Klein, J Larry; Stallings, Cliff; Sturm, Amanda; Gillespie, Michael; Fine, Jason; Hyslop, W Brian

    2012-03-01

    Improper electrocardiogram (ECG) lead placement resulting in suboptimal gating may lead to reduced image quality in cardiac magnetic resonance imaging (CMR). A patientspecific systematic technique for rapid optimization of lead placement may improve CMR image quality. A rapid 3 dimensional image of the thorax was used to guide the realignment of ECG leads relative to the cardiac axis of the patient in forty consecutive adult patients. Using our novel approach and consensus reading of pre- and post-correction ECG traces, seventy-three percent of patients had a qualitative improvement in their ECG tracings, and no patient had a decrease in quality of their ECG tracing following the correction technique. Statistically significant improvement was observed independent of gender, body mass index, and cardiac rhythm. This technique provides an efficient option to improve the quality of the ECG tracing in patients who have a poor quality ECG with standard techniques.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stassi, D.; Ma, H.; Schmidt, T. G., E-mail: taly.gilat-schmidt@marquette.edu

    Purpose: Reconstructing a low-motion cardiac phase is expected to improve coronary artery visualization in coronary computed tomography angiography (CCTA) exams. This study developed an automated algorithm for selecting the optimal cardiac phase for CCTA reconstruction. The algorithm uses prospectively gated, single-beat, multiphase data made possible by wide cone-beam imaging. The proposed algorithm differs from previous approaches because the optimal phase is identified based on vessel image quality (IQ) directly, compared to previous approaches that included motion estimation and interphase processing. Because there is no processing of interphase information, the algorithm can be applied to any sampling of image phases, makingmore » it suited for prospectively gated studies where only a subset of phases are available. Methods: An automated algorithm was developed to select the optimal phase based on quantitative IQ metrics. For each reconstructed slice at each reconstructed phase, an image quality metric was calculated based on measures of circularity and edge strength of through-plane vessels. The image quality metric was aggregated across slices, while a metric of vessel-location consistency was used to ignore slices that did not contain through-plane vessels. The algorithm performance was evaluated using two observer studies. Fourteen single-beat cardiac CT exams (Revolution CT, GE Healthcare, Chalfont St. Giles, UK) reconstructed at 2% intervals were evaluated for best systolic (1), diastolic (6), or systolic and diastolic phases (7) by three readers and the algorithm. Pairwise inter-reader and reader-algorithm agreement was evaluated using the mean absolute difference (MAD) and concordance correlation coefficient (CCC) between the reader and algorithm-selected phases. A reader-consensus best phase was determined and compared to the algorithm selected phase. In cases where the algorithm and consensus best phases differed by more than 2%, IQ was scored by three readers using a five point Likert scale. Results: There was no statistically significant difference between inter-reader and reader-algorithm agreement for either MAD or CCC metrics (p > 0.1). The algorithm phase was within 2% of the consensus phase in 15/21 of cases. The average absolute difference between consensus and algorithm best phases was 2.29% ± 2.47%, with a maximum difference of 8%. Average image quality scores for the algorithm chosen best phase were 4.01 ± 0.65 overall, 3.33 ± 1.27 for right coronary artery (RCA), 4.50 ± 0.35 for left anterior descending (LAD) artery, and 4.50 ± 0.35 for left circumflex artery (LCX). Average image quality scores for the consensus best phase were 4.11 ± 0.54 overall, 3.44 ± 1.03 for RCA, 4.39 ± 0.39 for LAD, and 4.50 ± 0.18 for LCX. There was no statistically significant difference (p > 0.1) between the image quality scores of the algorithm phase and the consensus phase. Conclusions: The proposed algorithm was statistically equivalent to a reader in selecting an optimal cardiac phase for CCTA exams. When reader and algorithm phases differed by >2%, image quality as rated by blinded readers was statistically equivalent. By detecting the optimal phase for CCTA reconstruction, the proposed algorithm is expected to improve coronary artery visualization in CCTA exams.« less

  20. Investigation into the influence of laser energy input on selective laser melted thin-walled parts by response surface method

    NASA Astrophysics Data System (ADS)

    Liu, Yang; Zhang, Jian; Pang, Zhicong; Wu, Weihui

    2018-04-01

    Selective laser melting (SLM) provides a feasible way for manufacturing of complex thin-walled parts directly, however, the energy input during SLM process, namely derived from the laser power, scanning speed, layer thickness and scanning space, etc. has great influence on the thin wall's qualities. The aim of this work is to relate the thin wall's parameters (responses), namely track width, surface roughness and hardness to the process parameters considered in this research (laser power, scanning speed and layer thickness) and to find out the optimal manufacturing conditions. Design of experiment (DoE) was used by implementing composite central design to achieve better manufacturing qualities. Mathematical models derived from the statistical analysis were used to establish the relationships between the process parameters and the responses. Also, the effects of process parameters on each response were determined. Then, a numerical optimization was performed to find out the optimal process set at which the quality features are at their desired values. Based on this study, the relationship between process parameters and SLMed thin-walled structure was revealed and thus, the corresponding optimal process parameters can be used to manufactured thin-walled parts with high quality.

  1. Taguchi Approach to Design Optimization for Quality and Cost: An Overview

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Dean, Edwin B.

    1990-01-01

    Calibrations to existing cost of doing business in space indicate that to establish human presence on the Moon and Mars with the Space Exploration Initiative (SEI) will require resources, felt by many, to be more than the national budget can afford. In order for SEI to succeed, we must actually design and build space systems at lower cost this time, even with tremendous increases in quality and performance requirements, such as extremely high reliability. This implies that both government and industry must change the way they do business. Therefore, new philosophy and technology must be employed to design and produce reliable, high quality space systems at low cost. In recognizing the need to reduce cost and improve quality and productivity, Department of Defense (DoD) and National Aeronautics and Space Administration (NASA) have initiated Total Quality Management (TQM). TQM is a revolutionary management strategy in quality assurance and cost reduction. TQM requires complete management commitment, employee involvement, and use of statistical tools. The quality engineering methods of Dr. Taguchi, employing design of experiments (DOE), is one of the most important statistical tools of TQM for designing high quality systems at reduced cost. Taguchi methods provide an efficient and systematic way to optimize designs for performance, quality, and cost. Taguchi methods have been used successfully in Japan and the United States in designing reliable, high quality products at low cost in such areas as automobiles and consumer electronics. However, these methods are just beginning to see application in the aerospace industry. The purpose of this paper is to present an overview of the Taguchi methods for improving quality and reducing cost, describe the current state of applications and its role in identifying cost sensitive design parameters.

  2. Optimal combining of ground-based sensors for the purpose of validating satellite-based rainfall estimates

    NASA Technical Reports Server (NTRS)

    Krajewski, Witold F.; Rexroth, David T.; Kiriaki, Kiriakie

    1991-01-01

    Two problems related to radar rainfall estimation are described. The first part is a description of a preliminary data analysis for the purpose of statistical estimation of rainfall from multiple (radar and raingage) sensors. Raingage, radar, and joint radar-raingage estimation is described, and some results are given. Statistical parameters of rainfall spatial dependence are calculated and discussed in the context of optimal estimation. Quality control of radar data is also described. The second part describes radar scattering by ellipsoidal raindrops. An analytical solution is derived for the Rayleigh scattering regime. Single and volume scattering are presented. Comparison calculations with the known results for spheres and oblate spheroids are shown.

  3. Comparative effectiveness research methodology using secondary data: A starting user's guide.

    PubMed

    Sun, Maxine; Lipsitz, Stuart R

    2018-04-01

    The use of secondary data, such as claims or administrative data, in comparative effectiveness research has grown tremendously in recent years. We believe that the current review can help investigators relying on secondary data to (1) gain insight into both the methodologies and statistical methods, (2) better understand the necessity of a rigorous planning before initiating a comparative effectiveness investigation, and (3) optimize the quality of their investigations. Specifically, we review concepts of adjusted analyses and confounders, methods of propensity score analyses, and instrumental variable analyses, risk prediction models (logistic and time-to-event), decision-curve analysis, as well as the interpretation of the P value and hypothesis testing. Overall, we hope that the current review article can help research investigators relying on secondary data to perform comparative effectiveness research better understand the necessity of a rigorous planning before study start, and gain better insight in the choice of statistical methods so as to optimize the quality of the research study. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Artificial neural networks in evaluation and optimization of modified release solid dosage forms.

    PubMed

    Ibrić, Svetlana; Djuriš, Jelena; Parojčić, Jelena; Djurić, Zorica

    2012-10-18

    Implementation of the Quality by Design (QbD) approach in pharmaceutical development has compelled researchers in the pharmaceutical industry to employ Design of Experiments (DoE) as a statistical tool, in product development. Among all DoE techniques, response surface methodology (RSM) is the one most frequently used. Progress of computer science has had an impact on pharmaceutical development as well. Simultaneous with the implementation of statistical methods, machine learning tools took an important place in drug formulation. Twenty years ago, the first papers describing application of artificial neural networks in optimization of modified release products appeared. Since then, a lot of work has been done towards implementation of new techniques, especially Artificial Neural Networks (ANN) in modeling of production, drug release and drug stability of modified release solid dosage forms. The aim of this paper is to review artificial neural networks in evaluation and optimization of modified release solid dosage forms.

  5. Artificial Neural Networks in Evaluation and Optimization of Modified Release Solid Dosage Forms

    PubMed Central

    Ibrić, Svetlana; Djuriš, Jelena; Parojčić, Jelena; Djurić, Zorica

    2012-01-01

    Implementation of the Quality by Design (QbD) approach in pharmaceutical development has compelled researchers in the pharmaceutical industry to employ Design of Experiments (DoE) as a statistical tool, in product development. Among all DoE techniques, response surface methodology (RSM) is the one most frequently used. Progress of computer science has had an impact on pharmaceutical development as well. Simultaneous with the implementation of statistical methods, machine learning tools took an important place in drug formulation. Twenty years ago, the first papers describing application of artificial neural networks in optimization of modified release products appeared. Since then, a lot of work has been done towards implementation of new techniques, especially Artificial Neural Networks (ANN) in modeling of production, drug release and drug stability of modified release solid dosage forms. The aim of this paper is to review artificial neural networks in evaluation and optimization of modified release solid dosage forms. PMID:24300369

  6. Statistical approaches to optimize detection of MIB off-flavor in aquaculture raised channel catfish

    USDA-ARS?s Scientific Manuscript database

    The catfish industry prides itself on preventing inadvertent sale of off-flavor fish. Typically, several fish are taste tested over several weeks before pond harvest to confirm good fish flavor quality. We collected several data sets of analytically measured off-flavor concentrations in catfish to...

  7. Optimal Experimental Design for Model Discrimination

    PubMed Central

    Myung, Jay I.; Pitt, Mark A.

    2009-01-01

    Models of a psychological process can be difficult to discriminate experimentally because it is not easy to determine the values of the critical design variables (e.g., presentation schedule, stimulus structure) that will be most informative in differentiating them. Recent developments in sampling-based search methods in statistics make it possible to determine these values, and thereby identify an optimal experimental design. After describing the method, it is demonstrated in two content areas in cognitive psychology in which models are highly competitive: retention (i.e., forgetting) and categorization. The optimal design is compared with the quality of designs used in the literature. The findings demonstrate that design optimization has the potential to increase the informativeness of the experimental method. PMID:19618983

  8. Clinical implementation of a knowledge based planning tool for prostate VMAT.

    PubMed

    Powis, Richard; Bird, Andrew; Brennan, Matthew; Hinks, Susan; Newman, Hannah; Reed, Katie; Sage, John; Webster, Gareth

    2017-05-08

    A knowledge based planning tool has been developed and implemented for prostate VMAT radiotherapy plans providing a target average rectum dose value based on previously achievable values for similar rectum/PTV overlap. The purpose of this planning tool is to highlight sub-optimal clinical plans and to improve plan quality and consistency. A historical cohort of 97 VMAT prostate plans was interrogated using a RayStation script and used to develop a local model for predicting optimum average rectum dose based on individual anatomy. A preliminary validation study was performed whereby historical plans identified as "optimal" and "sub-optimal" by the local model were replanned in a blinded study by four experienced planners and compared to the original clinical plan to assess whether any improvement in rectum dose was observed. The predictive model was then incorporated into a RayStation script and used as part of the clinical planning process. Planners were asked to use the script during planning to provide a patient specific prediction for optimum average rectum dose and to optimise the plan accordingly. Plans identified as "sub-optimal" in the validation study observed a statistically significant improvement in average rectum dose compared to the clinical plan when replanned whereas plans that were identified as "optimal" observed no improvement when replanned. This provided confidence that the local model can identify plans that were suboptimal in terms of rectal sparing. Clinical implementation of the knowledge based planning tool reduced the population-averaged mean rectum dose by 5.6Gy. There was a small but statistically significant increase in total MU and femoral head dose and a reduction in conformity index. These did not affect the clinical acceptability of the plans and no significant changes to other plan quality metrics were observed. The knowledge-based planning tool has enabled substantial reductions in population-averaged mean rectum dose for prostate VMAT patients. This suggests plans are improved when planners receive quantitative feedback on plan quality against historical data.

  9. Quantitative Analysis of the Effect of Iterative Reconstruction Using a Phantom: Determining the Appropriate Blending Percentage

    PubMed Central

    Kim, Hyun Gi; Lee, Young Han; Choi, Jin-Young; Park, Mi-Suk; Kim, Myeong-Jin; Kim, Ki Whang

    2015-01-01

    Purpose To investigate the optimal blending percentage of adaptive statistical iterative reconstruction (ASIR) in a reduced radiation dose while preserving a degree of image quality and texture that is similar to that of standard-dose computed tomography (CT). Materials and Methods The CT performance phantom was scanned with standard and dose reduction protocols including reduced mAs or kVp. Image quality parameters including noise, spatial, and low-contrast resolution, as well as image texture, were quantitatively evaluated after applying various blending percentages of ASIR. The optimal blending percentage of ASIR that preserved image quality and texture compared to standard dose CT was investigated in each radiation dose reduction protocol. Results As the percentage of ASIR increased, noise and spatial-resolution decreased, whereas low-contrast resolution increased. In the texture analysis, an increasing percentage of ASIR resulted in an increase of angular second moment, inverse difference moment, and correlation and in a decrease of contrast and entropy. The 20% and 40% dose reduction protocols with 20% and 40% ASIR blending, respectively, resulted in an optimal quality of images with preservation of the image texture. Conclusion Blending the 40% ASIR to the 40% reduced tube-current product can maximize radiation dose reduction and preserve adequate image quality and texture. PMID:25510772

  10. Nozzle Mounting Method Optimization Based on Robot Kinematic Analysis

    NASA Astrophysics Data System (ADS)

    Chen, Chaoyue; Liao, Hanlin; Montavon, Ghislain; Deng, Sihao

    2016-08-01

    Nowadays, the application of industrial robots in thermal spray is gaining more and more importance. A desired coating quality depends on factors such as a balanced robot performance, a uniform scanning trajectory and stable parameters (e.g. nozzle speed, scanning step, spray angle, standoff distance). These factors also affect the mass and heat transfer as well as the coating formation. Thus, the kinematic optimization of all these aspects plays a key role in order to obtain an optimal coating quality. In this study, the robot performance was optimized from the aspect of nozzle mounting on the robot. An optimized nozzle mounting for a type F4 nozzle was designed, based on the conventional mounting method from the point of view of robot kinematics validated on a virtual robot. Robot kinematic parameters were obtained from the simulation by offline programming software and analyzed by statistical methods. The energy consumptions of different nozzle mounting methods were also compared. The results showed that it was possible to reasonably assign the amount of robot motion to each axis during the process, so achieving a constant nozzle speed. Thus, it is possible optimize robot performance and to economize robot energy.

  11. Objective assessment of image quality. IV. Application to adaptive optics

    PubMed Central

    Barrett, Harrison H.; Myers, Kyle J.; Devaney, Nicholas; Dainty, Christopher

    2008-01-01

    The methodology of objective assessment, which defines image quality in terms of the performance of specific observers on specific tasks of interest, is extended to temporal sequences of images with random point spread functions and applied to adaptive imaging in astronomy. The tasks considered include both detection and estimation, and the observers are the optimal linear discriminant (Hotelling observer) and the optimal linear estimator (Wiener). A general theory of first- and second-order spatiotemporal statistics in adaptive optics is developed. It is shown that the covariance matrix can be rigorously decomposed into three terms representing the effect of measurement noise, random point spread function, and random nature of the astronomical scene. Figures of merit are developed, and computational methods are discussed. PMID:17106464

  12. Improved method for reliable HMW-GS identification by RP-HPLC and SDS-PAGE in common wheat cultivars

    USDA-ARS?s Scientific Manuscript database

    The accurate identification of alleles for high-molecular weight glutenins (HMW-GS) is critical for wheat breeding programs targeting end-use quality. RP-HPLC methods were optimized for separation of HMW-GS, resulting in enhanced resolution of 1By and 1Dx subunits. Statistically significant differe...

  13. Factor analysis in optimization of formulation of high content uniformity tablets containing low dose active substance.

    PubMed

    Lukášová, Ivana; Muselík, Jan; Franc, Aleš; Goněc, Roman; Mika, Filip; Vetchý, David

    2017-11-15

    Warfarin is intensively discussed drug with narrow therapeutic range. There have been cases of bleeding attributed to varying content or altered quality of the active substance. Factor analysis is useful for finding suitable technological parameters leading to high content uniformity of tablets containing low amount of active substance. The composition of tabletting blend and technological procedure were set with respect to factor analysis of previously published results. The correctness of set parameters was checked by manufacturing and evaluation of tablets containing 1-10mg of warfarin sodium. The robustness of suggested technology was checked by using "worst case scenario" and statistical evaluation of European Pharmacopoeia (EP) content uniformity limits with respect to Bergum division and process capability index (Cpk). To evaluate the quality of active substance and tablets, dissolution method was developed (water; EP apparatus II; 25rpm), allowing for statistical comparison of dissolution profiles. Obtained results prove the suitability of factor analysis to optimize the composition with respect to batches manufactured previously and thus the use of metaanalysis under industrial conditions is feasible. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Impact of growth rate on graphene lattice-defect formation within a single crystalline domain.

    PubMed

    Chin, Hao-Ting; Lee, Jian-Jhang; Hofmann, Mario; Hsieh, Ya-Ping

    2018-03-06

    Chemical vapor deposition (CVD) is promising for the large scale production of graphene and other two-dimensional materials. Optimization of the CVD process for enhancing their quality is a focus of ongoing effort and significant progress has been made in decreasing the defectiveness associated with grain boundaries and nucleation spots. However, little is known about the quality and origin of structural defects in the outgrowing lattice which are present even in single-crystalline material and represent the limit of current optimization efforts. We here investigate the formation kinetics of such defects by controlling graphene's growth rate over a wide range using nanoscale confinements. Statistical analysis of Raman spectroscopic results shows a clear trend between growth rate and defectiveness that is in quantitative agreement with a model where defects are healed preferentially at the growth front. Our results suggest that low growth rates are required to avoid the freezing of lattice defects and form high quality material. This conclusion is confirmed by a fourfold enhancement in graphene's carrier mobility upon optimization of the growth rate.

  15. Assessment of water quality monitoring for the optimal sensor placement in lake Yahuarcocha using pattern recognition techniques and geographical information systems.

    PubMed

    Jácome, Gabriel; Valarezo, Carla; Yoo, Changkyoo

    2018-03-30

    Pollution and the eutrophication process are increasing in lake Yahuarcocha and constant water quality monitoring is essential for a better understanding of the patterns occurring in this ecosystem. In this study, key sensor locations were determined using spatial and temporal analyses combined with geographical information systems (GIS) to assess the influence of weather features, anthropogenic activities, and other non-point pollution sources. A water quality monitoring network was established to obtain data on 14 physicochemical and microbiological parameters at each of seven sample sites over a period of 13 months. A spatial and temporal statistical approach using pattern recognition techniques, such as cluster analysis (CA) and discriminant analysis (DA), was employed to classify and identify the most important water quality parameters in the lake. The original monitoring network was reduced to four optimal sensor locations based on a fuzzy overlay of the interpolations of concentration variations of the most important parameters.

  16. A heuristic statistical stopping rule for iterative reconstruction in emission tomography.

    PubMed

    Ben Bouallègue, F; Crouzet, J F; Mariano-Goulart, D

    2013-01-01

    We propose a statistical stopping criterion for iterative reconstruction in emission tomography based on a heuristic statistical description of the reconstruction process. The method was assessed for MLEM reconstruction. Based on Monte-Carlo numerical simulations and using a perfectly modeled system matrix, our method was compared with classical iterative reconstruction followed by low-pass filtering in terms of Euclidian distance to the exact object, noise, and resolution. The stopping criterion was then evaluated with realistic PET data of a Hoffman brain phantom produced using the GATE platform for different count levels. The numerical experiments showed that compared with the classical method, our technique yielded significant improvement of the noise-resolution tradeoff for a wide range of counting statistics compatible with routine clinical settings. When working with realistic data, the stopping rule allowed a qualitatively and quantitatively efficient determination of the optimal image. Our method appears to give a reliable estimation of the optimal stopping point for iterative reconstruction. It should thus be of practical interest as it produces images with similar or better quality than classical post-filtered iterative reconstruction with a mastered computation time.

  17. Statistical analysis for validating ACO-KNN algorithm as feature selection in sentiment analysis

    NASA Astrophysics Data System (ADS)

    Ahmad, Siti Rohaidah; Yusop, Nurhafizah Moziyana Mohd; Bakar, Azuraliza Abu; Yaakub, Mohd Ridzwan

    2017-10-01

    This research paper aims to propose a hybrid of ant colony optimization (ACO) and k-nearest neighbor (KNN) algorithms as feature selections for selecting and choosing relevant features from customer review datasets. Information gain (IG), genetic algorithm (GA), and rough set attribute reduction (RSAR) were used as baseline algorithms in a performance comparison with the proposed algorithm. This paper will also discuss the significance test, which was used to evaluate the performance differences between the ACO-KNN, IG-GA, and IG-RSAR algorithms. This study evaluated the performance of the ACO-KNN algorithm using precision, recall, and F-score, which were validated using the parametric statistical significance tests. The evaluation process has statistically proven that this ACO-KNN algorithm has been significantly improved compared to the baseline algorithms. The evaluation process has statistically proven that this ACO-KNN algorithm has been significantly improved compared to the baseline algorithms. In addition, the experimental results have proven that the ACO-KNN can be used as a feature selection technique in sentiment analysis to obtain quality, optimal feature subset that can represent the actual data in customer review data.

  18. Statistical analysis of global horizontal solar irradiation GHI in Fez city, Morocco

    NASA Astrophysics Data System (ADS)

    Bounoua, Z.; Mechaqrane, A.

    2018-05-01

    An accurate knowledge of the solar energy reaching the ground is necessary for sizing and optimizing the performances of solar installations. This paper describes a statistical analysis of the global horizontal solar irradiation (GHI) at Fez city, Morocco. For better reliability, we have first applied a set of check procedures to test the quality of hourly GHI measurements. We then eliminate the erroneous values which are generally due to measurement or the cosine effect errors. Statistical analysis show that the annual mean daily values of GHI is of approximately 5 kWh/m²/day. Daily monthly mean values and other parameter are also calculated.

  19. Dynamic statistical optimization of GNSS radio occultation bending angles: advanced algorithm and performance analysis

    NASA Astrophysics Data System (ADS)

    Li, Y.; Kirchengast, G.; Scherllin-Pirscher, B.; Norman, R.; Yuan, Y. B.; Fritzer, J.; Schwaerz, M.; Zhang, K.

    2015-08-01

    We introduce a new dynamic statistical optimization algorithm to initialize ionosphere-corrected bending angles of Global Navigation Satellite System (GNSS)-based radio occultation (RO) measurements. The new algorithm estimates background and observation error covariance matrices with geographically varying uncertainty profiles and realistic global-mean correlation matrices. The error covariance matrices estimated by the new approach are more accurate and realistic than in simplified existing approaches and can therefore be used in statistical optimization to provide optimal bending angle profiles for high-altitude initialization of the subsequent Abel transform retrieval of refractivity. The new algorithm is evaluated against the existing Wegener Center Occultation Processing System version 5.6 (OPSv5.6) algorithm, using simulated data on two test days from January and July 2008 and real observed CHAllenging Minisatellite Payload (CHAMP) and Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC) measurements from the complete months of January and July 2008. The following is achieved for the new method's performance compared to OPSv5.6: (1) significant reduction of random errors (standard deviations) of optimized bending angles down to about half of their size or more; (2) reduction of the systematic differences in optimized bending angles for simulated MetOp data; (3) improved retrieval of refractivity and temperature profiles; and (4) realistically estimated global-mean correlation matrices and realistic uncertainty fields for the background and observations. Overall the results indicate high suitability for employing the new dynamic approach in the processing of long-term RO data into a reference climate record, leading to well-characterized and high-quality atmospheric profiles over the entire stratosphere.

  20. Optimization of the p-xylene oxidation process by a multi-objective differential evolution algorithm with adaptive parameters co-derived with the population-based incremental learning algorithm

    NASA Astrophysics Data System (ADS)

    Guo, Zhan; Yan, Xuefeng

    2018-04-01

    Different operating conditions of p-xylene oxidation have different influences on the product, purified terephthalic acid. It is necessary to obtain the optimal combination of reaction conditions to ensure the quality of the products, cut down on consumption and increase revenues. A multi-objective differential evolution (MODE) algorithm co-evolved with the population-based incremental learning (PBIL) algorithm, called PBMODE, is proposed. The PBMODE algorithm was designed as a co-evolutionary system. Each individual has its own parameter individual, which is co-evolved by PBIL. PBIL uses statistical analysis to build a model based on the corresponding symbiotic individuals of the superior original individuals during the main evolutionary process. The results of simulations and statistical analysis indicate that the overall performance of the PBMODE algorithm is better than that of the compared algorithms and it can be used to optimize the operating conditions of the p-xylene oxidation process effectively and efficiently.

  1. [Quality by design approaches for pharmaceutical development and manufacturing of Chinese medicine].

    PubMed

    Xu, Bing; Shi, Xin-Yuan; Wu, Zhi-Sheng; Zhang, Yan-Ling; Wang, Yun; Qiao, Yan-Jiang

    2017-03-01

    The pharmaceutical quality was built by design, formed in the manufacturing process and improved during the product's lifecycle. Based on the comprehensive literature review of pharmaceutical quality by design (QbD), the essential ideas and implementation strategies of pharmaceutical QbD were interpreted. Considering the complex nature of Chinese medicine, the "4H" model was innovated and proposed for implementing QbD in pharmaceutical development and industrial manufacture of Chinese medicine product. "4H" corresponds to the acronym of holistic design, holistic information analysis, holistic quality control, and holistic process optimization, which is consistent with the holistic concept of Chinese medicine theory. The holistic design aims at constructing both the quality problem space from the patient requirement and the quality solution space from multidisciplinary knowledge. Holistic information analysis emphasizes understanding the quality pattern of Chinese medicine by integrating and mining multisource data and information at a relatively high level. The batch-to-batch quality consistence and manufacturing system reliability can be realized by comprehensive application of inspective quality control, statistical quality control, predictive quality control and intelligent quality control strategies. Holistic process optimization is to improve the product quality and process capability during the product lifecycle management. The implementation of QbD is useful to eliminate the ecosystem contradictions lying in the pharmaceutical development and manufacturing process of Chinese medicine product, and helps guarantee the cost effectiveness. Copyright© by the Chinese Pharmaceutical Association.

  2. Multi-objective optimization of laser-scribed micro grooves on AZO conductive thin film using Data Envelopment Analysis

    NASA Astrophysics Data System (ADS)

    Kuo, Chung-Feng Jeffrey; Quang Vu, Huy; Gunawan, Dewantoro; Lan, Wei-Luen

    2012-09-01

    Laser scribing process has been considered as an effective approach for surface texturization on thin film solar cell. In this study, a systematic method for optimizing multi-objective process parameters of fiber laser system was proposed to achieve excellent quality characteristics, such as the minimum scribing line width, the flattest trough bottom, and the least processing edge surface bumps for increasing incident light absorption of thin film solar cell. First, the Taguchi method (TM) obtained useful statistical information through the orthogonal array with relatively fewer experiments. However, TM is only appropriate to optimize single-objective problems and has to rely on engineering judgment for solving multi-objective problems that can cause uncertainty to some degree. The back-propagation neural network (BPNN) and data envelopment analysis (DEA) were utilized to estimate the incomplete data and derive the optimal process parameters of laser scribing system. In addition, analysis of variance (ANOVA) method was also applied to identify the significant factors which have the greatest effects on the quality of scribing process; in other words, by putting more emphasis on these controllable and profound factors, the quality characteristics of the scribed thin film could be effectively enhanced. The experiments were carried out on ZnO:Al (AZO) transparent conductive thin film with a thickness of 500 nm and the results proved that the proposed approach yields better anticipated improvements than that of the TM which is only superior in improving one quality while sacrificing the other qualities. The results of confirmation experiments have showed the reliability of the proposed method.

  3. Naturalness preservation image contrast enhancement via histogram modification

    NASA Astrophysics Data System (ADS)

    Tian, Qi-Chong; Cohen, Laurent D.

    2018-04-01

    Contrast enhancement is a technique for enhancing image contrast to obtain better visual quality. Since many existing contrast enhancement algorithms usually produce over-enhanced results, the naturalness preservation is needed to be considered in the framework of image contrast enhancement. This paper proposes a naturalness preservation contrast enhancement method, which adopts the histogram matching to improve the contrast and uses the image quality assessment to automatically select the optimal target histogram. The contrast improvement and the naturalness preservation are both considered in the target histogram, so this method can avoid the over-enhancement problem. In the proposed method, the optimal target histogram is a weighted sum of the original histogram, the uniform histogram, and the Gaussian-shaped histogram. Then the structural metric and the statistical naturalness metric are used to determine the weights of corresponding histograms. At last, the contrast-enhanced image is obtained via matching the optimal target histogram. The experiments demonstrate the proposed method outperforms the compared histogram-based contrast enhancement algorithms.

  4. Optimizing patient/caregiver satisfaction through quality of communication in the pediatric emergency department.

    PubMed

    Locke, Robert; Stefano, Mariane; Koster, Alex; Taylor, Beth; Greenspan, Jay

    2011-11-01

    Optimizing patient/family caregiver satisfaction with emergency department (ED) encounters has advantages for improving patient health outcomes, adherence with medical plans, patient rights, and shared participation in care, provider satisfaction, improved health economics, institutional market share, and liability reduction. The variables that contribute to an optimal outcome in the pediatric ED setting have been less well investigated. The specific hypothesis tested was that patient/family caregiver-provider communication and 24-hour postdischarge phone contact would be associated with an increased frequency of highest possible satisfaction scores. A consecutive set of Press Ganey satisfaction survey responses between June and December 2009 in a large tertiary referral pediatric ED was evaluated. Press Ganey responses were subsequently linked to defined components of the electronic medical record associated with each survey respondent's ED visit to ascertain specific objective ED data. Multivariate modeling utilizing generalized linear equations was achieved to obtain a composite model of drivers of patient/caregiver satisfaction. Primary drivers of satisfaction and willingness to return or refer others to the ED were as follows: being informed about delays, ease of the insurance process, overall physician rating, registered nurse attention to needs, control of pain, and successful completion of postdischarge phone call to a family caregiver. Multiple wait time variables that were statistically significant in univariate modeling, including total length of time in the ED, time in waiting room, comfort of waiting room, time in treatment room, and play items, were not statistically significant once controlling for the other variables in the model. Type of insurance, race, patient age, or time of year did not influence the models. Achieving optimal patient/caregiver satisfaction scores in the pediatric ED is highly dependent on the quality of the interpersonal interaction and communication of ED activities. Wait time and other throughput variables are less important than perceived quality of the health interaction and interpersonal communication. Patient satisfaction has advantages greater than market share and should be considered a component of the care-delivery paradigm.

  5. Developing a Continuous Quality Improvement Assessment Using a Patient-Centered Approach in Optimizing Systemic Lupus Erythematosus Disease Control.

    PubMed

    Updyke, Katelyn Mariko; Urso, Brittany; Beg, Shazia; Solomon, James

    2017-10-09

    Systemic lupus erythematosus (SLE) is a multi-organ, autoimmune disease in which patients lose self-tolerance and develop immune complexes which deposit systemically causing multi-organ damage and inflammation. Patients often experience unpredictable flares of symptoms with poorly identified triggers. Literature suggests exogenous exposures may contribute to flares in symptoms. An online pilot survey was marketed globally through social media to self-reported SLE patients with the goal to identify specific subpopulations who are susceptible to disease state changes based on analyzed exogenous factors. The pilot survey was promoted for two weeks, 80 respondents fully completed the survey and were included in statistical analysis. Descriptive statistical analysis was performed on de-identified patient surveys and compared to previous literature studies reporting known or theorized triggers in the SLE disease state. The pilot survey identified similar exogenous triggers compared to previous literature, including antibiotics, increasing beef intake, and metal implants. The goal of the pilot survey is to utilize similar questions to develop a detailed internet-based patient interactive form that can be edited and time stamped as a method to promote continuous quality improvement assessments. The ultimate objective of the platform is to interact with SLE patients from across the globe longitudinally to optimize disease control and improve quality of care by allowing them to avoid harmful triggers.

  6. Developing a Continuous Quality Improvement Assessment Using a Patient-Centered Approach in Optimizing Systemic Lupus Erythematosus Disease Control

    PubMed Central

    Urso, Brittany; Beg, Shazia; Solomon, James

    2017-01-01

    Systemic lupus erythematosus (SLE) is a multi-organ, autoimmune disease in which patients lose self-tolerance and develop immune complexes which deposit systemically causing multi-organ damage and inflammation. Patients often experience unpredictable flares of symptoms with poorly identified triggers. Literature suggests exogenous exposures may contribute to flares in symptoms. An online pilot survey was marketed globally through social media to self-reported SLE patients with the goal to identify specific subpopulations who are susceptible to disease state changes based on analyzed exogenous factors. The pilot survey was promoted for two weeks, 80 respondents fully completed the survey and were included in statistical analysis. Descriptive statistical analysis was performed on de-identified patient surveys and compared to previous literature studies reporting known or theorized triggers in the SLE disease state. The pilot survey identified similar exogenous triggers compared to previous literature, including antibiotics, increasing beef intake, and metal implants. The goal of the pilot survey is to utilize similar questions to develop a detailed internet-based patient interactive form that can be edited and time stamped as a method to promote continuous quality improvement assessments. The ultimate objective of the platform is to interact with SLE patients from across the globe longitudinally to optimize disease control and improve quality of care by allowing them to avoid harmful triggers. PMID:29226052

  7. An RSM Study of the Effects of Simulation Work and Metamodel Specification on the Statistical Quality of Metamodel Estimates

    DTIC Science & Technology

    1994-03-01

    optimize, and perform "what-if" analysis on a complicated simulation model of the greenhouse effect . Regression metamodels were applied to several modules of...the large integrated assessment model of the greenhouse effect . In this study, the metamodels gave "acceptable forecast errors" and were shown to

  8. Optimization of Premix Powders for Tableting Use.

    PubMed

    Todo, Hiroaki; Sato, Kazuki; Takayama, Kozo; Sugibayashi, Kenji

    2018-05-08

    Direct compression is a popular choice as it provides the simplest way to prepare the tablet. It can be easily adopted when the active pharmaceutical ingredient (API) is unstable in water or to thermal drying. An optimal formulation of preliminary mixed powders (premix powders) is beneficial if prepared in advance for tableting use. The aim of this study was to find the optimal formulation of the premix powders composed of lactose (LAC), cornstarch (CS), and microcrystalline cellulose (MCC) by using statistical techniques. Based on the "Quality by Design" concept, a (3,3)-simplex lattice design consisting of three components, LAC, CS, and MCC was employed to prepare the model premix powders. Response surface method incorporating a thin-plate spline interpolation (RSM-S) was applied for estimation of the optimum premix powders for tableting use. The effect of tablet shape identified by the surface curvature on the optimization was investigated. The optimum premix powder was effective when the premix was applied to a small quantity of API, although the function of premix was limited in the case of the formulation of large amount of API. Statistical techniques are valuable to exploit new functions of well-known materials such as LAC, CS, and MCC.

  9. GLOBAL SOLUTIONS TO FOLDED CONCAVE PENALIZED NONCONVEX LEARNING

    PubMed Central

    Liu, Hongcheng; Yao, Tao; Li, Runze

    2015-01-01

    This paper is concerned with solving nonconvex learning problems with folded concave penalty. Despite that their global solutions entail desirable statistical properties, there lack optimization techniques that guarantee global optimality in a general setting. In this paper, we show that a class of nonconvex learning problems are equivalent to general quadratic programs. This equivalence facilitates us in developing mixed integer linear programming reformulations, which admit finite algorithms that find a provably global optimal solution. We refer to this reformulation-based technique as the mixed integer programming-based global optimization (MIPGO). To our knowledge, this is the first global optimization scheme with a theoretical guarantee for folded concave penalized nonconvex learning with the SCAD penalty (Fan and Li, 2001) and the MCP penalty (Zhang, 2010). Numerical results indicate a significant outperformance of MIPGO over the state-of-the-art solution scheme, local linear approximation, and other alternative solution techniques in literature in terms of solution quality. PMID:27141126

  10. The compartment bag test (CBT) for enumerating fecal indicator bacteria: Basis for design and interpretation of results.

    PubMed

    Gronewold, Andrew D; Sobsey, Mark D; McMahan, Lanakila

    2017-06-01

    For the past several years, the compartment bag test (CBT) has been employed in water quality monitoring and public health protection around the world. To date, however, the statistical basis for the design and recommended procedures for enumerating fecal indicator bacteria (FIB) concentrations from CBT results have not been formally documented. Here, we provide that documentation following protocols for communicating the evolution of similar water quality testing procedures. We begin with an overview of the statistical theory behind the CBT, followed by a description of how that theory was applied to determine an optimal CBT design. We then provide recommendations for interpreting CBT results, including procedures for estimating quantiles of the FIB concentration probability distribution, and the confidence of compliance with recognized water quality guidelines. We synthesize these values in custom user-oriented 'look-up' tables similar to those developed for other FIB water quality testing methods. Modified versions of our tables are currently distributed commercially as part of the CBT testing kit. Published by Elsevier B.V.

  11. Operation quality assessment model for video conference system

    NASA Astrophysics Data System (ADS)

    Du, Bangshi; Qi, Feng; Shao, Sujie; Wang, Ying; Li, Weijian

    2018-01-01

    Video conference system has become an important support platform for smart grid operation and management, its operation quality is gradually concerning grid enterprise. First, the evaluation indicator system covering network, business and operation maintenance aspects was established on basis of video conference system's operation statistics. Then, the operation quality assessment model combining genetic algorithm with regularized BP neural network was proposed, which outputs operation quality level of the system within a time period and provides company manager with some optimization advice. The simulation results show that the proposed evaluation model offers the advantages of fast convergence and high prediction accuracy in contrast with regularized BP neural network, and its generalization ability is superior to LM-BP neural network and Bayesian BP neural network.

  12. Analytical procedure validation and the quality by design paradigm.

    PubMed

    Rozet, Eric; Lebrun, Pierre; Michiels, Jean-François; Sondag, Perceval; Scherder, Tara; Boulanger, Bruno

    2015-01-01

    Since the adoption of the ICH Q8 document concerning the development of pharmaceutical processes following a quality by design (QbD) approach, there have been many discussions on the opportunity for analytical procedure developments to follow a similar approach. While development and optimization of analytical procedure following QbD principles have been largely discussed and described, the place of analytical procedure validation in this framework has not been clarified. This article aims at showing that analytical procedure validation is fully integrated into the QbD paradigm and is an essential step in developing analytical procedures that are effectively fit for purpose. Adequate statistical methodologies have also their role to play: such as design of experiments, statistical modeling, and probabilistic statements. The outcome of analytical procedure validation is also an analytical procedure design space, and from it, control strategy can be set.

  13. Comparison of quality of obturation and instrumentation time using hand files and two rotary file systems in primary molars: A single-blinded randomized controlled trial.

    PubMed

    Govindaraju, Lavanya; Jeevanandan, Ganesh; Subramanian, E M G

    2017-01-01

    In permanent dentition, different rotary systems are used for canal cleaning and shaping. Rotary instrumentation in pediatric dentistry is an emerging concept. A very few studies have compared the efficiency of rotary instrumentation for canal preparation in primary teeth. Hence, this study was performed to compare the obturation quality and instrumentation time of two rotary files systems - Protaper, Mtwo with hand files in primary molars. Forty-five primary mandibular molars were randomly allotted to one of the three groups. Instrumentation was done using K-files in Group 1; Protaper in Group 2; and Mtwo in Group 3. Instrumentation time was recorded. The canal filling quality was assessed as underfill, optimal fill, and overfill. Statistical analysis was done using Chi-square, ANOVA, and post hoc Tukey test. No significant difference was observed in the quality of obturation among three groups. Intergroup comparison of the instrumentation time showed a statistically significant difference between the three groups. The use of rotary instrumentation in primary teeth results in marked reduction in the instrumentation time and improves the quality of obturation.

  14. MO-C-18A-01: Advances in Model-Based 3D Image Reconstruction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, G; Pan, X; Stayman, J

    2014-06-15

    Recent years have seen the emergence of CT image reconstruction techniques that exploit physical models of the imaging system, photon statistics, and even the patient to achieve improved 3D image quality and/or reduction of radiation dose. With numerous advantages in comparison to conventional 3D filtered backprojection, such techniques bring a variety of challenges as well, including: a demanding computational load associated with sophisticated forward models and iterative optimization methods; nonlinearity and nonstationarity in image quality characteristics; a complex dependency on multiple free parameters; and the need to understand how best to incorporate prior information (including patient-specific prior images) within themore » reconstruction process. The advantages, however, are even greater – for example: improved image quality; reduced dose; robustness to noise and artifacts; task-specific reconstruction protocols; suitability to novel CT imaging platforms and noncircular orbits; and incorporation of known characteristics of the imager and patient that are conventionally discarded. This symposium features experts in 3D image reconstruction, image quality assessment, and the translation of such methods to emerging clinical applications. Dr. Chen will address novel methods for the incorporation of prior information in 3D and 4D CT reconstruction techniques. Dr. Pan will show recent advances in optimization-based reconstruction that enable potential reduction of dose and sampling requirements. Dr. Stayman will describe a “task-based imaging” approach that leverages models of the imaging system and patient in combination with a specification of the imaging task to optimize both the acquisition and reconstruction process. Dr. Samei will describe the development of methods for image quality assessment in such nonlinear reconstruction techniques and the use of these methods to characterize and optimize image quality and dose in a spectrum of clinical applications. Learning Objectives: Learn the general methodologies associated with model-based 3D image reconstruction. Learn the potential advantages in image quality and dose associated with model-based image reconstruction. Learn the challenges associated with computational load and image quality assessment for such reconstruction methods. Learn how imaging task can be incorporated as a means to drive optimal image acquisition and reconstruction techniques. Learn how model-based reconstruction methods can incorporate prior information to improve image quality, ease sampling requirements, and reduce dose.« less

  15. OPTIMIZING THROUGH CO-EVOLUTIONARY AVALANCHES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    S. BOETTCHER; A. PERCUS

    2000-08-01

    We explore a new general-purpose heuristic for finding high-quality solutions to hard optimization problems. The method, called extremal optimization, is inspired by ''self-organized critically,'' a concept introduced to describe emergent complexity in many physical systems. In contrast to Genetic Algorithms which operate on an entire ''gene-pool'' of possible solutions, extremal optimization successively replaces extremely undesirable elements of a sub-optimal solution with new, random ones. Large fluctuations, called ''avalanches,'' ensue that efficiently explore many local optima. Drawing upon models used to simulate far-from-equilibrium dynamics, extremal optimization complements approximation methods inspired by equilibrium statistical physics, such as simulated annealing. With only onemore » adjustable parameter, its performance has proved competitive with more elaborate methods, especially near phase transitions. Those phase transitions are found in the parameter space of most optimization problems, and have recently been conjectured to be the origin of some of the hardest instances in computational complexity. We will demonstrate how extremal optimization can be implemented for a variety of combinatorial optimization problems. We believe that extremal optimization will be a useful tool in the investigation of phase transitions in combinatorial optimization problems, hence valuable in elucidating the origin of computational complexity.« less

  16. Impact of database quality in knowledge-based treatment planning for prostate cancer.

    PubMed

    Wall, Phillip D H; Carver, Robert L; Fontenot, Jonas D

    2018-03-13

    This article investigates dose-volume prediction improvements in a common knowledge-based planning (KBP) method using a Pareto plan database compared with using a conventional, clinical plan database. Two plan databases were created using retrospective, anonymized data of 124 volumetric modulated arc therapy (VMAT) prostate cancer patients. The clinical plan database (CPD) contained planning data from each patient's clinically treated VMAT plan, which were manually optimized by various planners. The multicriteria optimization database (MCOD) contained Pareto-optimal plan data from VMAT plans created using a standardized multicriteria optimization protocol. Overlap volume histograms, incorporating fractional organ at risk volumes only within the treatment fields, were computed for each patient and used to match new patient anatomy to similar database patients. For each database patient, CPD and MCOD KBP predictions were generated for D 10 , D 30 , D 50 , D 65 , and D 80 of the bladder and rectum in a leave-one-out manner. Prediction achievability was evaluated through a replanning study on a subset of 31 randomly selected database patients using the best KBP predictions, regardless of plan database origin, as planning goals. MCOD predictions were significantly lower than CPD predictions for all 5 bladder dose-volumes and rectum D 50 (P = .004) and D 65 (P < .001), whereas CPD predictions for rectum D 10 (P = .005) and D 30 (P < .001) were significantly less than MCOD predictions. KBP predictions were statistically achievable in the replans for all predicted dose-volumes, excluding D 10 of bladder (P = .03) and rectum (P = .04). Compared with clinical plans, replans showed significant average reductions in D mean for bladder (7.8 Gy; P < .001) and rectum (9.4 Gy; P < .001), while maintaining statistically similar planning target volume, femoral head, and penile bulb dose. KBP dose-volume predictions derived from Pareto plans were more optimal overall than those resulting from manually optimized clinical plans, which significantly improved KBP-assisted plan quality. This work investigates how the plan quality of knowledge databases affects the performance and achievability of dose-volume predictions from a common knowledge-based planning approach for prostate cancer. Bladder and rectum dose-volume predictions derived from a database of standardized Pareto-optimal plans were compared with those derived from clinical plans manually designed by various planners. Dose-volume predictions from the Pareto plan database were significantly lower overall than those from the clinical plan database, without compromising achievability. Copyright © 2018 Elsevier Inc. All rights reserved.

  17. Impact of fair bowel preparation quality on adenoma and serrated polyp detection: data from the New Hampshire colonoscopy registry by using a standardized preparation-quality rating.

    PubMed

    Anderson, Joseph C; Butterly, Lynn F; Robinson, Christina M; Goodrich, Martha; Weiss, Julia E

    2014-09-01

    The effect of colon preparation quality on adenoma detection rates (ADRs) is unclear, partly because of lack of uniform colon preparation ratings in prior studies. The New Hampshire Colonoscopy Registry collects detailed data from colonoscopies statewide, by using a uniform preparation quality scale after the endoscopist has cleaned the mucosa. To compare the overall and proximal ADR and serrated polyp detection rates (SDR) in colonoscopies with differing levels of colon preparation quality. Cross-sectional. New Hampshire statewide registry. Patients undergoing colonoscopy. We examined colon preparation quality for 13,022 colonoscopies, graded by using specific descriptions provided to endoscopists. ADR and SDR are the number of colonoscopies with at least 1 adenoma or serrated polyp (excluding those in the rectum and/or sigmoid colon) detected divided by the total number of colonoscopies, for the preparation categories: optimal (excellent and/or good), fair, and poor. Overall/proximal ADR/SDR. The overall detection rates in examinations with fair colon preparation quality (SDR 8.9%; 95% confidence interval [CI], 7.4-10.7, ADR 27.1%; 95% CI, 24.6-30.0) were similar to rates observed in colonoscopies with optimal preparation quality (SDR 8.8%; 95% CI, 8.3-9.4, ADR 26.3%; 95% CI, 25.6-27.2). This finding also was observed for rates in the proximal colon. A logistic regression model (including withdrawal time) found that proximal ADR was statistically lower in the poor preparation category (odds ratio 0.45; 95% CI, 0.24-0.84; P < .01) than in adequately prepared colons. Homogeneous population. In our sample, there was no significant difference in overall or proximal ADR or SDR between colonoscopies with fair versus optimal colon preparation quality. Poor colon preparation quality may reduce the proximal ADR. Published by Mosby, Inc.

  18. UrQt: an efficient software for the Unsupervised Quality trimming of NGS data.

    PubMed

    Modolo, Laurent; Lerat, Emmanuelle

    2015-04-29

    Quality control is a necessary step of any Next Generation Sequencing analysis. Although customary, this step still requires manual interventions to empirically choose tuning parameters according to various quality statistics. Moreover, current quality control procedures that provide a "good quality" data set, are not optimal and discard many informative nucleotides. To address these drawbacks, we present a new quality control method, implemented in UrQt software, for Unsupervised Quality trimming of Next Generation Sequencing reads. Our trimming procedure relies on a well-defined probabilistic framework to detect the best segmentation between two segments of unreliable nucleotides, framing a segment of informative nucleotides. Our software only requires one user-friendly parameter to define the minimal quality threshold (phred score) to consider a nucleotide to be informative, which is independent of both the experiment and the quality of the data. This procedure is implemented in C++ in an efficient and parallelized software with a low memory footprint. We tested the performances of UrQt compared to the best-known trimming programs, on seven RNA and DNA sequencing experiments and demonstrated its optimality in the resulting tradeoff between the number of trimmed nucleotides and the quality objective. By finding the best segmentation to delimit a segment of good quality nucleotides, UrQt greatly increases the number of reads and of nucleotides that can be retained for a given quality objective. UrQt source files, binary executables for different operating systems and documentation are freely available (under the GPLv3) at the following address: https://lbbe.univ-lyon1.fr/-UrQt-.html .

  19. Optimal threshold estimator of a prognostic marker by maximizing a time-dependent expected utility function for a patient-centered stratified medicine.

    PubMed

    Dantan, Etienne; Foucher, Yohann; Lorent, Marine; Giral, Magali; Tessier, Philippe

    2018-06-01

    Defining thresholds of prognostic markers is essential for stratified medicine. Such thresholds are mostly estimated from purely statistical measures regardless of patient preferences potentially leading to unacceptable medical decisions. Quality-Adjusted Life-Years are a widely used preferences-based measure of health outcomes. We develop a time-dependent Quality-Adjusted Life-Years-based expected utility function for censored data that should be maximized to estimate an optimal threshold. We performed a simulation study to compare estimated thresholds when using the proposed expected utility approach and purely statistical estimators. Two applications illustrate the usefulness of the proposed methodology which was implemented in the R package ROCt ( www.divat.fr ). First, by reanalysing data of a randomized clinical trial comparing the efficacy of prednisone vs. placebo in patients with chronic liver cirrhosis, we demonstrate the utility of treating patients with a prothrombin level higher than 89%. Second, we reanalyze the data of an observational cohort of kidney transplant recipients: we conclude to the uselessness of the Kidney Transplant Failure Score to adapt the frequency of clinical visits. Applying such a patient-centered methodology may improve future transfer of novel prognostic scoring systems or markers in clinical practice.

  20. In-situ transesterification of seeds of invasive Chinese tallow trees (Triadica sebifera L.) in a microwave batch system (GREEN(3)) using hexane as co-solvent: Biodiesel production and process optimization.

    PubMed

    Barekati-Goudarzi, Mohamad; Boldor, Dorin; Nde, Divine B

    2016-02-01

    In-situ transesterification (simultaneous extraction and transesterification) of Chinese tallow tree seeds into methyl esters using a batch microwave system was investigated in this study. A high degree of oil extraction and efficient conversion of oil to biodiesel were found in the proposed range. The process was further optimized in terms of product yields and conversion rates using Doehlert optimization methodology. Based on the experimental results and statistical analysis, the optimal production yield conditions for this process were determined as: catalyst concentration of 1.74wt.%, solvent ratio about 3 (v/w), reaction time of 20min and temperature of 58.1°C. H(+)NMR was used to calculate reaction conversion. All methyl esters produced using this method met ASTM biodiesel quality specifications. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Using Statistical Process Control to Drive Improvement in Neonatal Care: A Practical Introduction to Control Charts.

    PubMed

    Gupta, Munish; Kaplan, Heather C

    2017-09-01

    Quality improvement (QI) is based on measuring performance over time, and variation in data measured over time must be understood to guide change and make optimal improvements. Common cause variation is natural variation owing to factors inherent to any process; special cause variation is unnatural variation owing to external factors. Statistical process control methods, and particularly control charts, are robust tools for understanding data over time and identifying common and special cause variation. This review provides a practical introduction to the use of control charts in health care QI, with a focus on neonatology. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. QSAR models for anti-malarial activity of 4-aminoquinolines.

    PubMed

    Masand, Vijay H; Toropov, Andrey A; Toropova, Alla P; Mahajan, Devidas T

    2014-03-01

    In the present study, predictive quantitative structure - activity relationship (QSAR) models for anti-malarial activity of 4-aminoquinolines have been developed. CORAL, which is freely available on internet (http://www.insilico.eu/coral), has been used as a tool of QSAR analysis to establish statistically robust QSAR model of anti-malarial activity of 4-aminoquinolines. Six random splits into the visible sub-system of the training and invisible subsystem of validation were examined. Statistical qualities for these splits vary, but in all these cases, statistical quality of prediction for anti-malarial activity was quite good. The optimal SMILES-based descriptor was used to derive the single descriptor based QSAR model for a data set of 112 aminoquinolones. All the splits had r(2)> 0.85 and r(2)> 0.78 for subtraining and validation sets, respectively. The three parametric multilinear regression (MLR) QSAR model has Q(2) = 0.83, R(2) = 0.84 and F = 190.39. The anti-malarial activity has strong correlation with presence/absence of nitrogen and oxygen at a topological distance of six.

  3. High Agreement and High Prevalence: The Paradox of Cohen's Kappa.

    PubMed

    Zec, Slavica; Soriani, Nicola; Comoretto, Rosanna; Baldi, Ileana

    2017-01-01

    Cohen's Kappa is the most used agreement statistic in literature. However, under certain conditions, it is affected by a paradox which returns biased estimates of the statistic itself. The aim of the study is to provide sufficient information which allows the reader to make an informed choice of the correct agreement measure, by underlining some optimal properties of Gwet's AC1 in comparison to Cohen's Kappa, using a real data example. During the process of literature review, we have asked a panel of three evaluators to come up with a judgment on the quality of 57 randomized controlled trials assigning a score to each trial using the Jadad scale. The quality was evaluated according to the following dimensions: adopted design, randomization unit, type of primary endpoint. With respect to each of the above described features, the agreement between the three evaluators has been calculated using Cohen's Kappa statistic and Gwet's AC1 statistic and, finally, the values have been compared with the observed agreement. The values of the Cohen's Kappa statistic would lead to believe that the agreement levels for the variables Unit, Design and Primary Endpoints are totally unsatisfactory. The AC1 statistic, on the contrary, shows plausible values which are in line with the respective values of the observed concordance. We conclude that it would always be appropriate to adopt the AC1 statistic, thus bypassing any risk of incurring the paradox and drawing wrong conclusions about the results of agreement analysis.

  4. ASSESSMENT OF CLINICAL IMAGE QUALITY IN PAEDIATRIC ABDOMINAL CT EXAMINATIONS: DEPENDENCY ON THE LEVEL OF ADAPTIVE STATISTICAL ITERATIVE RECONSTRUCTION (ASiR) AND THE TYPE OF CONVOLUTION KERNEL.

    PubMed

    Larsson, Joel; Båth, Magnus; Ledenius, Kerstin; Caisander, Håkan; Thilander-Klang, Anne

    2016-06-01

    The purpose of this study was to investigate the effect of different combinations of convolution kernel and the level of Adaptive Statistical iterative Reconstruction (ASiR™) on diagnostic image quality as well as visualisation of anatomical structures in paediatric abdominal computed tomography (CT) examinations. Thirty-five paediatric patients with abdominal pain with non-specified pathology undergoing abdominal CT were included in the study. Transaxial stacks of 5-mm-thick images were retrospectively reconstructed at various ASiR levels, in combination with three convolution kernels. Four paediatric radiologists rated the diagnostic image quality and the delineation of six anatomical structures in a blinded randomised visual grading study. Image quality at a given ASiR level was found to be dependent on the kernel, and a more edge-enhancing kernel benefitted from a higher ASiR level. An ASiR level of 70 % together with the Soft™ or Standard™ kernel was suggested to be the optimal combination for paediatric abdominal CT examinations. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. Seasonal assessment and apportionment of surface water pollution using multivariate statistical methods: Sinos River, southern Brazil.

    PubMed

    Alves, Darlan Daniel; Riegel, Roberta Plangg; de Quevedo, Daniela Müller; Osório, Daniela Montanari Migliavacca; da Costa, Gustavo Marques; do Nascimento, Carlos Augusto; Telöken, Franko

    2018-06-08

    Assessment of surface water quality is an issue of currently high importance, especially in polluted rivers which provide water for treatment and distribution as drinking water, as is the case of the Sinos River, southern Brazil. Multivariate statistical techniques allow a better understanding of the seasonal variations in water quality, as well as the source identification and source apportionment of water pollution. In this study, the multivariate statistical techniques of cluster analysis (CA), principal component analysis (PCA), and positive matrix factorization (PMF) were used, along with the Kruskal-Wallis test and Spearman's correlation analysis in order to interpret a water quality data set resulting from a monitoring program conducted over a period of almost two years (May 2013 to April 2015). The water samples were collected from the raw water inlet of the municipal water treatment plant (WTP) operated by the Water and Sewage Services of Novo Hamburgo (COMUSA). CA allowed the data to be grouped into three periods (autumn and summer (AUT-SUM); winter (WIN); spring (SPR)). Through the PCA, it was possible to identify that the most important parameters in contribution to water quality variations are total coliforms (TCOLI) in SUM-AUT, water level (WL), water temperature (WT), and electrical conductivity (EC) in WIN and color (COLOR) and turbidity (TURB) in SPR. PMF was applied to the complete data set and enabled the source apportionment water pollution through three factors, which are related to anthropogenic sources, such as the discharge of domestic sewage (mostly represented by Escherichia coli (ECOLI)), industrial wastewaters, and agriculture runoff. The results provided by this study demonstrate the contribution provided by the use of integrated statistical techniques in the interpretation and understanding of large data sets of water quality, showing also that this approach can be used as an efficient methodology to optimize indicators for water quality assessment.

  6. Combining local search with co-evolution in a remarkably simple way

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boettcher, S.; Percus, A.

    2000-05-01

    The authors explore a new general-purpose heuristic for finding high-quality solutions to hard optimization problem. The method, called extremal optimization, is inspired by self-organized criticality, a concept introduced to describe emergent complexity in physical systems. In contrast to genetic algorithms, which operate on an entire gene-pool of possible solutions, extremal optimization successively replaces extremely undesirable elements of a single sub-optimal solution with new, random ones. Large fluctuations, or avalanches, ensue that efficiently explore many local optima. Drawing upon models used to simulate far-from-equilibrium dynamics, extremal optimization complements heuristics inspired by equilibrium statistical physics, such as simulated annealing. With only onemore » adjustable parameter, its performance has proved competitive with more elaborate methods, especially near phase transitions. Phase transitions are found in many combinatorial optimization problems, and have been conjectured to occur in the region of parameter space containing the hardest instances. We demonstrate how extremal optimization can be implemented for a variety of hard optimization problems. We believe that this will be a useful tool in the investigation of phase transitions in combinatorial optimization, thereby helping to elucidate the origin of computational complexity.« less

  7. Make measurable what is not so: national monitoring of the status of persons with intellectual disability.

    PubMed

    Fujiura, Glenn T; Rutkowski-Kmitta, Violet; Owen, Randall

    2010-12-01

    Statistics are critical in holding governments accountable for the well-being of citizens with disability. International initiatives are underway to improve the quality of disability statistics, but meaningful ID data is exceptionally rare. The status of ID data was evaluated in a review of 12 national statistical systems. Recurring data collection by national ministries was identified and the availability of measures of poverty, exclusion, and disadvantage was assessed. A total of 131 recurring systems coordinated by 50 different ministries were identified. The majority included general disability but less than 25% of the systems screened ID. Of these, few provided policy-relevant data. The scope of ID data was dismal at best, though a significant statistical infrastructure exists for the integration of ID data. Advocacy will be necessary. There is no optimal form of data monitoring, and decisions regarding priorities in purpose, targeted audiences, and the goals for surveillance must be resolved.

  8. Design optimization for cost and quality: The robust design approach

    NASA Technical Reports Server (NTRS)

    Unal, Resit

    1990-01-01

    Designing reliable, low cost, and operable space systems has become the key to future space operations. Designing high quality space systems at low cost is an economic and technological challenge to the designer. A systematic and efficient way to meet this challenge is a new method of design optimization for performance, quality, and cost, called Robust Design. Robust Design is an approach for design optimization. It consists of: making system performance insensitive to material and subsystem variation, thus allowing the use of less costly materials and components; making designs less sensitive to the variations in the operating environment, thus improving reliability and reducing operating costs; and using a new structured development process so that engineering time is used most productively. The objective in Robust Design is to select the best combination of controllable design parameters so that the system is most robust to uncontrollable noise factors. The robust design methodology uses a mathematical tool called an orthogonal array, from design of experiments theory, to study a large number of decision variables with a significantly small number of experiments. Robust design also uses a statistical measure of performance, called a signal-to-noise ratio, from electrical control theory, to evaluate the level of performance and the effect of noise factors. The purpose is to investigate the Robust Design methodology for improving quality and cost, demonstrate its application by the use of an example, and suggest its use as an integral part of space system design process.

  9. Ranked solutions to a class of combinatorial optimizations - with applications in mass spectrometry based peptide sequencing

    NASA Astrophysics Data System (ADS)

    Doerr, Timothy; Alves, Gelio; Yu, Yi-Kuo

    2006-03-01

    Typical combinatorial optimizations are NP-hard; however, for a particular class of cost functions the corresponding combinatorial optimizations can be solved in polynomial time. This suggests a way to efficiently find approximate solutions - - find a transformation that makes the cost function as similar as possible to that of the solvable class. After keeping many high-ranking solutions using the approximate cost function, one may then re-assess these solutions with the full cost function to find the best approximate solution. Under this approach, it is important to be able to assess the quality of the solutions obtained, e.g., by finding the true ranking of kth best approximate solution when all possible solutions are considered exhaustively. To tackle this statistical issue, we provide a systematic method starting with a scaling function generated from the fininte number of high- ranking solutions followed by a convergent iterative mapping. This method, useful in a variant of the directed paths in random media problem proposed here, can also provide a statistical significance assessment for one of the most important proteomic tasks - - peptide sequencing using tandem mass spectrometry data.

  10. Recovery and purification of chitosanase produced by Bacillus cereus using expanded bed adsorption and central composite design.

    PubMed

    de Araújo, Nathália Kelly; Pimentel, Vanessa Carvalho; da Silva, Nayane Macedo Portela; de Araújo Padilha, Carlos Eduardo; de Macedo, Gorete Ribeiro; Dos Santos, Everaldo Silvino

    2016-02-01

    This study presents a system for expanded bed adsorption for the purification of chitosanase from broth extract in a single step. A chitosanase-producing strain was isolated and identified as Bacillus cereus C-01 and used to produce chitosanases. The expanded bed adsorption conditions for chitosanase purification were optimized statistically using STREAMLINE(TM) DEAE and a homemade column (2.6 × 30.0 cm). Dependent variables were defined by the quality criteria purification factor (P) and enzyme yield to optimize the chromatographic process. Statistical analyses showed that the optimum conditions for the maximum P were 150 cm/h load flow velocity, 6.0 cm settled bed height, and 7.36 cm distributor height. Distributor height had a strong influence on the process, considerably affecting both the P and enzyme yield. Optimizing the purification variables resulted in an approximately 3.66-fold increase in the P compared with the value under nonoptimized conditions. This system is promising for the recovery of chitosanase from B. cereus C-01 and is economically viable because it promotes the reduction steps. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Integrative image segmentation optimization and machine learning approach for high quality land-use and land-cover mapping using multisource remote sensing data

    NASA Astrophysics Data System (ADS)

    Gibril, Mohamed Barakat A.; Idrees, Mohammed Oludare; Yao, Kouame; Shafri, Helmi Zulhaidi Mohd

    2018-01-01

    The growing use of optimization for geographic object-based image analysis and the possibility to derive a wide range of information about the image in textual form makes machine learning (data mining) a versatile tool for information extraction from multiple data sources. This paper presents application of data mining for land-cover classification by fusing SPOT-6, RADARSAT-2, and derived dataset. First, the images and other derived indices (normalized difference vegetation index, normalized difference water index, and soil adjusted vegetation index) were combined and subjected to segmentation process with optimal segmentation parameters obtained using combination of spatial and Taguchi statistical optimization. The image objects, which carry all the attributes of the input datasets, were extracted and related to the target land-cover classes through data mining algorithms (decision tree) for classification. To evaluate the performance, the result was compared with two nonparametric classifiers: support vector machine (SVM) and random forest (RF). Furthermore, the decision tree classification result was evaluated against six unoptimized trials segmented using arbitrary parameter combinations. The result shows that the optimized process produces better land-use land-cover classification with overall classification accuracy of 91.79%, 87.25%, and 88.69% for SVM and RF, respectively, while the results of the six unoptimized classifications yield overall accuracy between 84.44% and 88.08%. Higher accuracy of the optimized data mining classification approach compared to the unoptimized results indicates that the optimization process has significant impact on the classification quality.

  12. A tutorial in displaying mass spectrometry-based proteomic data using heat maps.

    PubMed

    Key, Melissa

    2012-01-01

    Data visualization plays a critical role in interpreting experimental results of proteomic experiments. Heat maps are particularly useful for this task, as they allow us to find quantitative patterns across proteins and biological samples simultaneously. The quality of a heat map can be vastly improved by understanding the options available to display and organize the data in the heat map. This tutorial illustrates how to optimize heat maps for proteomics data by incorporating known characteristics of the data into the image. First, the concepts used to guide the creating of heat maps are demonstrated. Then, these concepts are applied to two types of analysis: visualizing spectral features across biological samples, and presenting the results of tests of statistical significance. For all examples we provide details of computer code in the open-source statistical programming language R, which can be used for biologists and clinicians with little statistical background. Heat maps are a useful tool for presenting quantitative proteomic data organized in a matrix format. Understanding and optimizing the parameters used to create the heat map can vastly improve both the appearance and the interoperation of heat map data.

  13. Monitoring the quality consistency of Weibizhi tablets by micellar electrokinetic chromatography fingerprints combined with multivariate statistical analyses, the simple quantified ratio fingerprint method, and the fingerprint-efficacy relationship.

    PubMed

    Liu, Yingchun; Sun, Guoxiang; Wang, Yan; Yang, Lanping; Yang, Fangliang

    2015-06-01

    Micellar electrokinetic chromatography fingerprinting combined with quantification was successfully developed and applied to monitor the quality consistency of Weibizhi tablets, which is a classical compound preparation used to treat gastric ulcers. A background electrolyte composed of 57 mmol/L sodium borate, 21 mmol/L sodium dodecylsulfate and 100 mmol/L sodium hydroxide was used to separate compounds. To optimize capillary electrophoresis conditions, multivariate statistical analyses were applied. First, the most important factors influencing sample electrophoretic behavior were identified as background electrolyte concentrations. Then, a Box-Benhnken design response surface strategy using resolution index RF as an integrated response was set up to correlate factors with response. RF reflects the effective signal amount, resolution, and signal homogenization in an electropherogram, thus, it was regarded as an excellent indicator. In fingerprint assessments, simple quantified ratio fingerprint method was established for comprehensive quality discrimination of traditional Chinese medicines/herbal medicines from qualitative and quantitative perspectives, by which the quality of 27 samples from the same manufacturer were well differentiated. In addition, the fingerprint-efficacy relationship between fingerprints and antioxidant activities was established using partial least squares regression, which provided important medicinal efficacy information for quality control. The present study offered an efficient means for monitoring Weibizhi tablet quality consistency. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Optimization of image quality in pulmonary CT angiography with low dose of contrast material

    NASA Astrophysics Data System (ADS)

    Assi, Abed Al Nasser; Abu Arra, Ali

    2017-06-01

    Aim: The aim of this study was to compare objective image quality data for patient pulmonary embolism between a conventional pulmonary CTA protocol with respect to a novel acquisition protocol performed with optimize radiation dose and less amount of iodinated contrast medium injected to the patients during PE scanning. Materials and Methods: Sixty- four patients with Pulmonary Embolism (PE) possibility, were examined using angio-CT protocol. Patients were randomly assigned to two groups: A (16 women and 16 men, with age ranging from 19-89 years) mean age, 62 years with standard deviation 16; range, 19-89 years) - injected contrast agent: 35-40 ml. B (16 women and 16 men, with age ranging from 28-86 years) - injected contrast agent: 70-80 ml. Other scanning parameters were kept constant. Pulmonary vessel enhancement and image noise were quantified; signal-to-noise ratio (SNR) and contrast-to-noise ratio (CNR) were calculated. Subjective vessel contrast was assessed by two radiologists in consensus. Result: A total of 14 cases of PE (22 %) were found in the evaluated of subjects (nine in group A, and five in group B). All PE cases were detected by the two readers. There was no significant difference in the size or location of the PEs between the two groups, the average image noise was 14 HU for group A and 19 HU for group B. The difference was not statistically significant (p = 0.09). Overall, the SNR and CNR were slightly higher on group B (24.4 and 22.5 respectively) compared with group A (19.4 and 16.4 respectively), but those differences were not statistically significant (p = 0.71 and p = 0.35, respectively). Conclusion and Discussion: Both groups that had been evaluated by pulmonary CTA protocol allow similar image quality to be achieved as compared with each other's, with optimize care dose for both protocol and contrast volume were reduced by 50 % in new protocol comparing to the conventional protocol.

  15. Correlation of the clinical and physical image quality in chest radiography for average adults with a computed radiography imaging system.

    PubMed

    Moore, C S; Wood, T J; Beavis, A W; Saunderson, J R

    2013-07-01

    The purpose of this study was to examine the correlation between the quality of visually graded patient (clinical) chest images and a quantitative assessment of chest phantom (physical) images acquired with a computed radiography (CR) imaging system. The results of a previously published study, in which four experienced image evaluators graded computer-simulated postero-anterior chest images using a visual grading analysis scoring (VGAS) scheme, were used for the clinical image quality measurement. Contrast-to-noise ratio (CNR) and effective dose efficiency (eDE) were used as physical image quality metrics measured in a uniform chest phantom. Although optimal values of these physical metrics for chest radiography were not derived in this work, their correlation with VGAS in images acquired without an antiscatter grid across the diagnostic range of X-ray tube voltages was determined using Pearson's correlation coefficient. Clinical and physical image quality metrics increased with decreasing tube voltage. Statistically significant correlations between VGAS and CNR (R=0.87, p<0.033) and eDE (R=0.77, p<0.008) were observed. Medical physics experts may use the physical image quality metrics described here in quality assurance programmes and optimisation studies with a degree of confidence that they reflect the clinical image quality in chest CR images acquired without an antiscatter grid. A statistically significant correlation has been found between the clinical and physical image quality in CR chest imaging. The results support the value of using CNR and eDE in the evaluation of quality in clinical thorax radiography.

  16. [Statistical process control applied to intensity modulated radiotherapy pretreatment controls with portal dosimetry].

    PubMed

    Villani, N; Gérard, K; Marchesi, V; Huger, S; François, P; Noël, A

    2010-06-01

    The first purpose of this study was to illustrate the contribution of statistical process control for a better security in intensity modulated radiotherapy (IMRT) treatments. This improvement is possible by controlling the dose delivery process, characterized by pretreatment quality control results. So, it is necessary to put under control portal dosimetry measurements (currently, the ionisation chamber measurements were already monitored by statistical process control thanks to statistical process control tools). The second objective was to state whether it is possible to substitute ionisation chamber with portal dosimetry in order to optimize time devoted to pretreatment quality control. At Alexis-Vautrin center, pretreatment quality controls in IMRT for prostate and head and neck treatments were performed for each beam of each patient. These controls were made with an ionisation chamber, which is the reference detector for the absolute dose measurement, and with portal dosimetry for the verification of dose distribution. Statistical process control is a statistical analysis method, coming from industry, used to control and improve the studied process quality. It uses graphic tools as control maps to follow-up process, warning the operator in case of failure, and quantitative tools to evaluate the process toward its ability to respect guidelines: this is the capability study. The study was performed on 450 head and neck beams and on 100 prostate beams. Control charts, showing drifts, both slow and weak, and also both strong and fast, of mean and standard deviation have been established and have shown special cause introduced (manual shift of the leaf gap of the multileaf collimator). Correlation between dose measured at one point, given with the EPID and the ionisation chamber has been evaluated at more than 97% and disagreement cases between the two measurements were identified. The study allowed to demonstrate the feasibility to reduce the time devoted to pretreatment controls, by substituting the ionisation chamber's measurements with those performed with EPID, and also that a statistical process control monitoring of data brought security guarantee. 2010 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.

  17. Characterization and optimization of cell seeding in scaffolds by factorial design: quality by design approach for skeletal tissue engineering.

    PubMed

    Chen, Yantian; Bloemen, Veerle; Impens, Saartje; Moesen, Maarten; Luyten, Frank P; Schrooten, Jan

    2011-12-01

    Cell seeding into scaffolds plays a crucial role in the development of efficient bone tissue engineering constructs. Hence, it becomes imperative to identify the key factors that quantitatively predict reproducible and efficient seeding protocols. In this study, the optimization of a cell seeding process was investigated using design of experiments (DOE) statistical methods. Five seeding factors (cell type, scaffold type, seeding volume, seeding density, and seeding time) were selected and investigated by means of two response parameters, critically related to the cell seeding process: cell seeding efficiency (CSE) and cell-specific viability (CSV). In addition, cell spatial distribution (CSD) was analyzed by Live/Dead staining assays. Analysis identified a number of statistically significant main factor effects and interactions. Among the five seeding factors, only seeding volume and seeding time significantly affected CSE and CSV. Also, cell and scaffold type were involved in the interactions with other seeding factors. Within the investigated ranges, optimal conditions in terms of CSV and CSD were obtained when seeding cells in a regular scaffold with an excess of medium. The results of this case study contribute to a better understanding and definition of optimal process parameters for cell seeding. A DOE strategy can identify and optimize critical process variables to reduce the variability and assists in determining which variables should be carefully controlled during good manufacturing practice production to enable a clinically relevant implant.

  18. Benefits of Outsourcing Strategy and IT Technology in Clinical Trials.

    PubMed

    Stamenovic, Milorad; Dobraca, Amra

    2017-09-01

    Aim of this paper is to describe some of models of outsourcing (numerous and response to different types of risks and increment of quality is based on individual problem and situation). Defining whether to outsource or not and whether to build or buy new information technology (IT) is question for contract research organization (CRO) and Pharma companies dealing with clinical trials, so the aim of this paper is to show business model that could make process of decision making less time consuming, less segmented and more efficient. This paper has a descriptive character, and represents a review of the literature that deals with the described issues. Outsourcing should enable optimal capacity flexibility (technology that is outsourced should be done only optimally not entirely). The goal with CRO partners is to establish equivalent levels of global quality, as extensions of other research and development activities (by unification of standards of performance of alliance partners with best standards of industry). IT is gaining greater significance at each stage of clinical study and represent an inevitable element of the quality of a clinical study (for the purpose of monitoring of clinical site activities, data collection and management, medical monitoring, statistical programming, statistical analysis, clinical study reporting). CROs are able to maximize work within the CRO global development, to support the notion of a fully integrated outsourced company; facilitate the use of similar business processes and norms, reusing established CRO standards and improve CRO operational decision making within outsourced studies by providing consistent and current information across outsourced and in-house activities.

  19. Benefits of Outsourcing Strategy and IT Technology in Clinical Trials

    PubMed Central

    Stamenovic, Milorad; Dobraca, Amra

    2017-01-01

    Introduction: Aim of this paper is to describe some of models of outsourcing (numerous and response to different types of risks and increment of quality is based on individual problem and situation). Defining whether to outsource or not and whether to build or buy new information technology (IT) is question for contract research organization (CRO) and Pharma companies dealing with clinical trials, so the aim of this paper is to show business model that could make process of decision making less time consuming, less segmented and more efficient. Material and methods: This paper has a descriptive character, and represents a review of the literature that deals with the described issues. Results: Outsourcing should enable optimal capacity flexibility (technology that is outsourced should be done only optimally not entirely). The goal with CRO partners is to establish equivalent levels of global quality, as extensions of other research and development activities (by unification of standards of performance of alliance partners with best standards of industry). IT is gaining greater significance at each stage of clinical study and represent an inevitable element of the quality of a clinical study (for the purpose of monitoring of clinical site activities, data collection and management, medical monitoring, statistical programming, statistical analysis, clinical study reporting). Conclusion: CROs are able to maximize work within the CRO global development, to support the notion of a fully integrated outsourced company; facilitate the use of similar business processes and norms, reusing established CRO standards and improve CRO operational decision making within outsourced studies by providing consistent and current information across outsourced and in-house activities. PMID:29114116

  20. A memetic optimization algorithm for multi-constrained multicast routing in ad hoc networks

    PubMed Central

    Hammad, Karim; El Bakly, Ahmed M.

    2018-01-01

    A mobile ad hoc network is a conventional self-configuring network where the routing optimization problem—subject to various Quality-of-Service (QoS) constraints—represents a major challenge. Unlike previously proposed solutions, in this paper, we propose a memetic algorithm (MA) employing an adaptive mutation parameter, to solve the multicast routing problem with higher search ability and computational efficiency. The proposed algorithm utilizes an updated scheme, based on statistical analysis, to estimate the best values for all MA parameters and enhance MA performance. The numerical results show that the proposed MA improved the delay and jitter of the network, while reducing computational complexity as compared to existing algorithms. PMID:29509760

  1. A memetic optimization algorithm for multi-constrained multicast routing in ad hoc networks.

    PubMed

    Ramadan, Rahab M; Gasser, Safa M; El-Mahallawy, Mohamed S; Hammad, Karim; El Bakly, Ahmed M

    2018-01-01

    A mobile ad hoc network is a conventional self-configuring network where the routing optimization problem-subject to various Quality-of-Service (QoS) constraints-represents a major challenge. Unlike previously proposed solutions, in this paper, we propose a memetic algorithm (MA) employing an adaptive mutation parameter, to solve the multicast routing problem with higher search ability and computational efficiency. The proposed algorithm utilizes an updated scheme, based on statistical analysis, to estimate the best values for all MA parameters and enhance MA performance. The numerical results show that the proposed MA improved the delay and jitter of the network, while reducing computational complexity as compared to existing algorithms.

  2. Space-filling designs for computer experiments: A review

    DOE PAGES

    Joseph, V. Roshan

    2016-01-29

    Improving the quality of a product/process using a computer simulator is a much less expensive option than the real physical testing. However, simulation using computationally intensive computer models can be time consuming and therefore, directly doing the optimization on the computer simulator can be infeasible. Experimental design and statistical modeling techniques can be used for overcoming this problem. This article reviews experimental designs known as space-filling designs that are suitable for computer simulations. In the review, a special emphasis is given for a recently developed space-filling design called maximum projection design. Furthermore, its advantages are illustrated using a simulation conductedmore » for optimizing a milling process.« less

  3. Space-filling designs for computer experiments: A review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joseph, V. Roshan

    Improving the quality of a product/process using a computer simulator is a much less expensive option than the real physical testing. However, simulation using computationally intensive computer models can be time consuming and therefore, directly doing the optimization on the computer simulator can be infeasible. Experimental design and statistical modeling techniques can be used for overcoming this problem. This article reviews experimental designs known as space-filling designs that are suitable for computer simulations. In the review, a special emphasis is given for a recently developed space-filling design called maximum projection design. Furthermore, its advantages are illustrated using a simulation conductedmore » for optimizing a milling process.« less

  4. Derivative Free Optimization of Complex Systems with the Use of Statistical Machine Learning Models

    DTIC Science & Technology

    2015-09-12

    AFRL-AFOSR-VA-TR-2015-0278 DERIVATIVE FREE OPTIMIZATION OF COMPLEX SYSTEMS WITH THE USE OF STATISTICAL MACHINE LEARNING MODELS Katya Scheinberg...COMPLEX SYSTEMS WITH THE USE OF STATISTICAL MACHINE LEARNING MODELS 5a.  CONTRACT NUMBER 5b.  GRANT NUMBER FA9550-11-1-0239 5c.  PROGRAM ELEMENT...developed, which has been the focus of our research. 15. SUBJECT TERMS optimization, Derivative-Free Optimization, Statistical Machine Learning 16. SECURITY

  5. Optimization of T2-weighted imaging for shoulder magnetic resonance arthrography by synthetic magnetic resonance imaging.

    PubMed

    Lee, Seung Hyun; Lee, Young Han; Hahn, Seok; Yang, Jaemoon; Song, Ho-Taek; Suh, Jin-Suck

    2017-01-01

    Background Synthetic magnetic resonance imaging (MRI) allows reformatting of various synthetic images by adjustment of scanning parameters such as repetition time (TR) and echo time (TE). Optimized MR images can be reformatted from T1, T2, and proton density (PD) values to achieve maximum tissue contrast between joint fluid and adjacent soft tissue. Purpose To demonstrate the method for optimization of TR and TE by synthetic MRI and to validate the optimized images by comparison with conventional shoulder MR arthrography (MRA) images. Material and Methods Thirty-seven shoulder MRA images acquired by synthetic MRI were retrospectively evaluated for PD, T1, and T2 values at the joint fluid and glenoid labrum. Differences in signal intensity between the fluid and labrum were observed between TR of 500-6000 ms and TE of 80-300 ms in T2-weighted (T2W) images. Conventional T2W and synthetic images were analyzed for diagnostic agreement of supraspinatus tendon abnormalities (kappa statistics) and image quality scores (one-way analysis of variance with post-hoc analysis). Results Optimized mean values of TR and TE were 2724.7 ± 1634.7 and 80.1 ± 0.4, respectively. Diagnostic agreement for supraspinatus tendon abnormalities between conventional and synthetic MR images was excellent (κ = 0.882). The mean image quality score of the joint space in optimized synthetic images was significantly higher compared with those in conventional and synthetic images (2.861 ± 0.351 vs. 2.556 ± 0.607 vs. 2.750 ± 0.439; P < 0.05). Conclusion Synthetic MRI with optimized TR and TE for shoulder MRA enables optimization of soft-tissue contrast.

  6. Quantification of the effects of quality investment on the Cost of Poor Quality: A quasi-experimental study

    NASA Astrophysics Data System (ADS)

    Tamimi, Abdallah Ibrahim

    Quality management is a fundamental challenge facing businesses. This research attempted to quantify the effect of quality investment on the Cost of Poor Quality (COPQ) in an aerospace company utilizing 3 years of quality data at United Launch Alliance, a Boeing -- Lockheed Martin Joint Venture Company. Statistical analysis tools, like multiple regressions, were used to quantify the relationship between quality investments and COPQ. Strong correlations were evident by the high correlation coefficient R2 and very small p-values in multiple regression analysis. The models in the study helped produce an Excel macro that based on preset constraints, optimized the level of quality spending to minimize COPQ. The study confirmed that as quality investments were increased, the COPQ decreased steadily until a point of diminishing return was reached. The findings may be used to develop an approach to reduce the COPQ and enhance product performance. Achieving superior quality in rocket launching enhances the accuracy, reliability, and mission success of delivering satellites to their precise orbits in pursuit of knowledge, peace, and freedom while assuring safety for the end user.

  7. Investigating the impact of design characteristics on statistical efficiency within discrete choice experiments: A systematic survey.

    PubMed

    Vanniyasingam, Thuva; Daly, Caitlin; Jin, Xuejing; Zhang, Yuan; Foster, Gary; Cunningham, Charles; Thabane, Lehana

    2018-06-01

    This study reviews simulation studies of discrete choice experiments to determine (i) how survey design features affect statistical efficiency, (ii) and to appraise their reporting quality. Statistical efficiency was measured using relative design (D-) efficiency, D-optimality, or D-error. For this systematic survey, we searched Journal Storage (JSTOR), Since Direct, PubMed, and OVID which included a search within EMBASE. Searches were conducted up to year 2016 for simulation studies investigating the impact of DCE design features on statistical efficiency. Studies were screened and data were extracted independently and in duplicate. Results for each included study were summarized by design characteristic. Previously developed criteria for reporting quality of simulation studies were also adapted and applied to each included study. Of 371 potentially relevant studies, 9 were found to be eligible, with several varying in study objectives. Statistical efficiency improved when increasing the number of choice tasks or alternatives; decreasing the number of attributes, attribute levels; using an unrestricted continuous "manipulator" attribute; using model-based approaches with covariates incorporating response behaviour; using sampling approaches that incorporate previous knowledge of response behaviour; incorporating heterogeneity in a model-based design; correctly specifying Bayesian priors; minimizing parameter prior variances; and using an appropriate method to create the DCE design for the research question. The simulation studies performed well in terms of reporting quality. Improvement is needed in regards to clearly specifying study objectives, number of failures, random number generators, starting seeds, and the software used. These results identify the best approaches to structure a DCE. An investigator can manipulate design characteristics to help reduce response burden and increase statistical efficiency. Since studies varied in their objectives, conclusions were made on several design characteristics, however, the validity of each conclusion was limited. Further research should be conducted to explore all conclusions in various design settings and scenarios. Additional reviews to explore other statistical efficiency outcomes and databases can also be performed to enhance the conclusions identified from this review.

  8. Advances for the Topographic Characterisation of SMC Materials

    PubMed Central

    Calvimontes, Alfredo; Grundke, Karina; Müller, Anett; Stamm, Manfred

    2009-01-01

    For a comprehensive study of Sheet Moulding Compound (SMC) surfaces, topographical data obtained by a contact-free optical method (chromatic aberration confocal imaging) were systematically acquired to characterise these surfaces with regard to their statistical, functional and volumetrical properties. Optimal sampling conditions (cut-off length and resolution) were obtained by a topographical-statistical procedure proposed in the present work. By using different length scales specific morphologies due to the influence of moulding conditions, metallic mould topography, glass fibre content and glass fibre orientation can be characterized. The aim of this study is to suggest a systematic topographical characterization procedure for composite materials in order to study and recognize the influence of production conditions on their surface quality.

  9. Assay optimization: a statistical design of experiments approach.

    PubMed

    Altekar, Maneesha; Homon, Carol A; Kashem, Mohammed A; Mason, Steven W; Nelson, Richard M; Patnaude, Lori A; Yingling, Jeffrey; Taylor, Paul B

    2007-03-01

    With the transition from manual to robotic HTS in the last several years, assay optimization has become a significant bottleneck. Recent advances in robotic liquid handling have made it feasible to reduce assay optimization timelines with the application of statistically designed experiments. When implemented, they can efficiently optimize assays by rapidly identifying significant factors, complex interactions, and nonlinear responses. This article focuses on the use of statistically designed experiments in assay optimization.

  10. Determination of optimal imaging settings for urolithiasis CT using filtered back projection (FBP), statistical iterative reconstruction (IR) and knowledge-based iterative model reconstruction (IMR): a physical human phantom study

    PubMed Central

    Choi, Se Y; Ahn, Seung H; Choi, Jae D; Kim, Jung H; Lee, Byoung-Il; Kim, Jeong-In

    2016-01-01

    Objective: The purpose of this study was to compare CT image quality for evaluating urolithiasis using filtered back projection (FBP), statistical iterative reconstruction (IR) and knowledge-based iterative model reconstruction (IMR) according to various scan parameters and radiation doses. Methods: A 5 × 5 × 5 mm3 uric acid stone was placed in a physical human phantom at the level of the pelvis. 3 tube voltages (120, 100 and 80 kV) and 4 current–time products (100, 70, 30 and 15 mAs) were implemented in 12 scans. Each scan was reconstructed with FBP, statistical IR (Levels 5–7) and knowledge-based IMR (soft-tissue Levels 1–3). The radiation dose, objective image quality and signal-to-noise ratio (SNR) were evaluated, and subjective assessments were performed. Results: The effective doses ranged from 0.095 to 2.621 mSv. Knowledge-based IMR showed better objective image noise and SNR than did FBP and statistical IR. The subjective image noise of FBP was worse than that of statistical IR and knowledge-based IMR. The subjective assessment scores deteriorated after a break point of 100 kV and 30 mAs. Conclusion: At the setting of 100 kV and 30 mAs, the radiation dose can be decreased by approximately 84% while keeping the subjective image assessment. Advances in knowledge: Patients with urolithiasis can be evaluated with ultralow-dose non-enhanced CT using a knowledge-based IMR algorithm at a substantially reduced radiation dose with the imaging quality preserved, thereby minimizing the risks of radiation exposure while providing clinically relevant diagnostic benefits for patients. PMID:26577542

  11. Comparison of the progressive resolution optimizer and photon optimizer in VMAT optimization for stereotactic treatments.

    PubMed

    Liu, Han; Sintay, Benjamin; Pearman, Keith; Shang, Qingyang; Hayes, Lane; Maurer, Jacqueline; Vanderstraeten, Caroline; Wiant, David

    2018-05-20

    The photon optimization (PO) algorithm was recently released by Varian Medical Systems to improve volumetric modulated arc therapy (VMAT) optimization within Eclipse (Version 13.5). The purpose of this study is to compare the PO algorithm with its predecessor, progressive resolution optimizer (PRO) for lung SBRT and brain SRS treatments. A total of 30 patients were selected retrospectively. Previously, all the plans were generated with the PRO algorithm within Eclipse Version 13.6. In the new version of PO algorithm (Version 15), dynamic conformal arcs (DCA) were first conformed to the target, then VMAT inverse planning was performed to achieve the desired dose distributions. PTV coverages were forced to be identical for the same patient for a fair comparison. SBRT plan quality was assessed based on selected dose-volume parameters, including the conformity index, V 20 for lung, V 30 Gy for chest wall, and D 0.035 cc for other critical organs. SRS plan quality was evaluated based on the conformity index and normal tissue volumes encompassed by the 12 and 6 Gy isodose lines (V 12 and V 6 ). The modulation complexity score (MCS) was used to compare plan complexity of two algorithms. No statistically significant differences between the PRO and PO algorithms were found for any of the dosimetric parameters studied, which indicates both algorithms produce comparable plan quality. Significant improvements in the gamma passing rate (increased from 97.0% to 99.2% for SBRT and 96.1% to 98.4% for SRS), MCS (average increase of 0.15 for SBRT and 0.10 for SRS), and delivery efficiency (MU reduction of 29.8% for SBRT and 28.3% for SRS) were found for the PO algorithm. MCS showed a strong correlation with the gamma passing rate, and an inverse correlation with total MUs used. The PO algorithm offers comparable plan quality to the PRO, while minimizing MLC complexity, thereby improving the delivery efficiency and accuracy. © 2018 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  12. Identifying Optimal Temporal Scale for the Correlation of AOD and Ground Measurements of PM2.5 to Improve the Model Performance in a Real-time Air Quality Estimation System

    NASA Technical Reports Server (NTRS)

    Li, Hui; Faruque, Fazlay; Williams, Worth; Al-Hamdan, Mohammad; Luvall, Jeffrey C.; Crosson, William; Rickman, Douglas; Limaye, Ashutosh

    2009-01-01

    Aerosol optical depth (AOD), an indirect estimate of particle matter using satellite observations, has shown great promise in improving estimates of PM 2.5 air quality surface. Currently, few studies have been conducted to explore the optimal way to apply AOD data to improve the model accuracy of PM 2.5 surface estimation in a real-time air quality system. We believe that two major aspects may be worthy of consideration in that area: 1) the approach to integrate satellite measurements with ground measurements in the pollution estimation, and 2) identification of an optimal temporal scale to calculate the correlation of AOD and ground measurements. This paper is focused on the second aspect on the identifying the optimal temporal scale to correlate AOD with PM2.5. Five following different temporal scales were chosen to evaluate their impact on the model performance: 1) within the last 3 days, 2) within the last 10 days, 3) within the last 30 days, 4) within the last 90 days, and 5) the time period with the highest correlation in a year. The model performance is evaluated for its accuracy, bias, and errors based on the following selected statistics: the Mean Bias, the Normalized Mean Bias, the Root Mean Square Error, Normalized Mean Error, and the Index of Agreement. This research shows that the model with the temporal scale of within the last 30 days displays the best model performance in this study area using 2004 and 2005 data sets.

  13. Modeling the BOD of Danube River in Serbia using spatial, temporal, and input variables optimized artificial neural network models.

    PubMed

    Šiljić Tomić, Aleksandra N; Antanasijević, Davor Z; Ristić, Mirjana Đ; Perić-Grujić, Aleksandra A; Pocajt, Viktor V

    2016-05-01

    This paper describes the application of artificial neural network models for the prediction of biological oxygen demand (BOD) levels in the Danube River. Eighteen regularly monitored water quality parameters at 17 stations on the river stretch passing through Serbia were used as input variables. The optimization of the model was performed in three consecutive steps: firstly, the spatial influence of a monitoring station was examined; secondly, the monitoring period necessary to reach satisfactory performance was determined; and lastly, correlation analysis was applied to evaluate the relationship among water quality parameters. Root-mean-square error (RMSE) was used to evaluate model performance in the first two steps, whereas in the last step, multiple statistical indicators of performance were utilized. As a result, two optimized models were developed, a general regression neural network model (labeled GRNN-1) that covers the monitoring stations from the Danube inflow to the city of Novi Sad and a GRNN model (labeled GRNN-2) that covers the stations from the city of Novi Sad to the border with Romania. Both models demonstrated good agreement between the predicted and actually observed BOD values.

  14. A System-Oriented Approach for the Optimal Control of Process Chains under Stochastic Influences

    NASA Astrophysics Data System (ADS)

    Senn, Melanie; Schäfer, Julian; Pollak, Jürgen; Link, Norbert

    2011-09-01

    Process chains in manufacturing consist of multiple connected processes in terms of dynamic systems. The properties of a product passing through such a process chain are influenced by the transformation of each single process. There exist various methods for the control of individual processes, such as classical state controllers from cybernetics or function mapping approaches realized by statistical learning. These controllers ensure that a desired state is obtained at process end despite of variations in the input and disturbances. The interactions between the single processes are thereby neglected, but play an important role in the optimization of the entire process chain. We divide the overall optimization into two phases: (1) the solution of the optimization problem by Dynamic Programming to find the optimal control variable values for each process for any encountered end state of its predecessor and (2) the application of the optimal control variables at runtime for the detected initial process state. The optimization problem is solved by selecting adequate control variables for each process in the chain backwards based on predefined quality requirements for the final product. For the demonstration of the proposed concept, we have chosen a process chain from sheet metal manufacturing with simplified transformation functions.

  15. Optimization image of magnetic resonance imaging (MRI) T2 fast spin echo (FSE) with variation echo train length (ETL) on the rupture tendon achilles case

    NASA Astrophysics Data System (ADS)

    Muzamil, Akhmad; Haries Firmansyah, Achmad

    2017-05-01

    The research was done the optimization image of Magnetic Resonance Imaging (MRI) T2 Fast Spin Echo (FSE) with variation Echo Train Length (ETL) on the Rupture Tendon Achilles case. This study aims to find the variations Echo Train Length (ETL) from the results of ankle’s MRI image and find out how the value of Echo Train Length (ETL) works on the MRI ankle to produce optimal image. In this research, the used ETL variations were 12 and 20 with the interval 2 on weighting T2 FSE sagittal. The study obtained the influence of Echo Train Length (ETL) on the quality of ankle MRI image sagittal using T2 FSE weighting and analyzed in 25 images of five patients. The data analysis has done quantitatively with the Region of Interest (ROI) directly on computer MRI image planes which conducted statistical tests Signal to Noise Ratio (SNR) and Contras to Noise Ratio (CNR). The Signal to Noise Ratio (SNR) was the highest finding on fat tissue, while the Contras to Noise Ratio (CNR) on the Tendon-Fat tissue with ETL 12 found in two patients. The statistics test showed the significant SNR value of the 0.007 (p<0.05) of Tendon tissue, 0.364 (p>0.05) of the Fat, 0.912 (p>0.05) of the Fibula, and 0.436 (p>0.05) of the Heel Bone. For the contrast to noise ratio (CNR) of the Tendon-FAT tissue was about 0.041 (p>0.05). The results of the study showed that ETL variation with T2 FSE sagittal weighting had difference at Tendon tissue and Tendon-Fat tissue for MRI imaging quality. SNR and CNR were an important aspect on imaging optimization process to give the diagnose information.

  16. An optimal merging technique for high-resolution precipitation products: OPTIMAL MERGING OF PRECIPITATION METHOD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shrestha, Roshan; Houser, Paul R.; Anantharaj, Valentine G.

    2011-04-01

    Precipitation products are currently available from various sources at higher spatial and temporal resolution than any time in the past. Each of the precipitation products has its strengths and weaknesses in availability, accuracy, resolution, retrieval techniques and quality control. By merging the precipitation data obtained from multiple sources, one can improve its information content by minimizing these issues. However, precipitation data merging poses challenges of scale-mismatch, and accurate error and bias assessment. In this paper we present Optimal Merging of Precipitation (OMP), a new method to merge precipitation data from multiple sources that are of different spatial and temporal resolutionsmore » and accuracies. This method is a combination of scale conversion and merging weight optimization, involving performance-tracing based on Bayesian statistics and trend-analysis, which yields merging weights for each precipitation data source. The weights are optimized at multiple scales to facilitate multiscale merging and better precipitation downscaling. Precipitation data used in the experiment include products from the 12-km resolution North American Land Data Assimilation (NLDAS) system, the 8-km resolution CMORPH and the 4-km resolution National Stage-IV QPE. The test cases demonstrate that the OMP method is capable of identifying a better data source and allocating a higher priority for them in the merging procedure, dynamically over the region and time period. This method is also effective in filtering out poor quality data introduced into the merging process.« less

  17. The use of IRMS, (1)H NMR and chemical analysis to characterise Italian and imported Tunisian olive oils.

    PubMed

    Camin, Federica; Pavone, Anita; Bontempo, Luana; Wehrens, Ron; Paolini, Mauro; Faberi, Angelo; Marianella, Rosa Maria; Capitani, Donatella; Vista, Silvia; Mannina, Luisa

    2016-04-01

    Isotope Ratio Mass Spectrometry (IRMS), (1)H Nuclear Magnetic Resonance ((1)H NMR), conventional chemical analysis and chemometric elaboration were used to assess quality and to define and confirm the geographical origin of 177 Italian PDO (Protected Denomination of Origin) olive oils and 86 samples imported from Tunisia. Italian olive oils were richer in squalene and unsaturated fatty acids, whereas Tunisian olive oils showed higher δ(18)O, δ(2)H, linoleic acid, saturated fatty acids β-sitosterol, sn-1 and 3 diglyceride values. Furthermore, all the Tunisian samples imported were of poor quality, with a K232 and/or acidity values above the limits established for extra virgin olive oils. By combining isotopic composition with (1)H NMR data using a multivariate statistical approach, a statistical model able to discriminate olive oil from Italy and those imported from Tunisia was obtained, with an optimal differentiation ability arriving at around 98%. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Optimization of laser butt welding parameters with multiple performance characteristics

    NASA Astrophysics Data System (ADS)

    Sathiya, P.; Abdul Jaleel, M. Y.; Katherasan, D.; Shanmugarajan, B.

    2011-04-01

    This paper presents a study carried out on 3.5 kW cooled slab laser welding of 904 L super austenitic stainless steel. The joints have butts welded with different shielding gases, namely argon, helium and nitrogen, at a constant flow rate. Super austenitic stainless steel (SASS) normally contains high amount of Mo, Cr, Ni, N and Mn. The mechanical properties are controlled to obtain good welded joints. The quality of the joint is evaluated by studying the features of weld bead geometry, such as bead width (BW) and depth of penetration (DOP). In this paper, the tensile strength and bead profiles (BW and DOP) of laser welded butt joints made of AISI 904 L SASS are investigated. The Taguchi approach is used as a statistical design of experiment (DOE) technique for optimizing the selected welding parameters. Grey relational analysis and the desirability approach are applied to optimize the input parameters by considering multiple output variables simultaneously. Confirmation experiments have also been conducted for both of the analyses to validate the optimized parameters.

  19. A data-driven approach to quality risk management.

    PubMed

    Alemayehu, Demissie; Alvir, Jose; Levenstein, Marcia; Nickerson, David

    2013-10-01

    An effective clinical trial strategy to ensure patient safety as well as trial quality and efficiency involves an integrated approach, including prospective identification of risk factors, mitigation of the risks through proper study design and execution, and assessment of quality metrics in real-time. Such an integrated quality management plan may also be enhanced by using data-driven techniques to identify risk factors that are most relevant in predicting quality issues associated with a trial. In this paper, we illustrate such an approach using data collected from actual clinical trials. Several statistical methods were employed, including the Wilcoxon rank-sum test and logistic regression, to identify the presence of association between risk factors and the occurrence of quality issues, applied to data on quality of clinical trials sponsored by Pfizer. ONLY A SUBSET OF THE RISK FACTORS HAD A SIGNIFICANT ASSOCIATION WITH QUALITY ISSUES, AND INCLUDED: Whether study used Placebo, whether an agent was a biologic, unusual packaging label, complex dosing, and over 25 planned procedures. Proper implementation of the strategy can help to optimize resource utilization without compromising trial integrity and patient safety.

  20. Evaluation of image quality and radiation dose by adaptive statistical iterative reconstruction technique level for chest CT examination.

    PubMed

    Hong, Sun Suk; Lee, Jong-Woong; Seo, Jeong Beom; Jung, Jae-Eun; Choi, Jiwon; Kweon, Dae Cheol

    2013-12-01

    The purpose of this research is to determine the adaptive statistical iterative reconstruction (ASIR) level that enables optimal image quality and dose reduction in the chest computed tomography (CT) protocol with ASIR. A chest phantom with 0-50 % ASIR levels was scanned and then noise power spectrum (NPS), signal and noise and the degree of distortion of peak signal-to-noise ratio (PSNR) and the root-mean-square error (RMSE) were measured. In addition, the objectivity of the experiment was measured using the American College of Radiology (ACR) phantom. Moreover, on a qualitative basis, five lesions' resolution, latitude and distortion degree of chest phantom and their compiled statistics were evaluated. The NPS value decreased as the frequency increased. The lowest noise and deviation were at the 20 % ASIR level, mean 126.15 ± 22.21. As a result of the degree of distortion, signal-to-noise ratio and PSNR at 20 % ASIR level were at the highest value as 31.0 and 41.52. However, maximum absolute error and RMSE showed the lowest deviation value as 11.2 and 16. In the ACR phantom study, all ASIR levels were within acceptable allowance of guidelines. The 20 % ASIR level performed best in qualitative evaluation at five lesions of chest phantom as resolution score 4.3, latitude 3.47 and the degree of distortion 4.25. The 20 % ASIR level was proved to be the best in all experiments, noise, distortion evaluation using ImageJ and qualitative evaluation of five lesions of a chest phantom. Therefore, optimal images as well as reduce radiation dose would be acquired when 20 % ASIR level in thoracic CT is applied.

  1. Whole dietary patterns to optimize cognitive function for military mission-readiness: a systematic review and recommendations for the field.

    PubMed

    Teo, Lynn; Crawford, Cindy; Yehuda, Rachel; Jaghab, Danny; Bingham, John J; Gallon, Matthew D; O'Connell, Meghan L; Chittum, Holly K; Arzola, Sonya M; Berry, Kevin

    2017-06-01

    Optimizing cognitive performance, particularly during times of high stress, is a prerequisite to mission-readiness among military personnel. It has been of interest to determine whether such performance could be enhanced through diet. This systematic review assesses the quality of the evidence for whole dietary patterns across various outcomes related to cognitive function in healthy adult populations to develop research recommendations for the military. PubMed, CINAHL, Embase, PsycInfo, and the Cochrane Library were searched. Peer-reviewed randomized controlled trials published in the English language were eligible. Fifteen included trials were assessed for methodological quality, and descriptive data were extracted. Of the 6 acceptable-quality studies, 1 demonstrated statistically nonsignificant results, whereas the other 5 showed conflicting results across the cognitive outcomes assessed. Due to the heterogeneity across the included studies, no recommendations could be reached concerning whether certain whole dietary patterns have an effect on cognitive outcomes in healthy populations. Specific recommendations for future research are offered. © The Author(s) 2017. Published by Oxford University Press on behalf of the International Life Sciences Institute. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  2. Correlation of the clinical and physical image quality in chest radiography for average adults with a computed radiography imaging system

    PubMed Central

    Wood, T J; Beavis, A W; Saunderson, J R

    2013-01-01

    Objective: The purpose of this study was to examine the correlation between the quality of visually graded patient (clinical) chest images and a quantitative assessment of chest phantom (physical) images acquired with a computed radiography (CR) imaging system. Methods: The results of a previously published study, in which four experienced image evaluators graded computer-simulated postero-anterior chest images using a visual grading analysis scoring (VGAS) scheme, were used for the clinical image quality measurement. Contrast-to-noise ratio (CNR) and effective dose efficiency (eDE) were used as physical image quality metrics measured in a uniform chest phantom. Although optimal values of these physical metrics for chest radiography were not derived in this work, their correlation with VGAS in images acquired without an antiscatter grid across the diagnostic range of X-ray tube voltages was determined using Pearson’s correlation coefficient. Results: Clinical and physical image quality metrics increased with decreasing tube voltage. Statistically significant correlations between VGAS and CNR (R=0.87, p<0.033) and eDE (R=0.77, p<0.008) were observed. Conclusion: Medical physics experts may use the physical image quality metrics described here in quality assurance programmes and optimisation studies with a degree of confidence that they reflect the clinical image quality in chest CR images acquired without an antiscatter grid. Advances in knowledge: A statistically significant correlation has been found between the clinical and physical image quality in CR chest imaging. The results support the value of using CNR and eDE in the evaluation of quality in clinical thorax radiography. PMID:23568362

  3. Experimental design approach to the process parameter optimization for laser welding of martensitic stainless steels in a constrained overlap configuration

    NASA Astrophysics Data System (ADS)

    Khan, M. M. A.; Romoli, L.; Fiaschi, M.; Dini, G.; Sarri, F.

    2011-02-01

    This paper presents an experimental design approach to process parameter optimization for the laser welding of martensitic AISI 416 and AISI 440FSe stainless steels in a constrained overlap configuration in which outer shell was 0.55 mm thick. To determine the optimal laser-welding parameters, a set of mathematical models were developed relating welding parameters to each of the weld characteristics. These were validated both statistically and experimentally. The quality criteria set for the weld to determine optimal parameters were the minimization of weld width and the maximization of weld penetration depth, resistance length and shearing force. Laser power and welding speed in the range 855-930 W and 4.50-4.65 m/min, respectively, with a fiber diameter of 300 μm were identified as the optimal set of process parameters. However, the laser power and welding speed can be reduced to 800-840 W and increased to 4.75-5.37 m/min, respectively, to obtain stronger and better welds.

  4. "Big Data" in Rheumatology: Intelligent Data Modeling Improves the Quality of Imaging Data.

    PubMed

    Landewé, Robert B M; van der Heijde, Désirée

    2018-05-01

    Analysis of imaging data in rheumatology is a challenge. Reliability of scores is an issue for several reasons. Signal-to-noise ratio of most imaging techniques is rather unfavorable (too little signal in relation to too much noise). Optimal use of all available data may help to increase credibility of imaging data, but knowledge of complicated statistical methodology and the help of skilled statisticians are required. Clinicians should appreciate the merits of sophisticated data modeling and liaise with statisticians to increase the quality of imaging results, as proper imaging studies in rheumatology imply more than a supersensitive imaging technique alone. Copyright © 2018 Elsevier Inc. All rights reserved.

  5. Evaluating the quality of a cell counting measurement process via a dilution series experimental design.

    PubMed

    Sarkar, Sumona; Lund, Steven P; Vyzasatya, Ravi; Vanguri, Padmavathy; Elliott, John T; Plant, Anne L; Lin-Gibson, Sheng

    2017-12-01

    Cell counting measurements are critical in the research, development and manufacturing of cell-based products, yet determining cell quantity with accuracy and precision remains a challenge. Validating and evaluating a cell counting measurement process can be difficult because of the lack of appropriate reference material. Here we describe an experimental design and statistical analysis approach to evaluate the quality of a cell counting measurement process in the absence of appropriate reference materials or reference methods. The experimental design is based on a dilution series study with replicate samples and observations as well as measurement process controls. The statistical analysis evaluates the precision and proportionality of the cell counting measurement process and can be used to compare the quality of two or more counting methods. As an illustration of this approach, cell counting measurement processes (automated and manual methods) were compared for a human mesenchymal stromal cell (hMSC) preparation. For the hMSC preparation investigated, results indicated that the automated method performed better than the manual counting methods in terms of precision and proportionality. By conducting well controlled dilution series experimental designs coupled with appropriate statistical analysis, quantitative indicators of repeatability and proportionality can be calculated to provide an assessment of cell counting measurement quality. This approach does not rely on the use of a reference material or comparison to "gold standard" methods known to have limited assurance of accuracy and precision. The approach presented here may help the selection, optimization, and/or validation of a cell counting measurement process. Published by Elsevier Inc.

  6. Data Acquisition and Preprocessing in Studies on Humans: What Is Not Taught in Statistics Classes?

    PubMed

    Zhu, Yeyi; Hernandez, Ladia M; Mueller, Peter; Dong, Yongquan; Forman, Michele R

    2013-01-01

    The aim of this paper is to address issues in research that may be missing from statistics classes and important for (bio-)statistics students. In the context of a case study, we discuss data acquisition and preprocessing steps that fill the gap between research questions posed by subject matter scientists and statistical methodology for formal inference. Issues include participant recruitment, data collection training and standardization, variable coding, data review and verification, data cleaning and editing, and documentation. Despite the critical importance of these details in research, most of these issues are rarely discussed in an applied statistics program. One reason for the lack of more formal training is the difficulty in addressing the many challenges that can possibly arise in the course of a study in a systematic way. This article can help to bridge this gap between research questions and formal statistical inference by using an illustrative case study for a discussion. We hope that reading and discussing this paper and practicing data preprocessing exercises will sensitize statistics students to these important issues and achieve optimal conduct, quality control, analysis, and interpretation of a study.

  7. Correlated Uncertainties in Radiation Shielding Effectiveness

    NASA Technical Reports Server (NTRS)

    Werneth, Charles M.; Maung, Khin Maung; Blattnig, Steve R.; Clowdsley, Martha S.; Townsend, Lawrence W.

    2013-01-01

    The space radiation environment is composed of energetic particles which can deliver harmful doses of radiation that may lead to acute radiation sickness, cancer, and even death for insufficiently shielded crew members. Spacecraft shielding must provide structural integrity and minimize the risk associated with radiation exposure. The risk of radiation exposure induced death (REID) is a measure of the risk of dying from cancer induced by radiation exposure. Uncertainties in the risk projection model, quality factor, and spectral fluence are folded into the calculation of the REID by sampling from probability distribution functions. Consequently, determining optimal shielding materials that reduce the REID in a statistically significant manner has been found to be difficult. In this work, the difference of the REID distributions for different materials is used to study the effect of composition on shielding effectiveness. It is shown that the use of correlated uncertainties allows for the determination of statistically significant differences between materials despite the large uncertainties in the quality factor. This is in contrast to previous methods where uncertainties have been generally treated as uncorrelated. It is concluded that the use of correlated quality factor uncertainties greatly reduces the uncertainty in the assessment of shielding effectiveness for the mitigation of radiation exposure.

  8. The Effect of Personalized Guideline-Concordant Treatment on Quality of Life and Functional Impairment in Bipolar Disorder

    PubMed Central

    Sylvia, Louisa G.; Rabideau, Dustin J.; Nierenberg, Andrew A.; Bowden, Charles L.; Friedman, Edward S.; Iosifescu, Dan V.; Thase, Michael E.; Ketter, Terence; Greiter, Elizabeth A.; Calabrese, Joseph R.; Leon, Andrew C.; Ostacher, Michael J.; Reilly-Harrington, Noreen

    2014-01-01

    Objectives The aims of this study were to evaluate correlates and predictors of life functioning and quality of life in bipolar disorder during a comparative effectiveness trial of moderate doses of lithium. Methods In the Lithium treatment moderate-dose use study (LiTMUS), 283 symptomatic outpatients with bipolar disorder type I or II were randomized to receive lithium plus ”optimal personalized treatment (OPT), or OPT alone. Participants were assessed using structured diagnostic interviews, clinician-rated blinded assessments, and questionnaires. We employ linear mixed effects models to test the effect of treatment overall and adjunct lithium specifically on quality of life or functioning. Similar models are used to examine the association of baseline demographics and clinical features with quality of life and life functioning. Results Quality of life and impaired functioning at baseline were associated with lower income, higher depressive severity, and more psychiatric comorbid conditions. Over six months, patients in both treatment groups improved in quality of life and life functioning (p-values < 0.0001); without a statistically significant difference between the two treatment groups (p-values > 0.05). Within the lithium group, improvement in quality of life and functioning were not associated with concurrent lithium levels at week 12 or week 24 (p-values > 0.05). Lower baseline depressive severity and younger age of onset predicted less improvement in functioning over six months. Conclusions Optimized care for bipolar disorder improves overall quality of life and life functioning, with no additional benefit from adjunct moderate doses of lithium. Illness burden and psychosocial stressors were associated with worse quality of life and lower functioning in individuals with bipolar disorder. PMID:25194782

  9. Modeling and optimization of dough recipe for breadsticks

    NASA Astrophysics Data System (ADS)

    Krivosheev, A. Yu; Ponomareva, E. I.; Zhuravlev, A. A.; Lukina, S. I.; Alekhina, N. N.

    2018-05-01

    During the work, the authors studied the combined effect of non-traditional raw materials on indicators of quality breadsticks, mathematical methods of experiment planning were applied. The main factors chosen were the dosages of flaxseed flour and grape seed oil. The output parameters were the swelling factor of the products and their strength. Optimization of the formulation composition of the dough for bread sticks was carried out by experimental- statistical methods. As a result of the experiment, mathematical models were constructed in the form of regression equations, adequately describing the process of studies. The statistical processing of the experimental data was carried out by the criteria of Student, Cochran and Fisher (with a confidence probability of 0.95). A mathematical interpretation of the regression equations was given. Optimization of the formulation of the dough for bread sticks was carried out by the method of uncertain Lagrange multipliers. The rational values of the factors were determined: the dosage of flaxseed flour - 14.22% and grape seed oil - 7.8%, ensuring the production of products with the best combination of swelling ratio and strength. On the basis of the data obtained, a recipe and a method for the production of breadsticks "Idea" were proposed (TU (Russian Technical Specifications) 9117-443-02068106-2017).

  10. Consistent integration of experimental and ab initio data into molecular and coarse-grained models

    NASA Astrophysics Data System (ADS)

    Vlcek, Lukas

    As computer simulations are increasingly used to complement or replace experiments, highly accurate descriptions of physical systems at different time and length scales are required to achieve realistic predictions. The questions of how to objectively measure model quality in relation to reference experimental or ab initio data, and how to transition seamlessly between different levels of resolution are therefore of prime interest. To address these issues, we use the concept of statistical distance to define a measure of similarity between statistical mechanical systems, i.e., a model and its target, and show that its minimization leads to general convergence of the systems' measurable properties. Through systematic coarse-graining, we arrive at appropriate expressions for optimization loss functions consistently incorporating microscopic ab initio data as well as macroscopic experimental data. The design of coarse-grained and multiscale models is then based on factoring the model system partition function into terms describing the system at different resolution levels. The optimization algorithm takes advantage of thermodynamic perturbation expressions for fast exploration of the model parameter space, enabling us to scan millions of parameter combinations per hour on a single CPU. The robustness and generality of the new model optimization framework and its efficient implementation are illustrated on selected examples including aqueous solutions, magnetic systems, and metal alloys.

  11. Can purchasing information be used to predict adherence to cardiovascular medications? An analysis of linked retail pharmacy and insurance claims data

    PubMed Central

    Krumme, Alexis A; Sanfélix-Gimeno, Gabriel; Franklin, Jessica M; Isaman, Danielle L; Mahesri, Mufaddal; Matlin, Olga S; Shrank, William H; Brennan, Troyen A; Brill, Gregory; Choudhry, Niteesh K

    2016-01-01

    Objective The use of retail purchasing data may improve adherence prediction over approaches using healthcare insurance claims alone. Design Retrospective. Setting and participants A cohort of patients who received prescription medication benefits through CVS Caremark, used a CVS Pharmacy ExtraCare Health Care (ECHC) loyalty card, and initiated a statin medication in 2011. Outcome We evaluated associations between retail purchasing patterns and optimal adherence to statins in the 12 subsequent months. Results Among 11 010 statin initiators, 43% were optimally adherent at 12 months of follow-up. Greater numbers of store visits per month and dollar amount per visit were positively associated with optimal adherence, as was making a purchase on the same day as filling a prescription (p<0.0001 for all). Models to predict adherence using retail purchase variables had low discriminative ability (C-statistic: 0.563), while models with both clinical and retail purchase variables achieved a C-statistic of 0.617. Conclusions While the use of retail purchases may improve the discriminative ability of claims-based approaches, these data alone appear inadequate for adherence prediction, even with the addition of more complex analytical approaches. Nevertheless, associations between retail purchasing behaviours and adherence could inform the development of quality improvement interventions. PMID:28186924

  12. An effective and optimal quality control approach for green energy manufacturing using design of experiments framework and evolutionary algorithm

    NASA Astrophysics Data System (ADS)

    Saavedra, Juan Alejandro

    Quality Control (QC) and Quality Assurance (QA) strategies vary significantly across industries in the manufacturing sector depending on the product being built. Such strategies range from simple statistical analysis and process controls, decision-making process of reworking, repairing, or scraping defective product. This study proposes an optimal QC methodology in order to include rework stations during the manufacturing process by identifying the amount and location of these workstations. The factors that are considered to optimize these stations are cost, cycle time, reworkability and rework benefit. The goal is to minimize the cost and cycle time of the process, but increase the reworkability and rework benefit. The specific objectives of this study are: (1) to propose a cost estimation model that includes energy consumption, and (2) to propose an optimal QC methodology to identify quantity and location of rework workstations. The cost estimation model includes energy consumption as part of the product direct cost. The cost estimation model developed allows the user to calculate product direct cost as the quality sigma level of the process changes. This provides a benefit because a complete cost estimation calculation does not need to be performed every time the processes yield changes. This cost estimation model is then used for the QC strategy optimization process. In order to propose a methodology that provides an optimal QC strategy, the possible factors that affect QC were evaluated. A screening Design of Experiments (DOE) was performed on seven initial factors and identified 3 significant factors. It reflected that one response variable was not required for the optimization process. A full factorial DOE was estimated in order to verify the significant factors obtained previously. The QC strategy optimization is performed through a Genetic Algorithm (GA) which allows the evaluation of several solutions in order to obtain feasible optimal solutions. The GA evaluates possible solutions based on cost, cycle time, reworkability and rework benefit. Finally it provides several possible solutions because this is a multi-objective optimization problem. The solutions are presented as chromosomes that clearly state the amount and location of the rework stations. The user analyzes these solutions in order to select one by deciding which of the four factors considered is most important depending on the product being manufactured or the company's objective. The major contribution of this study is to provide the user with a methodology used to identify an effective and optimal QC strategy that incorporates the number and location of rework substations in order to minimize direct product cost, and cycle time, and maximize reworkability, and rework benefit.

  13. Welded joints integrity analysis and optimization for fiber laser welding of dissimilar materials

    NASA Astrophysics Data System (ADS)

    Ai, Yuewei; Shao, Xinyu; Jiang, Ping; Li, Peigen; Liu, Yang; Liu, Wei

    2016-11-01

    Dissimilar materials welded joints provide many advantages in power, automotive, chemical, and spacecraft industries. The weld bead integrity which is determined by process parameters plays a significant role in the welding quality during the fiber laser welding (FLW) of dissimilar materials. In this paper, an optimization method by taking the integrity of the weld bead and weld area into consideration is proposed for FLW of dissimilar materials, the low carbon steel and stainless steel. The relationships between the weld bead integrity and process parameters are developed by the genetic algorithm optimized back propagation neural network (GA-BPNN). The particle swarm optimization (PSO) algorithm is taken for optimizing the predicted outputs from GA-BPNN for the objective. Through the optimization process, the desired weld bead with good integrity and minimum weld area are obtained and the corresponding microstructure and microhardness are excellent. The mechanical properties of the optimized joints are greatly improved compared with that of the un-optimized welded joints. Moreover, the effects of significant factors are analyzed based on the statistical approach and the laser power (LP) is identified as the most significant factor on the weld bead integrity and weld area. The results indicate that the proposed method is effective for improving the reliability and stability of welded joints in the practical production.

  14. Granulocyte-colony stimulating factor in the prevention of postoperative infectious complications and sub-optimal recovery from operation in patients with colorectal cancer and increased preoperative risk (ASA 3 and 4). Protocol for a controlled clinical trial developed by consensus of an international study group. Part two: design of the study.

    PubMed

    Bauhofer, A; Lorenz, W; Stinner, B; Rothmund, M; Koller, M; Sitter, H; Celik, I; Farndon, J R; Fingerhut, A; Hay, J M; Lefering, R; Lorijn, R; Nyström, P O; Schäfer, H; Schein, M; Solomkin, J; Troidl, H; Volk, H D; Wittmann, D H; Wyatt, J

    2001-04-01

    Presentation of a new type of a study protocol for evaluation of the effectiveness of an immune modifier (rhG-CSF, filgrastim): prevention of postoperative infectious complications and of sub-optimal recovery from operation in patients with colorectal cancer and increased preoperative risk (ASA 3 and 4). This part describes the design of the randomised, placebo controlled, double-blinded, single-centre study performed at an university hospital (n = 40 patients for each group). The trial design includes the following elements for a prototype protocol: * The study population is restricted to patients with colorectal cancer, including a left sided resection and an increased perioperative risk (ASA 3 and 4). * Patients are allocated by random to the control or treatment group. * The double blinding strategy of the trial is assessed by psychometric indices. * An endpoint construct with quality of life (EORTC QLQ-C30) and a recovery index (modified Mc Peek index) are used as primary endpoints. Qualitative analysis of clinical relevance of the endpoints is performed by both patients and doctors. * Statistical analysis uses an area under the curve (AUC) model for improvement of quality of life on leaving hospital and two and six months after operation. A confirmatory statistical model with quality of life as the first primary endpoint in the hierarchic test procedure is used. Expectations of patients and surgeons and the negative affect are analysed by social psychological scales. This study design differs from other trials on preoperative prophylaxis and postoperative recovery, and has been developed to try a new concept and avoid previous failures.

  15. The effect of foot reflexology and back massage on hemodialysis patients' fatigue and sleep quality.

    PubMed

    Unal, Kevser Sevgi; Balci Akpinar, Reva

    2016-08-01

    The aim of this study is to examine the effectiveness of foot reflexology and back massage on optimizing the sleep quality and reducing the fatigue of hemodialysis patients. The study includes 105 volunteer patients who were registered at a private dialysis clinic and were receiving hemodialysis treatment. Foot reflexology and back massage were administered to the patients two times a week for four weeks. The Visual Analogue Scale for Fatigue and the Pittsburg Sleep Quality Index were used to collect data. The differences between the pretest and posttest score averages of the patients on the Visual Analogue Scale for Fatigue and the Pittsburg Sleep Quality Index were statistically significant (p < 0.001). Foot reflexology and back massage were shown to improve the sleep quality and reduce the fatigue of hemodialysis patients. Compared to back massage, foot reflexology was determined to be more effective. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Single-case research design in pediatric psychology: considerations regarding data analysis.

    PubMed

    Cohen, Lindsey L; Feinstein, Amanda; Masuda, Akihiko; Vowles, Kevin E

    2014-03-01

    Single-case research allows for an examination of behavior and can demonstrate the functional relation between intervention and outcome in pediatric psychology. This review highlights key assumptions, methodological and design considerations, and options for data analysis. Single-case methodology and guidelines are reviewed with an in-depth focus on visual and statistical analyses. Guidelines allow for the careful evaluation of design quality and visual analysis. A number of statistical techniques have been introduced to supplement visual analysis, but to date, there is no consensus on their recommended use in single-case research design. Single-case methodology is invaluable for advancing pediatric psychology science and practice, and guidelines have been introduced to enhance the consistency, validity, and reliability of these studies. Experts generally agree that visual inspection is the optimal method of analysis in single-case design; however, statistical approaches are becoming increasingly evaluated and used to augment data interpretation.

  17. [Development and application of emergency medical information management system].

    PubMed

    Wang, Fang; Zhu, Baofeng; Chen, Jianrong; Wang, Jian; Gu, Chaoli; Liu, Buyun

    2011-03-01

    To meet the needs of clinical practice of rescuing critical illness and develop the information management system of the emergency medicine. Microsoft Visual FoxPro, which is one of Microsoft's visual programming tool, is used to develop computer-aided system included the information management system of the emergency medicine. The system mainly consists of the module of statistic analysis, the module of quality control of emergency rescue, the module of flow path of emergency rescue, the module of nursing care in emergency rescue, and the module of rescue training. It can realize the system management of emergency medicine and,process and analyze the emergency statistical data. This system is practical. It can optimize emergency clinical pathway, and meet the needs of clinical rescue.

  18. Compact disk error measurements

    NASA Technical Reports Server (NTRS)

    Howe, D.; Harriman, K.; Tehranchi, B.

    1993-01-01

    The objectives of this project are as follows: provide hardware and software that will perform simple, real-time, high resolution (single-byte) measurement of the error burst and good data gap statistics seen by a photoCD player read channel when recorded CD write-once discs of variable quality (i.e., condition) are being read; extend the above system to enable measurement of the hard decision (i.e., 1-bit error flags) and soft decision (i.e., 2-bit error flags) decoding information that is produced/used by the Cross Interleaved - Reed - Solomon - Code (CIRC) block decoder employed in the photoCD player read channel; construct a model that uses data obtained via the systems described above to produce meaningful estimates of output error rates (due to both uncorrected ECC words and misdecoded ECC words) when a CD disc having specific (measured) error statistics is read (completion date to be determined); and check the hypothesis that current adaptive CIRC block decoders are optimized for pressed (DAD/ROM) CD discs. If warranted, do a conceptual design of an adaptive CIRC decoder that is optimized for write-once CD discs.

  19. Optimizing fixed observational assets in a coastal observatory

    NASA Astrophysics Data System (ADS)

    Frolov, Sergey; Baptista, António; Wilkin, Michael

    2008-11-01

    Proliferation of coastal observatories necessitates an objective approach to managing of observational assets. In this article, we used our experience in the coastal observatory for the Columbia River estuary and plume to identify and address common problems in managing of fixed observational assets, such as salinity, temperature, and water level sensors attached to pilings and moorings. Specifically, we addressed the following problems: assessing the quality of an existing array, adding stations to an existing array, removing stations from an existing array, validating an array design, and targeting of an array toward data assimilation or monitoring. Our analysis was based on a combination of methods from oceanographic and statistical literature, mainly on the statistical machinery of the best linear unbiased estimator. The key information required for our analysis was the covariance structure for a field of interest, which was computed from the output of assimilated and non-assimilated models of the Columbia River estuary and plume. The network optimization experiments in the Columbia River estuary and plume proved to be successful, largely withstanding the scrutiny of sensitivity and validation studies, and hence providing valuable insight into optimization and operation of the existing observational network. Our success in the Columbia River estuary and plume suggest that algorithms for optimal placement of sensors are reaching maturity and are likely to play a significant role in the design of emerging ocean observatories, such as the United State's ocean observation initiative (OOI) and integrated ocean observing system (IOOS) observatories, and smaller regional observatories.

  20. The Taguchi Method Application to Improve the Quality of a Sustainable Process

    NASA Astrophysics Data System (ADS)

    Titu, A. M.; Sandu, A. V.; Pop, A. B.; Titu, S.; Ciungu, T. C.

    2018-06-01

    Taguchi’s method has always been a method used to improve the quality of the analyzed processes and products. This research shows an unusual situation, namely the modeling of some parameters, considered technical parameters, in a process that is wanted to be durable by improving the quality process and by ensuring quality using an experimental research method. Modern experimental techniques can be applied in any field and this study reflects the benefits of interacting between the agriculture sustainability principles and the Taguchi’s Method application. The experimental method used in this practical study consists of combining engineering techniques with experimental statistical modeling to achieve rapid improvement of quality costs, in fact seeking optimization at the level of existing processes and the main technical parameters. The paper is actually a purely technical research that promotes a technical experiment using the Taguchi method, considered to be an effective method since it allows for rapid achievement of 70 to 90% of the desired optimization of the technical parameters. The missing 10 to 30 percent can be obtained with one or two complementary experiments, limited to 2 to 4 technical parameters that are considered to be the most influential. Applying the Taguchi’s Method in the technique and not only, allowed the simultaneous study in the same experiment of the influence factors considered to be the most important in different combinations and, at the same time, determining each factor contribution.

  1. Implementing clinical protocols in oncology: quality gaps and the learning curve phenomenon.

    PubMed

    Kedikoglou, Simos; Syrigos, Konstantinos; Skalkidis, Yannis; Ploiarchopoulou, Fani; Dessypris, Nick; Petridou, Eleni

    2005-08-01

    The quality improvement effort in clinical practice has focused mostly on 'performance quality', i.e. on the development of comprehensive, evidence-based guidelines. This study aimed to assess the 'conformance quality', i.e. the extent to which guidelines once developed are correctly and consistently applied. It also aimed to assess the existence of quality gaps in the treatment of certain patient segments as defined by age or gender and to investigate methods to improve overall conformance quality. A retrospective audit of clinical practice in a well-defined oncology setting was undertaken and the results compared to those obtained from prospectively applying an internally developed clinical protocol in the same setting and using specific tools to increase conformance quality. All indicators showed improvement after the implementation of the protocol that in many cases reached statistical significance, while in the entire cohort advanced age was associated (although not significantly) with sub-optimal delivery of care. A 'learning curve' phenomenon in the implementation of quality initiatives was detected, with all indicators improving substantially in the second part of the prospective study. Clinicians should pay separate attention to the implementation of chosen protocols and employ specific tools to increase conformance quality in patient care.

  2. Omega-3 polyunsaturated fatty acids to optimize cognitive function for military mission-readiness: a systematic review and recommendations for the field.

    PubMed

    Teo, Lynn; Crawford, Cindy; Yehuda, Rachel; Jaghab, Danny; Bingham, John J; Chittum, Holly K; Gallon, Matthew D; O'Connell, Meghan L; Arzola, Sonya M; Berry, Kevin

    2017-06-01

    There has been interest in identifying whether nutrients might help optimize cognitive performance, especially for the military tasked with ensuring mission-readiness. This systematic review assesses the quality of the evidence for n-3 polyunsaturated fatty acids (PUFAs) across various outcomes related to cognitive function in healthy adult populations in order to develop research recommendations concerning n-3 PUFAs for mission-readiness. PubMed, CINAHL, Embase, PsycInfo, and the Cochrane Library were searched. Peer-reviewed randomized controlled trials published in the English language were eligible. Thirteen included trials were assessed for methodological quality, and descriptive data were extracted. Of the acceptable-quality (n = 8) and high-quality (n = 1) studies, 2 produced no statistically significant results, 5 produced mixed results, and 2 did not report between-group results. Results indicate that ingestion of n-3 PUFAs does not significantly alter cognitive performance in cognitively healthy persons. Studies exposing subjects to adverse circumstances that would be most relevant for drawing conclusions specifically for the military population are lacking. Several research recommendations are offered to enhance understanding of the role of fatty acids on cognitive functioning. © The Author(s) 2017. Published by Oxford University Press on behalf of the International Life Sciences Institute. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  3. Iterative metal artifact reduction: evaluation and optimization of technique.

    PubMed

    Subhas, Naveen; Primak, Andrew N; Obuchowski, Nancy A; Gupta, Amit; Polster, Joshua M; Krauss, Andreas; Iannotti, Joseph P

    2014-12-01

    Iterative metal artifact reduction (IMAR) is a sinogram inpainting technique that incorporates high-frequency data from standard weighted filtered back projection (WFBP) reconstructions to reduce metal artifact on computed tomography (CT). This study was designed to compare the image quality of IMAR and WFBP in total shoulder arthroplasties (TSA); determine the optimal amount of WFBP high-frequency data needed for IMAR; and compare image quality of the standard 3D technique with that of a faster 2D technique. Eight patients with nine TSA underwent CT with standardized parameters: 140 kVp, 300 mAs, 0.6 mm collimation and slice thickness, and B30 kernel. WFBP, three 3D IMAR algorithms with different amounts of WFBP high-frequency data (IMARlo, lowest; IMARmod, moderate; IMARhi, highest), and one 2D IMAR algorithm were reconstructed. Differences in attenuation near hardware and away from hardware were measured and compared using repeated measures ANOVA. Five readers independently graded image quality; scores were compared using Friedman's test. Attenuation differences were smaller with all 3D IMAR techniques than with WFBP (p < 0.0063). With increasing high-frequency data, the attenuation difference increased slightly (differences not statistically significant). All readers ranked IMARmod and IMARhi more favorably than WFBP (p < 0.05), with IMARmod ranked highest for most structures. The attenuation difference was slightly higher with 2D than with 3D IMAR, with no significant reader preference for 3D over 2D. IMAR significantly decreases metal artifact compared to WFBP both objectively and subjectively in TSA. The incorporation of a moderate amount of WFBP high-frequency data and use of a 2D reconstruction technique optimize image quality and allow for relatively short reconstruction times.

  4. SU-E-T-503: IMRT Optimization Using Monte Carlo Dose Engine: The Effect of Statistical Uncertainty.

    PubMed

    Tian, Z; Jia, X; Graves, Y; Uribe-Sanchez, A; Jiang, S

    2012-06-01

    With the development of ultra-fast GPU-based Monte Carlo (MC) dose engine, it becomes clinically realistic to compute the dose-deposition coefficients (DDC) for IMRT optimization using MC simulation. However, it is still time-consuming if we want to compute DDC with small statistical uncertainty. This work studies the effects of the statistical error in DDC matrix on IMRT optimization. The MC-computed DDC matrices are simulated here by adding statistical uncertainties at a desired level to the ones generated with a finite-size pencil beam algorithm. A statistical uncertainty model for MC dose calculation is employed. We adopt a penalty-based quadratic optimization model and gradient descent method to optimize fluence map and then recalculate the corresponding actual dose distribution using the noise-free DDC matrix. The impacts of DDC noise are assessed in terms of the deviation of the resulted dose distributions. We have also used a stochastic perturbation theory to theoretically estimate the statistical errors of dose distributions on a simplified optimization model. A head-and-neck case is used to investigate the perturbation to IMRT plan due to MC's statistical uncertainty. The relative errors of the final dose distributions of the optimized IMRT are found to be much smaller than those in the DDC matrix, which is consistent with our theoretical estimation. When history number is decreased from 108 to 106, the dose-volume-histograms are still very similar to the error-free DVHs while the error in DDC is about 3.8%. The results illustrate that the statistical errors in the DDC matrix have a relatively small effect on IMRT optimization in dose domain. This indicates we can use relatively small number of histories to obtain the DDC matrix with MC simulation within a reasonable amount of time, without considerably compromising the accuracy of the optimized treatment plan. This work is supported by Varian Medical Systems through a Master Research Agreement. © 2012 American Association of Physicists in Medicine.

  5. Statistical iterative reconstruction to improve image quality for digital breast tomosynthesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Shiyu, E-mail: shiyu.xu@gmail.com; Chen, Ying, E-mail: adachen@siu.edu; Lu, Jianping

    2015-09-15

    Purpose: Digital breast tomosynthesis (DBT) is a novel modality with the potential to improve early detection of breast cancer by providing three-dimensional (3D) imaging with a low radiation dose. 3D image reconstruction presents some challenges: cone-beam and flat-panel geometry, and highly incomplete sampling. A promising means to overcome these challenges is statistical iterative reconstruction (IR), since it provides the flexibility of accurate physics modeling and a general description of system geometry. The authors’ goal was to develop techniques for applying statistical IR to tomosynthesis imaging data. Methods: These techniques include the following: a physics model with a local voxel-pair basedmore » prior with flexible parameters to fine-tune image quality; a precomputed parameter λ in the prior, to remove data dependence and to achieve a uniform resolution property; an effective ray-driven technique to compute the forward and backprojection; and an oversampled, ray-driven method to perform high resolution reconstruction with a practical region-of-interest technique. To assess the performance of these techniques, the authors acquired phantom data on the stationary DBT prototype system. To solve the estimation problem, the authors proposed an optimization-transfer based algorithm framework that potentially allows fewer iterations to achieve an acceptably converged reconstruction. Results: IR improved the detectability of low-contrast and small microcalcifications, reduced cross-plane artifacts, improved spatial resolution, and lowered noise in reconstructed images. Conclusions: Although the computational load remains a significant challenge for practical development, the superior image quality provided by statistical IR, combined with advancing computational techniques, may bring benefits to screening, diagnostics, and intraoperative imaging in clinical applications.« less

  6. Generalized massive optimal data compression

    NASA Astrophysics Data System (ADS)

    Alsing, Justin; Wandelt, Benjamin

    2018-05-01

    In this paper, we provide a general procedure for optimally compressing N data down to n summary statistics, where n is equal to the number of parameters of interest. We show that compression to the score function - the gradient of the log-likelihood with respect to the parameters - yields n compressed statistics that are optimal in the sense that they preserve the Fisher information content of the data. Our method generalizes earlier work on linear Karhunen-Loéve compression for Gaussian data whilst recovering both lossless linear compression and quadratic estimation as special cases when they are optimal. We give a unified treatment that also includes the general non-Gaussian case as long as mild regularity conditions are satisfied, producing optimal non-linear summary statistics when appropriate. As a worked example, we derive explicitly the n optimal compressed statistics for Gaussian data in the general case where both the mean and covariance depend on the parameters.

  7. SU-D-BRB-02: Combining a Commercial Autoplanning Engine with Database Dose Predictions to Further Improve Plan Quality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, SP; Moore, JA; Hui, X

    Purpose: Database dose predictions and a commercial autoplanning engine both improve treatment plan quality in different but complimentary ways. The combination of these planning techniques is hypothesized to further improve plan quality. Methods: Four treatment plans were generated for each of 10 head and neck (HN) and 10 prostate cancer patients, including Plan-A: traditional IMRT optimization using clinically relevant default objectives; Plan-B: traditional IMRT optimization using database dose predictions; Plan-C: autoplanning using default objectives; and Plan-D: autoplanning using database dose predictions. One optimization was used for each planning method. Dose distributions were normalized to 95% of the planning target volumemore » (prostate: 8000 cGy; HN: 7000 cGy). Objectives used in plan optimization and analysis were the larynx (25%, 50%, 90%), left and right parotid glands (50%, 85%), spinal cord (0%, 50%), rectum and bladder (0%, 20%, 50%, 80%), and left and right femoral heads (0%, 70%). Results: All objectives except larynx 25% and 50% resulted in statistically significant differences between plans (Friedman’s χ{sup 2} ≥ 11.2; p ≤ 0.011). Maximum dose to the rectum (Plans A-D: 8328, 8395, 8489, 8537 cGy) and bladder (Plans A-D: 8403, 8448, 8527, 8569 cGy) were significantly increased. All other significant differences reflected a decrease in dose. Plans B-D were significantly different from Plan-A for 3, 17, and 19 objectives, respectively. Plans C-D were also significantly different from Plan-B for 8 and 13 objectives, respectively. In one case (cord 50%), Plan-D provided significantly lower dose than plan C (p = 0.003). Conclusion: Combining database dose predictions with a commercial autoplanning engine resulted in significant plan quality differences for the greatest number of objectives. This translated to plan quality improvements in most cases, although special care may be needed for maximum dose constraints. Further evaluation is warranted in a larger cohort across HN, prostate, and other treatment sites. This work is supported by Philips Radiation Oncology Systems.« less

  8. Metrology: Calibration and measurement processes guidelines

    NASA Technical Reports Server (NTRS)

    Castrup, Howard T.; Eicke, Woodward G.; Hayes, Jerry L.; Mark, Alexander; Martin, Robert E.; Taylor, James L.

    1994-01-01

    The guide is intended as a resource to aid engineers and systems contracts in the design, implementation, and operation of metrology, calibration, and measurement systems, and to assist NASA personnel in the uniform evaluation of such systems supplied or operated by contractors. Methodologies and techniques acceptable in fulfilling metrology quality requirements for NASA programs are outlined. The measurement process is covered from a high level through more detailed discussions of key elements within the process, Emphasis is given to the flowdown of project requirements to measurement system requirements, then through the activities that will provide measurements with defined quality. In addition, innovations and techniques for error analysis, development of statistical measurement process control, optimization of calibration recall systems, and evaluation of measurement uncertainty are presented.

  9. Reporting quality of randomised controlled trials published in prosthodontic and implantology journals.

    PubMed

    Kloukos, D; Papageorgiou, S N; Doulis, I; Petridis, H; Pandis, N

    2015-12-01

    The purpose of this study was to examine the reporting quality of randomised controlled trials (RCTs) published in prosthodontic and implantology journals. Thirty issues of nine journals in prosthodontics and implant dentistry were searched for RCTs, covering the years 2005-2012: The Journal of Prosthetic Dentistry, Journal of Oral Rehabilitation, The International Journal of Prosthodontics, The International Journal of Periodontics & Restorative Dentistry, Clinical Oral Implants Research, Clinical Implant Dentistry & Related Research, The International Journal of Oral & Maxillofacial Implants, Implant Dentistry and Journal of Dentistry. The reporting quality was assessed using a modified Consolidated Standards of Reporting Trials (CONSORT) statement checklist. Data were analysed using descriptive statistics followed by univariable and multivariable examination of statistical associations (α = 0·05). A total of 147 RCTs were identified with a mean CONSORT score of 69·4 (s.d. = 9·7). Significant differences were found among journals with the Journal of Oral Rehabilitation achieving the highest score (80·6, s.d. = 5·5) followed by Clinical Oral Implants Research (73·7, s.d. = 8·3). Involvement of a statistician/methodologist was significantly associated with increased CONSORT scores. Overall, the reporting quality of RCTs in major prosthodontic and implantology journals requires improvement. This is of paramount importance considering that optimal reporting of RCTs is an important prerequisite for clinical decision-making. © 2015 John Wiley & Sons Ltd.

  10. Clustering and Flow Conservation Monitoring Tool for Software Defined Networks.

    PubMed

    Puente Fernández, Jesús Antonio; García Villalba, Luis Javier; Kim, Tai-Hoon

    2018-04-03

    Prediction systems present some challenges on two fronts: the relation between video quality and observed session features and on the other hand, dynamics changes on the video quality. Software Defined Networks (SDN) is a new concept of network architecture that provides the separation of control plane (controller) and data plane (switches) in network devices. Due to the existence of the southbound interface, it is possible to deploy monitoring tools to obtain the network status and retrieve a statistics collection. Therefore, achieving the most accurate statistics depends on a strategy of monitoring and information requests of network devices. In this paper, we propose an enhanced algorithm for requesting statistics to measure the traffic flow in SDN networks. Such an algorithm is based on grouping network switches in clusters focusing on their number of ports to apply different monitoring techniques. Such grouping occurs by avoiding monitoring queries in network switches with common characteristics and then, by omitting redundant information. In this way, the present proposal decreases the number of monitoring queries to switches, improving the network traffic and preventing the switching overload. We have tested our optimization in a video streaming simulation using different types of videos. The experiments and comparison with traditional monitoring techniques demonstrate the feasibility of our proposal maintaining similar values decreasing the number of queries to the switches.

  11. Optimization of Robotic Spray Painting process Parameters using Taguchi Method

    NASA Astrophysics Data System (ADS)

    Chidhambara, K. V.; Latha Shankar, B.; Vijaykumar

    2018-02-01

    Automated spray painting process is gaining interest in industry and research recently due to extensive application of spray painting in automobile industries. Automating spray painting process has advantages of improved quality, productivity, reduced labor, clean environment and particularly cost effectiveness. This study investigates the performance characteristics of an industrial robot Fanuc 250ib for an automated painting process using statistical tool Taguchi’s Design of Experiment technique. The experiment is designed using Taguchi’s L25 orthogonal array by considering three factors and five levels for each factor. The objective of this work is to explore the major control parameters and to optimize the same for the improved quality of the paint coating measured in terms of Dry Film thickness(DFT), which also results in reduced rejection. Further Analysis of Variance (ANOVA) is performed to know the influence of individual factors on DFT. It is observed that shaping air and paint flow are the most influencing parameters. Multiple regression model is formulated for estimating predicted values of DFT. Confirmation test is then conducted and comparison results show that error is within acceptable level.

  12. A data-driven approach to quality risk management

    PubMed Central

    Alemayehu, Demissie; Alvir, Jose; Levenstein, Marcia; Nickerson, David

    2013-01-01

    Aim: An effective clinical trial strategy to ensure patient safety as well as trial quality and efficiency involves an integrated approach, including prospective identification of risk factors, mitigation of the risks through proper study design and execution, and assessment of quality metrics in real-time. Such an integrated quality management plan may also be enhanced by using data-driven techniques to identify risk factors that are most relevant in predicting quality issues associated with a trial. In this paper, we illustrate such an approach using data collected from actual clinical trials. Materials and Methods: Several statistical methods were employed, including the Wilcoxon rank-sum test and logistic regression, to identify the presence of association between risk factors and the occurrence of quality issues, applied to data on quality of clinical trials sponsored by Pfizer. Results: Only a subset of the risk factors had a significant association with quality issues, and included: Whether study used Placebo, whether an agent was a biologic, unusual packaging label, complex dosing, and over 25 planned procedures. Conclusion: Proper implementation of the strategy can help to optimize resource utilization without compromising trial integrity and patient safety. PMID:24312890

  13. Parameters optimization of laser brazing in crimping butt using Taguchi and BPNN-GA

    NASA Astrophysics Data System (ADS)

    Rong, Youmin; Zhang, Zhen; Zhang, Guojun; Yue, Chen; Gu, Yafei; Huang, Yu; Wang, Chunming; Shao, Xinyu

    2015-04-01

    The laser brazing (LB) is widely used in the automotive industry due to the advantages of high speed, small heat affected zone, high quality of welding seam, and low heat input. Welding parameters play a significant role in determining the bead geometry and hence quality of the weld joint. This paper addresses the optimization of the seam shape in LB process with welding crimping butt of 0.8 mm thickness using back propagation neural network (BPNN) and genetic algorithm (GA). A 3-factor, 5-level welding experiment is conducted by Taguchi L25 orthogonal array through the statistical design method. Then, the input parameters are considered here including welding speed, wire speed rate, and gap with 5 levels. The output results are efficient connection length of left side and right side, top width (WT) and bottom width (WB) of the weld bead. The experiment results are embed into the BPNN network to establish relationship between the input and output variables. The predicted results of the BPNN are fed to GA algorithm that optimizes the process parameters subjected to the objectives. Then, the effects of welding speed (WS), wire feed rate (WF), and gap (GAP) on the sum values of bead geometry is discussed. Eventually, the confirmation experiments are carried out to demonstrate the optimal values were effective and reliable. On the whole, the proposed hybrid method, BPNN-GA, can be used to guide the actual work and improve the efficiency and stability of LB process.

  14. Optimization of multi-environment trials for genomic selection based on crop models.

    PubMed

    Rincent, R; Kuhn, E; Monod, H; Oury, F-X; Rousset, M; Allard, V; Le Gouis, J

    2017-08-01

    We propose a statistical criterion to optimize multi-environment trials to predict genotype × environment interactions more efficiently, by combining crop growth models and genomic selection models. Genotype × environment interactions (GEI) are common in plant multi-environment trials (METs). In this context, models developed for genomic selection (GS) that refers to the use of genome-wide information for predicting breeding values of selection candidates need to be adapted. One promising way to increase prediction accuracy in various environments is to combine ecophysiological and genetic modelling thanks to crop growth models (CGM) incorporating genetic parameters. The efficiency of this approach relies on the quality of the parameter estimates, which depends on the environments composing this MET used for calibration. The objective of this study was to determine a method to optimize the set of environments composing the MET for estimating genetic parameters in this context. A criterion called OptiMET was defined to this aim, and was evaluated on simulated and real data, with the example of wheat phenology. The MET defined with OptiMET allowed estimating the genetic parameters with lower error, leading to higher QTL detection power and higher prediction accuracies. MET defined with OptiMET was on average more efficient than random MET composed of twice as many environments, in terms of quality of the parameter estimates. OptiMET is thus a valuable tool to determine optimal experimental conditions to best exploit MET and the phenotyping tools that are currently developed.

  15. Sleep and optimism: A longitudinal study of bidirectional causal relationship and its mediating and moderating variables in a Chinese student sample.

    PubMed

    Lau, Esther Yuet Ying; Hui, C Harry; Lam, Jasmine; Cheung, Shu-Fai

    2017-01-01

    While both sleep and optimism have been found to be predictive of well-being, few studies have examined their relationship with each other. Neither do we know much about the mediators and moderators of the relationship. This study investigated (1) the causal relationship between sleep quality and optimism in a college student sample, (2) the role of symptoms of depression, anxiety, and stress as mediators, and (3) how circadian preference might moderate the relationship. Internet survey data were collected from 1,684 full-time university students (67.6% female, mean age = 20.9 years, SD = 2.66) at three time-points, spanning about 19 months. Measures included the Attributional Style Questionnaire, the Pittsburgh Sleep Quality Index, the Composite Scale of Morningness, and the Depression Anxiety Stress Scale-21. Moderate correlations were found among sleep quality, depressive mood, stress symptoms, anxiety symptoms, and optimism. Cross-lagged analyses showed a bidirectional effect between optimism and sleep quality. Moreover, path analyses demonstrated that anxiety and stress symptoms partially mediated the influence of optimism on sleep quality, while depressive mood partially mediated the influence of sleep quality on optimism. In support of our hypothesis, sleep quality affects mood symptoms and optimism differently for different circadian preferences. Poor sleep results in depressive mood and thus pessimism in non-morning persons only. In contrast, the aggregated (direct and indirect) effects of optimism on sleep quality were invariant of circadian preference. Taken together, people who are pessimistic generally have more anxious mood and stress symptoms, which adversely affect sleep while morningness seems to have a specific protective effect countering the potential damage poor sleep has on optimism. In conclusion, optimism and sleep quality were both cause and effect of each other. Depressive mood partially explained the effect of sleep quality on optimism, whereas anxiety and stress symptoms were mechanisms bridging optimism to sleep quality. This was the first study examining the complex relationships among sleep quality, optimism, and mood symptoms altogether longitudinally in a student sample. Implications on prevention and intervention for sleep problems and mood disorders are discussed.

  16. Cross-layer Joint Relay Selection and Power Allocation Scheme for Cooperative Relaying System

    NASA Astrophysics Data System (ADS)

    Zhi, Hui; He, Mengmeng; Wang, Feiyue; Huang, Ziju

    2018-03-01

    A novel cross-layer joint relay selection and power allocation (CL-JRSPA) scheme over physical layer and data-link layer is proposed for cooperative relaying system in this paper. Our goal is finding the optimal relay selection and power allocation scheme to maximize system achievable rate when satisfying total transmit power constraint in physical layer and statistical delay quality-of-service (QoS) demand in data-link layer. Using the concept of effective capacity (EC), our goal can be formulated into an optimal joint relay selection and power allocation (JRSPA) problem to maximize the EC when satisfying total transmit power limitation. We first solving optimal power allocation (PA) problem with Lagrange multiplier approach, and then solving optimal relay selection (RS) problem. Simulation results demonstrate that CL-JRSPA scheme gets larger EC than other schemes when satisfying delay QoS demand. In addition, the proposed CL-JRSPA scheme achieves the maximal EC when relay located approximately halfway between source and destination, and EC becomes smaller when the QoS exponent becomes larger.

  17. A Comparative Analysis of Taguchi Methodology and Shainin System DoE in the Optimization of Injection Molding Process Parameters

    NASA Astrophysics Data System (ADS)

    Khavekar, Rajendra; Vasudevan, Hari, Dr.; Modi, Bhavik

    2017-08-01

    Two well-known Design of Experiments (DoE) methodologies, such as Taguchi Methods (TM) and Shainin Systems (SS) are compared and analyzed in this study through their implementation in a plastic injection molding unit. Experiments were performed at a perfume bottle cap manufacturing company (made by acrylic material) using TM and SS to find out the root cause of defects and to optimize the process parameters for minimum rejection. Experiments obtained the rejection rate to be 8.57% from 40% (appx.) during trial runs, which is quiet low, representing successful implementation of these DoE methods. The comparison showed that both methodologies gave same set of variables as critical for defect reduction, but with change in their significance order. Also, Taguchi methods require more number of experiments and consume more time compared to the Shainin System. Shainin system is less complicated and is easy to implement, whereas Taguchi methods is statistically more reliable for optimization of process parameters. Finally, experimentations implied that DoE methods are strong and reliable in implementation, as organizations attempt to improve the quality through optimization.

  18. Application of quality-by-design approach to optimize diallyl disulfide-loaded solid lipid nanoparticles.

    PubMed

    Talluri, Siddhartha Venkata; Kuppusamy, Gowthamarajan; Karri, Veera Venkata Satyanarayana Reddy; Yamjala, Karthik; Wadhwani, Ashish; Madhunapantula, SubbaRao V; Pindiprolu, Saikiran S S

    2017-05-01

    The current work was carried out by the principles of quality-by-design approach to develop an optimized solid lipid nanoparticles (SLNs) formulation of diallyl disulfide (DADS) through systematic statistical study. And its antitumor activity of DADS was also evaluated on breast cancer cell lines. To understand the effect of formulation variables (critical parameters) on the responses (critical quality attributes) of SLN, a 3-factor, 3-level Box-Behnken design, was explored to predict the responses such as particle size (Y1) and % entrapment efficiency (EE) (Y2) when concentration of surfactant (X1), amount of lipid (X2), and volume of solvent (X3) were selected as independent variables. Particle size analysis revealed that all the batches were within the nanometer range. DADS was released from the SLN much more rapidly at pH 4.5 than at pH 7.4, which is a desirable characteristic for tumor-targeted drug delivery. The cytotoxicity, reactive oxygen species (ROS), determination revealed that the antitumor activity of DADS is enhanced with SLN compared to DADS-free drug, and apoptosis is the mechanism underlying the cytotoxicity. The present study indicated the remarkable potential of DADS-SLN in enhancing the anticancer effect of DADS in breast cancer cells in vitro.

  19. A Novel Quantum-Behaved Bat Algorithm with Mean Best Position Directed for Numerical Optimization

    PubMed Central

    Zhu, Wenyong; Liu, Zijuan; Duan, Qingyan; Cao, Long

    2016-01-01

    This paper proposes a novel quantum-behaved bat algorithm with the direction of mean best position (QMBA). In QMBA, the position of each bat is mainly updated by the current optimal solution in the early stage of searching and in the late search it also depends on the mean best position which can enhance the convergence speed of the algorithm. During the process of searching, quantum behavior of bats is introduced which is beneficial to jump out of local optimal solution and make the quantum-behaved bats not easily fall into local optimal solution, and it has better ability to adapt complex environment. Meanwhile, QMBA makes good use of statistical information of best position which bats had experienced to generate better quality solutions. This approach not only inherits the characteristic of quick convergence, simplicity, and easy implementation of original bat algorithm, but also increases the diversity of population and improves the accuracy of solution. Twenty-four benchmark test functions are tested and compared with other variant bat algorithms for numerical optimization the simulation results show that this approach is simple and efficient and can achieve a more accurate solution. PMID:27293424

  20. Applications of spatial statistical network models to stream data

    USGS Publications Warehouse

    Isaak, Daniel J.; Peterson, Erin E.; Ver Hoef, Jay M.; Wenger, Seth J.; Falke, Jeffrey A.; Torgersen, Christian E.; Sowder, Colin; Steel, E. Ashley; Fortin, Marie-Josée; Jordan, Chris E.; Ruesch, Aaron S.; Som, Nicholas; Monestiez, Pascal

    2014-01-01

    Streams and rivers host a significant portion of Earth's biodiversity and provide important ecosystem services for human populations. Accurate information regarding the status and trends of stream resources is vital for their effective conservation and management. Most statistical techniques applied to data measured on stream networks were developed for terrestrial applications and are not optimized for streams. A new class of spatial statistical model, based on valid covariance structures for stream networks, can be used with many common types of stream data (e.g., water quality attributes, habitat conditions, biological surveys) through application of appropriate distributions (e.g., Gaussian, binomial, Poisson). The spatial statistical network models account for spatial autocorrelation (i.e., nonindependence) among measurements, which allows their application to databases with clustered measurement locations. Large amounts of stream data exist in many areas where spatial statistical analyses could be used to develop novel insights, improve predictions at unsampled sites, and aid in the design of efficient monitoring strategies at relatively low cost. We review the topic of spatial autocorrelation and its effects on statistical inference, demonstrate the use of spatial statistics with stream datasets relevant to common research and management questions, and discuss additional applications and development potential for spatial statistics on stream networks. Free software for implementing the spatial statistical network models has been developed that enables custom applications with many stream databases.

  1. Can a single-shot black-blood T2-weighted spin-echo echo-planar imaging sequence with sensitivity encoding replace the respiratory-triggered turbo spin-echo sequence for the liver? An optimization and feasibility study.

    PubMed

    Hussain, Shahid M; De Becker, Jan; Hop, Wim C J; Dwarkasing, Soendersing; Wielopolski, Piotr A

    2005-03-01

    To optimize and assess the feasibility of a single-shot black-blood T2-weighted spin-echo echo-planar imaging (SSBB-EPI) sequence for MRI of the liver using sensitivity encoding (SENSE), and compare the results with those obtained with a T2-weighted turbo spin-echo (TSE) sequence. Six volunteers and 16 patients were scanned at 1.5T (Philips Intera). In the volunteer study, we optimized the SSBB-EPI sequence by interactively changing the parameters (i.e., the resolution, echo time (TE), diffusion weighting with low b-values, and polarity of the phase-encoding gradient) with regard to distortion, suppression of the blood signal, and sensitivity to motion. The influence of each change was assessed. The optimized SSBB-EPI sequence was applied in patients (N = 16). A number of items, including the overall image quality (on a scale of 1-5), were used for graded evaluation. In addition, the signal-to-noise ratio (SNR) of the liver was calculated. Statistical analysis was carried out with the use of Wilcoxon's signed rank test for comparison of the SSBB-EPI and TSE sequences, with P = 0.05 considered the limit for significance. The SSBB-EPI sequence was improved by the following steps: 1) less frequency points than phase-encoding steps, 2) a b-factor of 20, and 3) a reversed polarity of the phase-encoding gradient. In patients, the mean overall image quality score for the optimized SSBB-EPI (3.5 (range: 1-4)) and TSE (3.6 (range: 3-4)), and the SNR of the liver on SSBB-EPI (mean +/- SD = 7.6 +/- 4.0) and TSE (8.9 +/- 4.6) were not significantly different (P > .05). Optimized SSBB-EPI with SENSE proved to be feasible in patients, and the overall image quality and SNR of the liver were comparable to those achieved with the standard respiratory-triggered T2-weighted TSE sequence. (c) 2005 Wiley-Liss, Inc.

  2. ParseCNV integrative copy number variation association software with quality tracking

    PubMed Central

    Glessner, Joseph T.; Li, Jin; Hakonarson, Hakon

    2013-01-01

    A number of copy number variation (CNV) calling algorithms exist; however, comprehensive software tools for CNV association studies are lacking. We describe ParseCNV, unique software that takes CNV calls and creates probe-based statistics for CNV occurrence in both case–control design and in family based studies addressing both de novo and inheritance events, which are then summarized based on CNV regions (CNVRs). CNVRs are defined in a dynamic manner to allow for a complex CNV overlap while maintaining precise association region. Using this approach, we avoid failure to converge and non-monotonic curve fitting weaknesses of programs, such as CNVtools and CNVassoc, and although Plink is easy to use, it only provides combined CNV state probe-based statistics, not state-specific CNVRs. Existing CNV association methods do not provide any quality tracking information to filter confident associations, a key issue which is fully addressed by ParseCNV. In addition, uncertainty in CNV calls underlying CNV associations is evaluated to verify significant results, including CNV overlap profiles, genomic context, number of probes supporting the CNV and single-probe intensities. When optimal quality control parameters are followed using ParseCNV, 90% of CNVs validate by polymerase chain reaction, an often problematic stage because of inadequate significant association review. ParseCNV is freely available at http://parsecnv.sourceforge.net. PMID:23293001

  3. ParseCNV integrative copy number variation association software with quality tracking.

    PubMed

    Glessner, Joseph T; Li, Jin; Hakonarson, Hakon

    2013-03-01

    A number of copy number variation (CNV) calling algorithms exist; however, comprehensive software tools for CNV association studies are lacking. We describe ParseCNV, unique software that takes CNV calls and creates probe-based statistics for CNV occurrence in both case-control design and in family based studies addressing both de novo and inheritance events, which are then summarized based on CNV regions (CNVRs). CNVRs are defined in a dynamic manner to allow for a complex CNV overlap while maintaining precise association region. Using this approach, we avoid failure to converge and non-monotonic curve fitting weaknesses of programs, such as CNVtools and CNVassoc, and although Plink is easy to use, it only provides combined CNV state probe-based statistics, not state-specific CNVRs. Existing CNV association methods do not provide any quality tracking information to filter confident associations, a key issue which is fully addressed by ParseCNV. In addition, uncertainty in CNV calls underlying CNV associations is evaluated to verify significant results, including CNV overlap profiles, genomic context, number of probes supporting the CNV and single-probe intensities. When optimal quality control parameters are followed using ParseCNV, 90% of CNVs validate by polymerase chain reaction, an often problematic stage because of inadequate significant association review. ParseCNV is freely available at http://parsecnv.sourceforge.net.

  4. Statistically Optimized Inversion Algorithm for Enhanced Retrieval of Aerosol Properties from Spectral Multi-Angle Polarimetric Satellite Observations

    NASA Technical Reports Server (NTRS)

    Dubovik, O; Herman, M.; Holdak, A.; Lapyonok, T.; Taure, D.; Deuze, J. L.; Ducos, F.; Sinyuk, A.

    2011-01-01

    The proposed development is an attempt to enhance aerosol retrieval by emphasizing statistical optimization in inversion of advanced satellite observations. This optimization concept improves retrieval accuracy relying on the knowledge of measurement error distribution. Efficient application of such optimization requires pronounced data redundancy (excess of the measurements number over number of unknowns) that is not common in satellite observations. The POLDER imager on board the PARASOL microsatellite registers spectral polarimetric characteristics of the reflected atmospheric radiation at up to 16 viewing directions over each observed pixel. The completeness of such observations is notably higher than for most currently operating passive satellite aerosol sensors. This provides an opportunity for profound utilization of statistical optimization principles in satellite data inversion. The proposed retrieval scheme is designed as statistically optimized multi-variable fitting of all available angular observations obtained by the POLDER sensor in the window spectral channels where absorption by gas is minimal. The total number of such observations by PARASOL always exceeds a hundred over each pixel and the statistical optimization concept promises to be efficient even if the algorithm retrieves several tens of aerosol parameters. Based on this idea, the proposed algorithm uses a large number of unknowns and is aimed at retrieval of extended set of parameters affecting measured radiation.

  5. Combining near infrared spectra of feces and geostatistics to generate forage nutritional quality maps across landscapes

    NASA Astrophysics Data System (ADS)

    Jean, Pierre-Olivier; Bradley, Robert; Tremblay, Jean-Pierre

    2015-04-01

    An important asset for the management of wild ungulates is the ability to recognize the spatial distribution of forage quality across heterogeneous landscapes. To do so typically requires knowledge of which plant species are eaten, in what abundance they are eaten, and what their nutritional quality might be. Acquiring such data may be, however, difficult and time consuming. Here, we are proposing a rapid and cost-effective forage quality monitoring tool that combines near infrared (NIR) spectra of fecal samples and easily obtained data on plant community composition. Our approach rests on the premise that NIR spectra of fecal samples collected within low population density exclosures reflect the optimal forage quality of a given landscape. Forage quality can thus be based on the Mahalanobis distance of fecal spectral scans across the landscape relative to fecal spectral scans inside exclosures (referred to as DISTEX). The Gi* spatial autocorrelation statistic can then be applied among neighbouring DISTEX values to detect and map 'hot-spots' and 'cold-spots' of nutritional quality over the landscape. We tested our approach in a heterogeneous boreal landscape on Anticosti Island (Qu

  6. An empirical comparison of key statistical attributes among potential ICU quality indicators.

    PubMed

    Brown, Sydney E S; Ratcliffe, Sarah J; Halpern, Scott D

    2014-08-01

    Good quality indicators should have face validity, relevance to patients, and be able to be measured reliably. Beyond these general requirements, good quality indicators should also have certain statistical properties, including sufficient variability to identify poor performers, relative insensitivity to severity adjustment, and the ability to capture what providers do rather than patients' characteristics. We assessed the performance of candidate indicators of ICU quality on these criteria. Indicators included ICU readmission, mortality, several length of stay outcomes, and the processes of venous-thromboembolism and stress ulcer prophylaxis provision. Retrospective cohort study. One hundred thirty-eight U.S. ICUs from 2001-2008 in the Project IMPACT database. Two hundred sixty-eight thousand eight hundred twenty-four patients discharged from U.S. ICUs. None. We assessed indicators' (1) variability across ICU-years; (2) degree of influence by patient vs. ICU and hospital characteristics using the Omega statistic; (3) sensitivity to severity adjustment by comparing the area under the receiver operating characteristic curve (AUC) between models including vs. excluding patient variables, and (4) correlation between risk adjusted quality indicators using a Spearman correlation. Large ranges of among-ICU variability were noted for all quality indicators, particularly for prolonged length of stay (4.7-71.3%) and the proportion of patients discharged home (30.6-82.0%), and ICU and hospital characteristics outweighed patient characteristics for stress ulcer prophylaxis (ω, 0.43; 95% CI, 0.34-0.54), venous thromboembolism prophylaxis (ω, 0.57; 95% CI, 0.53-0.61), and ICU readmissions (ω, 0.69; 95% CI, 0.52-0.90). Mortality measures were the most sensitive to severity adjustment (area under the receiver operating characteristic curve % difference, 29.6%); process measures were the least sensitive (area under the receiver operating characteristic curve % differences: venous thromboembolism prophylaxis, 3.4%; stress ulcer prophylaxis, 2.1%). None of the 10 indicators was clearly and consistently correlated with a majority of the other nine indicators. No indicator performed optimally across assessments. Future research should seek to define and operationalize quality in a way that is relevant to both patients and providers.

  7. Objective quality assessment of tone-mapped images.

    PubMed

    Yeganeh, Hojatollah; Wang, Zhou

    2013-02-01

    Tone-mapping operators (TMOs) that convert high dynamic range (HDR) to low dynamic range (LDR) images provide practically useful tools for the visualization of HDR images on standard LDR displays. Different TMOs create different tone-mapped images, and a natural question is which one has the best quality. Without an appropriate quality measure, different TMOs cannot be compared, and further improvement is directionless. Subjective rating may be a reliable evaluation method, but it is expensive and time consuming, and more importantly, is difficult to be embedded into optimization frameworks. Here we propose an objective quality assessment algorithm for tone-mapped images by combining: 1) a multiscale signal fidelity measure on the basis of a modified structural similarity index and 2) a naturalness measure on the basis of intensity statistics of natural images. Validations using independent subject-rated image databases show good correlations between subjective ranking score and the proposed tone-mapped image quality index (TMQI). Furthermore, we demonstrate the extended applications of TMQI using two examples-parameter tuning for TMOs and adaptive fusion of multiple tone-mapped images.

  8. Impact of a New Adaptive Statistical Iterative Reconstruction (ASIR)-V Algorithm on Image Quality in Coronary Computed Tomography Angiography.

    PubMed

    Pontone, Gianluca; Muscogiuri, Giuseppe; Andreini, Daniele; Guaricci, Andrea I; Guglielmo, Marco; Baggiano, Andrea; Fazzari, Fabio; Mushtaq, Saima; Conte, Edoardo; Annoni, Andrea; Formenti, Alberto; Mancini, Elisabetta; Verdecchia, Massimo; Campari, Alessandro; Martini, Chiara; Gatti, Marco; Fusini, Laura; Bonfanti, Lorenzo; Consiglio, Elisa; Rabbat, Mark G; Bartorelli, Antonio L; Pepi, Mauro

    2018-03-27

    A new postprocessing algorithm named adaptive statistical iterative reconstruction (ASIR)-V has been recently introduced. The aim of this article was to analyze the impact of ASIR-V algorithm on signal, noise, and image quality of coronary computed tomography angiography. Fifty consecutive patients underwent clinically indicated coronary computed tomography angiography (Revolution CT; GE Healthcare, Milwaukee, WI). Images were reconstructed using filtered back projection and ASIR-V 0%, and a combination of filtered back projection and ASIR-V 20%-80% and ASIR-V 100%. Image noise, signal-to-noise ratio (SNR), and contrast-to-noise ratio (CNR) were calculated for left main coronary artery (LM), left anterior descending artery (LAD), left circumflex artery (LCX), and right coronary artery (RCA) and were compared between the different postprocessing algorithms used. Similarly a four-point Likert image quality score of coronary segments was graded for each dataset and compared. A cutoff value of P < .05 was considered statistically significant. Compared to ASIR-V 0%, ASIR-V 100% demonstrated a significant reduction of image noise in all coronaries (P < .01). Compared to ASIR-V 0%, SNR was significantly higher with ASIR-V 60% in LM (P < .01), LAD (P < .05), LCX (P < .05), and RCA (P < .01). Compared to ASIR-V 0%, CNR for ASIR-V ≥60% was significantly improved in LM (P < .01), LAD (P < .05), and RCA (P < .01), whereas LCX demonstrated a significant improvement with ASIR-V ≥80%. ASIR-V 60% had significantly better Likert image quality scores compared to ASIR-V 0% in segment-, vessel-, and patient-based analyses (P < .01). Reconstruction with ASIR-V 60% provides the optimal balance between image noise, SNR, CNR, and image quality. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  9. Bidirectional relationship between sleep and optimism with depressive mood as a mediator: A longitudinal study of Chinese working adults.

    PubMed

    Lau, Esther Yuet Ying; Harry Hui, C; Cheung, Shu-Fai; Lam, Jasmine

    2015-11-01

    Sleep and optimism are important psycho-biological and personality constructs, respectively. However, very little work has examined the causal relationship between them, and none has examined the potential mechanisms operating in the relationship. This study aimed to understand whether sleep quality was a cause or an effect of optimism, and whether depressive mood could explain the relationship. Internet survey data were collected from 987 Chinese working adults (63.4% female, 92.4% full-time workers, 27.0% married, 90.2% Hong Kong residents, mean age=32.59 at three time-points, spanning about 19 months). Measures included a Chinese attributional style questionnaire, the Pittsburgh Sleep Quality Index, and the Depression Anxiety Stress Scale. Cross-sectional analyses revealed moderate correlations among sleep quality, depressive mood, and optimism. Cross-lagged analyses showed a bidirectional causality between optimism and sleep. Path analysis demonstrated that depressive mood fully mediated the influence of optimism on sleep quality, and it partially mediated the influence of sleep quality on optimism. Optimism improves sleep. Poor sleep makes a pessimist. The effects of sleep quality on optimism could not be fully explained by depressive mood, highlighting the unique role of sleep on optimism. Understanding the mechanisms of the feedback loop of sleep quality, mood, and optimism may provide insights for clinical interventions for individuals presented with mood-related problems. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Approach for Input Uncertainty Propagation and Robust Design in CFD Using Sensitivity Derivatives

    NASA Technical Reports Server (NTRS)

    Putko, Michele M.; Taylor, Arthur C., III; Newman, Perry A.; Green, Lawrence L.

    2002-01-01

    An implementation of the approximate statistical moment method for uncertainty propagation and robust optimization for quasi 3-D Euler CFD code is presented. Given uncertainties in statistically independent, random, normally distributed input variables, first- and second-order statistical moment procedures are performed to approximate the uncertainty in the CFD output. Efficient calculation of both first- and second-order sensitivity derivatives is required. In order to assess the validity of the approximations, these moments are compared with statistical moments generated through Monte Carlo simulations. The uncertainties in the CFD input variables are also incorporated into a robust optimization procedure. For this optimization, statistical moments involving first-order sensitivity derivatives appear in the objective function and system constraints. Second-order sensitivity derivatives are used in a gradient-based search to successfully execute a robust optimization. The approximate methods used throughout the analyses are found to be valid when considering robustness about input parameter mean values.

  11. [Study on the optimization of monitoring indicators of drinking water quality during health supervision].

    PubMed

    Ye, Bixiong; E, Xueli; Zhang, Lan

    2015-01-01

    To optimize non-regular drinking water quality indices (except Giardia and Cryptosporidium) of urban drinking water. Several methods including drinking water quality exceed the standard, the risk of exceeding standard, the frequency of detecting concentrations below the detection limit, water quality comprehensive index evaluation method, and attribute reduction algorithm of rough set theory were applied, redundancy factor of water quality indicators were eliminated, control factors that play a leading role in drinking water safety were found. Optimization results showed in 62 unconventional water quality monitoring indicators of urban drinking water, 42 water quality indicators could be optimized reduction by comprehensively evaluation combined with attribute reduction of rough set. Optimization of the water quality monitoring indicators and reduction of monitoring indicators and monitoring frequency could ensure the safety of drinking water quality while lowering monitoring costs and reducing monitoring pressure of the sanitation supervision departments.

  12. A statistical inference for concentrations of benzo[a]pyrene partially measured in the ambient air of an industrial city in Korea

    NASA Astrophysics Data System (ADS)

    Kim, Yongku; Seo, Young-Kyo; Baek, Sung-Ok

    2013-12-01

    Although large quantities of air pollutants are released into the atmosphere, they are partially monitored and routinely assessed for their health implications. This paper proposes a statistical model describing the temporal behavior of hazardous air pollutants (HAPs), which can have negative effects on human health. Benzo[a]pyrene (BaP) is selected for statistical modeling. The proposed model incorporates the linkage between BaP and meteorology and is specifically formulated to identify meteorological effects and allow for seasonal trends. The model is used to estimate and forecast temporal fields of BaP conditional on observed (or forecasted) meteorological conditions, including temperature, precipitation, wind speed, and air quality. The effects of BaP on human health are examined by characterizing health indicators, namely the cancer risk and the hazard quotient. The model provides useful information for the optimal monitoring period and projection of future BaP concentrations for both industrial and residential areas in Korea.

  13. Seasonal Drought Prediction: Advances, Challenges, and Future Prospects

    NASA Astrophysics Data System (ADS)

    Hao, Zengchao; Singh, Vijay P.; Xia, Youlong

    2018-03-01

    Drought prediction is of critical importance to early warning for drought managements. This review provides a synthesis of drought prediction based on statistical, dynamical, and hybrid methods. Statistical drought prediction is achieved by modeling the relationship between drought indices of interest and a suite of potential predictors, including large-scale climate indices, local climate variables, and land initial conditions. Dynamical meteorological drought prediction relies on seasonal climate forecast from general circulation models (GCMs), which can be employed to drive hydrological models for agricultural and hydrological drought prediction with the predictability determined by both climate forcings and initial conditions. Challenges still exist in drought prediction at long lead time and under a changing environment resulting from natural and anthropogenic factors. Future research prospects to improve drought prediction include, but are not limited to, high-quality data assimilation, improved model development with key processes related to drought occurrence, optimal ensemble forecast to select or weight ensembles, and hybrid drought prediction to merge statistical and dynamical forecasts.

  14. Systematic review of the effects of chronic disease management on quality-of-life in people with chronic obstructive pulmonary disease.

    PubMed

    Niesink, A; Trappenburg, J C A; de Weert-van Oene, G H; Lammers, J W J; Verheij, T J M; Schrijvers, A J P

    2007-11-01

    Chronic disease management for patients with chronic obstructive pulmonary disease (COPD) may improve quality, outcomes and access to care. To investigate effectiveness of chronic disease management programmes on the quality-of-life of people with COPD. Medline and Embase (1995-2005) were searched for relevant articles, and reference lists and abstracts were searched for controlled trials of chronic disease management programmes for patients with COPD. Quality-of-life was assessed as an outcome parameter. Two reviewers independently reviewed each paper for methodological quality and extracted the data. We found 10 randomized-controlled trials comparing chronic disease management with routine care. Patient populations, health-care professionals, intensity, and content of the intervention were heterogeneous. Different instruments were used to assess quality of life. Five out of 10 studies showed statistically significant positive outcomes on one or more domains of the quality of life instruments. Three studies, partly located in primary care, showed positive results. All chronic disease management projects for people with COPD involving primary care improved quality of life. In most of the studies, aspects of chronic disease management were applied to a limited extent. Quality of randomized-controlled trials was not optimal. More research is needed on chronic disease management programmes in patients with COPD across primary and secondary care.

  15. A critique of Rasch residual fit statistics.

    PubMed

    Karabatsos, G

    2000-01-01

    In test analysis involving the Rasch model, a large degree of importance is placed on the "objective" measurement of individual abilities and item difficulties. The degree to which the objectivity properties are attained, of course, depends on the degree to which the data fit the Rasch model. It is therefore important to utilize fit statistics that accurately and reliably detect the person-item response inconsistencies that threaten the measurement objectivity of persons and items. Given this argument, it is somewhat surprising that there is far more emphasis placed in the objective measurement of person and items than there is in the measurement quality of Rasch fit statistics. This paper provides a critical analysis of the residual fit statistics of the Rasch model, arguably the most often used fit statistics, in an effort to illustrate that the task of Rasch fit analysis is not as simple and straightforward as it appears to be. The faulty statistical properties of the residual fit statistics do not allow either a convenient or a straightforward approach to Rasch fit analysis. For instance, given a residual fit statistic, the use of a single minimum critical value for misfit diagnosis across different testing situations, where the situations vary in sample and test properties, leads to both the overdetection and underdetection of misfit. To improve this situation, it is argued that psychometricians need to implement residual-free Rasch fit statistics that are based on the number of Guttman response errors, or use indices that are statistically optimal in detecting measurement disturbances.

  16. Using statistical models to explore ensemble uncertainty in climate impact studies: the example of air pollution in Europe

    NASA Astrophysics Data System (ADS)

    Lemaire, Vincent E. P.; Colette, Augustin; Menut, Laurent

    2016-03-01

    Because of its sensitivity to unfavorable weather patterns, air pollution is sensitive to climate change so that, in the future, a climate penalty could jeopardize the expected efficiency of air pollution mitigation measures. A common method to assess the impact of climate on air quality consists in implementing chemistry-transport models forced by climate projections. However, the computing cost of such methods requires optimizing ensemble exploration techniques. By using a training data set from a deterministic projection of climate and air quality over Europe, we identified the main meteorological drivers of air quality for eight regions in Europe and developed statistical models that could be used to predict air pollutant concentrations. The evolution of the key climate variables driving either particulate or gaseous pollution allows selecting the members of the EuroCordex ensemble of regional climate projections that should be used in priority for future air quality projections (CanESM2/RCA4; CNRM-CM5-LR/RCA4 and CSIRO-Mk3-6-0/RCA4 and MPI-ESM-LR/CCLM following the EuroCordex terminology). After having tested the validity of the statistical model in predictive mode, we can provide ranges of uncertainty attributed to the spread of the regional climate projection ensemble by the end of the century (2071-2100) for the RCP8.5. In the three regions where the statistical model of the impact of climate change on PM2.5 offers satisfactory performances, we find a climate benefit (a decrease of PM2.5 concentrations under future climate) of -1.08 (±0.21), -1.03 (±0.32), -0.83 (±0.14) µg m-3, for respectively Eastern Europe, Mid-Europe and Northern Italy. In the British-Irish Isles, Scandinavia, France, the Iberian Peninsula and the Mediterranean, the statistical model is not considered skillful enough to draw any conclusion for PM2.5. In Eastern Europe, France, the Iberian Peninsula, Mid-Europe and Northern Italy, the statistical model of the impact of climate change on ozone was considered satisfactory and it confirms the climate penalty bearing upon ozone of 10.51 (±3.06), 11.70 (±3.63), 11.53 (±1.55), 9.86 (±4.41), 4.82 (±1.79) µg m-3, respectively. In the British-Irish Isles, Scandinavia and the Mediterranean, the skill of the statistical model was not considered robust enough to draw any conclusion for ozone pollution.

  17. The development and evaluation of the Australian child and adolescent recommended food score: a cross-sectional study

    PubMed Central

    2012-01-01

    Background Diet quality tools have been developed to assess the adequacy of dietary patterns for predicting future morbidity and mortality. This study describes the development and evaluation of a brief food-based diet quality index for use with children at the individual or population level. The Australian Child and Adolescent Recommended Food Score (ACARFS) was developed to reflect adherence to the Dietary Guidelines for Children and Adolescents in Australia and modelled on the approach of the US Recommended Food Score. Methods The ACARFS has eight sub-scales and is scored from zero to 73. The diet quality score was evaluated by assessing correlation (Spearman’s correlations) and agreement (weighted κ statistics) between ACARFS scores and nutrient intakes, derived from a food frequency questionnaire in 691 children (mean age 11.0, SD 1.1) in New South Wales, Australia. Nutrient intakes for ACARFS quartiles were compared with the relevant Australian nutrient reference values. Results ACARFS showed slight to substantial agreement (κ 0.13-0.64) with nutrient intakes, with statistically significant moderate to strong positive correlations with all vitamins, minerals and energy intake (r = 0.42-0.70). ACARFS was not related to BMI.Participants who scored less than the median ACARFS were more likely to have sub-optimal intakes of fibre, folic acid and calcium. Conclusion ACARFS demonstrated sufficient accuracy for use in future studies evaluating diet quality. Future research on its utility in targeting improvements in the nutritional quality of usual eating habits of children and adolescents is warranted. PMID:23164095

  18. Optimism on quality of life in Portuguese chronic patients: moderator/mediator?

    PubMed

    Vilhena, Estela; Pais-Ribeiro, José; Silva, Isabel; Pedro, Luísa; Meneses, Rute F; Cardoso, Helena; Silva, António Martins da; Mendonça, Denisa

    2014-07-01

    optimism is an important variable that has consistently been shown to affect adjustment to quality of life in chronic diseases. This study aims to clarify if dispositional optimism exerts a moderating or a mediating influence on the personality traits-quality of life association, in Portuguese chronic patients. multiple regression models were used to test the moderation and mediation effects of dispositional optimism in quality of life. A sample of 729 patients was recruited in Portugal's main hospitals and completed self-reported questionnaires assessing socio-demographic and clinical variables, personality, dispositional optimism, quality of life (QoL) and subjective well-being (SWB). the results of the regression models showed that dispositional optimism did not moderate the relationships between personality traits and quality of life. After controlling for gender, age, education level and severity of disease perception, the effects of personality traits on QoL and in SWB were mediated by dispositional optimism (partially and completely), except for the links between neuroticism/openness to experience and physical health. dispositional optimism is more likely to play a mediating, rather than a moderating role in personality traits-quality of life pathway in Portuguese chronic patients, suggesting that "the expectation that good things will happen" contributes to a better quality of life and subjective well-being.

  19. Planning Risk-Based SQC Schedules for Bracketed Operation of Continuous Production Analyzers.

    PubMed

    Westgard, James O; Bayat, Hassan; Westgard, Sten A

    2018-02-01

    To minimize patient risk, "bracketed" statistical quality control (SQC) is recommended in the new CLSI guidelines for SQC (C24-Ed4). Bracketed SQC requires that a QC event both precedes and follows (brackets) a group of patient samples. In optimizing a QC schedule, the frequency of QC or run size becomes an important planning consideration to maintain quality and also facilitate responsive reporting of results from continuous operation of high production analytic systems. Different plans for optimizing a bracketed SQC schedule were investigated on the basis of Parvin's model for patient risk and CLSI C24-Ed4's recommendations for establishing QC schedules. A Sigma-metric run size nomogram was used to evaluate different QC schedules for processes of different sigma performance. For high Sigma performance, an effective SQC approach is to employ a multistage QC procedure utilizing a "startup" design at the beginning of production and a "monitor" design periodically throughout production. Example QC schedules are illustrated for applications with measurement procedures having 6-σ, 5-σ, and 4-σ performance. Continuous production analyzers that demonstrate high σ performance can be effectively controlled with multistage SQC designs that employ a startup QC event followed by periodic monitoring or bracketing QC events. Such designs can be optimized to minimize the risk of harm to patients. © 2017 American Association for Clinical Chemistry.

  20. Evaluation of automatic exposure control system chamber for the dose optimization when examining pelvic in digital radiography.

    PubMed

    Kim, Sung-Chul; Lee, Hae-Kag; Lee, Yang-Sub; Cho, Jae-Hwan

    2015-01-01

    We found a way to optimize the image quality and reduce the exposure dose of patients through the proper activity combination of the automatic exposure control system chamber for the dose optimization when examining the pelvic anteroposterior side using the phantom of the human body standard model. We set 7 combinations of the chamber of automatic exposure control system. The effective dose was yielded by measuring five times for each according to the activity combination of the chamber for the dose measurement. Five radiologists with more than five years of experience evaluated the image through picture archiving and communication system using double blind test while classifying the 6 anatomical sites into 3-point level (improper, proper, perfect). When only one central chamber was activated, the effective dose was found to be the highest level, 0.287 mSv; and lowest when only the top left chamber was used, 0.165 mSv. After the subjective evaluation by five panel members on the pelvic image was completed, there was no statistically meaningful difference between the 7 chamber combinations, and all had good image quality. When testing the pelvic anteroposterior side with digital radiography, we were able to reduce the exposure dose of patients using the combination of the top right side of or the top two of the chamber.

  1. Protein structure modeling for CASP10 by multiple layers of global optimization.

    PubMed

    Joo, Keehyoung; Lee, Juyong; Sim, Sangjin; Lee, Sun Young; Lee, Kiho; Heo, Seungryong; Lee, In-Ho; Lee, Sung Jong; Lee, Jooyoung

    2014-02-01

    In the template-based modeling (TBM) category of CASP10 experiment, we introduced a new protocol called protein modeling system (PMS) to generate accurate protein structures in terms of side-chains as well as backbone trace. In the new protocol, a global optimization algorithm, called conformational space annealing (CSA), is applied to the three layers of TBM procedure: multiple sequence-structure alignment, 3D chain building, and side-chain re-modeling. For 3D chain building, we developed a new energy function which includes new distance restraint terms of Lorentzian type (derived from multiple templates), and new energy terms that combine (physical) energy terms such as dynamic fragment assembly (DFA) energy, DFIRE statistical potential energy, hydrogen bonding term, etc. These physical energy terms are expected to guide the structure modeling especially for loop regions where no template structures are available. In addition, we developed a new quality assessment method based on random forest machine learning algorithm to screen templates, multiple alignments, and final models. For TBM targets of CASP10, we find that, due to the combination of three stages of CSA global optimizations and quality assessment, the modeling accuracy of PMS improves at each additional stage of the protocol. It is especially noteworthy that the side-chains of the final PMS models are far more accurate than the models in the intermediate steps. Copyright © 2013 Wiley Periodicals, Inc.

  2. The Effect of Hydration on the Voice Quality of Future Professional Vocal Performers.

    PubMed

    van Wyk, Liezl; Cloete, Mariaan; Hattingh, Danel; van der Linde, Jeannie; Geertsema, Salome

    2017-01-01

    The application of systemic hydration as an instrument for optimal voice quality has been a common practice by several professional voice users over the years. Although the physiological action has been determined, the benefits on acoustic and perceptual characteristics are relatively unknown. The present study aimed to determine whether systemic hydration has beneficial outcomes on the voice quality of future professional voice users. A within-subject, pretest posttest design is applied to determine quantitative research results of female singing students between 18 and 32 years of age without a history of voice pathology. Acoustic and perceptual data were collected before and after a 2-hour singing rehearsal. The difference between the hypohydrated condition (controlled) and the hydrated condition (experimental) and the relationship between adequate hydration and acoustic and perceptual parameters of voice was then investigated. A statistical significant (P = 0.041) increase in jitter values were obtained for the hypohydrated condition. Increased maximum phonation time (MPT/z/) and higher maximum frequency for hydration indicated further statistical significant changes in voice quality (P = 0.028 and P = 0.015, respectively). Systemic hydration has positive outcomes on perceptual and acoustic parameters of voice quality for future professional singers. The singer's ability to sustain notes for longer and reach higher frequencies may reflect well in performances. Any positive change in voice quality may benefit the singer's occupational success and subsequently their social, emotional, and vocational well-being. More research evidence is needed to determine the parameters for implementing adequate hydration in vocal hygiene programs. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  3. Clinical Considerations of Adapted Drilling Protocol by Bone Quality Perception.

    PubMed

    Toia, Marco; Stocchero, Michele; Cecchinato, Francesca; Corrà, Enrico; Jimbo, Ryo; Cecchinato, Denis

    To evaluate insertion torque value (ITV) and marginal bone loss (MBL) of an implant system after a clinically perceived bone quality-adapted drilling. This multicenter retrospective study included patients treated with implants, conventionally loaded, in completely healed sites. Operators customized the osteotomy preparation according to radiographic assessment and their perception of bone quality. Drilling sequence, bone quality, and ITV were recorded at the time of surgery. Radiographs were taken at the time of implant placement and permanent restoration. MBL between implant placement and permanent restoration was calculated. The implant was used as the statistical unit. Demographic and implant characteristics were shown by means of descriptive statistics. Outcome values were compared using analysis of variance (ANOVA) and Kruskal-Wallis tests. Multiple regression models were used to test the effect of independent variables on ITV and MBL. One hundred eighty-eight implants placed in 87 patients were included in the analysis. The mean observation period was 144 ± 59 days. The mean ITV was 30.8 ± 15.1 Ncm. ITV differed significantly based on arches (mandible/maxilla) (P = .001), bone quality (P < .001), implant diameter (P = .032), and drilling protocol (P = .019). Median MBL was 0.05 mm (0.00; 0.24). A significant difference was found between the mandible and maxilla (P = .008) and between drilling protocols (P = .011). In particular, significantly higher MBL was found in the undersized drilling protocol. Multiple regression analysis showed that ITV was influenced by bone quality and implant diameter. MBL was influenced by bone quality, implant diameter, ITV, and the interaction between bone quality and ITV. It was estimated that MBL was greater with increased bone density and ITV. Excessive ITV in dense bone can cause negative marginal bone responses. A presurgical radiographic assessment and the perception of bone quality are necessary to select an optimal drilling protocol and to minimize surgical trauma.

  4. Essential elements of professional nursing environments in Primary Care and their influence on the quality of care.

    PubMed

    Gea-Caballero, Vicente; Castro-Sánchez, Enrique; Júarez-Vela, Raúl; Díaz-Herrera, Miguel Ángel; de Miguel-Montoya, Isabel; Martínez-Riera, José Ramón

    Nursing work environments are key determinants of care quality. Our study aimed to evaluate the characteristics of nursing environments in primary care settings in the Canary Islands, and identify crucial components of such environments to improve quality. We conducted a cross-sectional study in primary care organisations using the Practice Environment Scale - Nursing Work Index tool. We collected sociodemographic variables, scores, and selected the essential items conducive to optimal care. Appropriate parametric and non-parametric statistical tests were used to analyse relations between variables (CI = 95%, error = 5%). One hundred and forty-four nurses participated. The mean total score was 81.6. The results for the five dimensions included in the Practice Environment Scale - Nursing Work Index ranged from 2.25 - 2.92 (Mean). Twelve key items for quality of care were selected; six were positive in the Canary Islands, two were mixed, and four negative. 7/12 items were included in Dimension 2 (fundamentals of nursing). Being a manager was statistically associated with higher scores (p<.000). Years of experience was inversely associated with scores in the 12 items (p<.021). Nursing work environments in primary care settings in the Canary Islands are comparable to others previously studied in Spain. Areas to improve were human resources and participation of nurses in management decisions. Nurse managers must be knowledgeable about their working environments so they can focus on improvements in key dimensions. Copyright © 2017 Elsevier España, S.L.U. All rights reserved.

  5. Optimization of cold-adapted lysozyme production from the psychrophilic yeast Debaryomyces hansenii using statistical experimental methods.

    PubMed

    Wang, Quanfu; Hou, Yanhua; Yan, Peisheng

    2012-06-01

    Statistical experimental designs were employed to optimize culture conditions for cold-adapted lysozyme production of a psychrophilic yeast Debaryomyces hansenii. In the first step of optimization using Plackett-Burman design (PBD), peptone, glucose, temperature, and NaCl were identified as significant variables that affected lysozyme production, the formula was further optimized using a four factor central composite design (CCD) to understand their interaction and to determine their optimal levels. A quadratic model was developed and validated. Compared to the initial level (18.8 U/mL), the maximum lysozyme production (65.8 U/mL) observed was approximately increased by 3.5-fold under the optimized conditions. Cold-adapted lysozymes production was first optimized using statistical experimental methods. A 3.5-fold enhancement of microbial lysozyme was gained after optimization. Such an improved production will facilitate the application of microbial lysozyme. Thus, D. hansenii lysozyme may be a good and new resource for the industrial production of cold-adapted lysozymes. © 2012 Institute of Food Technologists®

  6. How much to trust the senses: Likelihood learning

    PubMed Central

    Sato, Yoshiyuki; Kording, Konrad P.

    2014-01-01

    Our brain often needs to estimate unknown variables from imperfect information. Our knowledge about the statistical distributions of quantities in our environment (called priors) and currently available information from sensory inputs (called likelihood) are the basis of all Bayesian models of perception and action. While we know that priors are learned, most studies of prior-likelihood integration simply assume that subjects know about the likelihood. However, as the quality of sensory inputs change over time, we also need to learn about new likelihoods. Here, we show that human subjects readily learn the distribution of visual cues (likelihood function) in a way that can be predicted by models of statistically optimal learning. Using a likelihood that depended on color context, we found that a learned likelihood generalized to new priors. Thus, we conclude that subjects learn about likelihood. PMID:25398975

  7. Statistically optimal perception and learning: from behavior to neural representations

    PubMed Central

    Fiser, József; Berkes, Pietro; Orbán, Gergő; Lengyel, Máté

    2010-01-01

    Human perception has recently been characterized as statistical inference based on noisy and ambiguous sensory inputs. Moreover, suitable neural representations of uncertainty have been identified that could underlie such probabilistic computations. In this review, we argue that learning an internal model of the sensory environment is another key aspect of the same statistical inference procedure and thus perception and learning need to be treated jointly. We review evidence for statistically optimal learning in humans and animals, and reevaluate possible neural representations of uncertainty based on their potential to support statistically optimal learning. We propose that spontaneous activity can have a functional role in such representations leading to a new, sampling-based, framework of how the cortex represents information and uncertainty. PMID:20153683

  8. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.

  9. Relationship between environmental management with quality of kampong space room (Case study: RW 3 of Sukun Sub District, Malang City)

    NASA Astrophysics Data System (ADS)

    Wardhani, D. K.; Azmi, D. S.; Purnamasari, W. D.

    2017-06-01

    RW 3 Sukun Malang was one of kampong that won the competition kampong environment and had managed to maintain the preservation of the kampong. Society of RW 3 Sukun undertake various activities to manage the environment by optimizing the use of kampong space. Despite RW 3 Sukun had conducted environmental management activities, there are several locations in the kampong space that less well maintained. The purpose of this research was to determine the relation of environmental management with the quality of kampong space in RW 3 Sukun. This research used qualitative and quantitative research approaches. Quantitative research conducted by using descriptive statistical analysis in assessing the quality of kampong space with weighting, scoring, and overlay maps. Quantitative research was also conducted on the relation analysis of environmental management with the quality of kampong space by using typology analysis and pearson correlation analysis. Qualitative research conducted on the analysis of environmental management and the relation analysis of environmental management with the quality of kampong space. Result of this research indicates that environmental management in RW 3 Sukun have relation with the quality of kampong space.

  10. The Impact of Optimal Respiratory Gating and Image Noise on Evaluation of Intratumor Heterogeneity on 18F-FDG PET Imaging of Lung Cancer.

    PubMed

    Grootjans, Willem; Tixier, Florent; van der Vos, Charlotte S; Vriens, Dennis; Le Rest, Catherine C; Bussink, Johan; Oyen, Wim J G; de Geus-Oei, Lioe-Fee; Visvikis, Dimitris; Visser, Eric P

    2016-11-01

    Accurate measurement of intratumor heterogeneity using parameters of texture on PET images is essential for precise characterization of cancer lesions. In this study, we investigated the influence of respiratory motion and varying noise levels on quantification of textural parameters in patients with lung cancer. We used an optimal-respiratory-gating algorithm on the list-mode data of 60 lung cancer patients who underwent 18 F-FDG PET. The images were reconstructed using a duty cycle of 35% (percentage of the total acquired PET data). In addition, nongated images of varying statistical quality (using 35% and 100% of the PET data) were reconstructed to investigate the effects of image noise. Several global image-derived indices and textural parameters (entropy, high-intensity emphasis, zone percentage, and dissimilarity) that have been associated with patient outcome were calculated. The clinical impact of optimal respiratory gating and image noise on assessment of intratumor heterogeneity was evaluated using Cox regression models, with overall survival as the outcome measure. The threshold for statistical significance was adjusted for multiple comparisons using Bonferroni correction. In the lower lung lobes, respiratory motion significantly affected quantification of intratumor heterogeneity for all textural parameters (P < 0.007) except entropy (P > 0.007). The mean increase in entropy, dissimilarity, zone percentage, and high-intensity emphasis was 1.3% ± 1.5% (P = 0.02), 11.6% ± 11.8% (P = 0.006), 2.3% ± 2.2% (P = 0.002), and 16.8% ± 17.2% (P = 0.006), respectively. No significant differences were observed for lesions in the upper lung lobes (P > 0.007). Differences in the statistical quality of the PET images affected the textural parameters less than respiratory motion, with no significant difference observed. The median follow-up time was 35 mo (range, 7-39 mo). In multivariate analysis for overall survival, total lesion glycolysis and high-intensity emphasis were the two most relevant image-derived indices and were considered to be independent significant covariates for the model regardless of the image type considered. The tested textural parameters are robust in the presence of respiratory motion artifacts and varying levels of image noise. © 2016 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  11. Clustering and Flow Conservation Monitoring Tool for Software Defined Networks

    PubMed Central

    Puente Fernández, Jesús Antonio

    2018-01-01

    Prediction systems present some challenges on two fronts: the relation between video quality and observed session features and on the other hand, dynamics changes on the video quality. Software Defined Networks (SDN) is a new concept of network architecture that provides the separation of control plane (controller) and data plane (switches) in network devices. Due to the existence of the southbound interface, it is possible to deploy monitoring tools to obtain the network status and retrieve a statistics collection. Therefore, achieving the most accurate statistics depends on a strategy of monitoring and information requests of network devices. In this paper, we propose an enhanced algorithm for requesting statistics to measure the traffic flow in SDN networks. Such an algorithm is based on grouping network switches in clusters focusing on their number of ports to apply different monitoring techniques. Such grouping occurs by avoiding monitoring queries in network switches with common characteristics and then, by omitting redundant information. In this way, the present proposal decreases the number of monitoring queries to switches, improving the network traffic and preventing the switching overload. We have tested our optimization in a video streaming simulation using different types of videos. The experiments and comparison with traditional monitoring techniques demonstrate the feasibility of our proposal maintaining similar values decreasing the number of queries to the switches. PMID:29614049

  12. Pressure Injury Prevention: Knowledge and Attitudes of Iranian Intensive Care Nurses.

    PubMed

    Tirgari, Batool; Mirshekari, Leili; Forouzi, Mansooreh Azzizadeh

    2018-04-01

    Pressure injuries are the third most expensive condition after cancer and cardiovascular disease. Nurses are responsible for the direct and continuous care, treatment, and prevention of pressure injuries. To achieve optimal quality care, nursing knowledge and attitudes must be based on the best scientific evidence. This study aimed to examine the knowledge and attitudes of nurses working in intensive care units of hospitals affiliated with Zahedan Medical Sciences University regarding the prevention of pressure injuries. This was a descriptive analytic study involving 89 critical care nurses. Data analysis was conducted using a 3-part questionnaire: demographic data, knowledge, and attitudes of intensive care nurses toward the prevention of pressure injuries. Collected data were analyzed with SPSS version 19 (IBM, Armonk, New York), using descriptive and inferential statistics (such as Pearson correlation coefficient, independent t test, and analysis of variance). The results showed that the mean ± SD score of pressure injury knowledge was 0.44 ± 0.12, and the attitude of participants toward pressure injury prevention was 2.69 ± 0.47. Scores varied widely; "nutrition" showed the highest mean score (0.71 ± 0.45), but "etiology and development" (0.42 ± 0.21) and "classification and observation" (0.42 ± 0.24) showed the lowest mean scores. Of the different aspects of attitudes toward pressure injury prevention, "the impact of pressure injuries" showed the highest mean score (2.95 ± 0.56), and "confidence in the effectiveness of prevention" showed the lowest mean score (2.56 ± 0.46). A statistically significant relationship was observed between pressure injury knowledge and attitudes toward pressure injury prevention (P < .001). Pressure injury prevention is one of many nursing care priorities and is a key indicator of the quality of nursing care. In order to achieve optimal quality care in this area, nurse managers and other administrators should make efforts to improve nursing knowledge and attitudes based on the latest scientific evidence for pressure injury prevention.

  13. Risk Assessment Integrated QbD Approach for Development of Optimized Bicontinuous Mucoadhesive Limicubes for Oral Delivery of Rosuvastatin.

    PubMed

    Javed, Md Noushad; Kohli, Kanchan; Amin, Saima

    2018-04-01

    Statins are widely prescribed for hyperlipidemia, cancer, and Alzheimer's disease but are facing some inherent challenges such as low solubility and drug loading, higher hepatic metabolism, as well as instability at gastric pH. So, relatively higher circulating dose, required for exerting the therapeutic benefits, leads to dose-mediated severe toxicity. Furthermore, due to low biocompatibility, high toxicity, and other regulatory caveats such as product conformity, reproducibility, and stability of conventional formulations as well as preferentially higher bioabsorption of lipids in their favorable cuboidal geometry, enhancement in in vivo biopharmaceutical performance of Rosuvastatin could be well manifested in Quality by Design (QbD) integrated cuboidal-shaped mucoadhesive microcrystalline delivery systems (Limicubes). Here, quality-target-product-profile (QTPPs), critical quality attributes (CQAs), Ishikawa fishbone diagram, and integration of risk management through risk assessment matrix for failure mode and effects analysis (FMEA) followed by processing of Plackett-Burman design matrix using different statistical test for the first time established an approach to substantiate the claims that controlling levels of only these three screened out independent process variables, i.e., Monoolein (B = 800-1100 μL), Poloxamer (C = 150-200 mg), and stirring speed (F = 700-1000 rpm) were statistically significant to modulate and improve the biopharmaceutical performance affecting key attributes, viz., average particle size (Y 1  = 1.40-2.70 μ), entrapment efficiency (Y 2  = 62.60-88.80%), and drug loading (Y 3  = 0.817-1.15%), in QbD-enabled process. The optimal performance of developed Limicubes exhibited an average particle size of 1.8 ± 0.2 μ, entrapment efficiency 80.32 ± 2.88%, and drug loading 0.93 ± 0.08% at the level of 1100 μL (+ 1), 200 mg (+ 1), and 700 rpm (- 1) for Monoolein, Poloxamer, and stirring speed, respectively.

  14. Humans make efficient use of natural image statistics when performing spatial interpolation.

    PubMed

    D'Antona, Anthony D; Perry, Jeffrey S; Geisler, Wilson S

    2013-12-16

    Visual systems learn through evolution and experience over the lifespan to exploit the statistical structure of natural images when performing visual tasks. Understanding which aspects of this statistical structure are incorporated into the human nervous system is a fundamental goal in vision science. To address this goal, we measured human ability to estimate the intensity of missing image pixels in natural images. Human estimation accuracy is compared with various simple heuristics (e.g., local mean) and with optimal observers that have nearly complete knowledge of the local statistical structure of natural images. Human estimates are more accurate than those of simple heuristics, and they match the performance of an optimal observer that knows the local statistical structure of relative intensities (contrasts). This optimal observer predicts the detailed pattern of human estimation errors and hence the results place strong constraints on the underlying neural mechanisms. However, humans do not reach the performance of an optimal observer that knows the local statistical structure of the absolute intensities, which reflect both local relative intensities and local mean intensity. As predicted from a statistical analysis of natural images, human estimation accuracy is negligibly improved by expanding the context from a local patch to the whole image. Our results demonstrate that the human visual system exploits efficiently the statistical structure of natural images.

  15. Approach for Uncertainty Propagation and Robust Design in CFD Using Sensitivity Derivatives

    NASA Technical Reports Server (NTRS)

    Putko, Michele M.; Newman, Perry A.; Taylor, Arthur C., III; Green, Lawrence L.

    2001-01-01

    This paper presents an implementation of the approximate statistical moment method for uncertainty propagation and robust optimization for a quasi 1-D Euler CFD (computational fluid dynamics) code. Given uncertainties in statistically independent, random, normally distributed input variables, a first- and second-order statistical moment matching procedure is performed to approximate the uncertainty in the CFD output. Efficient calculation of both first- and second-order sensitivity derivatives is required. In order to assess the validity of the approximations, the moments are compared with statistical moments generated through Monte Carlo simulations. The uncertainties in the CFD input variables are also incorporated into a robust optimization procedure. For this optimization, statistical moments involving first-order sensitivity derivatives appear in the objective function and system constraints. Second-order sensitivity derivatives are used in a gradient-based search to successfully execute a robust optimization. The approximate methods used throughout the analyses are found to be valid when considering robustness about input parameter mean values.

  16. Phytochemicals to optimize cognitive function for military mission-readiness: a systematic review and recommendations for the field.

    PubMed

    Teo, Lynn; Crawford, Cindy; Snow, James; Deuster, Patricia A; Bingham, John J; Gallon, Matthew D; O'Connell, Meghan L; Chittum, Holly K; Arzola, Sonya M; Berry, Kevin

    2017-06-01

    Optimizing cognitive performance and preventing cognitive impairments that result from exposure to high-stress situations are important to ensure mission-readiness for military personnel. This systematic review assesses the quality of the evidence for plant-based foods and beverages, or their phytochemical constituents, across various outcomes related to cognitive function in healthy adult populations to develop research recommendations for the military. PubMed, CINAHL, Embase, PsycInfo, and the Cochrane Library were searched. Peer-reviewed randomized controlled trials published in the English language were eligible. Twenty-five trials were included and assessed for methodological quality, and descriptive data were extracted. The acceptable (n = 16) to high-quality (n = 4) studies produced either no statistically significant effect or mixed results for enhancing cognitive function. The evidence suggested that healthy populations do not experience significant changes in cognitive performance when consuming soy- and non-soy-sourced isoflavones or cocoa. Heterogeneity among other interventions precluded reaching formal conclusions surrounding the evidence. Research recommendations are offered, including conducting more studies on the effect of plant-based interventions on populations reflective of military populations when exposed to military-like situations. © The Author(s) 2017. Published by Oxford University Press on behalf of the International Life Sciences Institute. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  17. C-SPECT - a Clinical Cardiac SPECT/Tct Platform: Design Concepts and Performance Potential

    PubMed Central

    Chang, Wei; Ordonez, Caesar E.; Liang, Haoning; Li, Yusheng; Liu, Jingai

    2013-01-01

    Because of scarcity of photons emitted from the heart, clinical cardiac SPECT imaging is mainly limited by photon statistics. The sub-optimal detection efficiency of current SPECT systems not only limits the quality of clinical cardiac SPECT imaging but also makes more advanced potential applications difficult to be realized. We propose a high-performance system platform - C-SPECT, which has its sampling geometry optimized for detection of emitted photons in quality and quantity. The C-SPECT has a stationary C-shaped gantry that surrounds the left-front side of a patient’s thorax. The stationary C-shaped collimator and detector systems in the gantry provide effective and efficient detection and sampling of photon emission. For cardiac imaging, the C-SPECT platform could achieve 2 to 4 times the system geometric efficiency of conventional SPECT systems at the same sampling resolution. This platform also includes an integrated transmission CT for attenuation correction. The ability of C-SPECT systems to perform sequential high-quality emission and transmission imaging could bring cost-effective high-performance to clinical imaging. In addition, a C-SPECT system could provide high detection efficiency to accommodate fast acquisition rate for gated and dynamic cardiac imaging. This paper describes the design concepts and performance potential of C-SPECT, and illustrates how these concepts can be implemented in a basic system. PMID:23885129

  18. Comparison of the clinical outcomes between unattended home APAP and polysomnography manual titration in obstructive sleep apnea patients.

    PubMed

    Wongsritrang, Krongthong; Fueangkamloon, Sumet

    2013-09-01

    To compare the clinical outcomes and determine the difference in therapeutic pressure between Automatic positive airway pressure (APAP) and polysomnography manual titration. Fifty patients of obstructive sleep apnea (OSA), moderate to severe cases, were randomized into two groups of intervention: 95-percentile pressure derived from APAP titration and an optimal pressure derived from manual titration. Clinical outcomes were assessed before and after four weeks. The average 95-percentile pressure derived from APAP titration was 11.7 +/- 0.3 cmH2O with median mask leak 1.3 L/min. The average optimal pressure derived from manual titration was 8.2 +/- 0.3 cmH2O. Pearson correlation analysis showed weak positive correlation (r = 0.336, p = 0.017). The Epworth Sleepiness Score (ESS), Quality of life tests: PSQI (Pittsburg Sleep Quality Index), and SF-36 (Medical Outcomes Study 36-Item Short-Form Health Survey) were improved significantly in both groups, but there were no statistical significant differences between groups. An APAP titration is an effective method of pressure determination for conventional CPAP therapy and shows no difference in clinical outcomes comparing the standard titration.

  19. Structural Model for the Effects of Environmental Elements on the Psychological Characteristics and Performance of the Employees of Manufacturing Systems

    PubMed Central

    Realyvásquez, Arturo; Maldonado-Macías, Aidé Aracely; García-Alcaraz, Jorge; Cortés-Robles, Guillermo; Blanco-Fernández, Julio

    2016-01-01

    This paper analyzes the effects of environmental elements on the psychological characteristics and performance of employees in manufacturing systems using structural equation modeling. Increasing the comprehension of these effects may help optimize manufacturing systems regarding their employees’ psychological characteristics and performance from a macroergonomic perspective. As the method, a new macroergonomic compatibility questionnaire (MCQ) was developed and statistically validated, and 158 respondents at four manufacture companies were considered. Noise, lighting and temperature, humidity and air quality (THAQ) were used as independent variables and psychological characteristics and employees’ performance as dependent variables. To propose and test the hypothetical causal model of significant relationships among the variables, a data analysis was deployed. Results found that the macroergonomic compatibility of environmental elements presents significant direct effects on employees’ psychological characteristics and either direct or indirect effects on the employees’ performance. THAQ had the highest direct and total effects on psychological characteristics. Regarding the direct and total effects on employees’ performance, the psychological characteristics presented the highest effects, followed by THAQ conditions. These results may help measure and optimize manufacturing systems’ performance by enhancing their macroergonomic compatibility and quality of life at work of the employees. PMID:26742054

  20. Structural Model for the Effects of Environmental Elements on the Psychological Characteristics and Performance of the Employees of Manufacturing Systems.

    PubMed

    Realyvásquez, Arturo; Maldonado-Macías, Aidé Aracely; García-Alcaraz, Jorge; Cortés-Robles, Guillermo; Blanco-Fernández, Julio

    2016-01-05

    This paper analyzes the effects of environmental elements on the psychological characteristics and performance of employees in manufacturing systems using structural equation modeling. Increasing the comprehension of these effects may help optimize manufacturing systems regarding their employees' psychological characteristics and performance from a macroergonomic perspective. As the method, a new macroergonomic compatibility questionnaire (MCQ) was developed and statistically validated, and 158 respondents at four manufacture companies were considered. Noise, lighting and temperature, humidity and air quality (THAQ) were used as independent variables and psychological characteristics and employees' performance as dependent variables. To propose and test the hypothetical causal model of significant relationships among the variables, a data analysis was deployed. Results found that the macroergonomic compatibility of environmental elements presents significant direct effects on employees' psychological characteristics and either direct or indirect effects on the employees' performance. THAQ had the highest direct and total effects on psychological characteristics. Regarding the direct and total effects on employees' performance, the psychological characteristics presented the highest effects, followed by THAQ conditions. These results may help measure and optimize manufacturing systems' performance by enhancing their macroergonomic compatibility and quality of life at work of the employees.

  1. History matching through dynamic decision-making

    PubMed Central

    Maschio, Célio; Santos, Antonio Alberto; Schiozer, Denis; Rocha, Anderson

    2017-01-01

    History matching is the process of modifying the uncertain attributes of a reservoir model to reproduce the real reservoir performance. It is a classical reservoir engineering problem and plays an important role in reservoir management since the resulting models are used to support decisions in other tasks such as economic analysis and production strategy. This work introduces a dynamic decision-making optimization framework for history matching problems in which new models are generated based on, and guided by, the dynamic analysis of the data of available solutions. The optimization framework follows a ‘learning-from-data’ approach, and includes two optimizer components that use machine learning techniques, such as unsupervised learning and statistical analysis, to uncover patterns of input attributes that lead to good output responses. These patterns are used to support the decision-making process while generating new, and better, history matched solutions. The proposed framework is applied to a benchmark model (UNISIM-I-H) based on the Namorado field in Brazil. Results show the potential the dynamic decision-making optimization framework has for improving the quality of history matching solutions using a substantial smaller number of simulations when compared with a previous work on the same benchmark. PMID:28582413

  2. Optimization of damping in the passive automotive suspension system with using two quarter-car models

    NASA Astrophysics Data System (ADS)

    Lozia, Z.; Zdanowicz, P.

    2016-09-01

    The paper presents the optimization of damping in the passive suspension system of a motor vehicle moving rectilinearly with a constant speed on a road with rough surface of random irregularities, described according to the ISO classification. Two quarter-car 2DoF models, linear and non-linear, were used; in the latter, nonlinearities of spring characteristics of the suspension system and pneumatic tyres, sliding friction in the suspension system, and wheel lift-off were taken into account. The smoothing properties of vehicle tyres were represented in both models. The calculations were carried out for three roads of different quality, with simulating four vehicle speeds. Statistical measures of vertical vehicle body vibrations and of changes in the vertical tyre/road contact force were used as the criteria of system optimization and model comparison. The design suspension displacement limit was also taken into account. The optimum suspension damping coefficient was determined and the impact of undesirable sliding friction in the suspension system on the calculation results was estimated. The results obtained make it possible to evaluate the impact of the structure and complexity of the model used on the results of the optimization.

  3. Statistical-QoS Guaranteed Energy Efficiency Optimization for Energy Harvesting Wireless Sensor Networks

    PubMed Central

    Cheng, Wenchi; Zhang, Hailin

    2017-01-01

    Energy harvesting, which offers a never-ending energy supply, has emerged as a prominent technology to prolong the lifetime and reduce costs for the battery-powered wireless sensor networks. However, how to improve the energy efficiency while guaranteeing the quality of service (QoS) for energy harvesting based wireless sensor networks is still an open problem. In this paper, we develop statistical delay-bounded QoS-driven power control policies to maximize the effective energy efficiency (EEE), which is defined as the spectrum efficiency under given specified QoS constraints per unit harvested energy, for energy harvesting based wireless sensor networks. For the battery-infinite wireless sensor networks, our developed QoS-driven power control policy converges to the Energy harvesting Water Filling (E-WF) scheme and the Energy harvesting Channel Inversion (E-CI) scheme under the very loose and stringent QoS constraints, respectively. For the battery-finite wireless sensor networks, our developed QoS-driven power control policy becomes the Truncated energy harvesting Water Filling (T-WF) scheme and the Truncated energy harvesting Channel Inversion (T-CI) scheme under the very loose and stringent QoS constraints, respectively. Furthermore, we evaluate the outage probabilities to theoretically analyze the performance of our developed QoS-driven power control policies. The obtained numerical results validate our analysis and show that our developed optimal power control policies can optimize the EEE over energy harvesting based wireless sensor networks. PMID:28832509

  4. Statistical-QoS Guaranteed Energy Efficiency Optimization for Energy Harvesting Wireless Sensor Networks.

    PubMed

    Gao, Ya; Cheng, Wenchi; Zhang, Hailin

    2017-08-23

    Energy harvesting, which offers a never-ending energy supply, has emerged as a prominent technology to prolong the lifetime and reduce costs for the battery-powered wireless sensor networks. However, how to improve the energy efficiency while guaranteeing the quality of service (QoS) for energy harvesting based wireless sensor networks is still an open problem. In this paper, we develop statistical delay-bounded QoS-driven power control policies to maximize the effective energy efficiency (EEE), which is defined as the spectrum efficiency under given specified QoS constraints per unit harvested energy, for energy harvesting based wireless sensor networks. For the battery-infinite wireless sensor networks, our developed QoS-driven power control policy converges to the Energy harvesting Water Filling (E-WF) scheme and the Energy harvesting Channel Inversion (E-CI) scheme under the very loose and stringent QoS constraints, respectively. For the battery-finite wireless sensor networks, our developed QoS-driven power control policy becomes the Truncated energy harvesting Water Filling (T-WF) scheme and the Truncated energy harvesting Channel Inversion (T-CI) scheme under the very loose and stringent QoS constraints, respectively. Furthermore, we evaluate the outage probabilities to theoretically analyze the performance of our developed QoS-driven power control policies. The obtained numerical results validate our analysis and show that our developed optimal power control policies can optimize the EEE over energy harvesting based wireless sensor networks.

  5. Can purchasing information be used to predict adherence to cardiovascular medications? An analysis of linked retail pharmacy and insurance claims data.

    PubMed

    Krumme, Alexis A; Sanfélix-Gimeno, Gabriel; Franklin, Jessica M; Isaman, Danielle L; Mahesri, Mufaddal; Matlin, Olga S; Shrank, William H; Brennan, Troyen A; Brill, Gregory; Choudhry, Niteesh K

    2016-11-09

    The use of retail purchasing data may improve adherence prediction over approaches using healthcare insurance claims alone. Retrospective. A cohort of patients who received prescription medication benefits through CVS Caremark, used a CVS Pharmacy ExtraCare Health Care (ECHC) loyalty card, and initiated a statin medication in 2011. We evaluated associations between retail purchasing patterns and optimal adherence to statins in the 12 subsequent months. Among 11 010 statin initiators, 43% were optimally adherent at 12 months of follow-up. Greater numbers of store visits per month and dollar amount per visit were positively associated with optimal adherence, as was making a purchase on the same day as filling a prescription (p<0.0001 for all). Models to predict adherence using retail purchase variables had low discriminative ability (C-statistic: 0.563), while models with both clinical and retail purchase variables achieved a C-statistic of 0.617. While the use of retail purchases may improve the discriminative ability of claims-based approaches, these data alone appear inadequate for adherence prediction, even with the addition of more complex analytical approaches. Nevertheless, associations between retail purchasing behaviours and adherence could inform the development of quality improvement interventions. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  6. Evaluating statistical consistency in the ocean model component of the Community Earth System Model (pyCECT v2.0)

    NASA Astrophysics Data System (ADS)

    Baker, Allison H.; Hu, Yong; Hammerling, Dorit M.; Tseng, Yu-heng; Xu, Haiying; Huang, Xiaomeng; Bryan, Frank O.; Yang, Guangwen

    2016-07-01

    The Parallel Ocean Program (POP), the ocean model component of the Community Earth System Model (CESM), is widely used in climate research. Most current work in CESM-POP focuses on improving the model's efficiency or accuracy, such as improving numerical methods, advancing parameterization, porting to new architectures, or increasing parallelism. Since ocean dynamics are chaotic in nature, achieving bit-for-bit (BFB) identical results in ocean solutions cannot be guaranteed for even tiny code modifications, and determining whether modifications are admissible (i.e., statistically consistent with the original results) is non-trivial. In recent work, an ensemble-based statistical approach was shown to work well for software verification (i.e., quality assurance) on atmospheric model data. The general idea of the ensemble-based statistical consistency testing is to use a qualitative measurement of the variability of the ensemble of simulations as a metric with which to compare future simulations and make a determination of statistical distinguishability. The capability to determine consistency without BFB results boosts model confidence and provides the flexibility needed, for example, for more aggressive code optimizations and the use of heterogeneous execution environments. Since ocean and atmosphere models have differing characteristics in term of dynamics, spatial variability, and timescales, we present a new statistical method to evaluate ocean model simulation data that requires the evaluation of ensemble means and deviations in a spatial manner. In particular, the statistical distribution from an ensemble of CESM-POP simulations is used to determine the standard score of any new model solution at each grid point. Then the percentage of points that have scores greater than a specified threshold indicates whether the new model simulation is statistically distinguishable from the ensemble simulations. Both ensemble size and composition are important. Our experiments indicate that the new POP ensemble consistency test (POP-ECT) tool is capable of distinguishing cases that should be statistically consistent with the ensemble and those that should not, as well as providing a simple, subjective and systematic way to detect errors in CESM-POP due to the hardware or software stack, positively contributing to quality assurance for the CESM-POP code.

  7. Multicriteria plan optimization in the hands of physicians: a pilot study in prostate cancer and brain tumors.

    PubMed

    Müller, Birgit S; Shih, Helen A; Efstathiou, Jason A; Bortfeld, Thomas; Craft, David

    2017-11-06

    The purpose of this study was to demonstrate the feasibility of physician driven planning in intensity modulated radiotherapy (IMRT) with a multicriteria optimization (MCO) treatment planning system and template based plan optimization. Exploiting the full planning potential of MCO navigation, this alternative planning approach intends to improve planning efficiency and individual plan quality. Planning was retrospectively performed on 12 brain tumor and 10 post-prostatectomy prostate patients previously treated with MCO-IMRT. For each patient, physicians were provided with a template-based generated Pareto surface of optimal plans to navigate, using the beam angles from the original clinical plans. We compared physician generated plans to clinically delivered plans (created by dosimetrists) in terms of dosimetric differences, physician preferences and planning times. Plan qualities were similar, however physician generated and clinical plans differed in the prioritization of clinical goals. Physician derived prostate plans showed significantly better sparing of the high dose rectum and bladder regions (p(D1) < 0.05; D1: dose received by 1% of the corresponding structure). Physicians' brain tumor plans indicated higher doses for targets and brainstem (p(D1) < 0.05). Within blinded plan comparisons physicians preferred the clinical plans more often (brain: 6:3 out of 12, prostate: 2:6 out of 10) (not statistically significant). While times of physician involvement were comparable for prostate planning, the new workflow reduced the average involved time for brain cases by 30%. Planner times were reduced for all cases. Subjective benefits, such as a better understanding of planning situations, were observed by clinicians through the insight into plan optimization and experiencing dosimetric trade-offs. We introduce physician driven planning with MCO for brain and prostate tumors as a feasible planning workflow. The proposed approach standardizes the planning process by utilizing site specific templates and integrates physicians more tightly into treatment planning. Physicians' navigated plan qualities were comparable to the clinical plans. Given the reduction of planning time of the planner and the equal or lower planning time of physicians, this approach has the potential to improve departmental efficiencies.

  8. Optimizing Quality of Care and Patient Safety in Malaysia: The Current Global Initiatives, Gaps and Suggested Solutions.

    PubMed

    Jarrar, Mu'taman; Abdul Rahman, Hamzah; Don, Mohammad Sobri

    2015-10-20

    Demand for health care service has significantly increased, while the quality of healthcare and patient safety has become national and international priorities. This paper aims to identify the gaps and the current initiatives for optimizing the quality of care and patient safety in Malaysia. Review of the current literature. Highly cited articles were used as the basis to retrieve and review the current initiatives for optimizing the quality of care and patient safety. The country health plan of Ministry of Health (MOH) Malaysia and the MOH Malaysia Annual Reports were reviewed. The MOH has set four strategies for optimizing quality and sustaining quality of life. The 10th Malaysia Health Plan promotes the theme "1 Care for 1 Malaysia" in order to sustain the quality of care. Despite of these efforts, the total number of complaints received by the medico-legal section of the MOH Malaysia is increasing. The current global initiatives indicted that quality performance generally belong to three main categories: patient; staffing; and working environment related factors. There is no single intervention for optimizing quality of care to maintain patient safety. Multidimensional efforts and interventions are recommended in order to optimize the quality of care and patient safety in Malaysia.

  9. Optimizing Quality of Care and Patient Safety in Malaysia: The Current Global Initiatives, Gaps and Suggested Solutions

    PubMed Central

    Jarrar, Mu’taman; Rahman, Hamzah Abdul; Don, Mohammad Sobri

    2016-01-01

    Background and Objective: Demand for health care service has significantly increased, while the quality of healthcare and patient safety has become national and international priorities. This paper aims to identify the gaps and the current initiatives for optimizing the quality of care and patient safety in Malaysia. Design: Review of the current literature. Highly cited articles were used as the basis to retrieve and review the current initiatives for optimizing the quality of care and patient safety. The country health plan of Ministry of Health (MOH) Malaysia and the MOH Malaysia Annual Reports were reviewed. Results: The MOH has set four strategies for optimizing quality and sustaining quality of life. The 10th Malaysia Health Plan promotes the theme “1 Care for 1 Malaysia” in order to sustain the quality of care. Despite of these efforts, the total number of complaints received by the medico-legal section of the MOH Malaysia is increasing. The current global initiatives indicted that quality performance generally belong to three main categories: patient; staffing; and working environment related factors. Conclusions: There is no single intervention for optimizing quality of care to maintain patient safety. Multidimensional efforts and interventions are recommended in order to optimize the quality of care and patient safety in Malaysia. PMID:26755459

  10. Corpus-based Statistical Screening for Phrase Identification

    PubMed Central

    Kim, Won; Wilbur, W. John

    2000-01-01

    Purpose: The authors study the extraction of useful phrases from a natural language database by statistical methods. The aim is to leverage human effort by providing preprocessed phrase lists with a high percentage of useful material. Method: The approach is to develop six different scoring methods that are based on different aspects of phrase occurrence. The emphasis here is not on lexical information or syntactic structure but rather on the statistical properties of word pairs and triples that can be obtained from a large database. Measurements: The Unified Medical Language System (UMLS) incorporates a large list of humanly acceptable phrases in the medical field as a part of its structure. The authors use this list of phrases as a gold standard for validating their methods. A good method is one that ranks the UMLS phrases high among all phrases studied. Measurements are 11-point average precision values and precision-recall curves based on the rankings. Result: The authors find of six different scoring methods that each proves effective in identifying UMLS quality phrases in a large subset of MEDLINE. These methods are applicable both to word pairs and word triples. All six methods are optimally combined to produce composite scoring methods that are more effective than any single method. The quality of the composite methods appears sufficient to support the automatic placement of hyperlinks in text at the site of highly ranked phrases. Conclusion: Statistical scoring methods provide a promising approach to the extraction of useful phrases from a natural language database for the purpose of indexing or providing hyperlinks in text. PMID:10984469

  11. Guidelines 13 and 14—Prediction uncertainty

    USGS Publications Warehouse

    Hill, Mary C.; Tiedeman, Claire

    2005-01-01

    An advantage of using optimization for model development and calibration is that optimization provides methods for evaluating and quantifying prediction uncertainty. Both deterministic and statistical methods can be used. Guideline 13 discusses using regression and post-audits, which we classify as deterministic methods. Guideline 14 discusses inferential statistics and Monte Carlo methods, which we classify as statistical methods.

  12. Parameter estimation techniques based on optimizing goodness-of-fit statistics for structural reliability

    NASA Technical Reports Server (NTRS)

    Starlinger, Alois; Duffy, Stephen F.; Palko, Joseph L.

    1993-01-01

    New methods are presented that utilize the optimization of goodness-of-fit statistics in order to estimate Weibull parameters from failure data. It is assumed that the underlying population is characterized by a three-parameter Weibull distribution. Goodness-of-fit tests are based on the empirical distribution function (EDF). The EDF is a step function, calculated using failure data, and represents an approximation of the cumulative distribution function for the underlying population. Statistics (such as the Kolmogorov-Smirnov statistic and the Anderson-Darling statistic) measure the discrepancy between the EDF and the cumulative distribution function (CDF). These statistics are minimized with respect to the three Weibull parameters. Due to nonlinearities encountered in the minimization process, Powell's numerical optimization procedure is applied to obtain the optimum value of the EDF. Numerical examples show the applicability of these new estimation methods. The results are compared to the estimates obtained with Cooper's nonlinear regression algorithm.

  13. Evaluating quantity and quality of literature focusing on health economics and pharmacoeconomics in Gulf Cooperation Council countries.

    PubMed

    Eljilany, Islam; El-Dahiyat, Faris; Curley, Louise Elizabeth; Babar, Zaheer-Ud-Din

    2018-05-30

    The importance of pharmacoeconomics and health economics has been augmented. It has the potential to provide evidence to aid in optimal decision-making in the funding of cost-effective medicines and services in Gulf Cooperation Council countries (G.C.C). To evaluate the quality and quantity of health economic researches published until the end of 2017 in G.C.C. and to identify the factors that affect the quality of studies. Studies were included according to predefined inclusion and exclusion criteria. The quantity was recorded, and the quality was assessed using the Quality of Health Economic Studies (QHES) instrument. Forty-nine studies were included. The mean (SD) quality score of all studies was 57.83 (25.05), and a high number of reviewed studies (47%) were evaluated as either poor or extremely poor quality. The factors that affect the quality of studies with statistical significance were, the type and method of economic evaluation, the economic outcome was the objective of the research, author`s background, the perspective of the study, health intervention and source of funding. The use of economic evaluation studies in G.C.C was limited. Different factors that affect the quality of articles such as performing a full economic evaluation and choosing societal perspective were identified. Strategies to improve the quality of future studies were recommended.

  14. Randomly iterated search and statistical competency as powerful inversion tools for deformation source modeling: Application to volcano interferometric synthetic aperture radar data

    NASA Astrophysics Data System (ADS)

    Shirzaei, M.; Walter, T. R.

    2009-10-01

    Modern geodetic techniques provide valuable and near real-time observations of volcanic activity. Characterizing the source of deformation based on these observations has become of major importance in related monitoring efforts. We investigate two random search approaches, simulated annealing (SA) and genetic algorithm (GA), and utilize them in an iterated manner. The iterated approach helps to prevent GA in general and SA in particular from getting trapped in local minima, and it also increases redundancy for exploring the search space. We apply a statistical competency test for estimating the confidence interval of the inversion source parameters, considering their internal interaction through the model, the effect of the model deficiency, and the observational error. Here, we present and test this new randomly iterated search and statistical competency (RISC) optimization method together with GA and SA for the modeling of data associated with volcanic deformations. Following synthetic and sensitivity tests, we apply the improved inversion techniques to two episodes of activity in the Campi Flegrei volcanic region in Italy, observed by the interferometric synthetic aperture radar technique. Inversion of these data allows derivation of deformation source parameters and their associated quality so that we can compare the two inversion methods. The RISC approach was found to be an efficient method in terms of computation time and search results and may be applied to other optimization problems in volcanic and tectonic environments.

  15. Comparison of the Cellient(™) automated cell block system and agar cell block method.

    PubMed

    Kruger, A M; Stevens, M W; Kerley, K J; Carter, C D

    2014-12-01

    To compare the Cellient(TM) automated cell block system with the agar cell block method in terms of quantity and quality of diagnostic material and morphological, histochemical and immunocytochemical features. Cell blocks were prepared from 100 effusion samples using the agar method and Cellient system, and routinely sectioned and stained for haematoxylin and eosin and periodic acid-Schiff with diastase (PASD). A preliminary immunocytochemical study was performed on selected cases (27/100 cases). Sections were evaluated using a three-point grading system to compare a set of morphological parameters. Statistical analysis was performed using Fisher's exact test. Parameters assessing cellularity, presence of single cells and definition of nuclear membrane, nucleoli, chromatin and cytoplasm showed a statistically significant improvement on Cellient cell blocks compared with agar cell blocks (P < 0.05). No significant difference was seen for definition of cell groups, PASD staining or the intensity or clarity of immunocytochemical staining. A discrepant immunocytochemistry (ICC) result was seen in 21% (13/63) of immunostains. The Cellient technique is comparable with the agar method, with statistically significant results achieved for important morphological features. It demonstrates potential as an alternative cell block preparation method which is relevant for the rapid processing of fine needle aspiration samples, malignant effusions and low-cellularity specimens, where optimal cell morphology and architecture are essential. Further investigation is required to optimize immunocytochemical staining using the Cellient method. © 2014 John Wiley & Sons Ltd.

  16. Optimizing ACS NSQIP modeling for evaluation of surgical quality and risk: patient risk adjustment, procedure mix adjustment, shrinkage adjustment, and surgical focus.

    PubMed

    Cohen, Mark E; Ko, Clifford Y; Bilimoria, Karl Y; Zhou, Lynn; Huffman, Kristopher; Wang, Xue; Liu, Yaoming; Kraemer, Kari; Meng, Xiangju; Merkow, Ryan; Chow, Warren; Matel, Brian; Richards, Karen; Hart, Amy J; Dimick, Justin B; Hall, Bruce L

    2013-08-01

    The American College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP) collects detailed clinical data from participating hospitals using standardized data definitions, analyzes these data, and provides participating hospitals with reports that permit risk-adjusted comparisons with a surgical quality standard. Since its inception, the ACS NSQIP has worked to refine surgical outcomes measurements and enhance statistical methods to improve the reliability and validity of this hospital profiling. From an original focus on controlling for between-hospital differences in patient risk factors with logistic regression, ACS NSQIP has added a variable to better adjust for the complexity and risk profile of surgical procedures (procedure mix adjustment) and stabilized estimates derived from small samples by using a hierarchical model with shrinkage adjustment. New models have been developed focusing on specific surgical procedures (eg, "Procedure Targeted" models), which provide opportunities to incorporate indication and other procedure-specific variables and outcomes to improve risk adjustment. In addition, comparative benchmark reports given to participating hospitals have been expanded considerably to allow more detailed evaluations of performance. Finally, procedures have been developed to estimate surgical risk for individual patients. This article describes the development of, and justification for, these new statistical methods and reporting strategies in ACS NSQIP. Copyright © 2013 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  17. Nonparametric rank regression for analyzing water quality concentration data with multiple detection limits.

    PubMed

    Fu, Liya; Wang, You-Gan

    2011-02-15

    Environmental data usually include measurements, such as water quality data, which fall below detection limits, because of limitations of the instruments or of certain analytical methods used. The fact that some responses are not detected needs to be properly taken into account in statistical analysis of such data. However, it is well-known that it is challenging to analyze a data set with detection limits, and we often have to rely on the traditional parametric methods or simple imputation methods. Distributional assumptions can lead to biased inference and justification of distributions is often not possible when the data are correlated and there is a large proportion of data below detection limits. The extent of bias is usually unknown. To draw valid conclusions and hence provide useful advice for environmental management authorities, it is essential to develop and apply an appropriate statistical methodology. This paper proposes rank-based procedures for analyzing non-normally distributed data collected at different sites over a period of time in the presence of multiple detection limits. To take account of temporal correlations within each site, we propose an optimal linear combination of estimating functions and apply the induced smoothing method to reduce the computational burden. Finally, we apply the proposed method to the water quality data collected at Susquehanna River Basin in United States of America, which clearly demonstrates the advantages of the rank regression models.

  18. CIDR

    Science.gov Websites

    Statistics Quality Control Statistics CIDR is dedicated to producing the highest quality data for our investigators. These cumulative quality control statistics are based on data from 419 released CIDR Program

  19. SU-F-T-352: Development of a Knowledge Based Automatic Lung IMRT Planning Algorithm with Non-Coplanar Beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, W; Wu, Q; Yuan, L

    Purpose: To improve the robustness of a knowledge based automatic lung IMRT planning method and to further validate the reliability of this algorithm by utilizing for the planning of clinical cases with non-coplanar beams. Methods: A lung IMRT planning method which automatically determines both plan optimization objectives and beam configurations with non-coplanar beams has been reported previously. A beam efficiency index map is constructed to guide beam angle selection in this algorithm. This index takes into account both the dose contributions from individual beams and the combined effect of multiple beams which is represented by a beam separation score. Wemore » studied the effect of this beam separation score on plan quality and determined the optimal weight for this score.14 clinical plans were re-planned with the knowledge-based algorithm. Significant dosimetric metrics for the PTV and OARs in the automatic plans are compared with those in the clinical plans by the two-sample t-test. In addition, a composite dosimetric quality index was defined to obtain the relationship between the plan quality and the beam separation score. Results: On average, we observed more than 15% reduction on conformity index and homogeneity index for PTV and V{sub 40}, V{sub 60} for heart while an 8% and 3% increase on V{sub 5}, V{sub 20} for lungs, respectively. The variation curve of the composite index as a function of angle spread score shows that 0.6 is the best value for the weight of the beam separation score. Conclusion: Optimal value for beam angle spread score in automatic lung IMRT planning is obtained. With this value, model can result in statistically the “best” achievable plans. This method can potentially improve the quality and planning efficiency for IMRT plans with no-coplanar angles.« less

  20. A near-infrared reflectance spectroscopic method for the direct analysis of several fodder-related chemical components in drumstick (Moringa oleifera Lam.) leaves.

    PubMed

    Zhang, Junjie; Li, Shuqi; Lin, Mengfei; Yang, Endian; Chen, Xiaoyang

    2018-05-01

    The drumstick tree has traditionally been used as foodstuff and fodder in several countries. Due to its high nutritional value and good biomass production, interest in this plant has increased in recent years. It has therefore become important to rapidly and accurately evaluate drumstick quality. In this study, we addressed the optimization of Near-infrared spectroscopy (NIRS) to analyze crude protein, crude fat, crude fiber, iron (Fe), and potassium (K) in a variety of drumstick accessions (N = 111) representing different populations, cultivation programs, and climates. Partial least-squares regression with internal cross-validation was used to evaluate the models and identify possible spectral outliers. The calibration statistics for these fodder-related chemical components suggest that NIRS can predict these parameters in a wide range of drumstick types with high accuracy. The NIRS calibration models developed in this study will be useful in predicting drumstick forage quality for these five quality parameters.

  1. Texture and haptic cues in slant discrimination: reliability-based cue weighting without statistically optimal cue combination

    NASA Astrophysics Data System (ADS)

    Rosas, Pedro; Wagemans, Johan; Ernst, Marc O.; Wichmann, Felix A.

    2005-05-01

    A number of models of depth-cue combination suggest that the final depth percept results from a weighted average of independent depth estimates based on the different cues available. The weight of each cue in such an average is thought to depend on the reliability of each cue. In principle, such a depth estimation could be statistically optimal in the sense of producing the minimum-variance unbiased estimator that can be constructed from the available information. Here we test such models by using visual and haptic depth information. Different texture types produce differences in slant-discrimination performance, thus providing a means for testing a reliability-sensitive cue-combination model with texture as one of the cues to slant. Our results show that the weights for the cues were generally sensitive to their reliability but fell short of statistically optimal combination - we find reliability-based reweighting but not statistically optimal cue combination.

  2. Evaluation of hybrid inverse planning and optimization (HIPO) algorithm for optimization in real-time, high-dose-rate (HDR) brachytherapy for prostate.

    PubMed

    Pokharel, Shyam; Rana, Suresh; Blikenstaff, Joseph; Sadeghi, Amir; Prestidge, Bradley

    2013-07-08

    The purpose of this study is to investigate the effectiveness of the HIPO planning and optimization algorithm for real-time prostate HDR brachytherapy. This study consists of 20 patients who underwent ultrasound-based real-time HDR brachytherapy of the prostate using the treatment planning system called Oncentra Prostate (SWIFT version 3.0). The treatment plans for all patients were optimized using inverse dose-volume histogram-based optimization followed by graphical optimization (GRO) in real time. The GRO is manual manipulation of isodose lines slice by slice. The quality of the plan heavily depends on planner expertise and experience. The data for all patients were retrieved later, and treatment plans were created and optimized using HIPO algorithm with the same set of dose constraints, number of catheters, and set of contours as in the real-time optimization algorithm. The HIPO algorithm is a hybrid because it combines both stochastic and deterministic algorithms. The stochastic algorithm, called simulated annealing, searches the optimal catheter distributions for a given set of dose objectives. The deterministic algorithm, called dose-volume histogram-based optimization (DVHO), optimizes three-dimensional dose distribution quickly by moving straight downhill once it is in the advantageous region of the search space given by the stochastic algorithm. The PTV receiving 100% of the prescription dose (V100) was 97.56% and 95.38% with GRO and HIPO, respectively. The mean dose (D(mean)) and minimum dose to 10% volume (D10) for the urethra, rectum, and bladder were all statistically lower with HIPO compared to GRO using the student pair t-test at 5% significance level. HIPO can provide treatment plans with comparable target coverage to that of GRO with a reduction in dose to the critical structures.

  3. Optimal descriptor as a translator of eclectic data into endpoint prediction: mutagenicity of fullerene as a mathematical function of conditions.

    PubMed

    Toropov, Andrey A; Toropova, Alla P

    2014-06-01

    The experimental data on the bacterial reverse mutation test on C60 nanoparticles (TA100) is examined as an endpoint. By means of the optimal descriptors calculated with the Monte Carlo method a mathematical model of the endpoint has been built up. The model is the mathematical function of (i) dose (g/plate); (ii) metabolic activation (i.e. with S9 mix or without S9 mix); and (iii) illumination (i.e. dark or irradiation). The statistical quality of the model is the following: n=10, r(2)=0.7549, q(2)=0.5709, s=7.67, F=25 (Training set); n=5, r(2)=0.8987, s=18.4 (Calibration set); and n=5, r(2)=0.6968, s=10.9 (Validation set). Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Water Quality Statistics

    ERIC Educational Resources Information Center

    Hodgson, Ted; Andersen, Lyle; Robison-Cox, Jim; Jones, Clain

    2004-01-01

    Water quality experiments, especially the use of macroinvertebrates as indicators of water quality, offer an ideal context for connecting statistics and science. In the STAR program for secondary students and teachers, water quality experiments were also used as a context for teaching statistics. In this article, we trace one activity that uses…

  5. Use of plan quality degradation to evaluate tradeoffs in delivery efficiency and clinical plan metrics arising from IMRT optimizer and sequencer compromises

    PubMed Central

    Wilkie, Joel R.; Matuszak, Martha M.; Feng, Mary; Moran, Jean M.; Fraass, Benedick A.

    2013-01-01

    Purpose: Plan degradation resulting from compromises made to enhance delivery efficiency is an important consideration for intensity modulated radiation therapy (IMRT) treatment plans. IMRT optimization and/or multileaf collimator (MLC) sequencing schemes can be modified to generate more efficient treatment delivery, but the effect those modifications have on plan quality is often difficult to quantify. In this work, the authors present a method for quantitative assessment of overall plan quality degradation due to tradeoffs between delivery efficiency and treatment plan quality, illustrated using comparisons between plans developed allowing different numbers of intensity levels in IMRT optimization and/or MLC sequencing for static segmental MLC IMRT plans. Methods: A plan quality degradation method to evaluate delivery efficiency and plan quality tradeoffs was developed and used to assess planning for 14 prostate and 12 head and neck patients treated with static IMRT. Plan quality was evaluated using a physician's predetermined “quality degradation” factors for relevant clinical plan metrics associated with the plan optimization strategy. Delivery efficiency and plan quality were assessed for a range of optimization and sequencing limitations. The “optimal” (baseline) plan for each case was derived using a clinical cost function with an unlimited number of intensity levels. These plans were sequenced with a clinical MLC leaf sequencer which uses >100 segments, assuring delivered intensities to be within 1% of the optimized intensity pattern. Each patient's optimal plan was also sequenced limiting the number of intensity levels (20, 10, and 5), and then separately optimized with these same numbers of intensity levels. Delivery time was measured for all plans, and direct evaluation of the tradeoffs between delivery time and plan degradation was performed. Results: When considering tradeoffs, the optimal number of intensity levels depends on the treatment site and on the stage in the process at which the levels are limited. The cost of improved delivery efficiency, in terms of plan quality degradation, increased as the number of intensity levels in the sequencer or optimizer decreased. The degradation was more substantial for the head and neck cases relative to the prostate cases, particularly when fewer than 20 intensity levels were used. Plan quality degradation was less severe when the number of intensity levels was limited in the optimizer rather than the sequencer. Conclusions: Analysis of plan quality degradation allows for a quantitative assessment of the compromises in clinical plan quality as delivery efficiency is improved, in order to determine the optimal delivery settings. The technique is based on physician-determined quality degradation factors and can be extended to other clinical situations where investigation of various tradeoffs is warranted. PMID:23822412

  6. Current approaches used in epidemiologic studies to examine short-term multipollutant air pollution exposures.

    PubMed

    Davalos, Angel D; Luben, Thomas J; Herring, Amy H; Sacks, Jason D

    2017-02-01

    Air pollution epidemiology traditionally focuses on the relationship between individual air pollutants and health outcomes (e.g., mortality). To account for potential copollutant confounding, individual pollutant associations are often estimated by adjusting or controlling for other pollutants in the mixture. Recently, the need to characterize the relationship between health outcomes and the larger multipollutant mixture has been emphasized in an attempt to better protect public health and inform more sustainable air quality management decisions. New and innovative statistical methods to examine multipollutant exposures were identified through a broad literature search, with a specific focus on those statistical approaches currently used in epidemiologic studies of short-term exposures to criteria air pollutants (i.e., particulate matter, carbon monoxide, sulfur dioxide, nitrogen dioxide, and ozone). Five broad classes of statistical approaches were identified for examining associations between short-term multipollutant exposures and health outcomes, specifically additive main effects, effect measure modification, unsupervised dimension reduction, supervised dimension reduction, and nonparametric methods. These approaches are characterized including advantages and limitations in different epidemiologic scenarios. By highlighting the characteristics of various studies in which multipollutant statistical methods have been used, this review provides epidemiologists and biostatisticians with a resource to aid in the selection of the most optimal statistical method to use when examining multipollutant exposures. Published by Elsevier Inc.

  7. Patient-specific dosimetric endpoints based treatment plan quality control in radiotherapy.

    PubMed

    Song, Ting; Staub, David; Chen, Mingli; Lu, Weiguo; Tian, Zhen; Jia, Xun; Li, Yongbao; Zhou, Linghong; Jiang, Steve B; Gu, Xuejun

    2015-11-07

    In intensity modulated radiotherapy (IMRT), the optimal plan for each patient is specific due to unique patient anatomy. To achieve such a plan, patient-specific dosimetric goals reflecting each patient's unique anatomy should be defined and adopted in the treatment planning procedure for plan quality control. This study is to develop such a personalized treatment plan quality control tool by predicting patient-specific dosimetric endpoints (DEs). The incorporation of patient specific DEs is realized by a multi-OAR geometry-dosimetry model, capable of predicting optimal DEs based on the individual patient's geometry. The overall quality of a treatment plan is then judged with a numerical treatment plan quality indicator and characterized as optimal or suboptimal. Taking advantage of clinically available prostate volumetric modulated arc therapy (VMAT) treatment plans, we built and evaluated our proposed plan quality control tool. Using our developed tool, six of twenty evaluated plans were identified as sub-optimal plans. After plan re-optimization, these suboptimal plans achieved better OAR dose sparing without sacrificing the PTV coverage, and the dosimetric endpoints of the re-optimized plans agreed well with the model predicted values, which validate the predictability of the proposed tool. In conclusion, the developed tool is able to accurately predict optimally achievable DEs of multiple OARs, identify suboptimal plans, and guide plan optimization. It is a useful tool for achieving patient-specific treatment plan quality control.

  8. Optimization and comparison of simultaneous and separate acquisition protocols for dual isotope myocardial perfusion SPECT.

    PubMed

    Ghaly, Michael; Links, Jonathan M; Frey, Eric C

    2015-07-07

    Dual-isotope simultaneous-acquisition (DISA) rest-stress myocardial perfusion SPECT (MPS) protocols offer a number of advantages over separate acquisition. However, crosstalk contamination due to scatter in the patient and interactions in the collimator degrade image quality. Compensation can reduce the effects of crosstalk, but does not entirely eliminate image degradations. Optimizing acquisition parameters could further reduce the impact of crosstalk. In this paper we investigate the optimization of the rest Tl-201 energy window width and relative injected activities using the ideal observer (IO), a realistic digital phantom population and Monte Carlo (MC) simulated Tc-99m and Tl-201 projections as a means to improve image quality. We compared performance on a perfusion defect detection task for Tl-201 acquisition energy window widths varying from 4 to 40 keV centered at 72 keV for a camera with a 9% energy resolution. We also investigated 7 different relative injected activities, defined as the ratio of Tc-99m and Tl-201 activities, while keeping the total effective dose constant at 13.5 mSv. For each energy window and relative injected activity, we computed the IO test statistics using a Markov chain Monte Carlo (MCMC) method for an ensemble of 1,620 triplets of fixed and reversible defect-present, and defect-absent noisy images modeling realistic background variations. The volume under the 3-class receiver operating characteristic (ROC) surface (VUS) was estimated and served as the figure of merit. For simultaneous acquisition, the IO suggested that relative Tc-to-Tl injected activity ratios of 2.6-5 and acquisition energy window widths of 16-22% were optimal. For separate acquisition, we observed a broad range of optimal relative injected activities from 2.6 to 12.1 and acquisition energy window of widths 16-22%. A negative correlation between Tl-201 injected activity and the width of the Tl-201 energy window was observed in these ranges. The results also suggested that DISA methods could potentially provide image quality as good as that obtained with separate acquisition protocols. We compared observer performance for the optimized protocols and the current clinical protocol using separate acquisition. The current clinical protocols provided better performance at a cost of injecting the patient with approximately double the injected activity of Tc-99m and Tl-201, resulting in substantially increased radiation dose.

  9. Upgrade to iterative image reconstruction (IR) in abdominal MDCT imaging: a clinical study for detailed parameter optimization beyond vendor recommendations using the adaptive statistical iterative reconstruction environment (ASIR).

    PubMed

    Mueck, F G; Körner, M; Scherr, M K; Geyer, L L; Deak, Z; Linsenmaier, U; Reiser, M; Wirth, S

    2012-03-01

    To compare the image quality of dose-reduced 64-row abdominal CT reconstructed at different levels of adaptive statistical iterative reconstruction (ASIR) to full-dose baseline examinations reconstructed with filtered back-projection (FBP) in a clinical setting and upgrade situation. Abdominal baseline examinations (noise index NI = 29; LightSpeed VCT XT, GE) were intra-individually compared to follow-up studies on a CT with an ASIR option (NI = 43; Discovery HD750, GE), n = 42. Standard-kernel images were calculated with ASIR blendings of 0 - 100 % in slice and volume mode, respectively. Three experienced radiologists compared the image quality of these 567 sets to their corresponding full-dose baseline examination (- 2: diagnostically inferior, - 1: inferior, 0: equal, + 1: superior, + 2: diagnostically superior). Furthermore, a phantom was scanned. Statistical analysis used the Wilcoxon - the Mann-Whitney U-test and the intra-class correlation (ICC). The mean CTDIvol decreased from 19.7 ± 5.5 to 12.2 ± 4.7 mGy (p < 0.001). The ICC was 0.861. The total image quality of the dose-reduced ASIR studies was comparable to the baseline at ASIR 50 % in slice (p = 0.18) and ASIR 50 - 100 % in volume mode (p > 0.10). Volume mode performed 73 % slower than slice mode (p < 0.01). After the system upgrade, the vendor recommendation of ASIR 50 % in slice mode allowed for a dose reduction of 38 % in abdominal CT with comparable image quality and time expenditure. However, there is still further dose reduction potential for more complex reconstruction settings. © Georg Thieme Verlag KG Stuttgart · New York.

  10. Upgrade to iterative image reconstruction (IR) in MDCT imaging: a clinical study for detailed parameter optimization beyond vendor recommendations using the adaptive statistical iterative reconstruction environment (ASIR) Part2: The chest.

    PubMed

    Mueck, F G; Michael, L; Deak, Z; Scherr, M K; Maxien, D; Geyer, L L; Reiser, M; Wirth, S

    2013-07-01

    To compare the image quality in dose-reduced 64-row CT of the chest at different levels of adaptive statistical iterative reconstruction (ASIR) to full-dose baseline examinations reconstructed solely with filtered back projection (FBP) in a realistic upgrade scenario. A waiver of consent was granted by the institutional review board (IRB). The noise index (NI) relates to the standard deviation of Hounsfield units in a water phantom. Baseline exams of the chest (NI = 29; LightSpeed VCT XT, GE Healthcare) were intra-individually compared to follow-up studies on a CT with ASIR after system upgrade (NI = 45; Discovery HD750, GE Healthcare), n = 46. Images were calculated in slice and volume mode with ASIR levels of 0 - 100 % in the standard and lung kernel. Three radiologists independently compared the image quality to the corresponding full-dose baseline examinations (-2: diagnostically inferior, -1: inferior, 0: equal, + 1: superior, + 2: diagnostically superior). Statistical analysis used Wilcoxon's test, Mann-Whitney U test and the intraclass correlation coefficient (ICC). The mean CTDIvol decreased by 53 % from the FBP baseline to 8.0 ± 2.3 mGy for ASIR follow-ups; p < 0.001. The ICC was 0.70. Regarding the standard kernel, the image quality in dose-reduced studies was comparable to the baseline at ASIR 70 % in volume mode (-0.07 ± 0.29, p = 0.29). Concerning the lung kernel, every ASIR level outperformed the baseline image quality (p < 0.001), with ASIR 30 % rated best (slice: 0.70 ± 0.6, volume: 0.74 ± 0.61). Vendors' recommendation of 50 % ASIR is fair. In detail, the ASIR 70 % in volume mode for the standard kernel and ASIR 30 % for the lung kernel performed best, allowing for a dose reduction of approximately 50 %. © Georg Thieme Verlag KG Stuttgart · New York.

  11. Divergent pathways to influence: Cognition and behavior differentially mediate the effects of optimism on physical and mental quality of life in Chinese university students.

    PubMed

    Ramsay, Jonathan E; Yang, Fang; Pang, Joyce S; Lai, Ching-Man; Ho, Roger Cm; Mak, Kwok-Kei

    2015-07-01

    Previous research has indicated that both cognitive and behavioral variables mediate the positive effect of optimism on quality of life; yet few attempts have been made to accommodate these constructs into a single explanatory framework. Adopting Fredrickson's broaden-and-build perspective, we examined the relationships between optimism, self-rated health, resilience, exercise, and quality of life in 365 Chinese university students using path analysis. For physical quality of life, a two-stage model, in which the effects of optimism were sequentially mediated by cognitive and behavioral variables, provided the best fit. A one-stage model, with full mediation by cognitive variables, provided the best fit for mental quality of life. This suggests that optimism influences physical and mental quality of life via different pathways. © The Author(s) 2013.

  12. Adaptive distributed source coding.

    PubMed

    Varodayan, David; Lin, Yao-Chung; Girod, Bernd

    2012-05-01

    We consider distributed source coding in the presence of hidden variables that parameterize the statistical dependence among sources. We derive the Slepian-Wolf bound and devise coding algorithms for a block-candidate model of this problem. The encoder sends, in addition to syndrome bits, a portion of the source to the decoder uncoded as doping bits. The decoder uses the sum-product algorithm to simultaneously recover the source symbols and the hidden statistical dependence variables. We also develop novel techniques based on density evolution (DE) to analyze the coding algorithms. We experimentally confirm that our DE analysis closely approximates practical performance. This result allows us to efficiently optimize parameters of the algorithms. In particular, we show that the system performs close to the Slepian-Wolf bound when an appropriate doping rate is selected. We then apply our coding and analysis techniques to a reduced-reference video quality monitoring system and show a bit rate saving of about 75% compared with fixed-length coding.

  13. Fast Synthesis of Gibbsite Nanoplates and Process Optimization using Box-Behnken Experimental Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Xin; Zhang, Xianwen; Graham, Trent R.

    Developing the ability to synthesize compositionally and morphologically well-defined gibbsite particles at the nanoscale with high yield is an ongoing need that has not yet achieved the level of rational design. Here we report optimization of a clean inorganic synthesis route based on statistical experimental design examining the influence of Al(OH)3 gel precursor concentration, pH, and aging time at temperature. At 80 oC, the optimum synthesis conditions of gel concentration at 0.5 M, pH at 9.2, and time at 72 h maximized the reaction yield up to ~87%. The resulting gibbsite product is composed of highly uniform euhedral hexagonal nanoplatesmore » within a basal plane diameter range of 200-400 nm. The independent roles of key system variables in the growth mechanism are considered. On the basis of these optimized experimental conditions, the synthesis procedure, which is both cost-effective and environmentally friendly, has the potential for mass production scale-up of high quality gibbsite material for various fundamental research and industrial applications.« less

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boutilier, Justin J., E-mail: j.boutilier@mail.utoronto.ca; Lee, Taewoo; Craig, Tim

    Purpose: To develop and evaluate the clinical applicability of advanced machine learning models that simultaneously predict multiple optimization objective function weights from patient geometry for intensity-modulated radiation therapy of prostate cancer. Methods: A previously developed inverse optimization method was applied retrospectively to determine optimal objective function weights for 315 treated patients. The authors used an overlap volume ratio (OV) of bladder and rectum for different PTV expansions and overlap volume histogram slopes (OVSR and OVSB for the rectum and bladder, respectively) as explanatory variables that quantify patient geometry. Using the optimal weights as ground truth, the authors trained and appliedmore » three prediction models: logistic regression (LR), multinomial logistic regression (MLR), and weighted K-nearest neighbor (KNN). The population average of the optimal objective function weights was also calculated. Results: The OV at 0.4 cm and OVSR at 0.1 cm features were found to be the most predictive of the weights. The authors observed comparable performance (i.e., no statistically significant difference) between LR, MLR, and KNN methodologies, with LR appearing to perform the best. All three machine learning models outperformed the population average by a statistically significant amount over a range of clinical metrics including bladder/rectum V53Gy, bladder/rectum V70Gy, and dose to the bladder, rectum, CTV, and PTV. When comparing the weights directly, the LR model predicted bladder and rectum weights that had, on average, a 73% and 74% relative improvement over the population average weights, respectively. The treatment plans resulting from the LR weights had, on average, a rectum V70Gy that was 35% closer to the clinical plan and a bladder V70Gy that was 29% closer, compared to the population average weights. Similar results were observed for all other clinical metrics. Conclusions: The authors demonstrated that the KNN and MLR weight prediction methodologies perform comparably to the LR model and can produce clinical quality treatment plans by simultaneously predicting multiple weights that capture trade-offs associated with sparing multiple OARs.« less

  15. Volumetric‐modulated arc therapy planning using multicriteria optimization for localized prostate cancer

    PubMed Central

    Ghandour, Sarah; Matzinger, Oscar

    2015-01-01

    The purpose of this work is to evaluate the volumetric‐modulated arc therapy (VMAT) multicriteria optimization (MCO) algorithm clinically available in the RayStation treatment planning system (TPS) and its ability to reduce treatment planning time while providing high dosimetric plan quality. Nine patients with localized prostate cancer who were previously treated with 78 Gy in 39 fractions using VMAT plans and rayArc system based on the direct machine parameter optimization (DMPO) algorithm were selected and replanned using the VMAT‐MCO system. First, the dosimetric quality of the plans was evaluated using multiple conformity metrics that account for target coverage and sparing of healthy tissue, used in our departmental clinical protocols. The conformity and homogeneity index, number of monitor units, and treatment planning time for both modalities were assessed. Next, the effects of the technical plan parameters, such as constraint leaf motion CLM (cm/°) and maximum arc delivery time T (s), on the accuracy of delivered dose were evaluated using quality assurance passing rates (QAs) measured using the Delta4 phantom from ScandiDos. For the dosimetric plan's quality analysis, the results show that the VMAT‐MCO system provides plans comparable to the rayArc system with no statistical difference for V95% (p<0.01), D1% (p<0.01), CI (p<0.01), and HI (p<0.01) of the PTV, bladder (p<0.01), and rectum (p<0.01) constraints, except for the femoral heads and healthy tissues, for which a dose reduction was observed using MCO compared with rayArc (p<0.01). The technical parameter study showed that a combination of CLM equal to 0.5 cm/degree and a maximum delivery time of 72 s allowed the accurate delivery of the VMAT‐MCO plan on the Elekta Versa HD linear accelerator. Planning evaluation and dosimetric measurements showed that VMAT‐MCO can be used clinically with the advantage of enhanced planning process efficiency by reducing the treatment planning time without impairing dosimetric quality. PACS numbers: 87.55.D, 87.55.de, 87.55.Qr PMID:26103500

  16. The effects on anxiety and quality of life of breast cancer patients following completion of the first cycle of chemotherapy

    PubMed Central

    Charalambous, Andreas; Kaite, Charis P; Charalambous, Melanie; Tistsi, Theologia; Kouta, Christiana

    2017-01-01

    Objectives: Breast cancer patients as part of their treatment need to undergo various forms of chemotherapy. This is considered as a burdensome experience for many patients often leading to significant levels of anxiety. The aim of the study was to explore the anxiety levels and any correlations to the quality of life of women with breast cancer that were undergoing chemotherapy. Methods: This was a cross-sectional study utilizing an explanatory sequential design. Data were collected from 355 women with breast cancer with the Self Anxiety Scale, the EORTC QLQ-C30, the EORTC QLQ-BR23 and sociodemographic questionnaires. Further insight to patients’ experiences was given through 12 in-depth interviews. Results: Anxiety scores ranged between 24 and 75 (45.7 ± 10.11), with 44% reporting serious or/and intense anxiety. The results revealed statistically significant differences on patients’ anxiety levels depending on their source of support. Overall, patients’ global health-related quality of life was found to be low to average 55.91 ± 17.94. The results showed low emotional functioning (49.30 ± 29.12), low role functions (56.34 ± 27.50) and low sexual functioning (24.93 ± 20.75). Patients also reported experiencing problems with fatigue (49.04 ± 29.12), insomnia (44.32 ± 32.97), hair loss (48.25 ± 38.32) and arm symptoms (36.53 ± 23.71). Patients being solely supported by the family experienced higher anxiety levels (p < 0.001) and lower quality of life (p < 0.001). There was a statistically significant negative correlation between anxiety and quality of life (r = −0.623, p < 0.001). Statistically significant differences were also found in relation to demographics, anxiety and quality of life. The interviews provided further evidence on the impact of anxiety on patients’ lives. Conclusion: The time following the completion of the first cycle of chemotherapy is associated with anxiety and lower quality of life levels in breast cancer patients. Healthcare providers should consider the supportive healthcare needs from the beginning of chemotherapy in patients to optimize their conventional and supportive healthcare outcomes. PMID:28694967

  17. Collimator optimization in myocardial perfusion SPECT using the ideal observer and realistic background variability for lesion detection and joint detection and localization tasks

    NASA Astrophysics Data System (ADS)

    Ghaly, Michael; Du, Yong; Links, Jonathan M.; Frey, Eric C.

    2016-03-01

    In SPECT imaging, collimators are a major factor limiting image quality and largely determine the noise and resolution of SPECT images. In this paper, we seek the collimator with the optimal tradeoff between image noise and resolution with respect to performance on two tasks related to myocardial perfusion SPECT: perfusion defect detection and joint detection and localization. We used the Ideal Observer (IO) operating on realistic background-known-statistically (BKS) and signal-known-exactly (SKE) data. The areas under the receiver operating characteristic (ROC) and localization ROC (LROC) curves (AUCd, AUCd+l), respectively, were used as the figures of merit for both tasks. We used a previously developed population of 54 phantoms based on the eXtended Cardiac Torso Phantom (XCAT) that included variations in gender, body size, heart size and subcutaneous adipose tissue level. For each phantom, organ uptakes were varied randomly based on distributions observed in patient data. We simulated perfusion defects at six different locations with extents and severities of 10% and 25%, respectively, which represented challenging but clinically relevant defects. The extent and severity are, respectively, the perfusion defect’s fraction of the myocardial volume and reduction of uptake relative to the normal myocardium. Projection data were generated using an analytical projector that modeled attenuation, scatter, and collimator-detector response effects, a 9% energy resolution at 140 keV, and a 4 mm full-width at half maximum (FWHM) intrinsic spatial resolution. We investigated a family of eight parallel-hole collimators that spanned a large range of sensitivity-resolution tradeoffs. For each collimator and defect location, the IO test statistics were computed using a Markov Chain Monte Carlo (MCMC) method for an ensemble of 540 pairs of defect-present and -absent images that included the aforementioned anatomical and uptake variability. Sets of test statistics were computed for both tasks and analyzed using ROC and LROC analysis methodologies. The results of this study suggest that collimators with somewhat poorer resolution and higher sensitivity than those of a typical low-energy high-resolution (LEHR) collimator were optimal for both defect detection and joint detection and localization tasks in myocardial perfusion SPECT for the range of defect sizes investigated. This study also indicates that optimizing instrumentation for a detection task may provide near-optimal performance on the more challenging detection-localization task.

  18. The impact of child's severity on quality-of-life among parents of children with autism spectrum disorder: the mediating role of optimism.

    PubMed

    Wisessathorn, Manika; Chanuantong, Tanasugarn; Fisher, Edwin B

    2013-10-01

    Investigate the impact of child severity and optimism on quality-of-life in parents of children with Autism Spectrum Disorder (ASD). Additionally, the role of optimism as mediator between child's severity and parental quality-of-life was also evaluated Three hundred three parents of children with ASD were recruited from the local autistic centers and schools in Bangkok, Thailand. A set of demographic information sheet, the Childhood Autism Rating Scale (CARS), the Life Oriented Test-Revised (LOT-R), and the WHOQOL-BREF test were submitted for collecting parental information. Using Pearson Correlation, a significant negative association was found between child's severity and parental quality-of-life while optimism was found to correlate positively with parental outcomes. The finding from path-analysis confirmed that impairment of language and repetitive behavior of an ASD child associated with optimism that, in turn, predicted level of parental quality-of-life in all domains. The current findings assured a role of optimism as mediator between child's severity and parental quality-of-life. Implications for the development of intervention focused on enhancing parent's optimism were recommended.

  19. Reduction of product-related species during the fermentation and purification of a recombinant IL-1 receptor antagonist at the laboratory and pilot scale.

    PubMed

    Schirmer, Emily B; Golden, Kathryn; Xu, Jin; Milling, Jesse; Murillo, Alec; Lowden, Patricia; Mulagapati, Srihariraju; Hou, Jinzhao; Kovalchin, Joseph T; Masci, Allyson; Collins, Kathryn; Zarbis-Papastoitsis, Gregory

    2013-08-01

    Through a parallel approach of tracking product quality through fermentation and purification development, a robust process was designed to reduce the levels of product-related species. Three biochemically similar product-related species were identified as byproducts of host-cell enzymatic activity. To modulate intracellular proteolytic activity, key fermentation parameters (temperature, pH, trace metals, EDTA levels, and carbon source) were evaluated through bioreactor optimization, while balancing negative effects on growth, productivity, and oxygen demand. The purification process was based on three non-affinity steps and resolved product-related species by exploiting small charge differences. Using statistical design of experiments for elution conditions, a high-resolution cation exchange capture column was optimized for resolution and recovery. Further reduction of product-related species was achieved by evaluating a matrix of conditions for a ceramic hydroxyapatite column. The optimized fermentation process was transferred from the 2-L laboratory scale to the 100-L pilot scale and the purification process was scaled accordingly to process the fermentation harvest. The laboratory- and pilot-scale processes resulted in similar process recoveries of 60 and 65%, respectively, and in a product that was of equal quality and purity to that of small-scale development preparations. The parallel approach for up- and downstream development was paramount in achieving a robust and scalable clinical process. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Analysis of deep inferior epigastric perforator (DIEP) arteries by using MDCTA: Comparison between 2 post-processing techniques.

    PubMed

    Saba, Luca; Atzeni, Matteo; Ribuffo, Diego; Mallarini, Giorgio; Suri, Jasjit S

    2012-08-01

    Our purpose was to compare two post-processing techniques, Maximum-Intensity-Projection (MIP) and Volume Rendering (VR) for the study of perforator arteries. Thirty patients who underwent Multi-Detector-Row CT Angiography (MDCTA) between February 2010 and May 2010 were retrospectively analyzed. For each patient and for each reconstruction method, the image quality was evaluated and the inter- and intra-observer agreement was calculated according to the Cohen statistics. The Hounsfield Unit (HU) value in the common femoral artery was quantified and the correlation (Pearson Statistic) between image quality and HU value was explored. The Pearson r between the right and left common femoral artery was excellent (r=0.955). The highest image quality score was obtained using MIP for both observers (total value 75, with a mean value 2.67 for observer 1 and total value of 79 and a mean value of 2.82 for observer 2). The highest agreement between the two observers was detected using the MIP protocol with a Cohen kappa value of 0.856. The ROC area under the curve (Az) for the VR is 0.786 (0.086 SD; p value=0.0009) whereas the ROC area under the curve (Az) for the MIP is 0.0928 (0.051 SD; p value=0.0001). MIP showed the optimal inter- and intra-observer agreement and the highest quality scores and therefore should be used as post-processing techniques in the analysis of perforating arteries. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  1. A multiple objective optimization approach to quality control

    NASA Technical Reports Server (NTRS)

    Seaman, Christopher Michael

    1991-01-01

    The use of product quality as the performance criteria for manufacturing system control is explored. The goal in manufacturing, for economic reasons, is to optimize product quality. The problem is that since quality is a rather nebulous product characteristic, there is seldom an analytic function that can be used as a measure. Therefore standard control approaches, such as optimal control, cannot readily be applied. A second problem with optimizing product quality is that it is typically measured along many dimensions: there are many apsects of quality which must be optimized simultaneously. Very often these different aspects are incommensurate and competing. The concept of optimality must now include accepting tradeoffs among the different quality characteristics. These problems are addressed using multiple objective optimization. It is shown that the quality control problem can be defined as a multiple objective optimization problem. A controller structure is defined using this as the basis. Then, an algorithm is presented which can be used by an operator to interactively find the best operating point. Essentially, the algorithm uses process data to provide the operator with two pieces of information: (1) if it is possible to simultaneously improve all quality criteria, then determine what changes to the process input or controller parameters should be made to do this; and (2) if it is not possible to improve all criteria, and the current operating point is not a desirable one, select a criteria in which a tradeoff should be made, and make input changes to improve all other criteria. The process is not operating at an optimal point in any sense if no tradeoff has to be made to move to a new operating point. This algorithm ensures that operating points are optimal in some sense and provides the operator with information about tradeoffs when seeking the best operating point. The multiobjective algorithm was implemented in two different injection molding scenarios: tuning of process controllers to meet specified performance objectives and tuning of process inputs to meet specified quality objectives. Five case studies are presented.

  2. Selected Bibliography on Optimizing Techniques in Statistics

    DTIC Science & Technology

    1981-08-01

    problems in business, industry and .ogovern nt ae f rmulated as optimization problem. Topics in optimization constitute an essential area of study in...numerical, iii) mathematical programming, and (iv) variational. We provide pertinent references with statistical applications Sin the above areas in Part I...TMS Advanced Studies in Managentnt Sciences, North-Holland PIIENli iiiany, Amsterdam. (To appear.) Spang, H. A. (1962). A review of minimization

  3. Evaluation of dynamic row-action maximum likelihood algorithm reconstruction for quantitative 15O brain PET.

    PubMed

    Ibaraki, Masanobu; Sato, Kaoru; Mizuta, Tetsuro; Kitamura, Keishi; Miura, Shuichi; Sugawara, Shigeki; Shinohara, Yuki; Kinoshita, Toshibumi

    2009-09-01

    A modified version of row-action maximum likelihood algorithm (RAMLA) using a 'subset-dependent' relaxation parameter for noise suppression, or dynamic RAMLA (DRAMA), has been proposed. The aim of this study was to assess the capability of DRAMA reconstruction for quantitative (15)O brain positron emission tomography (PET). Seventeen healthy volunteers were studied using a 3D PET scanner. The PET study included 3 sequential PET scans for C(15)O, (15)O(2) and H (2) (15) O. First, the number of main iterations (N (it)) in DRAMA was optimized in relation to image convergence and statistical image noise. To estimate the statistical variance of reconstructed images on a pixel-by-pixel basis, a sinogram bootstrap method was applied using list-mode PET data. Once the optimal N (it) was determined, statistical image noise and quantitative parameters, i.e., cerebral blood flow (CBF), cerebral blood volume (CBV), cerebral metabolic rate of oxygen (CMRO(2)) and oxygen extraction fraction (OEF) were compared between DRAMA and conventional FBP. DRAMA images were post-filtered so that their spatial resolutions were matched with FBP images with a 6-mm FWHM Gaussian filter. Based on the count recovery data, N (it) = 3 was determined as an optimal parameter for (15)O PET data. The sinogram bootstrap analysis revealed that DRAMA reconstruction resulted in less statistical noise, especially in a low-activity region compared to FBP. Agreement of quantitative values between FBP and DRAMA was excellent. For DRAMA images, average gray matter values of CBF, CBV, CMRO(2) and OEF were 46.1 +/- 4.5 (mL/100 mL/min), 3.35 +/- 0.40 (mL/100 mL), 3.42 +/- 0.35 (mL/100 mL/min) and 42.1 +/- 3.8 (%), respectively. These values were comparable to corresponding values with FBP images: 46.6 +/- 4.6 (mL/100 mL/min), 3.34 +/- 0.39 (mL/100 mL), 3.48 +/- 0.34 (mL/100 mL/min) and 42.4 +/- 3.8 (%), respectively. DRAMA reconstruction is applicable to quantitative (15)O PET study and is superior to conventional FBP in terms of image quality.

  4. A statistical, task-based evaluation method for three-dimensional x-ray breast imaging systems using variable-background phantoms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Subok; Jennings, Robert; Liu Haimo

    Purpose: For the last few years, development and optimization of three-dimensional (3D) x-ray breast imaging systems, such as digital breast tomosynthesis (DBT) and computed tomography, have drawn much attention from the medical imaging community, either academia or industry. However, there is still much room for understanding how to best optimize and evaluate the devices over a large space of many different system parameters and geometries. Current evaluation methods, which work well for 2D systems, do not incorporate the depth information from the 3D imaging systems. Therefore, it is critical to develop a statistically sound evaluation method to investigate the usefulnessmore » of inclusion of depth and background-variability information into the assessment and optimization of the 3D systems. Methods: In this paper, we present a mathematical framework for a statistical assessment of planar and 3D x-ray breast imaging systems. Our method is based on statistical decision theory, in particular, making use of the ideal linear observer called the Hotelling observer. We also present a physical phantom that consists of spheres of different sizes and materials for producing an ensemble of randomly varying backgrounds to be imaged for a given patient class. Lastly, we demonstrate our evaluation method in comparing laboratory mammography and three-angle DBT systems for signal detection tasks using the phantom's projection data. We compare the variable phantom case to that of a phantom of the same dimensions filled with water, which we call the uniform phantom, based on the performance of the Hotelling observer as a function of signal size and intensity. Results: Detectability trends calculated using the variable and uniform phantom methods are different from each other for both mammography and DBT systems. Conclusions: Our results indicate that measuring the system's detection performance with consideration of background variability may lead to differences in system performance estimates and comparisons. For the assessment of 3D systems, to accurately determine trade offs between image quality and radiation dose, it is critical to incorporate randomness arising from the imaging chain including background variability into system performance calculations.« less

  5. Optimizing ELISAs for precision and robustness using laboratory automation and statistical design of experiments.

    PubMed

    Joelsson, Daniel; Moravec, Phil; Troutman, Matthew; Pigeon, Joseph; DePhillips, Pete

    2008-08-20

    Transferring manual ELISAs to automated platforms requires optimizing the assays for each particular robotic platform. These optimization experiments are often time consuming and difficult to perform using a traditional one-factor-at-a-time strategy. In this manuscript we describe the development of an automated process using statistical design of experiments (DOE) to quickly optimize immunoassays for precision and robustness on the Tecan EVO liquid handler. By using fractional factorials and a split-plot design, five incubation time variables and four reagent concentration variables can be optimized in a short period of time.

  6. A randomized controlled trial of expressive writing in breast cancer survivors with lymphedema.

    PubMed

    Sohl, Stephanie J; Dietrich, Mary S; Wallston, Kenneth A; Ridner, Sheila H

    2017-07-01

    Breast cancer survivors who develop lymphedema report poorer quality of life (QoL) than those without lymphedema. Expressive writing is a potential intervention to address QoL. Adult women (N = 107) with breast cancer and chronic Stage II lymphedema were randomised to writing about thoughts and feelings specific to lymphedema and its treatment (intervention) or about daily activities (control) for four, 20-min sessions. Outcome measures were several indicators of QoL assessed at baseline, one, three, and six months post-intervention (total scores and subscales of Upper Limb Lymphedema 27 and Functional Assessment of Cancer Therapy-Breast). Hypothesised moderators of change in QoL were dispositional optimism, avoidant behaviours, and time since lymphedema diagnosis. There was no statistically significant intent-to-treat main effects of expressive writing on QoL. Statistically significant moderating effects on change in different indicators of QoL were observed for all three moderators. Expressive writing was more effective for improving QoL in women who were higher on optimism, lower on avoidance and had less time since a lymphedema diagnosis. These results provide further evidence that there are subsets of individuals for whom expressive writing is more effective. Future research may investigate targeting expressive writing based on identified moderators.

  7. Characterization of interfade duration for satellite communication systems design and optimization in a temperate climate

    NASA Astrophysics Data System (ADS)

    Jorge, Flávio; Riva, Carlo; Rocha, Armando

    2016-03-01

    The characterization of the fade dynamics on Earth-satellite links is an important subject when designing the so called fade mitigation techniques that contribute to the proper reliability of the satellite communication systems and the customers' quality of service (QoS). The interfade duration, defined as the period between two consecutive fade events, has been only poorly analyzed using limited data sets, but its complete characterization would enable the design and optimization of the satellite communication systems by estimating the system requirements to recover in time before the next propagation impairment. Depending on this analysis, several actions can be taken ensuring the service maintenance. In this paper we present for the first time a detailed and comprehensive analysis of the interfade events statistical properties based on 9 years of in-excess attenuation measurements at Ka band (19.7 GHz) with very high availability that is required to build a reliable data set mainly for the longer interfade duration events. The number of years necessary to reach the statistical stability of interfade duration is also evaluated for the first time, providing a reference when accessing the relevance of the results published in the past. The study is carried out in Aveiro, Portugal, which is conditioned by temperate Mediterranean climate with Oceanic influences.

  8. 3Drefine: an interactive web server for efficient protein structure refinement

    PubMed Central

    Bhattacharya, Debswapna; Nowotny, Jackson; Cao, Renzhi; Cheng, Jianlin

    2016-01-01

    3Drefine is an interactive web server for consistent and computationally efficient protein structure refinement with the capability to perform web-based statistical and visual analysis. The 3Drefine refinement protocol utilizes iterative optimization of hydrogen bonding network combined with atomic-level energy minimization on the optimized model using a composite physics and knowledge-based force fields for efficient protein structure refinement. The method has been extensively evaluated on blind CASP experiments as well as on large-scale and diverse benchmark datasets and exhibits consistent improvement over the initial structure in both global and local structural quality measures. The 3Drefine web server allows for convenient protein structure refinement through a text or file input submission, email notification, provided example submission and is freely available without any registration requirement. The server also provides comprehensive analysis of submissions through various energy and statistical feedback and interactive visualization of multiple refined models through the JSmol applet that is equipped with numerous protein model analysis tools. The web server has been extensively tested and used by many users. As a result, the 3Drefine web server conveniently provides a useful tool easily accessible to the community. The 3Drefine web server has been made publicly available at the URL: http://sysbio.rnet.missouri.edu/3Drefine/. PMID:27131371

  9. System for verifiable CT radiation dose optimization based on image quality. part II. process control system.

    PubMed

    Larson, David B; Malarik, Remo J; Hall, Seth M; Podberesky, Daniel J

    2013-10-01

    To evaluate the effect of an automated computed tomography (CT) radiation dose optimization and process control system on the consistency of estimated image noise and size-specific dose estimates (SSDEs) of radiation in CT examinations of the chest, abdomen, and pelvis. This quality improvement project was determined not to constitute human subject research. An automated system was developed to analyze each examination immediately after completion, and to report individual axial-image-level and study-level summary data for patient size, image noise, and SSDE. The system acquired data for 4 months beginning October 1, 2011. Protocol changes were made by using parameters recommended by the prediction application, and 3 months of additional data were acquired. Preimplementation and postimplementation mean image noise and SSDE were compared by using unpaired t tests and F tests. Common-cause variation was differentiated from special-cause variation by using a statistical process control individual chart. A total of 817 CT examinations, 490 acquired before and 327 acquired after the initial protocol changes, were included in the study. Mean patient age and water-equivalent diameter were 12.0 years and 23.0 cm, respectively. The difference between actual and target noise increased from -1.4 to 0.3 HU (P < .01) and the standard deviation decreased from 3.9 to 1.6 HU (P < .01). Mean SSDE decreased from 11.9 to 7.5 mGy, a 37% reduction (P < .01). The process control chart identified several special causes of variation. Implementation of an automated CT radiation dose optimization system led to verifiable simultaneous decrease in image noise variation and SSDE. The automated nature of the system provides the opportunity for consistent CT radiation dose optimization on a broad scale. © RSNA, 2013.

  10. Quality by design approach for developing chitosan-Ca-alginate microspheres for colon delivery of celecoxib-hydroxypropyl-β-cyclodextrin-PVP complex.

    PubMed

    Mennini, N; Furlanetto, S; Cirri, M; Mura, P

    2012-01-01

    The aim of the present work was to develop a new multiparticulate system, designed for colon-specific delivery of celecoxib for both systemic (in chronotherapic treatment of arthritis) and local (in prophylaxis of colon carcinogenesis) therapy. The system simultaneously benefits from ternary complexation with hydroxypropyl-β-cyclodextrin and PVP (polyvinylpyrrolidone), to increase drug solubility, and vectorization in chitosan-Ca-alginate microspheres, to exploit the colon-specific carrier properties of these polymers. Statistical experimental design was employed to investigate the combined effect of four formulation variables, i.e., % of alginate, CaCl₂, and chitosan and time of cross-linking on microsphere entrapment efficiency (EE%) and drug amount released after 4h in colonic medium, considered as the responses to be optimized. Design of experiment was used in the context of Quality by Design, which requires a multivariate approach for understanding the multifactorial relationships among formulation parameters. Doehlert design allowed for defining a design space, which revealed that variations of the considered factors had in most cases an opposite influence on the responses. Desirability function was used to attain simultaneous optimization of both responses. The desired goals were achieved for both systemic and local use of celecoxib. Experimental values obtained from the optimized formulations were in both cases very close to the predicted values, thus confirming the validity of the generated mathematical model. These results demonstrated the effectiveness of the proposed jointed use of drug cyclodextrin complexation and chitosan-Ca-alginate microsphere vectorization, as well as the usefulness of the multivariate approach for the preparation of colon-targeted celecoxib microspheres with optimized properties. Copyright © 2011 Elsevier B.V. All rights reserved.

  11. Optimization of soymilk, mango nectar and sucrose solution mixes for better quality of soymilk based beverage.

    PubMed

    Getu, Rahel; Tola, Yetenayet B; Neela, Satheesh

    2017-01-01

    Soy milk-based beverages play an important role as a healthy food alternative for human consumption. However, the ‘beany’ flavor and chalky mouth feel of soy milk often makes it unpalatable to consumers. The objective of the present study is to optimize a blend of soy milk, mango nectar and sucrose solution for the best quality soy milk-based beverage. This study was designed to develop a soy milk blended beverage, with mango nectar and sucrose solutions, with the best physicochemical and sensory properties. Fourteen combinations of formulations were determined by D-optimal mixture simplex lattice design, by using Design expert. The blended beverages were prepared by mixing the three basic ingredients with the range of 60−100% soy milk, 0–25% mango nectar and 0–15% sucrose solution. The prepared blended beverage was analyzed for selected physicochemical and sensory properties. The statistical significance of the terms in the regression equations were examined by Analysis of Variance (ANOVA) for each response and the significance test level was set at 5% (p < 0.05). The results showed that, as the proportion of mango nectar and sucrose solution increased, total color change, total soluble solid, gross energy, titratable acidity, and beta-carotene contents increased but with a decrease in moisture , ash, protein, ether extract, minerals and phytic acid contents was observed. Fi- nally, numerical optimization determined that 81% soy milk, 16% Mango nectar and 3% sugar solution will give by a soy milk blended beverage with the best physicochemical and sensory properties, with a desirability of 0.564. Blending soy milk with fruit juice such as mango is beneficial, as it improves sensory as well as selected nutritional parameters.

  12. Optimization of Price and Quality in Service Systems,

    DTIC Science & Technology

    Price and service quality are important variables in the design of optimal service systems. Price is important because of the strong consumption...priorities of service offered. The paper takes a systematic view of this problem, and presents techniques for quantitative determination of the optimal prices and service quality in a wide class of systems. (Author)

  13. Vitamin B12 production from crude glycerol by Propionibacterium freudenreichii ssp. shermanii: optimization of medium composition through statistical experimental designs.

    PubMed

    Kośmider, Alicja; Białas, Wojciech; Kubiak, Piotr; Drożdżyńska, Agnieszka; Czaczyk, Katarzyna

    2012-02-01

    A two-step statistical experimental design was employed to optimize the medium for vitamin B(12) production from crude glycerol by Propionibacterium freudenreichii ssp. shermanii. In the first step, using Plackett-Burman design, five of 13 tested medium components (calcium pantothenate, NaH(2)PO(4)·2H(2)O, casein hydrolysate, glycerol and FeSO(4)·7H(2)O) were identified as factors having significant influence on vitamin production. In the second step, a central composite design was used to optimize levels of medium components selected in the first step. Valid statistical models describing the influence of significant factors on vitamin B(12) production were established for each optimization phase. The optimized medium provided a 93% increase in final vitamin concentration compared to the original medium. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Comparison of diffusion-weighted MRI acquisition techniques for normal pancreas at 3.0 Tesla.

    PubMed

    Yao, Xiu-Zhong; Kuang, Tiantao; Wu, Li; Feng, Hao; Liu, Hao; Cheng, Wei-Zhong; Rao, Sheng-Xiang; Wang, He; Zeng, Meng-Su

    2014-01-01

    We aimed to optimize diffusion-weighted imaging (DWI) acquisitions for normal pancreas at 3.0 Tesla. Thirty healthy volunteers were examined using four DWI acquisition techniques with b values of 0 and 600 s/mm2 at 3.0 Tesla, including breath-hold DWI, respiratory-triggered DWI, respiratory-triggered DWI with inversion recovery (IR), and free-breathing DWI with IR. Artifacts, signal-to-noise ratio (SNR) and apparent diffusion coefficient (ADC) of normal pancreas were statistically evaluated among different DWI acquisitions. Statistical differences were noticed in artifacts, SNR, and ADC values of normal pancreas among different DWI acquisitions by ANOVA (P <0.001). Normal pancreas imaging had the lowest artifact in respiratory-triggered DWI with IR, the highest SNR in respiratory-triggered DWI, and the highest ADC value in free-breathing DWI with IR. The head, body, and tail of normal pancreas had statistically different ADC values on each DWI acquisition by ANOVA (P < 0.05). The highest image quality for normal pancreas was obtained using respiratory-triggered DWI with IR. Normal pancreas displayed inhomogeneous ADC values along the head, body, and tail structures.

  15. Water quality modelling of Jadro spring.

    PubMed

    Margeta, J; Fistanic, I

    2004-01-01

    Management of water quality in karst is a specific problem. Water generally moves very fast by infiltration processes but far more by concentrated flows through fissures and openings in karst. This enables the entire surface pollution to be transferred fast and without filtration into groundwater springs. A typical example is the Jadro spring. Changes in water quality at the spring are sudden, but short. Turbidity as a major water quality problem for the karst springs regularly exceeds allowable standards. Former practice in problem solving has been reduced to intensive water disinfection in periods of great turbidity without analyses of disinfection by-products risks for water users. The main prerequisite for water quality control and an optimization of water disinfection is the knowledge of raw water quality and nature of occurrence. The analysis of monitoring data and their functional relationship with hydrological parameters enables establishment of a stochastic model that will help obtain better information on turbidity in different periods of the year. Using the model a great number of average monthly and extreme daily values are generated. By statistical analyses of these data possibility of occurrence of high turbidity in certain months is obtained. This information can be used for designing expert system for water quality management of karst springs. Thus, the time series model becomes a valuable tool in management of drinking water quality of the Jadro spring.

  16. Is it possible for knowledge-based planning to improve intensity modulated radiation therapy plan quality for planners with different planning experiences in left-sided breast cancer patients?

    PubMed

    Wang, Juanqi; Hu, Weigang; Yang, Zhaozhi; Chen, Xiaohui; Wu, Zhiqiang; Yu, Xiaoli; Guo, Xiaomao; Lu, Saiquan; Li, Kaixuan; Yu, Gongyi

    2017-05-22

    Knowledge-based planning (KBP) is a promising technique that can improve plan quality and increase planning efficiency. However, no attempts have been made to extend the domain of KBP for planners with different planning experiences so far. The purpose of this study was to quantify the potential gains for planners with different planning experiences after implementing KBP in intensity modulated radiation therapy (IMRT) plans for left-sided breast cancer patients. The model libraries were populated with 80 expert clinical plans from treated patients who previously received left-sided breast-conserving surgery and IMRT with simultaneously integrated boost. The libraries were created on the RapidPlan TM . 6 planners with different planning experiences (2 beginner planners, 2 junior planners and 2 senior planners) generated manual and KBP optimized plans for additional 10 patients, similar to those included in the model libraries. The plan qualities were compared between manual and KBP plans. All plans were capable of achieving the prescription requirement. There were almost no statistically significant differences in terms of the planning target volume (PTV) coverage and dose conformality. It was demonstrated that the doses for most of organs-at-risk (OARs) were on average lower or equal in KBP plans compared to manual plans except for the senior planners, where the very small differences were not statistically significant. KBP data showed a systematic trend to have superior dose sparing at most parameters for the heart and ipsilateral lung. The observed decrease in the doses to these OARs could be achieved, particularly for the beginner and junior planners. Many differences were statistically significant. It is feasible to generate acceptable IMRT plans after implementing KBP for left-sided breast cancer. KBP helps to effectively improve the quality of IMRT plans against the benchmark of manual plans for less experienced planners without any manual intervention. KBP showed promise for homogenizing the plan quality by transferring planning expertise from more experienced to less experienced planners.

  17. Factors associated with frailty in chronically ill older adults.

    PubMed

    Hackstaff, Lynn

    2009-01-01

    An ex post facto analysis of a secondary dataset examined relationships between physical frailty, depression, and the self-perceived domains of health status and quality-of-life in older adults. The randomized sample included 992 community-dwelling, chronically ill, and functionally impaired adults age 65 and older who received care from a Southern California Kaiser Permanente medical center between 1998 and 2002. Physical frailty represents a level of physiologic vulnerability and functional loss that results in dependence on others for basic, daily living needs (Fried et al., 2001). The purpose of the study was to identify possible intervention junctures related to self-efficacy of older adults in order to help optimize their functionality. Multivariate correlation analyses showed statistically significant positive correlations between frailty level and depression (r = .18; p = < .05), number of medical conditions (r = .09; p = < .05), and self-rated quality-of-life (r = .24; p = < .05). Frailty level showed a statistically significant negative correlation with self-perceived health status (r = -.25; p = < .05). Notably, no statistically significant correlation was found between age and frailty level (r = -.03; p = < .05). In linear regression, self-perceived health status had a partial variance with frailty level (part r = -.18). The significant correlations found support further research to identify interventions to help vulnerable, older adults challenge self-perceived capabilities so that they may achieve optimum functionality through increased physical activity earlier on, and increased self-efficacy to support successful adaptation to aging-related losses.

  18. Correcting for Optimistic Prediction in Small Data Sets

    PubMed Central

    Smith, Gordon C. S.; Seaman, Shaun R.; Wood, Angela M.; Royston, Patrick; White, Ian R.

    2014-01-01

    The C statistic is a commonly reported measure of screening test performance. Optimistic estimation of the C statistic is a frequent problem because of overfitting of statistical models in small data sets, and methods exist to correct for this issue. However, many studies do not use such methods, and those that do correct for optimism use diverse methods, some of which are known to be biased. We used clinical data sets (United Kingdom Down syndrome screening data from Glasgow (1991–2003), Edinburgh (1999–2003), and Cambridge (1990–2006), as well as Scottish national pregnancy discharge data (2004–2007)) to evaluate different approaches to adjustment for optimism. We found that sample splitting, cross-validation without replication, and leave-1-out cross-validation produced optimism-adjusted estimates of the C statistic that were biased and/or associated with greater absolute error than other available methods. Cross-validation with replication, bootstrapping, and a new method (leave-pair-out cross-validation) all generated unbiased optimism-adjusted estimates of the C statistic and had similar absolute errors in the clinical data set. Larger simulation studies confirmed that all 3 methods performed similarly with 10 or more events per variable, or when the C statistic was 0.9 or greater. However, with lower events per variable or lower C statistics, bootstrapping tended to be optimistic but with lower absolute and mean squared errors than both methods of cross-validation. PMID:24966219

  19. AMERICAN-SOVIET SYMPOSIUM ON USE OF MATHEMATICAL MODELS TO OPTIMIZE WATER QUALITY MANAGEMENT HELD AT KHARKOV AND ROSTOV-ON-DON, USSR ON DECEMBER 9-16, 1975

    EPA Science Inventory

    The American-Soviet Symposium on Use of Mathematical Models to Optimize Water Quality Management examines methodological questions related to simulation and optimization modeling of processes that determine water quality of river basins. Discussants describe the general state of ...

  20. A Sparse Representation-Based Deployment Method for Optimizing the Observation Quality of Camera Networks

    PubMed Central

    Wang, Chang; Qi, Fei; Shi, Guangming; Wang, Xiaotian

    2013-01-01

    Deployment is a critical issue affecting the quality of service of camera networks. The deployment aims at adopting the least number of cameras to cover the whole scene, which may have obstacles to occlude the line of sight, with expected observation quality. This is generally formulated as a non-convex optimization problem, which is hard to solve in polynomial time. In this paper, we propose an efficient convex solution for deployment optimizing the observation quality based on a novel anisotropic sensing model of cameras, which provides a reliable measurement of the observation quality. The deployment is formulated as the selection of a subset of nodes from a redundant initial deployment with numerous cameras, which is an ℓ0 minimization problem. Then, we relax this non-convex optimization to a convex ℓ1 minimization employing the sparse representation. Therefore, the high quality deployment is efficiently obtained via convex optimization. Simulation results confirm the effectiveness of the proposed camera deployment algorithms. PMID:23989826

  1. Design and optimization of self-nanoemulsifying drug delivery systems (SNEDDS) for enhanced dissolution of gemfibrozil.

    PubMed

    Villar, Ana Maria Sierra; Naveros, Beatriz Clares; Campmany, Ana Cristina Calpena; Trenchs, Monserrat Aróztegui; Rocabert, Coloma Barbé; Bellowa, Lyda Halbaut

    2012-07-15

    Self-nanoemulsifying drug delivery systems of gemfibrozil were developed under Quality by Design approach for improvement of dissolution and oral absorption. Preliminary screening was performed to select proper components combination. Box-Behnken experimental design was employed as statistical tool to optimize the formulation variables, X(1) (Cremophor(®) EL), X(2) (Capmul(®) MCM-C8), and X(3) (lemon essential oil). Systems were assessed for visual characteristics (emulsification efficacy), turbidity, droplet size, polydispersity index and drug release. Different pH media were also assayed for optimization. Following optimization, the values of formulation components (X(1), X(2), and X(3)) were 32.43%, 29.73% and 21.62%, respectively (16.22% of gemfibrozil). Transmission electron microscopy demonstrated spherical droplet morphology. SNEEDS release study was compared to commercial tablets. Optimized SNEDDS formulation of gemfibrozil showed a significant increase in dissolution rate compared to conventional tablets. Both formulations followed Weibull mathematical model release with a significant difference in t(d) parameter in favor of the SNEDDS. Equally amodelistic parameters were calculated being the dissolution efficiency significantly higher for SNEDDS, confirming that the developed SNEDDS formulation was superior to commercial formulation with respect to in vitro dissolution profile. This paper provides an overview of the SNEDDS of the gemfibrozil as a promising alternative to improve oral absorption. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. Portfolio optimization problem with nonidentical variances of asset returns using statistical mechanical informatics.

    PubMed

    Shinzato, Takashi

    2016-12-01

    The portfolio optimization problem in which the variances of the return rates of assets are not identical is analyzed in this paper using the methodology of statistical mechanical informatics, specifically, replica analysis. We defined two characteristic quantities of an optimal portfolio, namely, minimal investment risk and investment concentration, in order to solve the portfolio optimization problem and analytically determined their asymptotical behaviors using replica analysis. Numerical experiments were also performed, and a comparison between the results of our simulation and those obtained via replica analysis validated our proposed method.

  3. Portfolio optimization problem with nonidentical variances of asset returns using statistical mechanical informatics

    NASA Astrophysics Data System (ADS)

    Shinzato, Takashi

    2016-12-01

    The portfolio optimization problem in which the variances of the return rates of assets are not identical is analyzed in this paper using the methodology of statistical mechanical informatics, specifically, replica analysis. We defined two characteristic quantities of an optimal portfolio, namely, minimal investment risk and investment concentration, in order to solve the portfolio optimization problem and analytically determined their asymptotical behaviors using replica analysis. Numerical experiments were also performed, and a comparison between the results of our simulation and those obtained via replica analysis validated our proposed method.

  4. Efficient greedy algorithms for economic manpower shift planning

    NASA Astrophysics Data System (ADS)

    Nearchou, A. C.; Giannikos, I. C.; Lagodimos, A. G.

    2015-01-01

    Consideration is given to the economic manpower shift planning (EMSP) problem, an NP-hard capacity planning problem appearing in various industrial settings including the packing stage of production in process industries and maintenance operations. EMSP aims to determine the manpower needed in each available workday shift of a given planning horizon so as to complete a set of independent jobs at minimum cost. Three greedy heuristics are presented for the EMSP solution. These practically constitute adaptations of an existing algorithm for a simplified version of EMSP which had shown excellent performance in terms of solution quality and speed. Experimentation shows that the new algorithms perform very well in comparison to the results obtained by both the CPLEX optimizer and an existing metaheuristic. Statistical analysis is deployed to rank the algorithms in terms of their solution quality and to identify the effects that critical planning factors may have on their relative efficiency.

  5. A Methodology for Anatomic Ultrasound Image Diagnostic Quality Assessment.

    PubMed

    Hemmsen, Martin Christian; Lange, Theis; Brandt, Andreas Hjelm; Nielsen, Michael Bachmann; Jensen, Jorgen Arendt

    2017-01-01

    This paper discusses the methods for the assessment of ultrasound image quality based on our experiences with evaluating new methods for anatomic imaging. It presents a methodology to ensure a fair assessment between competing imaging methods using clinically relevant evaluations. The methodology is valuable in the continuing process of method optimization and guided development of new imaging methods. It includes a three phased study plan covering from initial prototype development to clinical assessment. Recommendations to the clinical assessment protocol, software, and statistical analysis are presented. Earlier uses of the methodology has shown that it ensures validity of the assessment, as it separates the influences between developer, investigator, and assessor once a research protocol has been established. This separation reduces confounding influences on the result from the developer to properly reveal the clinical value. This paper exemplifies the methodology using recent studies of synthetic aperture sequential beamforming tissue harmonic imaging.

  6. A systematic review of optimal treatment strategies for localized Ewing's sarcoma of bone after neo-adjuvant chemotherapy.

    PubMed

    Werier, Joel; Yao, Xiaomei; Caudrelier, Jean-Michel; Di Primio, Gina; Ghert, Michelle; Gupta, Abha A; Kandel, Rita; Verma, Shailendra

    2016-03-01

    To perform a systematic review to investigate the optimal treatment strategy among the options of surgery alone, radiotherapy (RT) alone, and the combination of RT plus surgery in the management of localized Ewing's sarcoma of bone following neo-adjuvant chemotherapy. MEDLINE and EMBASE (1999 to February 2015), the Cochrane Library, and relevant conferences were searched. Two systematic reviews and eight full texts met the pre-planned study selection criteria. When RT was compared with surgery, a meta-analysis combining two papers showed that surgery resulted in a higher event-free survival (EFS) than RT in any location (HR = 1.50, 95% CI 1.12-2.00; p = 0.007). However another paper did not find a statistically significant difference in patients with pelvic disease, and no papers identified a significant difference in overall survival. When surgery plus RT was compared with surgery alone, a meta-analysis did not demonstrate a statistically significant difference for EFS between the two groups (HR = 1.21, 95% CI 0.90-1.63). Both surgical morbidities and radiation toxicities were reported. The existing evidence is based on very low aggregate quality as assessed by the GRADE approach. In patients with localized Ewing's sarcoma, either surgery alone (if complete surgical excision with clear margin can be achieved) or RT alone may be a reasonable treatment option. The optimal local treatment for an individual patient should be decided through consideration of patient characteristics, the potential benefit and harm of the treatment options, and patient preference. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  7. The assessment of data sources for influenza virologic surveillance in New York State.

    PubMed

    Escuyer, Kay L; Waters, Christine L; Gowie, Donna L; Maxted, Angie M; Farrell, Gregory M; Fuschino, Meghan E; St George, Kirsten

    2017-03-01

    Following the 2013 USA release of the Influenza Virologic Surveillance Right Size Roadmap, the New York State Department of Health (NYSDOH) embarked on an evaluation of data sources for influenza virologic surveillance. To assess NYS data sources, additional to data generated by the state public health laboratory (PHL), which could enhance influenza surveillance at the state and national level. Potential sources of laboratory test data for influenza were analyzed for quantity and quality. Computer models, designed to assess sample sizes and the confidence of data for statistical representation of influenza activity, were used to compare PHL test data to results from clinical and commercial laboratories, reported between June 8, 2013 and May 31, 2014. Sample sizes tested for influenza at the state PHL were sufficient for situational awareness surveillance with optimal confidence levels, only during peak weeks of the influenza season. Influenza data pooled from NYS PHLs and clinical laboratories generated optimal confidence levels for situational awareness throughout the influenza season. For novel influenza virus detection in NYS, combined real-time (rt) RT-PCR data from state and regional PHLs achieved ≥85% confidence during peak influenza activity, and ≥95% confidence for most of low season and all of off-season. In NYS, combined data from clinical, commercial, and public health laboratories generated optimal influenza surveillance for situational awareness throughout the season. Statistical confidence for novel virus detection, which is reliant on only PHL data, was achieved for most of the year. © 2016 The Authors. Influenza and Other Respiratory Viruses Published by John Wiley & Sons Ltd.

  8. Dependency of image quality on acquisition protocol and image processing in chest tomosynthesis-a visual grading study based on clinical data.

    PubMed

    Jadidi, Masoud; Båth, Magnus; Nyrén, Sven

    2018-04-09

    To compare the quality of images obtained with two different protocols with different acquisition time and the influence from image post processing in a chest digital tomosynthesis (DTS) system. 20 patients with suspected lung cancer were imaged with a chest X-ray equipment with tomosynthesis option. Two examination protocols with different acquisition times (6.3 and 12 s) were performed on each patient. Both protocols were presented with two different image post-processing (standard DTS processing and more advanced processing optimised for chest radiography). Thus, 4 series from each patient, altogether 80 series, were presented anonymously and in a random order. Five observers rated the quality of the reconstructed section images according to predefined quality criteria in three different classes. Visual grading characteristics (VGC) was used to analyse the data and the area under the VGC curve (AUC VGC ) was used as figure-of-merit. The 12 s protocol and the standard DTS processing were used as references in the analyses. The protocol with 6.3 s acquisition time had a statistically significant advantage over the vendor-recommended protocol with 12 s acquisition time for the classes of criteria, Demarcation (AUC VGC = 0.56, p = 0.009) and Disturbance (AUC VGC = 0.58, p < 0.001). A similar value of AUC VGC was found also for the class Structure (definition of bone structures in the spine) (0.56) but it could not be statistically separated from 0.5 (p = 0.21). For the image processing, the VGC analysis showed a small but statistically significant advantage for the standard DTS processing over the more advanced processing for the classes of criteria Demarcation (AUC VGC = 0.45, p = 0.017) and Disturbance (AUC VGC = 0.43, p = 0.005). A similar value of AUC VGC was found also for the class Structure (0.46), but it could not be statistically separated from 0.5 (p = 0.31). The study indicates that the protocol with 6.3 s acquisition time yields slightly better image quality than the vender-recommended protocol with acquisition time 12 s for several anatomical structures. Furthermore, the standard gradation processing  (the vendor-recommended post-processing for DTS), yields to some extent advantage over the gradation processing/multiobjective frequency processing/flexible noise control processing in terms of image quality for all classes of criteria. Advances in knowledge: The study proves that the image quality may be strongly affected by the selection of DTS protocol and that the vendor-recommended protocol may not always be the optimal choice.

  9. A gEUD-based inverse planning technique for HDR prostate brachytherapy: Feasibility study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giantsoudi, D.; Department of Radiation Oncology, Francis H. Burr Proton Therapy Center, Boston, Massachusetts 02114; Baltas, D.

    2013-04-15

    Purpose: The purpose of this work was to study the feasibility of a new inverse planning technique based on the generalized equivalent uniform dose for image-guided high dose rate (HDR) prostate cancer brachytherapy in comparison to conventional dose-volume based optimization. Methods: The quality of 12 clinical HDR brachytherapy implants for prostate utilizing HIPO (Hybrid Inverse Planning Optimization) is compared with alternative plans, which were produced through inverse planning using the generalized equivalent uniform dose (gEUD). All the common dose-volume indices for the prostate and the organs at risk were considered together with radiobiological measures. The clinical effectiveness of the differentmore » dose distributions was investigated by comparing dose volume histogram and gEUD evaluators. Results: Our results demonstrate the feasibility of gEUD-based inverse planning in HDR brachytherapy implants for prostate. A statistically significant decrease in D{sub 10} or/and final gEUD values for the organs at risk (urethra, bladder, and rectum) was found while improving dose homogeneity or dose conformity of the target volume. Conclusions: Following the promising results of gEUD-based optimization in intensity modulated radiation therapy treatment optimization, as reported in the literature, the implementation of a similar model in HDR brachytherapy treatment plan optimization is suggested by this study. The potential of improved sparing of organs at risk was shown for various gEUD-based optimization parameter protocols, which indicates the ability of this method to adapt to the user's preferences.« less

  10. Optimization of transmission scan duration for 15O PET study with sequential dual tracer administration using N-index.

    PubMed

    Kudomi, Nobuyuki; Watabe, Hiroshi; Hayashi, Takuya; Oka, Hisashi; Miyake, Yoshinori; Iida, Hidehiro

    2010-06-01

    Cerebral blood flow (CBF), oxygen extraction fraction (OEF) and cerebral metabolic rate of O(2) (CMRO(2)) can be quantified by PET with the administration of H (2) (15) O and (15)O(2). Recently, a shortening in the duration of these measurements was achieved by the sequential administration of dual tracers of (15)O(2) and H (2) (15) O with PET acquisition and integration method (DARG method). A transmission scan is generally required for correcting photon attenuation in advance of PET scan. Although the DARG method can shorten the total study duration to around 30 min, the transmission scan duration has not been optimized and has possibility to shorten its duration. Our aim of this study was to determine the optimal duration for the transmission scan. We introduced 'N-index', which estimates the noise level on an image obtained by subtracting two statistically independent and physiologically equivalent images. The relationship between noise on functional images and duration of the transmission scan was investigated by N-index. We performed phantom studies to test whether the N-index reflects the pixel noise in a PET image. We also estimated the noise level by the N-index on CBF, OEF and CMRO(2) images from DARG method in clinical patients, and investigated an optimal true count of the transmission scan. We found tight correlation between pixel noise and N-index in the phantom study. By investigating relationship between the transmission scan duration and N-index value for the functional images by DARG method, we revealed that the transmission data with true counts of more than 40 Mcounts results in CBF, OEF, and CMRO(2) images of reasonable quantitative accuracy and quality. The present study suggests that further shortening of DARG measurement is possible by abridging the transmission scan. The N-index could be used to determine the optimal measurement condition when examining the quality of image.

  11. Optimization of the transmission of observable expectation values and observable statistics in continuous-variable teleportation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Albano Farias, L.; Stephany, J.

    2010-12-15

    We analyze the statistics of observables in continuous-variable (CV) quantum teleportation in the formalism of the characteristic function. We derive expressions for average values of output-state observables, in particular, cumulants which are additive in terms of the input state and the resource of teleportation. Working with a general class of teleportation resources, the squeezed-bell-like states, which may be optimized in a free parameter for better teleportation performance, we discuss the relation between resources optimal for fidelity and those optimal for different observable averages. We obtain the values of the free parameter of the squeezed-bell-like states which optimize the central momentamore » and cumulants up to fourth order. For the cumulants the distortion between in and out states due to teleportation depends only on the resource. We obtain optimal parameters {Delta}{sub (2)}{sup opt} and {Delta}{sub (4)}{sup opt} for the second- and fourth-order cumulants, which do not depend on the squeezing of the resource. The second-order central momenta, which are equal to the second-order cumulants, and the photon number average are also optimized by the resource with {Delta}{sub (2)}{sup opt}. We show that the optimal fidelity resource, which has been found previously to depend on the characteristics of input, approaches for high squeezing to the resource that optimizes the second-order momenta. A similar behavior is obtained for the resource that optimizes the photon statistics, which is treated here using the sum of the squared differences in photon probabilities of input versus output states as the distortion measure. This is interpreted naturally to mean that the distortions associated with second-order momenta dominate the behavior of the output state for large squeezing of the resource. Optimal fidelity resources and optimal photon statistics resources are compared, and it is shown that for mixtures of Fock states both resources are equivalent.« less

  12. Digital radiography: optimization of image quality and dose using multi-frequency software.

    PubMed

    Precht, H; Gerke, O; Rosendahl, K; Tingberg, A; Waaler, D

    2012-09-01

    New developments in processing of digital radiographs (DR), including multi-frequency processing (MFP), allow optimization of image quality and radiation dose. This is particularly promising in children as they are believed to be more sensitive to ionizing radiation than adults. To examine whether the use of MFP software reduces the radiation dose without compromising quality at DR of the femur in 5-year-old-equivalent anthropomorphic and technical phantoms. A total of 110 images of an anthropomorphic phantom were imaged on a DR system (Canon DR with CXDI-50 C detector and MLT[S] software) and analyzed by three pediatric radiologists using Visual Grading Analysis. In addition, 3,500 images taken of a technical contrast-detail phantom (CDRAD 2.0) provide an objective image-quality assessment. Optimal image-quality was maintained at a dose reduction of 61% with MLT(S) optimized images. Even for images of diagnostic quality, MLT(S) provided a dose reduction of 88% as compared to the reference image. Software impact on image quality was found significant for dose (mAs), dynamic range dark region and frequency band. By optimizing image processing parameters, a significant dose reduction is possible without significant loss of image quality.

  13. Optimal Weights Mixed Filter for removing mixture of Gaussian and impulse noises

    PubMed Central

    Grama, Ion; Liu, Quansheng

    2017-01-01

    In this paper we consider the problem of restoration of a image contaminated by a mixture of Gaussian and impulse noises. We propose a new statistic called ROADGI which improves the well-known Rank-Ordered Absolute Differences (ROAD) statistic for detecting points contaminated with the impulse noise in this context. Combining ROADGI statistic with the method of weights optimization we obtain a new algorithm called Optimal Weights Mixed Filter (OWMF) to deal with the mixed noise. Our simulation results show that the proposed filter is effective for mixed noises, as well as for single impulse noise and for single Gaussian noise. PMID:28692667

  14. Optimal Weights Mixed Filter for removing mixture of Gaussian and impulse noises.

    PubMed

    Jin, Qiyu; Grama, Ion; Liu, Quansheng

    2017-01-01

    In this paper we consider the problem of restoration of a image contaminated by a mixture of Gaussian and impulse noises. We propose a new statistic called ROADGI which improves the well-known Rank-Ordered Absolute Differences (ROAD) statistic for detecting points contaminated with the impulse noise in this context. Combining ROADGI statistic with the method of weights optimization we obtain a new algorithm called Optimal Weights Mixed Filter (OWMF) to deal with the mixed noise. Our simulation results show that the proposed filter is effective for mixed noises, as well as for single impulse noise and for single Gaussian noise.

  15. Taguchi experimental design to determine the taste quality characteristic of candied carrot

    NASA Astrophysics Data System (ADS)

    Ekawati, Y.; Hapsari, A. A.

    2018-03-01

    Robust parameter design is used to design product that is robust to noise factors so the product’s performance fits the target and delivers a better quality. In the process of designing and developing the innovative product of candied carrot, robust parameter design is carried out using Taguchi Method. The method is used to determine an optimal quality design. The optimal quality design is based on the process and the composition of product ingredients that are in accordance with consumer needs and requirements. According to the identification of consumer needs from the previous research, quality dimensions that need to be assessed are the taste and texture of the product. The quality dimension assessed in this research is limited to the taste dimension. Organoleptic testing is used for this assessment, specifically hedonic testing that makes assessment based on consumer preferences. The data processing uses mean and signal to noise ratio calculation and optimal level setting to determine the optimal process/composition of product ingredients. The optimal value is analyzed using confirmation experiments to prove that proposed product match consumer needs and requirements. The result of this research is identification of factors that affect the product taste and the optimal quality of product according to Taguchi Method.

  16. CPR methodology with new steady-state criterion and more accurate statistical treatment of channel bow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baumgartner, S.; Bieli, R.; Bergmann, U. C.

    2012-07-01

    An overview is given of existing CPR design criteria and the methods used in BWR reload analysis to evaluate the impact of channel bow on CPR margins. Potential weaknesses in today's methodologies are discussed. Westinghouse in collaboration with KKL and Axpo - operator and owner of the Leibstadt NPP - has developed an optimized CPR methodology based on a new criterion to protect against dryout during normal operation and with a more rigorous treatment of channel bow. The new steady-state criterion is expressed in terms of an upper limit of 0.01 for the dryout failure probability per year. This ismore » considered a meaningful and appropriate criterion that can be directly related to the probabilistic criteria set-up for the analyses of Anticipated Operation Occurrences (AOOs) and accidents. In the Monte Carlo approach a statistical modeling of channel bow and an accurate evaluation of CPR response functions allow the associated CPR penalties to be included directly in the plant SLMCPR and OLMCPR in a best-estimate manner. In this way, the treatment of channel bow is equivalent to all other uncertainties affecting CPR. Emphasis is put on quantifying the statistical distribution of channel bow throughout the core using measurement data. The optimized CPR methodology has been implemented in the Westinghouse Monte Carlo code, McSLAP. The methodology improves the quality of dryout safety assessments by supplying more valuable information and better control of conservatisms in establishing operational limits for CPR. The methodology is demonstrated with application examples from the introduction at KKL. (authors)« less

  17. SU-F-T-187: Quantifying Normal Tissue Sparing with 4D Robust Optimization of Intensity Modulated Proton Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Newpower, M; Ge, S; Mohan, R

    Purpose: To report an approach to quantify the normal tissue sparing for 4D robustly-optimized versus PTV-optimized IMPT plans. Methods: We generated two sets of 90 DVHs from a patient’s 10-phase 4D CT set; one by conventional PTV-based optimization done in the Eclipse treatment planning system, and the other by an in-house robust optimization algorithm. The 90 DVHs were created for the following scenarios in each of the ten phases of the 4DCT: ± 5mm shift along x, y, z; ± 3.5% range uncertainty and a nominal scenario. A Matlab function written by Gay and Niemierko was modified to calculate EUDmore » for each DVH for the following structures: esophagus, heart, ipsilateral lung and spinal cord. An F-test determined whether or not the variances of each structure’s DVHs were statistically different. Then a t-test determined if the average EUDs for each optimization algorithm were statistically significantly different. Results: T-test results showed each structure had a statistically significant difference in average EUD when comparing robust optimization versus PTV-based optimization. Under robust optimization all structures except the spinal cord received lower EUDs than PTV-based optimization. Using robust optimization the average EUDs decreased 1.45% for the esophagus, 1.54% for the heart and 5.45% for the ipsilateral lung. The average EUD to the spinal cord increased 24.86% but was still well below tolerance. Conclusion: This work has helped quantify a qualitative relationship noted earlier in our work: that robust optimization leads to plans with greater normal tissue sparing compared to PTV-based optimization. Except in the case of the spinal cord all structures received a lower EUD under robust optimization and these results are statistically significant. While the average EUD to the spinal cord increased to 25.06 Gy under robust optimization it is still well under the TD50 value of 66.5 Gy from Emami et al. Supported in part by the NCI U19 CA021239.« less

  18. Coupling Matched Molecular Pairs with Machine Learning for Virtual Compound Optimization.

    PubMed

    Turk, Samo; Merget, Benjamin; Rippmann, Friedrich; Fulle, Simone

    2017-12-26

    Matched molecular pair (MMP) analyses are widely used in compound optimization projects to gain insights into structure-activity relationships (SAR). The analysis is traditionally done via statistical methods but can also be employed together with machine learning (ML) approaches to extrapolate to novel compounds. The here introduced MMP/ML method combines a fragment-based MMP implementation with different machine learning methods to obtain automated SAR decomposition and prediction. To test the prediction capabilities and model transferability, two different compound optimization scenarios were designed: (1) "new fragments" which occurs when exploring new fragments for a defined compound series and (2) "new static core and transformations" which resembles for instance the identification of a new compound series. Very good results were achieved by all employed machine learning methods especially for the new fragments case, but overall deep neural network models performed best, allowing reliable predictions also for the new static core and transformations scenario, where comprehensive SAR knowledge of the compound series is missing. Furthermore, we show that models trained on all available data have a higher generalizability compared to models trained on focused series and can extend beyond chemical space covered in the training data. Thus, coupling MMP with deep neural networks provides a promising approach to make high quality predictions on various data sets and in different compound optimization scenarios.

  19. An Efficient Framework Model for Optimizing Routing Performance in VANETs.

    PubMed

    Al-Kharasani, Nori M; Zulkarnain, Zuriati Ahmad; Subramaniam, Shamala; Hanapi, Zurina Mohd

    2018-02-15

    Routing in Vehicular Ad hoc Networks (VANET) is a bit complicated because of the nature of the high dynamic mobility. The efficiency of routing protocol is influenced by a number of factors such as network density, bandwidth constraints, traffic load, and mobility patterns resulting in frequency changes in network topology. Therefore, Quality of Service (QoS) is strongly needed to enhance the capability of the routing protocol and improve the overall network performance. In this paper, we introduce a statistical framework model to address the problem of optimizing routing configuration parameters in Vehicle-to-Vehicle (V2V) communication. Our framework solution is based on the utilization of the network resources to further reflect the current state of the network and to balance the trade-off between frequent changes in network topology and the QoS requirements. It consists of three stages: simulation network stage used to execute different urban scenarios, the function stage used as a competitive approach to aggregate the weighted cost of the factors in a single value, and optimization stage used to evaluate the communication cost and to obtain the optimal configuration based on the competitive cost. The simulation results show significant performance improvement in terms of the Packet Delivery Ratio (PDR), Normalized Routing Load (NRL), Packet loss (PL), and End-to-End Delay (E2ED).

  20. Threshold matrix for digital halftoning by genetic algorithm optimization

    NASA Astrophysics Data System (ADS)

    Alander, Jarmo T.; Mantere, Timo J.; Pyylampi, Tero

    1998-10-01

    Digital halftoning is used both in low and high resolution high quality printing technologies. Our method is designed to be mainly used for low resolution ink jet marking machines to produce both gray tone and color images. The main problem with digital halftoning is pink noise caused by the human eye's visual transfer function. To compensate for this the random dot patterns used are optimized to contain more blue than pink noise. Several such dot pattern generator threshold matrices have been created automatically by using genetic algorithm optimization, a non-deterministic global optimization method imitating natural evolution and genetics. A hybrid of genetic algorithm with a search method based on local backtracking was developed together with several fitness functions evaluating dot patterns for rectangular grids. By modifying the fitness function, a family of dot generators results, each with its particular statistical features. Several versions of genetic algorithms, backtracking and fitness functions were tested to find a reasonable combination. The generated threshold matrices have been tested by simulating a set of test images using the Khoros image processing system. Even though the work was focused on developing low resolution marking technology, the resulting family of dot generators can be applied also in other halftoning application areas including high resolution printing technology.

  1. Dynamic evaluation of two decades of WRF-CMAQ ozone ...

    EPA Pesticide Factsheets

    Dynamic evaluation of the fully coupled Weather Research and Forecasting (WRF)– Community Multi-scale Air Quality (CMAQ) model ozone simulations over the contiguous United States (CONUS) using two decades of simulations covering the period from 1990 to 2010 is conducted to assess how well the changes in observed ozone air quality are simulated by the model. The changes induced by variations in meteorology and/or emissions are also evaluated during the same timeframe using spectral decomposition of observed and modeled ozone time series with the aim of identifying the underlying forcing mechanisms that control ozone exceedances and making informed recommendations for the optimal use of regional-scale air quality models. The evaluation is focused on the warm season's (i.e., May–September) daily maximum 8-hr (DM8HR) ozone concentrations, the 4th highest (4th) and average of top 10 DM8HR ozone values (top10), as well as the spectrally-decomposed components of the DM8HR ozone time series using the Kolmogorov-Zurbenko (KZ) filter. Results of the dynamic evaluation are presented for six regions in the U.S., consistent with the National Oceanic and Atmospheric Administration (NOAA) climatic regions. During the earlier 11-yr period (1990–2000), the simulated and observed trends are not statistically significant. During the more recent 2000–2010 period, all trends are statistically significant and WRF-CMAQ captures the observed trend in most regions. Given large n

  2. Optimal Adaptive Statistical Iterative Reconstruction Percentage in Dual-energy Monochromatic CT Portal Venography.

    PubMed

    Zhao, Liqin; Winklhofer, Sebastian; Yang, Zhenghan; Wang, Keyang; He, Wen

    2016-03-01

    The aim of this article was to study the influence of different adaptive statistical iterative reconstruction (ASIR) percentages on the image quality of dual-energy computed tomography (DECT) portal venography in portal hypertension patients. DECT scans of 40 patients with cirrhosis (mean age, 56 years) at the portal venous phase were retrospectively analyzed. Monochromatic images at 60 and 70 keV were reconstructed with four ASIR percentages: 0%, 30%, 50%, and 70%. Computed tomography (CT) numbers of the portal veins (PVs), liver parenchyma, and subcutaneous fat tissue in the abdomen were measured. The standard deviation from the region of interest of the liver parenchyma was interpreted as the objective image noise (IN). The contrast-noise ratio (CNR) between PV and liver parenchyma was calculated. The diagnostic acceptability (DA) and sharpness of PV margins were obtained using a 5-point score. The IN, CNR, DA, and sharpness of PV were compared among the eight groups with different keV + ASIR level combinations. The IN, CNR, DA, and sharpness of PV of different keV + ASIR groups were all statistically different (P < 0.05). In the eight groups, the best and worst CNR were obtained in the 60 keV + 70% ASIR and 70 keV + 0% ASIR (filtered back-projection [FBP]) combination, respectively, whereas the largest and smallest objective IN were obtained in the 60 keV + 0% ASIR (FBP) and 70 keV + 70% combination. The highest DA and sharpness values of PV were obtained at 50% ASIR for 60 keV. An optimal ASIR percentage (50%) combined with an appropriate monochromatic energy level (60 keV) provides the highest DA in portal venography imaging, whereas for the higher monochromatic energy (70 keV) images, 30% ASIR provides the highest image quality, with less IN than 60 keV with 50% ASIR. Copyright © 2015 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  3. Statistical Optimization of Pharmacogenomics Association Studies: Key Considerations from Study Design to Analysis

    PubMed Central

    Grady, Benjamin J.; Ritchie, Marylyn D.

    2011-01-01

    Research in human genetics and genetic epidemiology has grown significantly over the previous decade, particularly in the field of pharmacogenomics. Pharmacogenomics presents an opportunity for rapid translation of associated genetic polymorphisms into diagnostic measures or tests to guide therapy as part of a move towards personalized medicine. Expansion in genotyping technology has cleared the way for widespread use of whole-genome genotyping in the effort to identify novel biology and new genetic markers associated with pharmacokinetic and pharmacodynamic endpoints. With new technology and methodology regularly becoming available for use in genetic studies, a discussion on the application of such tools becomes necessary. In particular, quality control criteria have evolved with the use of GWAS as we have come to understand potential systematic errors which can be introduced into the data during genotyping. There have been several replicated pharmacogenomic associations, some of which have moved to the clinic to enact change in treatment decisions. These examples of translation illustrate the strength of evidence necessary to successfully and effectively translate a genetic discovery. In this review, the design of pharmacogenomic association studies is examined with the goal of optimizing the impact and utility of this research. Issues of ascertainment, genotyping, quality control, analysis and interpretation are considered. PMID:21887206

  4. Structured Set Intra Prediction With Discriminative Learning in a Max-Margin Markov Network for High Efficiency Video Coding

    PubMed Central

    Dai, Wenrui; Xiong, Hongkai; Jiang, Xiaoqian; Chen, Chang Wen

    2014-01-01

    This paper proposes a novel model on intra coding for High Efficiency Video Coding (HEVC), which simultaneously predicts blocks of pixels with optimal rate distortion. It utilizes the spatial statistical correlation for the optimal prediction based on 2-D contexts, in addition to formulating the data-driven structural interdependences to make the prediction error coherent with the probability distribution, which is desirable for successful transform and coding. The structured set prediction model incorporates a max-margin Markov network (M3N) to regulate and optimize multiple block predictions. The model parameters are learned by discriminating the actual pixel value from other possible estimates to maximize the margin (i.e., decision boundary bandwidth). Compared to existing methods that focus on minimizing prediction error, the M3N-based model adaptively maintains the coherence for a set of predictions. Specifically, the proposed model concurrently optimizes a set of predictions by associating the loss for individual blocks to the joint distribution of succeeding discrete cosine transform coefficients. When the sample size grows, the prediction error is asymptotically upper bounded by the training error under the decomposable loss function. As an internal step, we optimize the underlying Markov network structure to find states that achieve the maximal energy using expectation propagation. For validation, we integrate the proposed model into HEVC for optimal mode selection on rate-distortion optimization. The proposed prediction model obtains up to 2.85% bit rate reduction and achieves better visual quality in comparison to the HEVC intra coding. PMID:25505829

  5. Nonlinear Curve-Fitting Program

    NASA Technical Reports Server (NTRS)

    Everhart, Joel L.; Badavi, Forooz F.

    1989-01-01

    Nonlinear optimization algorithm helps in finding best-fit curve. Nonlinear Curve Fitting Program, NLINEAR, interactive curve-fitting routine based on description of quadratic expansion of X(sup 2) statistic. Utilizes nonlinear optimization algorithm calculating best statistically weighted values of parameters of fitting function and X(sup 2) minimized. Provides user with such statistical information as goodness of fit and estimated values of parameters producing highest degree of correlation between experimental data and mathematical model. Written in FORTRAN 77.

  6. Development and Validation of an Agency for Healthcare Research and Quality Indicator for Mortality After Congenital Heart Surgery Harmonized With Risk Adjustment for Congenital Heart Surgery (RACHS-1) Methodology.

    PubMed

    Jenkins, Kathy J; Koch Kupiec, Jennifer; Owens, Pamela L; Romano, Patrick S; Geppert, Jeffrey J; Gauvreau, Kimberlee

    2016-05-20

    The National Quality Forum previously approved a quality indicator for mortality after congenital heart surgery developed by the Agency for Healthcare Research and Quality (AHRQ). Several parameters of the validated Risk Adjustment for Congenital Heart Surgery (RACHS-1) method were included, but others differed. As part of the National Quality Forum endorsement maintenance process, developers were asked to harmonize the 2 methodologies. Parameters that were identical between the 2 methods were retained. AHRQ's Healthcare Cost and Utilization Project State Inpatient Databases (SID) 2008 were used to select optimal parameters where differences existed, with a goal to maximize model performance and face validity. Inclusion criteria were not changed and included all discharges for patients <18 years with International Classification of Diseases, Ninth Revision, Clinical Modification procedure codes for congenital heart surgery or nonspecific heart surgery combined with congenital heart disease diagnosis codes. The final model includes procedure risk group, age (0-28 days, 29-90 days, 91-364 days, 1-17 years), low birth weight (500-2499 g), other congenital anomalies (Clinical Classifications Software 217, except for 758.xx), multiple procedures, and transfer-in status. Among 17 945 eligible cases in the SID 2008, the c statistic for model performance was 0.82. In the SID 2013 validation data set, the c statistic was 0.82. Risk-adjusted mortality rates by center ranged from 0.9% to 4.1% (5th-95th percentile). Congenital heart surgery programs can now obtain national benchmarking reports by applying AHRQ Quality Indicator software to hospital administrative data, based on the harmonized RACHS-1 method, with high discrimination and face validity. © 2016 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.

  7. Evaluation of MLC leaf transmission on IMRT treatment plan quality of patients with advanced lung cancer.

    PubMed

    Chen, Jiayun; Fu, Guishan; Li, Minghui; Song, Yixin; Dai, Jianrong; Miao, Junjie; Liu, Zhiqiang; Li, Yexiong

    2017-12-14

    The purpose of this paper was to evaluate the impact of leaf treatment of multileaf collimator (MLC) in plan quality of intensity-modulated radiotherapy (IMRT) of patients with advanced lung cancer. Five MLCs with different leaf transmissions (0.01%, 0.5%, 1.2%, 1.8%, and 3%) were configured for an accelerator in a treatment planning system. Correspondingly, 5 treatment plans with the same optimization setting were created and evaluated quantitatively for each patient (11 patients total) who was diagnosed with advanced lung cancer. All of the 5 plans for each patient met the dose requirement for the planning treatment volumes (PTVs) and had similar target dose homogeneity and conformity. On average, the doses to selected organs were as follows: (1) V 5 , V 20 , and the mean dose of total lung; (2) the maximum and mean dose to spinal cord planning organ-at-risk volume (PRV); and (3) V 30 and V 40 of heart, decreased slightly when MLC transmission was decreased, but with no statistical differences. There is a clear grouping of plans having total quality score (S D ) value, which is used to evaluate plan quality: (1) more than 1 (patient nos. 1 to 3, 5, and 8), and more than 2.5 (patient no. 6); (2) less than 1 (patient nos. 7 and 10); (3) around 1 (patient nos. 4, 9, and 11). As MLC transmission increased, overall S D values increased as well and plan dose requirement was harder to meet. The clinical requirements were violated increasingly as MLC transmission became large. Total S D with and without normal tissue (NT) showed similar results, with no statistically significant differences. Therefore, decrease of MLC transmission did have minimum impact on plan, and it improved target coverage and reduced normal tissue radiation slightly, with no statistical significance. Plan quality could not be significantly improved by MLC transmission reduction. However, lower MLC transmission may have advantages on lung sparing to low- and intermediate-dose exposure. Besides conventional fraction, hyperfraction, or stereotactic body radiotherapy (SBRT), the reduction on lung sparing is still essential because it is highly relevant to radiation pneumonitis (RP). It has potential to diminish incidence of RP and improve patient's quality of life after irradiation with lowered MLC transmission. Copyright © 2017 American Association of Medical Dosimetrists. Published by Elsevier Inc. All rights reserved.

  8. The impact on midlevel vision of statistically optimal divisive normalization in V1.

    PubMed

    Coen-Cagli, Ruben; Schwartz, Odelia

    2013-07-15

    The first two areas of the primate visual cortex (V1, V2) provide a paradigmatic example of hierarchical computation in the brain. However, neither the functional properties of V2 nor the interactions between the two areas are well understood. One key aspect is that the statistics of the inputs received by V2 depend on the nonlinear response properties of V1. Here, we focused on divisive normalization, a canonical nonlinear computation that is observed in many neural areas and modalities. We simulated V1 responses with (and without) different forms of surround normalization derived from statistical models of natural scenes, including canonical normalization and a statistically optimal extension that accounted for image nonhomogeneities. The statistics of the V1 population responses differed markedly across models. We then addressed how V2 receptive fields pool the responses of V1 model units with different tuning. We assumed this is achieved by learning without supervision a linear representation that removes correlations, which could be accomplished with principal component analysis. This approach revealed V2-like feature selectivity when we used the optimal normalization and, to a lesser extent, the canonical one but not in the absence of both. We compared the resulting two-stage models on two perceptual tasks; while models encompassing V1 surround normalization performed better at object recognition, only statistically optimal normalization provided systematic advantages in a task more closely matched to midlevel vision, namely figure/ground judgment. Our results suggest that experiments probing midlevel areas might benefit from using stimuli designed to engage the computations that characterize V1 optimality.

  9. Design of order statistics filters using feedforward neural networks

    NASA Astrophysics Data System (ADS)

    Maslennikova, Yu. S.; Bochkarev, V. V.

    2016-08-01

    In recent years significant progress have been made in the development of nonlinear data processing techniques. Such techniques are widely used in digital data filtering and image enhancement. Many of the most effective nonlinear filters based on order statistics. The widely used median filter is the best known order statistic filter. Generalized form of these filters could be presented based on Lloyd's statistics. Filters based on order statistics have excellent robustness properties in the presence of impulsive noise. In this paper, we present special approach for synthesis of order statistics filters using artificial neural networks. Optimal Lloyd's statistics are used for selecting of initial weights for the neural network. Adaptive properties of neural networks provide opportunities to optimize order statistics filters for data with asymmetric distribution function. Different examples demonstrate the properties and performance of presented approach.

  10. Statistical Optimality in Multipartite Ranking and Ordinal Regression.

    PubMed

    Uematsu, Kazuki; Lee, Yoonkyung

    2015-05-01

    Statistical optimality in multipartite ranking is investigated as an extension of bipartite ranking. We consider the optimality of ranking algorithms through minimization of the theoretical risk which combines pairwise ranking errors of ordinal categories with differential ranking costs. The extension shows that for a certain class of convex loss functions including exponential loss, the optimal ranking function can be represented as a ratio of weighted conditional probability of upper categories to lower categories, where the weights are given by the misranking costs. This result also bridges traditional ranking methods such as proportional odds model in statistics with various ranking algorithms in machine learning. Further, the analysis of multipartite ranking with different costs provides a new perspective on non-smooth list-wise ranking measures such as the discounted cumulative gain and preference learning. We illustrate our findings with simulation study and real data analysis.

  11. Taguchi's off line method and Multivariate loss function approach for quality management and optimization of process parameters -A review

    NASA Astrophysics Data System (ADS)

    Bharti, P. K.; Khan, M. I.; Singh, Harbinder

    2010-10-01

    Off-line quality control is considered to be an effective approach to improve product quality at a relatively low cost. The Taguchi method is one of the conventional approaches for this purpose. Through this approach, engineers can determine a feasible combination of design parameters such that the variability of a product's response can be reduced and the mean is close to the desired target. The traditional Taguchi method was focused on ensuring good performance at the parameter design stage with one quality characteristic, but most products and processes have multiple quality characteristics. The optimal parameter design minimizes the total quality loss for multiple quality characteristics. Several studies have presented approaches addressing multiple quality characteristics. Most of these papers were concerned with maximizing the parameter combination of signal to noise (SN) ratios. The results reveal the advantages of this approach are that the optimal parameter design is the same as the traditional Taguchi method for the single quality characteristic; the optimal design maximizes the amount of reduction of total quality loss for multiple quality characteristics. This paper presents a literature review on solving multi-response problems in the Taguchi method and its successful implementation in various industries.

  12. Cost-effectiveness of prostate cancer screening: a simulation study based on ERSPC data.

    PubMed

    Heijnsdijk, E A M; de Carvalho, T M; Auvinen, A; Zappa, M; Nelen, V; Kwiatkowski, M; Villers, A; Páez, A; Moss, S M; Tammela, T L J; Recker, F; Denis, L; Carlsson, S V; Wever, E M; Bangma, C H; Schröder, F H; Roobol, M J; Hugosson, J; de Koning, H J

    2015-01-01

    The results of the European Randomized Study of Screening for Prostate Cancer (ERSPC) trial showed a statistically significant 29% prostate cancer mortality reduction for the men screened in the intervention arm and a 23% negative impact on the life-years gained because of quality of life. However, alternative prostate-specific antigen (PSA) screening strategies for the population may exist, optimizing the effects on mortality reduction, quality of life, overdiagnosis, and costs. Based on data of the ERSPC trial, we predicted the numbers of prostate cancers diagnosed, prostate cancer deaths averted, life-years and quality-adjusted life-years (QALY) gained, and cost-effectiveness of 68 screening strategies starting at age 55 years, with a PSA threshold of 3, using microsimulation modeling. The screening strategies varied by age to stop screening and screening interval (one to 14 years or once in a lifetime screens), and therefore number of tests. Screening at short intervals of three years or less was more cost-effective than using longer intervals. Screening at ages 55 to 59 years with two-year intervals had an incremental cost-effectiveness ratio of $73000 per QALY gained and was considered optimal. With this strategy, lifetime prostate cancer mortality reduction was predicted as 13%, and 33% of the screen-detected cancers were overdiagnosed. When better quality of life for the post-treatment period could be achieved, an older age of 65 to 72 years for ending screening was obtained. Prostate cancer screening can be cost-effective when it is limited to two or three screens between ages 55 to 59 years. Screening above age 63 years is less cost-effective because of loss of QALYs because of overdiagnosis. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  13. Optimal colour quality of LED clusters based on memory colours.

    PubMed

    Smet, Kevin; Ryckaert, Wouter R; Pointer, Michael R; Deconinck, Geert; Hanselaer, Peter

    2011-03-28

    The spectral power distributions of tri- and tetrachromatic clusters of Light-Emitting-Diodes, composed of simulated and commercially available LEDs, were optimized with a genetic algorithm to maximize the luminous efficacy of radiation and the colour quality as assessed by the memory colour quality metric developed by the authors. The trade-off of the colour quality as assessed by the memory colour metric and the luminous efficacy of radiation was investigated by calculating the Pareto optimal front using the NSGA-II genetic algorithm. Optimal peak wavelengths and spectral widths of the LEDs were derived, and over half of them were found to be close to Thornton's prime colours. The Pareto optimal fronts of real LED clusters were always found to be smaller than those of the simulated clusters. The effect of binning on designing a real LED cluster was investigated and was found to be quite large. Finally, a real LED cluster of commercially available AlGaInP, InGaN and phosphor white LEDs was optimized to obtain a higher score on memory colour quality scale than its corresponding CIE reference illuminant.

  14. Taguchi Based Performance and Reliability Improvement of an Ion Chamber Amplifier for Enhanced Nuclear Reactor Safety

    NASA Astrophysics Data System (ADS)

    Kulkarni, R. D.; Agarwal, Vivek

    2008-08-01

    An ion chamber amplifier (ICA) is used as a safety device for neutronic power (flux) measurement in regulation and protection systems of nuclear reactors. Therefore, performance reliability of an ICA is an important issue. Appropriate quality engineering is essential to achieve a robust design and performance of the ICA circuit. It is observed that the low input bias current operational amplifiers used in the input stage of the ICA circuit are the most critical devices for proper functioning of the ICA. They are very sensitive to the gamma radiation present in their close vicinity. Therefore, the response of the ICA deteriorates with exposure to gamma radiation resulting in a decrease in the overall reliability, unless desired performance is ensured under all conditions. This paper presents a performance enhancement scheme for an ICA operated in the nuclear environment. The Taguchi method, which is a proven technique for reliability enhancement, has been used in this work. It is demonstrated that if a statistical, optimal design approach, like the Taguchi method is used, the cost of high quality and reliability may be brought down drastically. The complete methodology and statistical calculations involved are presented, as are the experimental and simulation results to arrive at a robust design of the ICA.

  15. Obtaining high resolution XUV coronal images

    NASA Technical Reports Server (NTRS)

    Golub, L.; Spiller, E.

    1992-01-01

    Photographs obtained during three flights of an 11 inch diameter normal incident soft X-ray (wavelength 63.5 A) telescope are analyzed and the data are compared to the results expected from tests of the mirror surfaces. Multilayer coated X ray telescopes have the potential for 0.01 arcsec resolution, and there is optimism that such high quality mirrors can be built. Some of the factors which enter into the performance actually achieved in practice are as follows: quality of the mirror substrate, quality of the multilayer coating, and number of photons collected. Measurements of multilayer mirrors show that the actual performance achieved in the solar X-ray images demonstrates a reduction in the scattering compared to that calculated from the topography of the top surface of the multilayer. In the brief duration of a rocket flight, the resolution is also limited by counting statistics from the number of photons collected. At X-ray Ultraviolet (XUV) wavelengths from 171 to 335 A the photon flux should be greater than 10(exp 10) ph/sec, so that a resolution better than 0.1 arcsec might be achieved, if mirror quality does not provide a limit first. In a satellite, a large collecting area will be needed for the highest resolution.

  16. Social support mediates the association between benefit finding and quality of life in caregivers.

    PubMed

    Brand, Charles; Barry, Lorna; Gallagher, Stephen

    2016-06-01

    The psychosocial pathways underlying associations between benefit finding and quality of life are poorly understood. Here, we examined associations between benefit finding, social support, optimism and quality of life in a sample of 84 caregivers. Results revealed that quality of life was predicted by benefit finding, optimism and social support. Moreover, the association between benefit finding and quality of life was explained by social support, but not optimism; caregivers who reported greater benefit finding perceived their social support be higher and this, in turn, had a positive effect on their overall quality of life. These results underscore the importance of harnessing benefit finding to enhance caregiver quality of life. © The Author(s) 2014.

  17. On optimal current patterns for electrical impedance tomography.

    PubMed

    Demidenko, Eugene; Hartov, Alex; Soni, Nirmal; Paulsen, Keith D

    2005-02-01

    We develop a statistical criterion for optimal patterns in planar circular electrical impedance tomography. These patterns minimize the total variance of the estimation for the resistance or conductance matrix. It is shown that trigonometric patterns (Isaacson, 1986), originally derived from the concept of distinguishability, are a special case of our optimal statistical patterns. New optimal random patterns are introduced. Recovering the electrical properties of the measured body is greatly simplified when optimal patterns are used. The Neumann-to-Dirichlet map and the optimal patterns are derived for a homogeneous medium with an arbitrary distribution of the electrodes on the periphery. As a special case, optimal patterns are developed for a practical EIT system with a finite number of electrodes. For a general nonhomogeneous medium, with no a priori restriction, the optimal patterns for the resistance and conductance matrix are the same. However, for a homogeneous medium, the best current pattern is the worst voltage pattern and vice versa. We study the effect of the number and the width of the electrodes on the estimate of resistivity and conductivity in a homogeneous medium. We confirm experimentally that the optimal patterns produce minimum conductivity variance in a homogeneous medium. Our statistical model is able to discriminate between a homogenous agar phantom and one with a 2 mm air hole with error probability (p-value) 1/1000.

  18. Influence on Visual Quality of Intraoperative Orientation of Asymmetric Intraocular Lenses.

    PubMed

    Bonaque-González, Sergio; Ríos, Susana; Amigó, Alfredo; López-Gil, Norberto

    2015-10-01

    To evaluate visual quality when changing the intraocular orientation of the Lentis Mplus LS-312MF nonrotational symmetric +3.00 diopters aspheric multifocal intraocular lens ([IOL] Oculentis GmbH, Berlin, Germany) in normal eyes. An artificial eye was used to measure the in vitro wavefront of the IOL. The corneal topography of 20 healthy patients was obtained. For each eye, a computational analysis simulated the implantation of the IOL. The modulation transfer function (MTF) and an image quality parameter (visually modulated transfer function [VSMTF] metric) were calculated for a 5.0-mm pupil and for three conditions: distance, intermediate, and near vision. The procedure was repeated for each eye after a rotation of the IOL with respect to the cornea from 0° to 360° in 1° steps. Statistical analysis showed significant differences in mean VSMTF values between orientations for distance vision. Optimal orientation of the IOL (different for each eye) showed a mean improvement of 58% ± 19% (range: 20% to 121%) in VSMTF values with respect to the worst possible orientation. For these orientations, intermediate and near vision quality were statistically indistinguishable. The MTFs were different between orientations, showing a mean difference of approximately 5 cycles per degree in the maximum spatial frequencies that can be transferred between the best and the worst orientations for distance vision. The results suggest that implantation of this nonrotational symmetric IOL should improve visual outcomes if it is oriented to coincide with a customized meridian. A simple, practical method is proposed to find an approximation to the angle that an Mplus IOL should be inserted. Copyright 2015, SLACK Incorporated.

  19. HIV quality report cards: impact of case-mix adjustment and statistical methods.

    PubMed

    Ohl, Michael E; Richardson, Kelly K; Goto, Michihiko; Vaughan-Sarrazin, Mary; Schweizer, Marin L; Perencevich, Eli N

    2014-10-15

    There will be increasing pressure to publicly report and rank the performance of healthcare systems on human immunodeficiency virus (HIV) quality measures. To inform discussion of public reporting, we evaluated the influence of case-mix adjustment when ranking individual care systems on the viral control quality measure. We used data from the Veterans Health Administration (VHA) HIV Clinical Case Registry and administrative databases to estimate case-mix adjusted viral control for 91 local systems caring for 12 368 patients. We compared results using 2 adjustment methods, the observed-to-expected estimator and the risk-standardized ratio. Overall, 10 913 patients (88.2%) achieved viral control (viral load ≤400 copies/mL). Prior to case-mix adjustment, system-level viral control ranged from 51% to 100%. Seventeen (19%) systems were labeled as low outliers (performance significantly below the overall mean) and 11 (12%) as high outliers. Adjustment for case mix (patient demographics, comorbidity, CD4 nadir, time on therapy, and income from VHA administrative databases) reduced the number of low outliers by approximately one-third, but results differed by method. The adjustment model had moderate discrimination (c statistic = 0.66), suggesting potential for unadjusted risk when using administrative data to measure case mix. Case-mix adjustment affects rankings of care systems on the viral control quality measure. Given the sensitivity of rankings to selection of case-mix adjustment methods-and potential for unadjusted risk when using variables limited to current administrative databases-the HIV care community should explore optimal methods for case-mix adjustment before moving forward with public reporting. Published by Oxford University Press on behalf of the Infectious Diseases Society of America 2014. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  20. [Male sterility and its association with genital disease and environmental factors].

    PubMed

    Merino Ruiz, M C; De León Cervantes, M G; García Flores, R F

    1995-10-01

    Semen quality may be affected by many factors, as there is evidence that conditions as varicocele, criptorquidia, orchitis and bacterian infections; as well as to exposure to physical agents as heat, or chemical substances, or ingestion of alcohol and drugs, may affect semen quality. The objective of this study is to investigate the risk implied in the exposure to these factors on the semen quality. The study was carried out in a prospective way in a group of males at Clínica de Infertilidad, Unidad de Biología de la Reproducción del Hospital Universitario Dr. J.E. González. Ninety nine males were studied, they received an intentioned questionnaire about antecedents of exposure to environmental factors, and urologic resolved pathology. Espermatobioscopy was done and it was classified according to OMS. Two groups were formed, one with the individuals with normal espermatobioscopy (n = 25); and the abnormal ones (n = 74). The statistical Incidences Reason, square Xi and Atributable Risk, were applied in order to determine the impact that different factors may have on semen quality. The found alterations in semen were astenozoospermia (n = 58); hypospermia (n = 22); oligozoospermia (n = 18); teratozoospermia (n = 7); polizoospermia (n = 7); and azoospermia (n = 6). The results of the mentioned statistical tests, show that in these alterations there is an associated risk factor to the use of tobacco, exposure to chemical substances, to physical aggresors; and anatomic anomalies previously corrected. It is considered that obtention of this information is a great help because once the unfavorable factors are eliminated, the environment is improved in order to obtain an espermatogenesis in optimal conditions.

  1. A comparison of video review and feedback device measurement of chest compressions quality during pediatric cardiopulmonary resuscitation.

    PubMed

    Hsieh, Ting-Chang; Wolfe, Heather; Sutton, Robert; Myers, Sage; Nadkarni, Vinay; Donoghue, Aaron

    2015-08-01

    To describe chest compression (CC) rate, depth, and leaning during pediatric cardiopulmonary resuscitation (CPR) as measured by two simultaneous methods, and to assess the accuracy and reliability of video review in measuring CC quality. Resuscitations in a pediatric emergency department are videorecorded for quality improvement. Patients aged 8-18 years receiving CPR under videorecording were eligible for inclusion. CPR was recorded by a pressure/accelerometer feedback device and tabulated in 30-s epochs of uninterrupted CC. Investigators reviewed videorecorded CPR and measured rate, depth, and release by observation. Raters categorized epochs as 'meeting criteria' if 80% of CCs in an epoch were done with appropriate depth (>45 mm) and/or release (<2.5 kg leaning). Comparison between device measurement and video was made by Spearman's ρ for rate and by κ statistic for depth and release. Interrater reliability for depth and release was measured by κ statistic. Five patients underwent videorecorded CPR using the feedback device. 97 30-s epochs of CCs were analyzed. CCs met criteria for rate in 74/97 (76%) of epochs; depth in 38/97 (39%); release in 82/97 (84%). Agreement between video and feedback device for rate was good (ρ = 0.77); agreement was poor for depth and release (κ 0.04-0.41). Interrater reliability for depth and release measured by video was poor (κ 0.04-0.49). Video review measured CC rate accurately; depth and release were not reliably or accurately assessed by video. Future research should focus on the optimal combination of methods for measuring CPR quality. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  2. Artificial Intelligence Approach to Support Statistical Quality Control Teaching

    ERIC Educational Resources Information Center

    Reis, Marcelo Menezes; Paladini, Edson Pacheco; Khator, Suresh; Sommer, Willy Arno

    2006-01-01

    Statistical quality control--SQC (consisting of Statistical Process Control, Process Capability Studies, Acceptance Sampling and Design of Experiments) is a very important tool to obtain, maintain and improve the Quality level of goods and services produced by an organization. Despite its importance, and the fact that it is taught in technical and…

  3. The potential of coordinated reservoir operation for flood mitigation in large basins - A case study on the Bavarian Danube using coupled hydrological-hydrodynamic models

    NASA Astrophysics Data System (ADS)

    Seibert, S. P.; Skublics, D.; Ehret, U.

    2014-09-01

    The coordinated operation of reservoirs in large-scale river basins has great potential to improve flood mitigation. However, this requires large scale hydrological models to translate the effect of reservoir operation to downstream points of interest, in a quality sufficient for the iterative development of optimized operation strategies. And, of course, it requires reservoirs large enough to make a noticeable impact. In this paper, we present and discuss several methods dealing with these prerequisites for reservoir operation using the example of three major floods in the Bavarian Danube basin (45,000 km2) and nine reservoirs therein: We start by presenting an approach for multi-criteria evaluation of model performance during floods, including aspects of local sensitivity to simulation quality. Then we investigate the potential of joint hydrologic-2d-hydrodynamic modeling to improve model performance. Based on this, we evaluate upper limits of reservoir impact under idealized conditions (perfect knowledge of future rainfall) with two methods: Detailed simulations and statistical analysis of the reservoirs' specific retention volume. Finally, we investigate to what degree reservoir operation strategies optimized for local (downstream vicinity to the reservoir) and regional (at the Danube) points of interest are compatible. With respect to model evaluation, we found that the consideration of local sensitivities to simulation quality added valuable information not included in the other evaluation criteria (Nash-Sutcliffe efficiency and Peak timing). With respect to the second question, adding hydrodynamic models to the model chain did, contrary to our expectations, not improve simulations, despite the fact that under idealized conditions (using observed instead of simulated lateral inflow) the hydrodynamic models clearly outperformed the routing schemes of the hydrological models. Apparently, the advantages of hydrodynamic models could not be fully exploited when fed by output from hydrological models afflicted with systematic errors in volume and timing. This effect could potentially be reduced by joint calibration of the hydrological-hydrodynamic model chain. Finally, based on the combination of the simulation-based and statistical impact assessment, we identified one reservoir potentially useful for coordinated, regional flood mitigation for the Danube. While this finding is specific to our test basin, the more interesting and generally valid finding is that operation strategies optimized for local and regional flood mitigation are not necessarily mutually exclusive, sometimes they are identical, sometimes they can, due to temporal offsets, be pursued simultaneously.

  4. A case study: application of statistical process control tool for determining process capability and sigma level.

    PubMed

    Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona

    2012-01-01

    Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical process control study on process. Interpretation of such a study provides information about stability, process variability, changing of trends, and quantification of process ability against defective production. Comparative evaluation of critical quality attributes by Pareto charts provides the least capable and most variable process that is liable for improvement. Statistical process control thus proves to be an important tool for six sigma-capable process development and continuous quality improvement.

  5. Effects of moist- and dry-heat cooking on the meat quality, microstructure and sensory characteristics of native chicken meat.

    PubMed

    Chumngoen, Wanwisa; Chen, Chih-Feng; Tan, Fa-Jui

    2018-01-01

    This study investigates the effects of moist- (water-cooking; WC) and dry-heat (oven-cooking; OC) on the quality, microstructure and sensory characteristics of native chicken breast meat. The results revealed that OC meat had a significantly higher cooking time, cooking loss, and shear force values and lower L* values. Protein solubility decreased after cooking in both cooking methods; however, no statistical difference was observed between WC and OC meats, whereas collagen solubility and myofibrillar fragmentation index (MFI) increased after cooking and WC meat exhibited higher collagen solubility and MFI (P < 0.05). The fiber diameter and sarcomere length decreased substantially after cooking, and fibril shrinkage was noticeable in OC meat (P < 0.05). Descriptive sensory analysis revealed that WC meat exhibited a significantly higher moisture release and lower initial hardness, chewdown hardness and residual loose particles. A darker color and enhanced chickeny flavor were observed for OC meat. Based on the unique sensory and physicochemical characteristics in demand, producers should employ appropriate cooking methods to optimize native chicken meat quality. © 2017 Japanese Society of Animal Science.

  6. Optimization protocol for the extraction of 6-gingerol and 6-shogaol from Zingiber officinale var. rubrum Theilade and improving antioxidant and anticancer activity using response surface methodology.

    PubMed

    Ghasemzadeh, Ali; Jaafar, Hawa Z E; Rahmat, Asmah

    2015-07-30

    Analysis and extraction of plant matrices are important processes for the development, modernization, and quality control of herbal formulations. Response surface methodology is a collection of statistical and mathematical techniques that are used to optimize the range of variables in various experimental processes to reduce the number of experimental runs, cost , and time, compared to other methods. Response surface methodology was applied for optimizing reflux extraction conditions for achieving high 6-gingerol and 6-shogaol contents, and high antioxidant activity in Zingiber officinale var. rubrum Theilade . The two-factor central composite design was employed to determine the effects of two independent variables, namely extraction temperature (X1: 50-80 °C) and time (X2: 2-4 h), on the properties of the extracts. The 6-gingerol and 6-shogaol contents were measured using ultra-performance liquid chromatography. The antioxidant activity of the rhizome extracts was determined by means of the 1,1-diphenyl-2-picrylhydrazyl assay. Anticancer activity of optimized extracts against HeLa cancer cell lines was measured using MTT (3-(4,5-dimethylthiazol-2-yl)-2,5-diphenyltetrazolium bromide) assay. Increasing the extraction temperature and time induced significant response of the variables. The optimum extraction condition for all responses was at 76.9 °C for 3.4 h. Under the optimum condition, the corresponding predicted response values for 6-gingerol, 6-shogaol, and the antioxidant activity were 2.89 mg/g DW, 1.85 mg/g DW, and 84.3%, respectively. 6-gingerol and 6-shogaol were extracted under optimized condition to check the viability of the models. The values were 2.92 and 1.88 mg/g DW, and 84.0% for 6-gingerol, 6-shogaol, and the antioxidant activity respectively. The experimental values agreed with those predicted, thus indicating suitability of the models employed and the success of RSM in optimizing the extraction condition. With optimizing of reflux extraction anticancer activity of extracts against HeLa cancer cells enhanced about 16.8%. The half inhibition concentration (IC50) value of optimized and unoptimized extract was found at concentration of 20.9 and 38.4 μg/mL respectively. Optimized extract showed more distinct anticancer activities against HeLa cancer cells in a concentration of 40 μg/mL (P < 0.01) without toxicity to normal cells. The results indicated that the pharmaceutical quality of ginger could be improved significantly by optimizing of extraction process using response surface methodology.

  7. Statistical and engineering methods for model enhancement

    NASA Astrophysics Data System (ADS)

    Chang, Chia-Jung

    Models which describe the performance of physical process are essential for quality prediction, experimental planning, process control and optimization. Engineering models developed based on the underlying physics/mechanics of the process such as analytic models or finite element models are widely used to capture the deterministic trend of the process. However, there usually exists stochastic randomness in the system which may introduce the discrepancy between physics-based model predictions and observations in reality. Alternatively, statistical models can be used to develop models to obtain predictions purely based on the data generated from the process. However, such models tend to perform poorly when predictions are made away from the observed data points. This dissertation contributes to model enhancement research by integrating physics-based model and statistical model to mitigate the individual drawbacks and provide models with better accuracy by combining the strengths of both models. The proposed model enhancement methodologies including the following two streams: (1) data-driven enhancement approach and (2) engineering-driven enhancement approach. Through these efforts, more adequate models are obtained, which leads to better performance in system forecasting, process monitoring and decision optimization. Among different data-driven enhancement approaches, Gaussian Process (GP) model provides a powerful methodology for calibrating a physical model in the presence of model uncertainties. However, if the data contain systematic experimental errors, the GP model can lead to an unnecessarily complex adjustment of the physical model. In Chapter 2, we proposed a novel enhancement procedure, named as “Minimal Adjustment”, which brings the physical model closer to the data by making minimal changes to it. This is achieved by approximating the GP model by a linear regression model and then applying a simultaneous variable selection of the model and experimental bias terms. Two real examples and simulations are presented to demonstrate the advantages of the proposed approach. Different from enhancing the model based on data-driven perspective, an alternative approach is to focus on adjusting the model by incorporating the additional domain or engineering knowledge when available. This often leads to models that are very simple and easy to interpret. The concepts of engineering-driven enhancement are carried out through two applications to demonstrate the proposed methodologies. In the first application where polymer composite quality is focused, nanoparticle dispersion has been identified as a crucial factor affecting the mechanical properties. Transmission Electron Microscopy (TEM) images are commonly used to represent nanoparticle dispersion without further quantifications on its characteristics. In Chapter 3, we developed the engineering-driven nonhomogeneous Poisson random field modeling strategy to characterize nanoparticle dispersion status of nanocomposite polymer, which quantitatively represents the nanomaterial quality presented through image data. The model parameters are estimated through the Bayesian MCMC technique to overcome the challenge of limited amount of accessible data due to the time consuming sampling schemes. The second application is to calibrate the engineering-driven force models of laser-assisted micro milling (LAMM) process statistically, which facilitates a systematic understanding and optimization of targeted processes. In Chapter 4, the force prediction interval has been derived by incorporating the variability in the runout parameters as well as the variability in the measured cutting forces. The experimental results indicate that the model predicts the cutting force profile with good accuracy using a 95% confidence interval. To conclude, this dissertation is the research drawing attention to model enhancement, which has considerable impacts on modeling, design, and optimization of various processes and systems. The fundamental methodologies of model enhancement are developed and further applied to various applications. These research activities developed engineering compliant models for adequate system predictions based on observational data with complex variable relationships and uncertainty, which facilitate process planning, monitoring, and real-time control.

  8. Risk alignment in health care quality and financing: optimizing value.

    PubMed

    Granata, A V

    1998-01-01

    How should health care best consolidate rational cost control while preserving and enhancing quality? That is, how can a system best optimize value? A limitation of many current health management modalities may be that the power to control health spending has been expropriated from physician providers, while they are still fully responsible for quality. Assigning responsibility without authority is a significant predicament. There are growing indications that well-organized, well-managed groups of high quality physicians may be able to directly manage both types of risk-quality and financial. The best way to optimize responsibility and authority, and to control financial and quality risks, is to place such responsibility and authority within the same entity.

  9. Optimization and comparison of simultaneous and separate acquisition protocols for dual isotope myocardial perfusion SPECT

    PubMed Central

    Ghaly, Michael; Links, Jonathan M; Frey, Eric C

    2015-01-01

    Dual-isotope simultaneous-acquisition (DISA) rest-stress myocardial perfusion SPECT (MPS) protocols offer a number of advantages over separate acquisition. However, crosstalk contamination due to scatter in the patient and interactions in the collimator degrade image quality. Compensation can reduce the effects of crosstalk, but does not entirely eliminate image degradations. Optimizing acquisition parameters could further reduce the impact of crosstalk. In this paper we investigate the optimization of the rest Tl-201 energy window width and relative injected activities using the ideal observer (IO), a realistic digital phantom population and Monte Carlo (MC) simulated Tc-99m and Tl-201 projections as a means to improve image quality. We compared performance on a perfusion defect detection task for Tl-201 acquisition energy window widths varying from 4 to 40 keV centered at 72 keV for a camera with a 9% energy resolution. We also investigated 7 different relative injected activities, defined as the ratio of Tc-99m and Tl-201 activities, while keeping the total effective dose constant at 13.5 mSv. For each energy window and relative injected activity, we computed the IO test statistics using a Markov chain Monte Carlo (MCMC) method for an ensemble of 1,620 triplets of fixed and reversible defect-present, and defect-absent noisy images modeling realistic background variations. The volume under the 3-class receiver operating characteristic (ROC) surface (VUS) was estimated and served as the figure of merit. For simultaneous acquisition, the IO suggested that relative Tc-to-Tl injected activity ratios of 2.6–5 and acquisition energy window widths of 16–22% were optimal. For separate acquisition, we observed a broad range of optimal relative injected activities from 2.6 to 12.1 and acquisition energy window of widths 16–22%. A negative correlation between Tl-201 injected activity and the width of the Tl-201 energy window was observed in these ranges. The results also suggested that DISA methods could potentially provide image quality as good as that obtained with separate acquisition protocols. We compared observer performance for the optimized protocols and the current clinical protocol using separate acquisition. The current clinical protocols provided better performance at a cost of injecting the patient with approximately double the injected activity of Tc-99m and Tl-201, resulting in substantially increased radiation dose. PMID:26083239

  10. Cascaded systems analysis of noise and detectability in dual-energy cone-beam CT

    PubMed Central

    Gang, Grace J.; Zbijewski, Wojciech; Webster Stayman, J.; Siewerdsen, Jeffrey H.

    2012-01-01

    Purpose: Dual-energy computed tomography and dual-energy cone-beam computed tomography (DE-CBCT) are promising modalities for applications ranging from vascular to breast, renal, hepatic, and musculoskeletal imaging. Accordingly, the optimization of imaging techniques for such applications would benefit significantly from a general theoretical description of image quality that properly incorporates factors of acquisition, reconstruction, and tissue decomposition in DE tomography. This work reports a cascaded systems analysis model that includes the Poisson statistics of x rays (quantum noise), detector model (flat-panel detectors), anatomical background, image reconstruction (filtered backprojection), DE decomposition (weighted subtraction), and simple observer models to yield a task-based framework for DE technique optimization. Methods: The theoretical framework extends previous modeling of DE projection radiography and CBCT. Signal and noise transfer characteristics are propagated through physical and mathematical stages of image formation and reconstruction. Dual-energy decomposition was modeled according to weighted subtraction of low- and high-energy images to yield the 3D DE noise-power spectrum (NPS) and noise-equivalent quanta (NEQ), which, in combination with observer models and the imaging task, yields the dual-energy detectability index (d′). Model calculations were validated with NPS and NEQ measurements from an experimental imaging bench simulating the geometry of a dedicated musculoskeletal extremities scanner. Imaging techniques, including kVp pair and dose allocation, were optimized using d′ as an objective function for three example imaging tasks: (1) kidney stone discrimination; (2) iodine vs bone in a uniform, soft-tissue background; and (3) soft tissue tumor detection on power-law anatomical background. Results: Theoretical calculations of DE NPS and NEQ demonstrated good agreement with experimental measurements over a broad range of imaging conditions. Optimization results suggest a lower fraction of total dose imparted by the low-energy acquisition, a finding consistent with previous literature. The selection of optimal kVp pair reveals the combined effect of both quantum noise and contrast in the kidney stone discrimination and soft-tissue tumor detection tasks, whereas the K-edge effect of iodine was the dominant factor in determining kVp pairs in the iodine vs bone task. The soft-tissue tumor task illustrated the benefit of dual-energy imaging in eliminating anatomical background noise and improving detectability beyond that achievable by single-energy scans. Conclusions: This work established a task-based theoretical framework that is predictive of DE image quality. The model can be utilized in optimizing a broad range of parameters in image acquisition, reconstruction, and decomposition, providing a useful tool for maximizing DE-CBCT image quality and reducing dose. PMID:22894440

  11. A Unified Statistical Rain-Attenuation Model for Communication Link Fade Predictions and Optimal Stochastic Fade Control Design Using a Location-Dependent Rain-Statistic Database

    NASA Technical Reports Server (NTRS)

    Manning, Robert M.

    1990-01-01

    A static and dynamic rain-attenuation model is presented which describes the statistics of attenuation on an arbitrarily specified satellite link for any location for which there are long-term rainfall statistics. The model may be used in the design of the optimal stochastic control algorithms to mitigate the effects of attenuation and maintain link reliability. A rain-statistics data base is compiled, which makes it possible to apply the model to any location in the continental U.S. with a resolution of 0-5 degrees in latitude and longitude. The model predictions are compared with experimental observations, showing good agreement.

  12. Batch and Continuous Ultrasound Assisted Extraction of Boldo Leaves (Peumus boldus Mol.).

    PubMed

    Petigny, Loïc; Périno-Issartier, Sandrine; Wajsman, Joël; Chemat, Farid

    2013-03-12

    Vegetal extracts are widely used as primary ingredients for various products from creams to perfumes in the pharmaceutical, nutraceutic and cosmetic industries. Having concentrated and active extract is essential, as the process must extract as much soluble material as possible in a minimum time, using the least possible volume of solvent. The boldo leaves extract is of great interest for the industry as it holds a great anti-oxidant activity due to high levels of flavonoids and alkaloids such as boldine. Ultrasound Assisted Extraction (UAE) has been used to improve the efficiency of the plant extraction, reducing extraction time, increasing the concentration of the extract with the same amount of solvent and plant material. After a preliminary study, a response surface method has been used to optimize the extraction of soluble material from the plant. The results provided by the statistical analysis revealed that the optimized conditions were: sonication power 23 W/cm2 for 40 min and a temperature of 36 °C. The optimized parameters of the UAE provide a better extraction compared to a conventional maceration in terms of process time (30 min instead of 120 min), higher yield, more energy saving, cleanliness, safety and product quality.

  13. Batch and Continuous Ultrasound Assisted Extraction of Boldo Leaves (Peumus boldus Mol.)

    PubMed Central

    Petigny, Loïc; Périno-Issartier, Sandrine; Wajsman, Joël; Chemat, Farid

    2013-01-01

    Vegetal extracts are widely used as primary ingredients for various products from creams to perfumes in the pharmaceutical, nutraceutic and cosmetic industries. Having concentrated and active extract is essential, as the process must extract as much soluble material as possible in a minimum time, using the least possible volume of solvent. The boldo leaves extract is of great interest for the industry as it holds a great anti-oxidant activity due to high levels of flavonoids and alkaloids such as boldine. Ultrasound Assisted Extraction (UAE) has been used to improve the efficiency of the plant extraction, reducing extraction time, increasing the concentration of the extract with the same amount of solvent and plant material. After a preliminary study, a response surface method has been used to optimize the extraction of soluble material from the plant. The results provided by the statistical analysis revealed that the optimized conditions were: sonication power 23 W/cm2 for 40 min and a temperature of 36 °C. The optimized parameters of the UAE provide a better extraction compared to a conventional maceration in terms of process time (30 min instead of 120 min), higher yield, more energy saving, cleanliness, safety and product quality. PMID:23481637

  14. Rapid development of xylanase assay conditions using Taguchi methodology.

    PubMed

    Prasad Uday, Uma Shankar; Bandyopadhyay, Tarun Kanti; Bhunia, Biswanath

    2016-11-01

    The present investigation is mainly concerned with the rapid development of extracellular xylanase assay conditions by using Taguchi methodology. The extracellular xylanase was produced from Aspergillus niger (KP874102.1), a new strain isolated from a soil sample of the Baramura forest, Tripura West, India. Four physical parameters including temperature, pH, buffer concentration and incubation time were considered as key factors for xylanase activity and were optimized using Taguchi robust design methodology for enhanced xylanase activity. The main effect, interaction effects and optimal levels of the process factors were determined using signal-to-noise (S/N) ratio. The Taguchi method recommends the use of S/N ratio to measure quality characteristics. Based on analysis of the S/N ratio, optimal levels of the process factors were determined. Analysis of variance (ANOVA) was performed to evaluate statistically significant process factors. ANOVA results showed that temperature contributed the maximum impact (62.58%) on xylanase activity, followed by pH (22.69%), buffer concentration (9.55%) and incubation time (5.16%). Predicted results showed that enhanced xylanase activity (81.47%) can be achieved with pH 2, temperature 50°C, buffer concentration 50 Mm and incubation time 10 min.

  15. Factors associated with quality of life in patients undergoing coronary angioplasty

    PubMed Central

    Darvishpour, Azar; Javadi-Pashaki, Nazila; Salari, Arsalan; Sadeghi, Tahere; Taleshan-Nejad, Marayam

    2017-01-01

    Objective: Percutaneous coronary intervention has been effective in increasing longevity of patients with cardiovascular disease. However, the evidence shows that the quality of life after the intervention is still lower than optimal level. The quality of life can be affected by various factors. The aim of this study is to determine the quality of life and its related factors in patients undergoing coronary angioplasty. Methods: This cross-sectional study was performed on 106 patients undergoing coronary angioplasty during 2015-2016. This study population included all patients who referred to a cardiac clinic in Rasht, Iran, were passed 3 months after their angioplasty. Research samples met the inclusion criteria and were willing to participate to the study, were selected gradually (continually). Research tools were a self-structured questionnaire regarding factors associated with the quality of life and the MacNew quality of life questionnaire. Data were collected through asking patients questions and using patient’s medical records. Data analysis was conducted using descriptive and inferential statistics. Results: The results of multivariate linear regression analysis showed that independent variables of age (P = 0.0001), the number of diseased vessels (P = 0.0001), and the number of comorbidities (P < 0.05) were the most important factors associated with the quality of life. Conclusion: Health-care professionals can play an effective role in promoting the quality of life of patients undergoing coronary angioplasty by modifying lifestyle based on the related factors and to provide comprehensive care programs, especially for elderly. PMID:29085266

  16. Effect of optimal daily fertigation on migration of water and salt in soil, root growth and fruit yield of cucumber (Cucumis sativus L.) in solar-greenhouse.

    PubMed

    Liang, Xinshu; Gao, Yinan; Zhang, Xiaoying; Tian, Yongqiang; Zhang, Zhenxian; Gao, Lihong

    2014-01-01

    Inappropriate and excessive irrigation and fertilization have led to the predominant decline of crop yields, and water and fertilizer use efficiency in intensive vegetable production systems in China. For many vegetables, fertigation can be applied daily according to the actual water and nutrient requirement of crops. A greenhouse study was therefore conducted to investigate the effect of daily fertigation on migration of water and salt in soil, and root growth and fruit yield of cucumber. The treatments included conventional interval fertigation, optimal interval fertigation and optimal daily fertigation. Generally, although soil under the treatment optimal interval fertigation received much lower fertilizers than soil under conventional interval fertigation, the treatment optimal interval fertigation did not statistically decrease the economic yield and fruit nutrition quality of cucumber when compare to conventional interval fertigation. In addition, the treatment optimal interval fertigation effectively avoided inorganic nitrogen accumulation in soil and significantly (P<0.05) increased the partial factor productivity of applied nitrogen by 88% and 209% in the early-spring and autumn-winter seasons, respectively, when compared to conventional interval fertigation. Although soils under the treatments optimal interval fertigation and optimal daily fertigation received the same amount of fertilizers, the treatment optimal daily fertigation maintained the relatively stable water, electrical conductivity and mineral nitrogen levels in surface soils, promoted fine root (<1.5 mm diameter) growth of cucumber, and eventually increased cucumber economic yield by 6.2% and 8.3% and partial factor productivity of applied nitrogen by 55% and 75% in the early-spring and autumn-winter seasons, respectively, when compared to the treatment optimal interval fertigation. These results suggested that optimal daily fertigation is a beneficial practice for improving crop yield and the water and fertilizers use efficiency in solar greenhouse.

  17. Effect of Optimal Daily Fertigation on Migration of Water and Salt in Soil, Root Growth and Fruit Yield of Cucumber (Cucumis sativus L.) in Solar-Greenhouse

    PubMed Central

    Liang, Xinshu; Gao, Yinan; Zhang, Xiaoying; Tian, Yongqiang; Zhang, Zhenxian; Gao, Lihong

    2014-01-01

    Inappropriate and excessive irrigation and fertilization have led to the predominant decline of crop yields, and water and fertilizer use efficiency in intensive vegetable production systems in China. For many vegetables, fertigation can be applied daily according to the actual water and nutrient requirement of crops. A greenhouse study was therefore conducted to investigate the effect of daily fertigation on migration of water and salt in soil, and root growth and fruit yield of cucumber. The treatments included conventional interval fertigation, optimal interval fertigation and optimal daily fertigation. Generally, although soil under the treatment optimal interval fertigation received much lower fertilizers than soil under conventional interval fertigation, the treatment optimal interval fertigation did not statistically decrease the economic yield and fruit nutrition quality of cucumber when compare to conventional interval fertigation. In addition, the treatment optimal interval fertigation effectively avoided inorganic nitrogen accumulation in soil and significantly (P<0.05) increased the partial factor productivity of applied nitrogen by 88% and 209% in the early-spring and autumn-winter seasons, respectively, when compared to conventional interval fertigation. Although soils under the treatments optimal interval fertigation and optimal daily fertigation received the same amount of fertilizers, the treatment optimal daily fertigation maintained the relatively stable water, electrical conductivity and mineral nitrogen levels in surface soils, promoted fine root (<1.5 mm diameter) growth of cucumber, and eventually increased cucumber economic yield by 6.2% and 8.3% and partial factor productivity of applied nitrogen by 55% and 75% in the early-spring and autumn-winter seasons, respectively, when compared to the treatment optimal interval fertigation. These results suggested that optimal daily fertigation is a beneficial practice for improving crop yield and the water and fertilizers use efficiency in solar greenhouse. PMID:24475204

  18. Optimal allocation of testing resources for statistical simulations

    NASA Astrophysics Data System (ADS)

    Quintana, Carolina; Millwater, Harry R.; Singh, Gulshan; Golden, Patrick

    2015-07-01

    Statistical estimates from simulation involve uncertainty caused by the variability in the input random variables due to limited data. Allocating resources to obtain more experimental data of the input variables to better characterize their probability distributions can reduce the variance of statistical estimates. The methodology proposed determines the optimal number of additional experiments required to minimize the variance of the output moments given single or multiple constraints. The method uses multivariate t-distribution and Wishart distribution to generate realizations of the population mean and covariance of the input variables, respectively, given an amount of available data. This method handles independent and correlated random variables. A particle swarm method is used for the optimization. The optimal number of additional experiments per variable depends on the number and variance of the initial data, the influence of the variable in the output function and the cost of each additional experiment. The methodology is demonstrated using a fretting fatigue example.

  19. [The application of operating room quality backward system in instrument place management].

    PubMed

    Du, Hui; He, Anjie; Zeng, Leilei

    2010-09-01

    Improvement of the surgery instrument's clean quality, the optimized preparation way, reasonable arrangement in groups, raising the working efficiency. We use the quality backward system into the instrument clean, the pack and the preparation way's question, carry on the analysis and the optimization, and appraise the effect after trying out 6 months. After finally the way optimized, instrument clean quality distinct enhancement; The flaws in the instrument clean, the pack way and the total operating time reduce; the contradictory between nurses and the cleans arising from the unclear connection reduces, the satisfaction degree of nurse and doctor to the instrument enhances. Using of operating room quality backward system in the management of the instrument clean, the pack and the preparation way optimized, may reduce flaws in the work and the waste of human resources, raise the working efficiency.

  20. The marriage problem and the fate of bachelors

    NASA Astrophysics Data System (ADS)

    Nieuwenhuizen, Th. M.

    In the marriage problem, a variant of the bi-parted matching problem, each member has a “wish-list” expressing his/her preference for all possible partners; this list consists of random, positive real numbers drawn from a certain distribution. One searches the lowest cost for the society, at the risk of breaking up pairs in the course of time. Minimization of a global cost function (Hamiltonian) is performed with statistical mechanics techniques at a finite fictitious temperature. The problem is generalized to include bachelors, needed in particular when the groups have different size, and polygamy. Exact solutions are found for the optimal solution ( T=0). The entropy is found to vanish quadratically in T. Also, other evidence is found that the replica symmetric solution is exact, implying at most a polynomial degeneracy of the optimal solution. Whether bachelors occur or not, depends not only on their intrinsic qualities, or lack thereof, but also on global aspects of the chance for pair formation in society.

  1. Information-Theoretic Assessment of Sample Imaging Systems

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.; Alter-Gartenberg, Rachel; Park, Stephen K.; Rahman, Zia-ur

    1999-01-01

    By rigorously extending modern communication theory to the assessment of sampled imaging systems, we develop the formulations that are required to optimize the performance of these systems within the critical constraints of image gathering, data transmission, and image display. The goal of this optimization is to produce images with the best possible visual quality for the wide range of statistical properties of the radiance field of natural scenes that one normally encounters. Extensive computational results are presented to assess the performance of sampled imaging systems in terms of information rate, theoretical minimum data rate, and fidelity. Comparisons of this assessment with perceptual and measurable performance demonstrate that (1) the information rate that a sampled imaging system conveys from the captured radiance field to the observer is closely correlated with the fidelity, sharpness and clarity with which the observed images can be restored and (2) the associated theoretical minimum data rate is closely correlated with the lowest data rate with which the acquired signal can be encoded for efficient transmission.

  2. A Novel Flexible Inertia Weight Particle Swarm Optimization Algorithm.

    PubMed

    Amoshahy, Mohammad Javad; Shamsi, Mousa; Sedaaghi, Mohammad Hossein

    2016-01-01

    Particle swarm optimization (PSO) is an evolutionary computing method based on intelligent collective behavior of some animals. It is easy to implement and there are few parameters to adjust. The performance of PSO algorithm depends greatly on the appropriate parameter selection strategies for fine tuning its parameters. Inertia weight (IW) is one of PSO's parameters used to bring about a balance between the exploration and exploitation characteristics of PSO. This paper proposes a new nonlinear strategy for selecting inertia weight which is named Flexible Exponential Inertia Weight (FEIW) strategy because according to each problem we can construct an increasing or decreasing inertia weight strategy with suitable parameters selection. The efficacy and efficiency of PSO algorithm with FEIW strategy (FEPSO) is validated on a suite of benchmark problems with different dimensions. Also FEIW is compared with best time-varying, adaptive, constant and random inertia weights. Experimental results and statistical analysis prove that FEIW improves the search performance in terms of solution quality as well as convergence rate.

  3. A Novel Flexible Inertia Weight Particle Swarm Optimization Algorithm

    PubMed Central

    Shamsi, Mousa; Sedaaghi, Mohammad Hossein

    2016-01-01

    Particle swarm optimization (PSO) is an evolutionary computing method based on intelligent collective behavior of some animals. It is easy to implement and there are few parameters to adjust. The performance of PSO algorithm depends greatly on the appropriate parameter selection strategies for fine tuning its parameters. Inertia weight (IW) is one of PSO’s parameters used to bring about a balance between the exploration and exploitation characteristics of PSO. This paper proposes a new nonlinear strategy for selecting inertia weight which is named Flexible Exponential Inertia Weight (FEIW) strategy because according to each problem we can construct an increasing or decreasing inertia weight strategy with suitable parameters selection. The efficacy and efficiency of PSO algorithm with FEIW strategy (FEPSO) is validated on a suite of benchmark problems with different dimensions. Also FEIW is compared with best time-varying, adaptive, constant and random inertia weights. Experimental results and statistical analysis prove that FEIW improves the search performance in terms of solution quality as well as convergence rate. PMID:27560945

  4. Optimized water vapor permeability of sodium alginate films using response surface methodology

    NASA Astrophysics Data System (ADS)

    Zhang, Qing; Xu, Jiachao; Gao, Xin; Fu, Xiaoting

    2013-11-01

    The water vapor permeability (WVP) of films is important when developing pharmaceutical applications. Films are frequently used as coatings, and as such directly influence the quality of the medicine. The optimization of processing conditions for sodium alginate films was investigated using response surface methodology. Single-factor tests and Box-Behnken experimental design were employed. WVP was selected as the response variable, and the operating parameters for the single-factor tests were sodium alginate concentration, carboxymethyl cellulose (CMC) concentration and CaCl2 solution immersion time. The coefficient of determination ( R 2) was 0.97, indicating statistical significance. A minimal WVP of 0.389 8 g·mm/(m2·h·kPa) was achieved under the optimum conditions. These were found to be a sodium alginate concentration, CMC concentration and CaCl2 solution immersion time at 8.04%, 0.13%, and 12 min, respectively. This provides a reference for potential applications in manufacturing film-coated hard capsule shells.

  5. Receptor arrays optimized for natural odor statistics.

    PubMed

    Zwicker, David; Murugan, Arvind; Brenner, Michael P

    2016-05-17

    Natural odors typically consist of many molecules at different concentrations. It is unclear how the numerous odorant molecules and their possible mixtures are discriminated by relatively few olfactory receptors. Using an information theoretic model, we show that a receptor array is optimal for this task if it achieves two possibly conflicting goals: (i) Each receptor should respond to half of all odors and (ii) the response of different receptors should be uncorrelated when averaged over odors presented with natural statistics. We use these design principles to predict statistics of the affinities between receptors and odorant molecules for a broad class of odor statistics. We also show that optimal receptor arrays can be tuned to either resolve concentrations well or distinguish mixtures reliably. Finally, we use our results to predict properties of experimentally measured receptor arrays. Our work can thus be used to better understand natural olfaction, and it also suggests ways to improve artificial sensor arrays.

  6. Study of the Effect of Lubricant Emulsion Percentage and Tool Material on Surface Roughness in Machining of EN-AC 48000 Alloy

    NASA Astrophysics Data System (ADS)

    Soltani, E.; Shahali, H.; Zarepour, H.

    2011-01-01

    In this paper, the effect of machining parameters, namely, lubricant emulsion percentage and tool material on surface roughness has been studied in machining process of EN-AC 48000 aluminum alloy. EN-AC 48000 aluminum alloy is an important alloy in industries. Machining of this alloy is of vital importance due to built-up edge and tool wear. A L9 Taguchi standard orthogonal array has been applied as experimental design to investigate the effect of the factors and their interaction. Nine machining tests have been carried out with three random replications resulting in 27 experiments. Three type of cutting tools including coated carbide (CD1810), uncoated carbide (H10), and polycrystalline diamond (CD10) have been used in this research. Emulsion percentage of lubricant is selected at three levels including 3%, 5% and 10%. Statistical analysis has been employed to study the effect of factors and their interactions using ANOVA method. Moreover, the optimal factors level has been achieved through signal to noise ratio (S/N) analysis. Also, a regression model has been provided to predict the surface roughness. Finally, the results of the confirmation tests have been presented to verify the adequacy of the predictive model. In this research, surface quality was improved by 9% using lubricant and statistical optimization method.

  7. 3Drefine: an interactive web server for efficient protein structure refinement.

    PubMed

    Bhattacharya, Debswapna; Nowotny, Jackson; Cao, Renzhi; Cheng, Jianlin

    2016-07-08

    3Drefine is an interactive web server for consistent and computationally efficient protein structure refinement with the capability to perform web-based statistical and visual analysis. The 3Drefine refinement protocol utilizes iterative optimization of hydrogen bonding network combined with atomic-level energy minimization on the optimized model using a composite physics and knowledge-based force fields for efficient protein structure refinement. The method has been extensively evaluated on blind CASP experiments as well as on large-scale and diverse benchmark datasets and exhibits consistent improvement over the initial structure in both global and local structural quality measures. The 3Drefine web server allows for convenient protein structure refinement through a text or file input submission, email notification, provided example submission and is freely available without any registration requirement. The server also provides comprehensive analysis of submissions through various energy and statistical feedback and interactive visualization of multiple refined models through the JSmol applet that is equipped with numerous protein model analysis tools. The web server has been extensively tested and used by many users. As a result, the 3Drefine web server conveniently provides a useful tool easily accessible to the community. The 3Drefine web server has been made publicly available at the URL: http://sysbio.rnet.missouri.edu/3Drefine/. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  8. Evaluating the quality of NMR structures by local density of protons.

    PubMed

    Ban, Yih-En Andrew; Rudolph, Johannes; Zhou, Pei; Edelsbrunner, Herbert

    2006-03-01

    Evaluating the quality of experimentally determined protein structural models is an essential step toward identifying potential errors and guiding further structural refinement. Herein, we report the use of proton local density as a sensitive measure to assess the quality of nuclear magnetic resonance (NMR) structures. Using 256 high-resolution crystal structures with protons added and optimized, we show that the local density of different proton types display distinct distributions. These distributions can be characterized by statistical moments and are used to establish local density Z-scores for evaluating both global and local packing for individual protons. Analysis of 546 crystal structures at various resolutions shows that the local density Z-scores increase as the structural resolution decreases and correlate well with the ClashScore (Word et al. J Mol Biol 1999;285(4):1711-1733) generated by all atom contact analysis. Local density Z-scores for NMR structures exhibit a significantly wider range of values than for X-ray structures and demonstrate a combination of potentially problematic inflation and compression. Water-refined NMR structures show improved packing quality. Our analysis of a high-quality structural ensemble of ubiquitin refined against order parameters shows proton density distributions that correlate nearly perfectly with our standards derived from crystal structures, further validating our approach. We present an automated analysis and visualization tool for proton packing to evaluate the quality of NMR structures. 2005 Wiley-Liss, Inc.

  9. Interactions of donor sources and media influence the histo-morphological quality of full-thickness skin models.

    PubMed

    Lange, Julia; Weil, Frederik; Riegler, Christoph; Groeber, Florian; Rebhan, Silke; Kurdyn, Szymon; Alb, Miriam; Kneitz, Hermann; Gelbrich, Götz; Walles, Heike; Mielke, Stephan

    2016-10-01

    Human artificial skin models are increasingly employed as non-animal test platforms for research and medical purposes. However, the overall histopathological quality of such models may vary significantly. Therefore, the effects of manufacturing protocols and donor sources on the quality of skin models built-up from fibroblasts and keratinocytes derived from juvenile foreskins is studied. Histo-morphological parameters such as epidermal thickness, number of epidermal cell layers, dermal thickness, dermo-epidermal adhesion and absence of cellular nuclei in the corneal layer are obtained and scored accordingly. In total, 144 full-thickness skin models derived from 16 different donors, built-up in triplicates using three different culture conditions were successfully generated. In univariate analysis both media and donor age affected the quality of skin models significantly. Both parameters remained statistically significant in multivariate analyses. Performing general linear model analyses we could show that individual medium-donor-interactions influence the quality. These observations suggest that the optimal choice of media may differ from donor to donor and coincides with findings where significant inter-individual variations of growth rates in keratinocytes and fibroblasts have been described. Thus, the consideration of individual medium-donor-interactions may improve the overall quality of human organ models thereby forming a reproducible test platform for sophisticated clinical research. Copyright © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Near infrared spectrometric technique for testing fruit quality: optimisation of regression models using genetic algorithms

    NASA Astrophysics Data System (ADS)

    Isingizwe Nturambirwe, J. Frédéric; Perold, Willem J.; Opara, Umezuruike L.

    2016-02-01

    Near infrared (NIR) spectroscopy has gained extensive use in quality evaluation. It is arguably one of the most advanced spectroscopic tools in non-destructive quality testing of food stuff, from measurement to data analysis and interpretation. NIR spectral data are interpreted through means often involving multivariate statistical analysis, sometimes associated with optimisation techniques for model improvement. The objective of this research was to explore the extent to which genetic algorithms (GA) can be used to enhance model development, for predicting fruit quality. Apple fruits were used, and NIR spectra in the range from 12000 to 4000 cm-1 were acquired on both bruised and healthy tissues, with different degrees of mechanical damage. GAs were used in combination with partial least squares regression methods to develop bruise severity prediction models, and compared to PLS models developed using the full NIR spectrum. A classification model was developed, which clearly separated bruised from unbruised apple tissue. GAs helped improve prediction models by over 10%, in comparison with full spectrum-based models, as evaluated in terms of error of prediction (Root Mean Square Error of Cross-validation). PLS models to predict internal quality, such as sugar content and acidity were developed and compared to the versions optimized by genetic algorithm. Overall, the results highlighted the potential use of GA method to improve speed and accuracy of fruit quality prediction.

  11. The impact on midlevel vision of statistically optimal divisive normalization in V1

    PubMed Central

    Coen-Cagli, Ruben; Schwartz, Odelia

    2013-01-01

    The first two areas of the primate visual cortex (V1, V2) provide a paradigmatic example of hierarchical computation in the brain. However, neither the functional properties of V2 nor the interactions between the two areas are well understood. One key aspect is that the statistics of the inputs received by V2 depend on the nonlinear response properties of V1. Here, we focused on divisive normalization, a canonical nonlinear computation that is observed in many neural areas and modalities. We simulated V1 responses with (and without) different forms of surround normalization derived from statistical models of natural scenes, including canonical normalization and a statistically optimal extension that accounted for image nonhomogeneities. The statistics of the V1 population responses differed markedly across models. We then addressed how V2 receptive fields pool the responses of V1 model units with different tuning. We assumed this is achieved by learning without supervision a linear representation that removes correlations, which could be accomplished with principal component analysis. This approach revealed V2-like feature selectivity when we used the optimal normalization and, to a lesser extent, the canonical one but not in the absence of both. We compared the resulting two-stage models on two perceptual tasks; while models encompassing V1 surround normalization performed better at object recognition, only statistically optimal normalization provided systematic advantages in a task more closely matched to midlevel vision, namely figure/ground judgment. Our results suggest that experiments probing midlevel areas might benefit from using stimuli designed to engage the computations that characterize V1 optimality. PMID:23857950

  12. Automation method to identify the geological structure of seabed using spatial statistic analysis of echo sounding data

    NASA Astrophysics Data System (ADS)

    Kwon, O.; Kim, W.; Kim, J.

    2017-12-01

    Recently construction of subsea tunnel has been increased globally. For safe construction of subsea tunnel, identifying the geological structure including fault at design and construction stage is more than important. Then unlike the tunnel in land, it's very difficult to obtain the data on geological structure because of the limit in geological survey. This study is intended to challenge such difficulties in a way of developing the technology to identify the geological structure of seabed automatically by using echo sounding data. When investigation a potential site for a deep subsea tunnel, there is the technical and economical limit with borehole of geophysical investigation. On the contrary, echo sounding data is easily obtainable while information reliability is higher comparing to above approaches. This study is aimed at developing the algorithm that identifies the large scale of geological structure of seabed using geostatic approach. This study is based on theory of structural geology that topographic features indicate geological structure. Basic concept of algorithm is outlined as follows; (1) convert the seabed topography to the grid data using echo sounding data, (2) apply the moving window in optimal size to the grid data, (3) estimate the spatial statistics of the grid data in the window area, (4) set the percentile standard of spatial statistics, (5) display the values satisfying the standard on the map, (6) visualize the geological structure on the map. The important elements in this study include optimal size of moving window, kinds of optimal spatial statistics and determination of optimal percentile standard. To determine such optimal elements, a numerous simulations were implemented. Eventually, user program based on R was developed using optimal analysis algorithm. The user program was designed to identify the variations of various spatial statistics. It leads to easy analysis of geological structure depending on variation of spatial statistics by arranging to easily designate the type of spatial statistics and percentile standard. This research was supported by the Korea Agency for Infrastructure Technology Advancement under the Ministry of Land, Infrastructure and Transport of the Korean government. (Project Number: 13 Construction Research T01)

  13. Spatially-Distributed Cost–Effectiveness Analysis Framework to Control Phosphorus from Agricultural Diffuse Pollution

    PubMed Central

    Geng, Runzhe; Wang, Xiaoyan; Sharpley, Andrew N.; Meng, Fande

    2015-01-01

    Best management practices (BMPs) for agricultural diffuse pollution control are implemented at the field or small-watershed scale. However, the benefits of BMP implementation on receiving water quality at multiple spatial is an ongoing challenge. In this paper, we introduce an integrated approach that combines risk assessment (i.e., Phosphorus (P) index), model simulation techniques (Hydrological Simulation Program–FORTRAN), and a BMP placement tool at various scales to identify the optimal location for implementing multiple BMPs and estimate BMP effectiveness after implementation. A statistically significant decrease in nutrient discharge from watersheds is proposed to evaluate the effectiveness of BMPs, strategically targeted within watersheds. Specifically, we estimate two types of cost-effectiveness curves (total pollution reduction and proportion of watersheds improved) for four allocation approaches. Selection of a ‘‘best approach” depends on the relative importance of the two types of effectiveness, which involves a value judgment based on the random/aggregated degree of BMP distribution among and within sub-watersheds. A statistical optimization framework is developed and evaluated in Chaohe River Watershed located in the northern mountain area of Beijing. Results show that BMP implementation significantly (p >0.001) decrease P loss from the watershed. Remedial strategies where BMPs were targeted to areas of high risk of P loss, deceased P loads compared with strategies where BMPs were randomly located across watersheds. Sensitivity analysis indicated that aggregated BMP placement in particular watershed is the most cost-effective scenario to decrease P loss. The optimization approach outlined in this paper is a spatially hierarchical method for targeting nonpoint source controls across a range of scales from field to farm, to watersheds, to regions. Further, model estimates showed targeting at multiple scales is necessary to optimize program efficiency. The integrated model approach described that selects and places BMPs at varying levels of implementation, provides a new theoretical basis and technical guidance for diffuse pollution management in agricultural watersheds. PMID:26313561

  14. Spatially-Distributed Cost-Effectiveness Analysis Framework to Control Phosphorus from Agricultural Diffuse Pollution.

    PubMed

    Geng, Runzhe; Wang, Xiaoyan; Sharpley, Andrew N; Meng, Fande

    2015-01-01

    Best management practices (BMPs) for agricultural diffuse pollution control are implemented at the field or small-watershed scale. However, the benefits of BMP implementation on receiving water quality at multiple spatial is an ongoing challenge. In this paper, we introduce an integrated approach that combines risk assessment (i.e., Phosphorus (P) index), model simulation techniques (Hydrological Simulation Program-FORTRAN), and a BMP placement tool at various scales to identify the optimal location for implementing multiple BMPs and estimate BMP effectiveness after implementation. A statistically significant decrease in nutrient discharge from watersheds is proposed to evaluate the effectiveness of BMPs, strategically targeted within watersheds. Specifically, we estimate two types of cost-effectiveness curves (total pollution reduction and proportion of watersheds improved) for four allocation approaches. Selection of a ''best approach" depends on the relative importance of the two types of effectiveness, which involves a value judgment based on the random/aggregated degree of BMP distribution among and within sub-watersheds. A statistical optimization framework is developed and evaluated in Chaohe River Watershed located in the northern mountain area of Beijing. Results show that BMP implementation significantly (p >0.001) decrease P loss from the watershed. Remedial strategies where BMPs were targeted to areas of high risk of P loss, deceased P loads compared with strategies where BMPs were randomly located across watersheds. Sensitivity analysis indicated that aggregated BMP placement in particular watershed is the most cost-effective scenario to decrease P loss. The optimization approach outlined in this paper is a spatially hierarchical method for targeting nonpoint source controls across a range of scales from field to farm, to watersheds, to regions. Further, model estimates showed targeting at multiple scales is necessary to optimize program efficiency. The integrated model approach described that selects and places BMPs at varying levels of implementation, provides a new theoretical basis and technical guidance for diffuse pollution management in agricultural watersheds.

  15. Evaluating performances of simplified physically based landslide susceptibility models.

    NASA Astrophysics Data System (ADS)

    Capparelli, Giovanna; Formetta, Giuseppe; Versace, Pasquale

    2015-04-01

    Rainfall induced shallow landslides cause significant damages involving loss of life and properties. Prediction of shallow landslides susceptible locations is a complex task that involves many disciplines: hydrology, geotechnical science, geomorphology, and statistics. Usually to accomplish this task two main approaches are used: statistical or physically based model. This paper presents a package of GIS based models for landslide susceptibility analysis. It was integrated in the NewAge-JGrass hydrological model using the Object Modeling System (OMS) modeling framework. The package includes three simplified physically based models for landslides susceptibility analysis (M1, M2, and M3) and a component for models verifications. It computes eight goodness of fit indices (GOF) by comparing pixel-by-pixel model results and measurements data. Moreover, the package integration in NewAge-JGrass allows the use of other components such as geographic information system tools to manage inputs-output processes, and automatic calibration algorithms to estimate model parameters. The system offers the possibility to investigate and fairly compare the quality and the robustness of models and models parameters, according a procedure that includes: i) model parameters estimation by optimizing each of the GOF index separately, ii) models evaluation in the ROC plane by using each of the optimal parameter set, and iii) GOF robustness evaluation by assessing their sensitivity to the input parameter variation. This procedure was repeated for all three models. The system was applied for a case study in Calabria (Italy) along the Salerno-Reggio Calabria highway, between Cosenza and Altilia municipality. The analysis provided that among all the optimized indices and all the three models, Average Index (AI) optimization coupled with model M3 is the best modeling solution for our test case. This research was funded by PON Project No. 01_01503 "Integrated Systems for Hydrogeological Risk Monitoring, Early Warning and Mitigation Along the Main Lifelines", CUP B31H11000370005, in the framework of the National Operational Program for "Research and Competitiveness" 2007-2013.

  16. Main Quality Attributes of Monoclonal Antibodies and Effect of Cell Culture Components

    PubMed

    Torkashvand, Fatemeh; Vaziri, Behrouz

    2017-05-01

    The culture media optimization is an inevitable part of upstream process development in therapeutic monoclonal antibodies (mAbs) production. The quality by design (QbD) approach defines the assured quality of the final product through the development stage. An important step in QbD is determination of the main quality attributes. During the media optimization, some of the main quality attributes such as glycosylation pattern, charge variants, aggregates, and low-molecular-weight species, could be significantly altered. Here, we provide an overview of how cell culture medium components affects the main quality attributes of the mAbs. Knowing the relationship between the culture media components and the main quality attributes could be successfully utilized for a rational optimization of mammalian cell culture media for industrial mAbs production.

  17. An innovative time-cost-quality tradeoff modeling of building construction project based on resource allocation.

    PubMed

    Hu, Wenfa; He, Xinhua

    2014-01-01

    The time, quality, and cost are three important but contradictive objectives in a building construction project. It is a tough challenge for project managers to optimize them since they are different parameters. This paper presents a time-cost-quality optimization model that enables managers to optimize multiobjectives. The model is from the project breakdown structure method where task resources in a construction project are divided into a series of activities and further into construction labors, materials, equipment, and administration. The resources utilized in a construction activity would eventually determine its construction time, cost, and quality, and a complex time-cost-quality trade-off model is finally generated based on correlations between construction activities. A genetic algorithm tool is applied in the model to solve the comprehensive nonlinear time-cost-quality problems. Building of a three-storey house is an example to illustrate the implementation of the model, demonstrate its advantages in optimizing trade-off of construction time, cost, and quality, and help make a winning decision in construction practices. The computational time-cost-quality curves in visual graphics from the case study prove traditional cost-time assumptions reasonable and also prove this time-cost-quality trade-off model sophisticated.

  18. Finite-size effect on optimal efficiency of heat engines.

    PubMed

    Tajima, Hiroyasu; Hayashi, Masahito

    2017-07-01

    The optimal efficiency of quantum (or classical) heat engines whose heat baths are n-particle systems is given by the strong large deviation. We give the optimal work extraction process as a concrete energy-preserving unitary time evolution among the heat baths and the work storage. We show that our optimal work extraction turns the disordered energy of the heat baths to the ordered energy of the work storage, by evaluating the ratio of the entropy difference to the energy difference in the heat baths and the work storage, respectively. By comparing the statistical mechanical optimal efficiency with the macroscopic thermodynamic bound, we evaluate the accuracy of the macroscopic thermodynamics with finite-size heat baths from the statistical mechanical viewpoint. We also evaluate the quantum coherence effect on the optimal efficiency of the cycle processes without restricting their cycle time by comparing the classical and quantum optimal efficiencies.

  19. Reporting quality of statistical methods in surgical observational studies: protocol for systematic review.

    PubMed

    Wu, Robert; Glen, Peter; Ramsay, Tim; Martel, Guillaume

    2014-06-28

    Observational studies dominate the surgical literature. Statistical adjustment is an important strategy to account for confounders in observational studies. Research has shown that published articles are often poor in statistical quality, which may jeopardize their conclusions. The Statistical Analyses and Methods in the Published Literature (SAMPL) guidelines have been published to help establish standards for statistical reporting.This study will seek to determine whether the quality of statistical adjustment and the reporting of these methods are adequate in surgical observational studies. We hypothesize that incomplete reporting will be found in all surgical observational studies, and that the quality and reporting of these methods will be of lower quality in surgical journals when compared with medical journals. Finally, this work will seek to identify predictors of high-quality reporting. This work will examine the top five general surgical and medical journals, based on a 5-year impact factor (2007-2012). All observational studies investigating an intervention related to an essential component area of general surgery (defined by the American Board of Surgery), with an exposure, outcome, and comparator, will be included in this systematic review. Essential elements related to statistical reporting and quality were extracted from the SAMPL guidelines and include domains such as intent of analysis, primary analysis, multiple comparisons, numbers and descriptive statistics, association and correlation analyses, linear regression, logistic regression, Cox proportional hazard analysis, analysis of variance, survival analysis, propensity analysis, and independent and correlated analyses. Each article will be scored as a proportion based on fulfilling criteria in relevant analyses used in the study. A logistic regression model will be built to identify variables associated with high-quality reporting. A comparison will be made between the scores of surgical observational studies published in medical versus surgical journals. Secondary outcomes will pertain to individual domains of analysis. Sensitivity analyses will be conducted. This study will explore the reporting and quality of statistical analyses in surgical observational studies published in the most referenced surgical and medical journals in 2013 and examine whether variables (including the type of journal) can predict high-quality reporting.

  20. Software Analytical Instrument for Assessment of the Process of Casting Slabs

    NASA Astrophysics Data System (ADS)

    Franěk, Zdeněk; Kavička, František; Štětina, Josef; Masarik, Miloš

    2010-06-01

    The paper describes the original proposal of ways of solution and function of the program equipment for assessment of the process of casting slabs. The program system LITIOS was developed and implemented in EVRAZ Vitkovice Steel Ostrava on the equipment of continuous casting of steel (further only ECC). This program system works on the data warehouse of technological parameters of casting and quality parameters of slabs. It enables an ECC technologist to analyze the course of casting melt and with using statistics methods to set the influence of single technological parameters on the duality of final slabs. The system also enables long term monitoring and optimization of the production.

  1. Biotherapeutic formulation factors affecting metal leachables from stainless steel studied by design of experiments.

    PubMed

    Zhou, Shuxia; Evans, Brad; Schöneich, Christian; Singh, Satish K

    2012-03-01

    Trace amounts of metals are inevitably present in biotherapeutic products. They can arise from various sources. The impact of common formulation factors such as protein concentration, antioxidant, metal chelator concentration and type, surfactant, pH, and contact time with stainless steel on metal leachables was investigated by a design of experiments approach. Three major metal leachables, iron, chromium, and nickel were monitored by inductively coupled plasma-mass spectrometry. It was observed that among all the tested factors, contact time, metal chelator concentration, and protein concentration were statistically significant factors with higher temperature resulting in higher levels of leached metals. Within a pH range of 5.5-6.5, solution pH played a minor role for chromium leaching at 25°C. No statistically significant difference was observed due to type of chelator, presence of antioxidant, or surfactant. In order to optimize a biotherapeutic formulation to achieve a target drug product shelf life with acceptable quality, each formulation component must be evaluated for its impact.

  2. Learning the ideal observer for SKE detection tasks by use of convolutional neural networks (Cum Laude Poster Award)

    NASA Astrophysics Data System (ADS)

    Zhou, Weimin; Anastasio, Mark A.

    2018-03-01

    It has been advocated that task-based measures of image quality (IQ) should be employed to evaluate and optimize imaging systems. Task-based measures of IQ quantify the performance of an observer on a medically relevant task. The Bayesian Ideal Observer (IO), which employs complete statistical information of the object and noise, achieves the upper limit of the performance for a binary signal classification task. However, computing the IO performance is generally analytically intractable and can be computationally burdensome when Markov-chain Monte Carlo (MCMC) techniques are employed. In this paper, supervised learning with convolutional neural networks (CNNs) is employed to approximate the IO test statistics for a signal-known-exactly and background-known-exactly (SKE/BKE) binary detection task. The receiver operating characteristic (ROC) curve and the area under the ROC curve (AUC) are compared to those produced by the analytically computed IO. The advantages of the proposed supervised learning approach for approximating the IO are demonstrated.

  3. Investigation of statistical iterative reconstruction for dedicated breast CT

    PubMed Central

    Makeev, Andrey; Glick, Stephen J.

    2013-01-01

    Purpose: Dedicated breast CT has great potential for improving the detection and diagnosis of breast cancer. Statistical iterative reconstruction (SIR) in dedicated breast CT is a promising alternative to traditional filtered backprojection (FBP). One of the difficulties in using SIR is the presence of free parameters in the algorithm that control the appearance of the resulting image. These parameters require tuning in order to achieve high quality reconstructions. In this study, the authors investigated the penalized maximum likelihood (PML) method with two commonly used types of roughness penalty functions: hyperbolic potential and anisotropic total variation (TV) norm. Reconstructed images were compared with images obtained using standard FBP. Optimal parameters for PML with the hyperbolic prior are reported for the task of detecting microcalcifications embedded in breast tissue. Methods: Computer simulations were used to acquire projections in a half-cone beam geometry. The modeled setup describes a realistic breast CT benchtop system, with an x-ray spectra produced by a point source and an a-Si, CsI:Tl flat-panel detector. A voxelized anthropomorphic breast phantom with 280 μm microcalcification spheres embedded in it was used to model attenuation properties of the uncompressed woman's breast in a pendant position. The reconstruction of 3D images was performed using the separable paraboloidal surrogates algorithm with ordered subsets. Task performance was assessed with the ideal observer detectability index to determine optimal PML parameters. Results: The authors' findings suggest that there is a preferred range of values of the roughness penalty weight and the edge preservation threshold in the penalized objective function with the hyperbolic potential, which resulted in low noise images with high contrast microcalcifications preserved. In terms of numerical observer detectability index, the PML method with optimal parameters yielded substantially improved performance (by a factor of greater than 10) compared to FBP. The hyperbolic prior was also observed to be superior to the TV norm. A few of the best-performing parameter pairs for the PML method also demonstrated superior performance for various radiation doses. In fact, using PML with certain parameter values results in better images, acquired using 2 mGy dose, than FBP-reconstructed images acquired using 6 mGy dose. Conclusions: A range of optimal free parameters for the PML algorithm with hyperbolic and TV norm-based potentials is presented for the microcalcification detection task, in dedicated breast CT. The reported values can be used as starting values of the free parameters, when SIR techniques are used for image reconstruction. Significant improvement in image quality can be achieved by using PML with optimal combination of parameters, as compared to FBP. Importantly, these results suggest improved detection of microcalcifications can be obtained by using PML with lower radiation dose to the patient, than using FBP with higher dose. PMID:23927318

  4. Conditional optimal spacing in exponential distribution.

    PubMed

    Park, Sangun

    2006-12-01

    In this paper, we propose the conditional optimal spacing defined as the optimal spacing after specifying a predetermined order statistic. If we specify a censoring time, then the optimal inspection times for grouped inspection can be determined from this conditional optimal spacing. We take an example of exponential distribution, and provide a simple method of finding the conditional optimal spacing.

  5. Optimizing α for better statistical decisions: a case study involving the pace-of-life syndrome hypothesis: optimal α levels set to minimize Type I and II errors frequently result in different conclusions from those using α = 0.05.

    PubMed

    Mudge, Joseph F; Penny, Faith M; Houlahan, Jeff E

    2012-12-01

    Setting optimal significance levels that minimize Type I and Type II errors allows for more transparent and well-considered statistical decision making compared to the traditional α = 0.05 significance level. We use the optimal α approach to re-assess conclusions reached by three recently published tests of the pace-of-life syndrome hypothesis, which attempts to unify occurrences of different physiological, behavioral, and life history characteristics under one theory, over different scales of biological organization. While some of the conclusions reached using optimal α were consistent to those previously reported using the traditional α = 0.05 threshold, opposing conclusions were also frequently reached. The optimal α approach reduced probabilities of Type I and Type II errors, and ensured statistical significance was associated with biological relevance. Biologists should seriously consider their choice of α when conducting null hypothesis significance tests, as there are serious disadvantages with consistent reliance on the traditional but arbitrary α = 0.05 significance level. Copyright © 2012 WILEY Periodicals, Inc.

  6. Blind image quality assessment based on aesthetic and statistical quality-aware features

    NASA Astrophysics Data System (ADS)

    Jenadeleh, Mohsen; Masaeli, Mohammad Masood; Moghaddam, Mohsen Ebrahimi

    2017-07-01

    The main goal of image quality assessment (IQA) methods is the emulation of human perceptual image quality judgments. Therefore, the correlation between objective scores of these methods with human perceptual scores is considered as their performance metric. Human judgment of the image quality implicitly includes many factors when assessing perceptual image qualities such as aesthetics, semantics, context, and various types of visual distortions. The main idea of this paper is to use a host of features that are commonly employed in image aesthetics assessment in order to improve blind image quality assessment (BIQA) methods accuracy. We propose an approach that enriches the features of BIQA methods by integrating a host of aesthetics image features with the features of natural image statistics derived from multiple domains. The proposed features have been used for augmenting five different state-of-the-art BIQA methods, which use statistical natural scene statistics features. Experiments were performed on seven benchmark image quality databases. The experimental results showed significant improvement of the accuracy of the methods.

  7. Networking—a statistical physics perspective

    NASA Astrophysics Data System (ADS)

    Yeung, Chi Ho; Saad, David

    2013-03-01

    Networking encompasses a variety of tasks related to the communication of information on networks; it has a substantial economic and societal impact on a broad range of areas including transportation systems, wired and wireless communications and a range of Internet applications. As transportation and communication networks become increasingly more complex, the ever increasing demand for congestion control, higher traffic capacity, quality of service, robustness and reduced energy consumption requires new tools and methods to meet these conflicting requirements. The new methodology should serve for gaining better understanding of the properties of networking systems at the macroscopic level, as well as for the development of new principled optimization and management algorithms at the microscopic level. Methods of statistical physics seem best placed to provide new approaches as they have been developed specifically to deal with nonlinear large-scale systems. This review aims at presenting an overview of tools and methods that have been developed within the statistical physics community and that can be readily applied to address the emerging problems in networking. These include diffusion processes, methods from disordered systems and polymer physics, probabilistic inference, which have direct relevance to network routing, file and frequency distribution, the exploration of network structures and vulnerability, and various other practical networking applications.

  8. Contrast research of CDMA and GSM network optimization

    NASA Astrophysics Data System (ADS)

    Wu, Yanwen; Liu, Zehong; Zhou, Guangyue

    2004-03-01

    With the development of mobile telecommunication network, users of CDMA advanced their request of network service quality. While the operators also change their network management object from signal coverage to performance improvement. In that case, reasonably layout & optimization of mobile telecommunication network, reasonably configuration of network resource, improvement of the service quality, and increase the enterprise's core competition ability, all those have been concerned by the operator companies. This paper firstly looked into the flow of CDMA network optimization. Then it dissertated to some keystones in the CDMA network optimization, like PN code assignment, calculation of soft handover, etc. As GSM is also the similar cellular mobile telecommunication system like CDMA, so this paper also made a contrast research of CDMA and GSM network optimization in details, including the similarity and the different. In conclusion, network optimization is a long time job; it will run through the whole process of network construct. By the adjustment of network hardware (like BTS equipments, RF systems, etc.) and network software (like parameter optimized, configuration optimized, capacity optimized, etc.), network optimization work can improve the performance and service quality of the network.

  9. Phase Transitions in Combinatorial Optimization Problems: Basics, Algorithms and Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Hartmann, Alexander K.; Weigt, Martin

    2005-10-01

    A concise, comprehensive introduction to the topic of statistical physics of combinatorial optimization, bringing together theoretical concepts and algorithms from computer science with analytical methods from physics. The result bridges the gap between statistical physics and combinatorial optimization, investigating problems taken from theoretical computing, such as the vertex-cover problem, with the concepts and methods of theoretical physics. The authors cover rapid developments and analytical methods that are both extremely complex and spread by word-of-mouth, providing all the necessary basics in required detail. Throughout, the algorithms are shown with examples and calculations, while the proofs are given in a way suitable for graduate students, post-docs, and researchers. Ideal for newcomers to this young, multidisciplinary field.

  10. Use of multiresponse statistical techniques to optimize the separation of diosmin, hesperidin, diosmetin and hesperitin in different pharmaceutical preparations by high performance liquid chromatography with UV-DAD.

    PubMed

    Sammani, Mohamad Subhi; Clavijo, Sabrina; Portugal, Lindomar; Suárez, Ruth; Seddik, Hassan; Cerdà, Víctor

    2017-05-15

    A new method for the separation and determination of four flavonoids: hesperidin (HES), diosmin (DIO), hesperitin (HTIN), and diosmetin (DTIN) in pure form and pharmaceutical formulations has been developed by using high performance liquid chromatography (HPLC) with UV-DAD detection. Multivariate statistics (2 k full factorial and Box Behnken Designs) has been used for the multiresponse optimization of the chromatographic separation, which was completed in 22min, and carried out on a symmetry® C18 column (250×3mm; 5µm) as stationary phase. Separation was conducted by gradient elution mode using a mixture of methanol, acetonitrile and water pH: 2.5 (CH 3 COOH), as mobile phase. Analytes were separated setting the column at 22°C, with a flow rate of 0.58mLmin -1 and detected at 285nm. Under the optimized conditions, the flavonoids showed retention times of: 8.62, 11.53, 18.55 and 19.94min for HES, DIO, HTIN and DTIN, respectively. Limits of detection and quantification were <0.0156µgmL -1 and <0.100µgmL -1 , respectively. Linearity was achieved with good correlation coefficients values (r 2 =0.999; n=5). Intra-day and inter-day precision were found to be less than 3.44% (n=7). Finally, the proposed method was successfully applied to determine the target flavonoids in pharmaceutical preparations with satisfactory recoveries (between 95.2% and 107.9%), demonstrating that should also find application in the quality control, as well as in the pharmacokinetic studies of these drugs. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Satisfaction with the local service point for care: results of an evaluation study

    PubMed Central

    Esslinger, Adelheid Susanne; Macco, Katrin; Schmidt, Katharina

    2009-01-01

    Purpose The market of care increases and is characterized by complexity. Therefore, service points, such as the ‘Zentrale Anlaufstelle Pflege (ZAPf)’ in Nuremberg, are helpful for clients to get orientation. The purpose of the presentation is to show the results of an evaluation study about the clients' satisfaction with the offers of ZAPf. Study Satisfaction with service may be measured with the SERVQUAL concept introduced by Parasuraman et al. (1988). They found out five dimensions of quality (tangibles, reliability, responsiveness, assurances and empathy). We took these dimensions in our study. The study focuses on the quality of service and the benefits recognized by clients. In spring 2007, we conducted 67 interviews by phone, based on a half standardized questionnaire. Statistical analysis was conducted using SPSS. Results The clients want to get information about care in general, financial and legal aspects, alternative care arrangement (e.g. ambulant, long-term care) and typical age-related diseases. They show a high satisfaction with the service provided. Their benefits are to get information and advice, to strengthen the ability of decision taking, to cope with changing situations in life, and to develop solutions. Conclusions The results show that the quality of service is on a high level. Critical success factors are the interdisciplinary cooperation at the service point, based on a regularly and open exchange of information. Every member focuses on an optimal individual solution for the client. Local professional service points act as networkers and brokers. They serve not only for the clients' needs but also support the effective and efficient provision of optimized care.

  12. [Feedforward control strategy and its application in quality improvement of ethanol precipitation process of danhong injection].

    PubMed

    Yan, Bin-Jun; Guo, Zheng-Tai; Qu, Hai-Bin; Zhao, Bu-Chang; Zhao, Tao

    2013-06-01

    In this work, a feedforward control strategy basing on the concept of quality by design was established for the manufacturing process of traditional Chinese medicine to reduce the impact of the quality variation of raw materials on drug. In the research, the ethanol precipitation process of Danhong injection was taken as an application case of the method established. Box-Behnken design of experiments was conducted. Mathematical models relating the attributes of the concentrate, the process parameters and the quality of the supernatants produced were established. Then an optimization model for calculating the best process parameters basing on the attributes of the concentrate was built. The quality of the supernatants produced by ethanol precipitation with optimized and non-optimized process parameters were compared. The results showed that using the feedforward control strategy for process parameters optimization can control the quality of the supernatants effectively. The feedforward control strategy proposed can enhance the batch-to-batch consistency of the supernatants produced by ethanol precipitation.

  13. Parameterization of Shape and Compactness in Object-based Image Classification Using Quickbird-2 Imagery

    NASA Astrophysics Data System (ADS)

    Tonbul, H.; Kavzoglu, T.

    2016-12-01

    In recent years, object based image analysis (OBIA) has spread out and become a widely accepted technique for the analysis of remotely sensed data. OBIA deals with grouping pixels into homogenous objects based on spectral, spatial and textural features of contiguous pixels in an image. The first stage of OBIA, named as image segmentation, is the most prominent part of object recognition. In this study, multiresolution segmentation, which is a region-based approach, was employed to construct image objects. In the application of multi-resolution, three parameters, namely shape, compactness and scale must be set by the analyst. Segmentation quality remarkably influences the fidelity of the thematic maps and accordingly the classification accuracy. Therefore, it is of great importance to search and set optimal values for the segmentation parameters. In the literature, main focus has been on the definition of scale parameter, assuming that the effect of shape and compactness parameters is limited in terms of achieved classification accuracy. The aim of this study is to deeply analyze the influence of shape/compactness parameters by varying their values while using the optimal scale parameter determined by the use of Estimation of Scale Parameter (ESP-2) approach. A pansharpened Qickbird-2 image covering Trabzon, Turkey was employed to investigate the objectives of the study. For this purpose, six different combinations of shape/compactness were utilized to make deductions on the behavior of shape and compactness parameters and optimal setting for all parameters as a whole. Objects were assigned to classes using nearest neighbor classifier in all segmentation observations and equal number of pixels was randomly selected to calculate accuracy metrics. The highest overall accuracy (92.3%) was achieved by setting the shape/compactness criteria to 0.3/0.3. The results of this study indicate that shape/compactness parameters can have significant effect on classification accuracy with 4% change in overall accuracy. Also, statistical significance of differences in accuracy was tested using the McNemar's test and found that the difference between poor and optimal setting of shape/compactness parameters was statistically significant, suggesting a search for optimal parameterization instead of default setting.

  14. Statistical auditing of toxicology reports.

    PubMed

    Deaton, R R; Obenchain, R L

    1994-06-01

    Statistical auditing is a new report review process used by the quality assurance unit at Eli Lilly and Co. Statistical auditing allows the auditor to review the process by which the report was generated, as opposed to the process by which the data was generated. We have the flexibility to use different sampling techniques and still obtain thorough coverage of the report data. By properly implementing our auditing process, we can work smarter rather than harder and continue to help our customers increase the quality of their products (reports). Statistical auditing is helping our quality assurance unit meet our customers' need, while maintaining or increasing the quality of our regulatory obligations.

  15. Reproducibility-optimized test statistic for ranking genes in microarray studies.

    PubMed

    Elo, Laura L; Filén, Sanna; Lahesmaa, Riitta; Aittokallio, Tero

    2008-01-01

    A principal goal of microarray studies is to identify the genes showing differential expression under distinct conditions. In such studies, the selection of an optimal test statistic is a crucial challenge, which depends on the type and amount of data under analysis. While previous studies on simulated or spike-in datasets do not provide practical guidance on how to choose the best method for a given real dataset, we introduce an enhanced reproducibility-optimization procedure, which enables the selection of a suitable gene- anking statistic directly from the data. In comparison with existing ranking methods, the reproducibilityoptimized statistic shows good performance consistently under various simulated conditions and on Affymetrix spike-in dataset. Further, the feasibility of the novel statistic is confirmed in a practical research setting using data from an in-house cDNA microarray study of asthma-related gene expression changes. These results suggest that the procedure facilitates the selection of an appropriate test statistic for a given dataset without relying on a priori assumptions, which may bias the findings and their interpretation. Moreover, the general reproducibilityoptimization procedure is not limited to detecting differential expression only but could be extended to a wide range of other applications as well.

  16. Experimental design in chemistry: A tutorial.

    PubMed

    Leardi, Riccardo

    2009-10-12

    In this tutorial the main concepts and applications of experimental design in chemistry will be explained. Unfortunately, nowadays experimental design is not as known and applied as it should be, and many papers can be found in which the "optimization" of a procedure is performed one variable at a time. Goal of this paper is to show the real advantages in terms of reduced experimental effort and of increased quality of information that can be obtained if this approach is followed. To do that, three real examples will be shown. Rather than on the mathematical aspects, this paper will focus on the mental attitude required by experimental design. The readers being interested to deepen their knowledge of the mathematical and algorithmical part can find very good books and tutorials in the references [G.E.P. Box, W.G. Hunter, J.S. Hunter, Statistics for Experimenters: An Introduction to Design, Data Analysis, and Model Building, John Wiley & Sons, New York, 1978; R. Brereton, Chemometrics: Data Analysis for the Laboratory and Chemical Plant, John Wiley & Sons, New York, 1978; R. Carlson, J.E. Carlson, Design and Optimization in Organic Synthesis: Second Revised and Enlarged Edition, in: Data Handling in Science and Technology, vol. 24, Elsevier, Amsterdam, 2005; J.A. Cornell, Experiments with Mixtures: Designs, Models and the Analysis of Mixture Data, in: Series in Probability and Statistics, John Wiley & Sons, New York, 1991; R.E. Bruns, I.S. Scarminio, B. de Barros Neto, Statistical Design-Chemometrics, in: Data Handling in Science and Technology, vol. 25, Elsevier, Amsterdam, 2006; D.C. Montgomery, Design and Analysis of Experiments, 7th edition, John Wiley & Sons, Inc., 2009; T. Lundstedt, E. Seifert, L. Abramo, B. Thelin, A. Nyström, J. Pettersen, R. Bergman, Chemolab 42 (1998) 3; Y. Vander Heyden, LC-GC Europe 19 (9) (2006) 469].

  17. Constructing a Reward-Related Quality of Life Statistic in Daily Life—a Proof of Concept Study Using Positive Affect

    PubMed Central

    Verhagen, Simone J. W.; Simons, Claudia J. P.; van Zelst, Catherine; Delespaul, Philippe A. E. G.

    2017-01-01

    Background: Mental healthcare needs person-tailored interventions. Experience Sampling Method (ESM) can provide daily life monitoring of personal experiences. This study aims to operationalize and test a measure of momentary reward-related Quality of Life (rQoL). Intuitively, quality of life improves by spending more time on rewarding experiences. ESM clinical interventions can use this information to coach patients to find a realistic, optimal balance of positive experiences (maximize reward) in daily life. rQoL combines the frequency of engaging in a relevant context (a ‘behavior setting’) with concurrent (positive) affect. High rQoL occurs when the most frequent behavior settings are combined with positive affect or infrequent behavior settings co-occur with low positive affect. Methods: Resampling procedures (Monte Carlo experiments) were applied to assess the reliability of rQoL using various behavior setting definitions under different sampling circumstances, for real or virtual subjects with low-, average- and high contextual variability. Furthermore, resampling was used to assess whether rQoL is a distinct concept from positive affect. Virtual ESM beep datasets were extracted from 1,058 valid ESM observations for virtual and real subjects. Results: Behavior settings defined by Who-What contextual information were most informative. Simulations of at least 100 ESM observations are needed for reliable assessment. Virtual ESM beep datasets of a real subject can be defined by Who-What-Where behavior setting combinations. Large sample sizes are necessary for reliable rQoL assessments, except for subjects with low contextual variability. rQoL is distinct from positive affect. Conclusion: rQoL is a feasible concept. Monte Carlo experiments should be used to assess the reliable implementation of an ESM statistic. Future research in ESM should asses the behavior of summary statistics under different sampling situations. This exploration is especially relevant in clinical implementation, where often only small datasets are available. PMID:29163294

  18. Constructing a Reward-Related Quality of Life Statistic in Daily Life-a Proof of Concept Study Using Positive Affect.

    PubMed

    Verhagen, Simone J W; Simons, Claudia J P; van Zelst, Catherine; Delespaul, Philippe A E G

    2017-01-01

    Background: Mental healthcare needs person-tailored interventions. Experience Sampling Method (ESM) can provide daily life monitoring of personal experiences. This study aims to operationalize and test a measure of momentary reward-related Quality of Life (rQoL). Intuitively, quality of life improves by spending more time on rewarding experiences. ESM clinical interventions can use this information to coach patients to find a realistic, optimal balance of positive experiences (maximize reward) in daily life. rQoL combines the frequency of engaging in a relevant context (a 'behavior setting') with concurrent (positive) affect. High rQoL occurs when the most frequent behavior settings are combined with positive affect or infrequent behavior settings co-occur with low positive affect. Methods: Resampling procedures (Monte Carlo experiments) were applied to assess the reliability of rQoL using various behavior setting definitions under different sampling circumstances, for real or virtual subjects with low-, average- and high contextual variability. Furthermore, resampling was used to assess whether rQoL is a distinct concept from positive affect. Virtual ESM beep datasets were extracted from 1,058 valid ESM observations for virtual and real subjects. Results: Behavior settings defined by Who-What contextual information were most informative. Simulations of at least 100 ESM observations are needed for reliable assessment. Virtual ESM beep datasets of a real subject can be defined by Who-What-Where behavior setting combinations. Large sample sizes are necessary for reliable rQoL assessments, except for subjects with low contextual variability. rQoL is distinct from positive affect. Conclusion: rQoL is a feasible concept. Monte Carlo experiments should be used to assess the reliable implementation of an ESM statistic. Future research in ESM should asses the behavior of summary statistics under different sampling situations. This exploration is especially relevant in clinical implementation, where often only small datasets are available.

  19. [Psychological determinants of quality of life in women diagnosed with depressive disorders].

    PubMed

    Michalska-Leśniewicz, Magdalena; Gruszczyński, Wojciech

    2010-01-01

    The essential element of the functioning of patients is the assessment of quality of life and its determinants. Taking into account the depression process and its specific nature this seems to be of special importance. The aim of this paper was the assessment of importance of psychological determinants of quality of life in women with depressive disorders. The tests were carried out on the basis of the analysis of medical documentation, including the psychiatric records. The following criteria were measured: depression level (Beck Hopelessness Scale), quality of life (The Life Satisfaction Questionnaire FLZ according to Fahrenberg), personality model (NEO Five-Factor Inventory), optimism (The Life Orientation Test-Revised LOT-R by M. Scheier, ChS. Carver and M. Bridges adapted by R. Poprawa and Z. Juczyński), purpose in life (The Purpose-in-Life Test developed by Crumbaugh and Maholick according to the authorised translation by Z. Płuzek), social support (The Social Support Questionnaire by Sommer G, Fydrich T, 1989 adapted by Z. Juczyński), health satisfaction (General Health Questionnaire GHQ 28 by David Goldberg). Women diagnosed with depressive disorders were qualified for testing. The tested group of women included 80 patients in the age bracket of 40 to 60 years from the Outpatient Department of Mental Health, Regional Specialised Hospital in Zgierz. The reference group consisted of 30 women showing no symptoms of depressive disorders. The statistical analysis of variables taken into account in the tests showed essential statistical differences between the compared groups with regard to almost all parameters. Significant differences were found in respect of life satisfaction, personality variables, social support, health satisfaction and purpose in life. The obtained results showed significant differences regarding the assessment of quality of life between the group of women with depressive disorders and the group of women without any symptoms of such disorders in respect of the tested psychological determinants. The only exception was the "parent-child relationship satisfaction" where no differences were found.

  20. DCTune Perceptual Optimization of Compressed Dental X-Rays

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.; Null, Cynthia H. (Technical Monitor)

    1996-01-01

    In current dental practice, x-rays of completed dental work are often sent to the insurer for verification. It is faster and cheaper to transmit instead digital scans of the x-rays. Further economies result if the images are sent in compressed form. DCTune is a technology for optimizing DCT (digital communication technology) quantization matrices to yield maximum perceptual quality for a given bit-rate, or minimum bit-rate for a given perceptual quality. Perceptual optimization of DCT color quantization matrices. In addition, the technology provides a means of setting the perceptual quality of compressed imagery in a systematic way. The purpose of this research was, with respect to dental x-rays, 1) to verify the advantage of DCTune over standard JPEG (Joint Photographic Experts Group), 2) to verify the quality control feature of DCTune, and 3) to discover regularities in the optimized matrices of a set of images. We optimized matrices for a total of 20 images at two resolutions (150 and 300 dpi) and four bit-rates (0.25, 0.5, 0.75, 1.0 bits/pixel), and examined structural regularities in the resulting matrices. We also conducted psychophysical studies (1) to discover the DCTune quality level at which the images became 'visually lossless,' and (2) to rate the relative quality of DCTune and standard JPEG images at various bitrates. Results include: (1) At both resolutions, DCTune quality is a linear function of bit-rate. (2) DCTune quantization matrices for all images at all bitrates and resolutions are modeled well by an inverse Gaussian, with parameters of amplitude and width. (3) As bit-rate is varied, optimal values of both amplitude and width covary in an approximately linear fashion. (4) Both amplitude and width vary in systematic and orderly fashion with either bit-rate or DCTune quality; simple mathematical functions serve to describe these relationships. (5) In going from 150 to 300 dpi, amplitude parameters are substantially lower and widths larger at corresponding bit-rates or qualities. (6) Visually lossless compression occurs at a DCTune quality value of about 1. (7) At 0.25 bits/pixel, comparative ratings give DCTune a substantial advantage over standard JPEG. As visually lossless bit-rates are approached, this advantage of necessity diminishes. We have concluded that DCTune optimized quantization matrices provide better visual quality than standard JPEG. Meaningful quality levels may be specified by means of the DCTune metric. Optimized matrices are very similar across the class of dental x-rays, suggesting the possibility of a 'class-optimal' matrix. DCTune technology appears to provide some value in the context of compressed dental x-rays.

  1. Optimization of the Hartmann-Shack microlens array

    NASA Astrophysics Data System (ADS)

    de Oliveira, Otávio Gomes; de Lima Monteiro, Davies William

    2011-04-01

    In this work we propose to optimize the microlens-array geometry for a Hartmann-Shack wavefront sensor. The optimization makes possible that regular microlens arrays with a larger number of microlenses are replaced by arrays with fewer microlenses located at optimal sampling positions, with no increase in the reconstruction error. The goal is to propose a straightforward and widely accessible numerical method to calculate an optimized microlens array for a known aberration statistics. The optimization comprises the minimization of the wavefront reconstruction error and/or the number of necessary microlenses in the array. We numerically generate, sample and reconstruct the wavefront, and use a genetic algorithm to discover the optimal array geometry. Within an ophthalmological context, as a case study, we demonstrate that an array with only 10 suitably located microlenses can be used to produce reconstruction errors as small as those of a 36-microlens regular array. The same optimization procedure can be employed for any application where the wavefront statistics is known.

  2. Use of a quality trait index to increase the reliability of phenotypic evaluations in broccoli

    USDA-ARS?s Scientific Manuscript database

    Selection of superior broccoli hybrids involves multiple considerations, including optimization of head quality traits. Quality assessment of broccoli heads is often confounded by relatively subjective human preferences for optimal appearance of heads. To assist the selection process, we assessed fi...

  3. Longitudinal analysis of reporting and quality of systematic reviews in high-impact surgical journals.

    PubMed

    Chapman, S J; Drake, T M; Bolton, W S; Barnard, J; Bhangu, A

    2017-02-01

    The PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) Statement aims to optimize the reporting of systematic reviews. The performance of the PRISMA Statement in improving the reporting and quality of surgical systematic reviews remains unclear. Systematic reviews published in five high-impact surgical journals between 2007 and 2015 were identified from online archives. Manuscripts blinded to journal, publication year and authorship were assessed according to 27 reporting criteria described by the PRISMA Statement and scored using a validated quality appraisal tool (AMSTAR, Assessing the Methodological Quality of Systematic Reviews). Comparisons were made between studies published before (2007-2009) and after (2011-2015) its introduction. The relationship between reporting and study quality was measured using Spearman's rank test. Of 281 eligible manuscripts, 80 were published before the PRISMA Statement and 201 afterwards. Most manuscripts (208) included a meta-analysis, with the remainder comprising a systematic review only. There was no meaningful change in median compliance with the PRISMA Statement (19 (i.q.r. 16-21) of 27 items before versus 19 (17-22) of 27 after introduction of PRISMA) despite achieving statistical significance (P = 0·042). Better reporting compliance was associated with higher methodological quality (r s  = 0·70, P < 0·001). The PRISMA Statement has had minimal impact on the reporting of surgical systematic reviews. Better compliance was associated with higher-quality methodology. © 2016 BJS Society Ltd Published by John Wiley & Sons Ltd.

  4. Quality and Dose Optimized CT Trauma Protocol - Recommendation from a University Level-I Trauma Center.

    PubMed

    Kahn, Johannes; Kaul, David; Böning, Georg; Rotzinger, Roman; Freyhardt, Patrick; Schwabe, Philipp; Maurer, Martin H; Renz, Diane Miriam; Streitparth, Florian

    2017-09-01

    Purpose  As a supra-regional level-I trauma center, we evaluated computed tomography (CT) acquisitions of polytraumatized patients for quality and dose optimization purposes. Adapted statistical iterative reconstruction [(AS)IR] levels, tube voltage reduction as well as a split-bolus contrast agent (CA) protocol were applied. Materials and Methods  61 patients were split into 3 different groups that differed with respect to tube voltage (120 - 140 kVp) and level of applied ASIR reconstruction (ASIR 20 - 50 %). The CT protocol included a native acquisition of the head followed by a single contrast-enhanced acquisition of the whole body (64-MSCT). CA (350 mg/ml iodine) was administered as a split bolus injection of 100 ml (2 ml/s), 20 ml NaCl (1 ml/s), 60 ml (4 ml/s), 40 ml NaCl (4 ml/s) with a scan delay of 85 s to detect injuries of both the arterial system and parenchymal organs in a single acquisition. Both the quantitative (SNR/CNR) and qualitative (5-point Likert scale) image quality was evaluated in parenchymal organs that are often injured in trauma patients. Radiation exposure was assessed. Results  The use of IR combined with a reduction of tube voltage resulted in good qualitative and quantitative image quality and a significant reduction in radiation exposure of more than 40 % (DLP 1087 vs. 647 mGyxcm). Image quality could be improved due to a dedicated protocol that included different levels of IR adapted to different slice thicknesses, kernels and the examined area for the evaluation of head, lung, body and bone injury patterns. In synopsis of our results, we recommend the implementation of a polytrauma protocol with a tube voltage of 120 kVp and the following IR levels: cCT 5mm: ASIR 20; cCT 0.625 mm: ASIR 40; lung 2.5 mm: ASIR 30, body 5 mm: ASIR 40; body 1.25 mm: ASIR 50; body 0.625 mm: ASIR 0. Conclusion  A dedicated adaptation of the CT trauma protocol (level of reduction of tube voltage and of IR) according to the examined body region (head, lung, body, bone) combined with a split bolus CA injection protocol allows for a high-quality CT examination and a relevant reduction of radiation exposure in the examination of polytraumatized patients Key Points   · Dedicated adaption of the CT trauma protocol allows for an optimized examination.. · Different levels of iterative reconstruction, tube voltage and the CA injection protocol are crucial.. · A reduction of radiation exposure of more than 40 % with good image quality is possible.. Citation Format · Kahn J, Kaul D, Böning G et al. Quality and Dose Optimized CT Trauma Protocol - Recommendation from a University Level-I Trauma Center. Fortschr Röntgenstr 2017; 189: 844 - 854. © Georg Thieme Verlag KG Stuttgart · New York.

  5. Image quality improvements using adaptive statistical iterative reconstruction for evaluating chronic myocardial infarction using iodine density images with spectral CT.

    PubMed

    Kishimoto, Junichi; Ohta, Yasutoshi; Kitao, Shinichiro; Watanabe, Tomomi; Ogawa, Toshihide

    2018-04-01

    Single-source dual-energy CT (ssDECT) allows the reconstruction of iodine density images (IDIs) from projection based computing. We hypothesized that adding adaptive statistical iterative reconstruction (ASiR) could improve image quality. The aim of our study was to evaluate the effect and determine the optimal blend percentages of ASiR for IDI of myocardial late iodine enhancement (LIE) in the evaluation of chronic myocardial infarction using ssDECT. A total of 28 patients underwent cardiac LIE using a ssDECT scanner. IDIs between 0 and 100% of ASiR contributions in 10% increments were reconstructed. The signal-to-noise ratio (SNR) of remote myocardia and the contrast-to-noise ratio (CNR) of infarcted myocardia were measured. Transmural extent of infarction was graded using a 5-point scale. The SNR, CNR, and transmural extent were assessed for each ASiR contribution ratio. The transmural extents were compared with MRI as a reference standard. Compared to 0% ASiR, the use of 20-100% ASiR resulted in a reduction of image noise (p < 0.01) without significant differences in the signal. Compared with 0% ASiR images, reconstruction with 100% ASiR image showed the highest improvement in SNR (229%; p < 0.001) and CNR (199%; p < 0.001). ASiR above 80% showed the highest ratio (73.7%) of accurate transmural extent classification. In conclusion, ASiR intensity of 80-100% in IDIs can improve image quality without changes in signal and maximizes the accuracy of transmural extent in infarcted myocardium.

  6. Quality assurance for high dose rate brachytherapy treatment planning optimization: using a simple optimization to verify a complex optimization

    NASA Astrophysics Data System (ADS)

    Deufel, Christopher L.; Furutani, Keith M.

    2014-02-01

    As dose optimization for high dose rate brachytherapy becomes more complex, it becomes increasingly important to have a means of verifying that optimization results are reasonable. A method is presented for using a simple optimization as quality assurance for the more complex optimization algorithms typically found in commercial brachytherapy treatment planning systems. Quality assurance tests may be performed during commissioning, at regular intervals, and/or on a patient specific basis. A simple optimization method is provided that optimizes conformal target coverage using an exact, variance-based, algebraic approach. Metrics such as dose volume histogram, conformality index, and total reference air kerma agree closely between simple and complex optimizations for breast, cervix, prostate, and planar applicators. The simple optimization is shown to be a sensitive measure for identifying failures in a commercial treatment planning system that are possibly due to operator error or weaknesses in planning system optimization algorithms. Results from the simple optimization are surprisingly similar to the results from a more complex, commercial optimization for several clinical applications. This suggests that there are only modest gains to be made from making brachytherapy optimization more complex. The improvements expected from sophisticated linear optimizations, such as PARETO methods, will largely be in making systems more user friendly and efficient, rather than in finding dramatically better source strength distributions.

  7. An Efficient Framework Model for Optimizing Routing Performance in VANETs

    PubMed Central

    Zulkarnain, Zuriati Ahmad; Subramaniam, Shamala

    2018-01-01

    Routing in Vehicular Ad hoc Networks (VANET) is a bit complicated because of the nature of the high dynamic mobility. The efficiency of routing protocol is influenced by a number of factors such as network density, bandwidth constraints, traffic load, and mobility patterns resulting in frequency changes in network topology. Therefore, Quality of Service (QoS) is strongly needed to enhance the capability of the routing protocol and improve the overall network performance. In this paper, we introduce a statistical framework model to address the problem of optimizing routing configuration parameters in Vehicle-to-Vehicle (V2V) communication. Our framework solution is based on the utilization of the network resources to further reflect the current state of the network and to balance the trade-off between frequent changes in network topology and the QoS requirements. It consists of three stages: simulation network stage used to execute different urban scenarios, the function stage used as a competitive approach to aggregate the weighted cost of the factors in a single value, and optimization stage used to evaluate the communication cost and to obtain the optimal configuration based on the competitive cost. The simulation results show significant performance improvement in terms of the Packet Delivery Ratio (PDR), Normalized Routing Load (NRL), Packet loss (PL), and End-to-End Delay (E2ED). PMID:29462884

  8. Fault detection and isolation in GPS receiver autonomous integrity monitoring based on chaos particle swarm optimization-particle filter algorithm

    NASA Astrophysics Data System (ADS)

    Wang, Ershen; Jia, Chaoying; Tong, Gang; Qu, Pingping; Lan, Xiaoyu; Pang, Tao

    2018-03-01

    The receiver autonomous integrity monitoring (RAIM) is one of the most important parts in an avionic navigation system. Two problems need to be addressed to improve this system, namely, the degeneracy phenomenon and lack of samples for the standard particle filter (PF). However, the number of samples cannot adequately express the real distribution of the probability density function (i.e., sample impoverishment). This study presents a GPS receiver autonomous integrity monitoring (RAIM) method based on a chaos particle swarm optimization particle filter (CPSO-PF) algorithm with a log likelihood ratio. The chaos sequence generates a set of chaotic variables, which are mapped to the interval of optimization variables to improve particle quality. This chaos perturbation overcomes the potential for the search to become trapped in a local optimum in the particle swarm optimization (PSO) algorithm. Test statistics are configured based on a likelihood ratio, and satellite fault detection is then conducted by checking the consistency between the state estimate of the main PF and those of the auxiliary PFs. Based on GPS data, the experimental results demonstrate that the proposed algorithm can effectively detect and isolate satellite faults under conditions of non-Gaussian measurement noise. Moreover, the performance of the proposed novel method is better than that of RAIM based on the PF or PSO-PF algorithm.

  9. Analysis of Sting Balance Calibration Data Using Optimized Regression Models

    NASA Technical Reports Server (NTRS)

    Ulbrich, N.; Bader, Jon B.

    2010-01-01

    Calibration data of a wind tunnel sting balance was processed using a candidate math model search algorithm that recommends an optimized regression model for the data analysis. During the calibration the normal force and the moment at the balance moment center were selected as independent calibration variables. The sting balance itself had two moment gages. Therefore, after analyzing the connection between calibration loads and gage outputs, it was decided to choose the difference and the sum of the gage outputs as the two responses that best describe the behavior of the balance. The math model search algorithm was applied to these two responses. An optimized regression model was obtained for each response. Classical strain gage balance load transformations and the equations of the deflection of a cantilever beam under load are used to show that the search algorithm s two optimized regression models are supported by a theoretical analysis of the relationship between the applied calibration loads and the measured gage outputs. The analysis of the sting balance calibration data set is a rare example of a situation when terms of a regression model of a balance can directly be derived from first principles of physics. In addition, it is interesting to note that the search algorithm recommended the correct regression model term combinations using only a set of statistical quality metrics that were applied to the experimental data during the algorithm s term selection process.

  10. Cocaine profiling for strategic intelligence, a cross-border project between France and Switzerland: part II. Validation of the statistical methodology for the profiling of cocaine.

    PubMed

    Lociciro, S; Esseiva, P; Hayoz, P; Dujourdy, L; Besacier, F; Margot, P

    2008-05-20

    Harmonisation and optimization of analytical and statistical methodologies were carried out between two forensic laboratories (Lausanne, Switzerland and Lyon, France) in order to provide drug intelligence for cross-border cocaine seizures. Part I dealt with the optimization of the analytical method and its robustness. This second part investigates statistical methodologies that will provide reliable comparison of cocaine seizures analysed on two different gas chromatographs interfaced with a flame ionisation detectors (GC-FIDs) in two distinct laboratories. Sixty-six statistical combinations (ten data pre-treatments followed by six different distance measurements and correlation coefficients) were applied. One pre-treatment (N+S: area of each peak is divided by its standard deviation calculated from the whole data set) followed by the Cosine or Pearson correlation coefficients were found to be the best statistical compromise for optimal discrimination of linked and non-linked samples. The centralisation of the analyses in one single laboratory is not a required condition anymore to compare samples seized in different countries. This allows collaboration, but also, jurisdictional control over data.

  11. Strategy Developed for Selecting Optimal Sensors for Monitoring Engine Health

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Sensor indications during rocket engine operation are the primary means of assessing engine performance and health. Effective selection and location of sensors in the operating engine environment enables accurate real-time condition monitoring and rapid engine controller response to mitigate critical fault conditions. These capabilities are crucial to ensure crew safety and mission success. Effective sensor selection also facilitates postflight condition assessment, which contributes to efficient engine maintenance and reduced operating costs. Under the Next Generation Launch Technology program, the NASA Glenn Research Center, in partnership with Rocketdyne Propulsion and Power, has developed a model-based procedure for systematically selecting an optimal sensor suite for assessing rocket engine system health. This optimization process is termed the systematic sensor selection strategy. Engine health management (EHM) systems generally employ multiple diagnostic procedures including data validation, anomaly detection, fault-isolation, and information fusion. The effectiveness of each diagnostic component is affected by the quality, availability, and compatibility of sensor data. Therefore systematic sensor selection is an enabling technology for EHM. Information in three categories is required by the systematic sensor selection strategy. The first category consists of targeted engine fault information; including the description and estimated risk-reduction factor for each identified fault. Risk-reduction factors are used to define and rank the potential merit of timely fault diagnoses. The second category is composed of candidate sensor information; including type, location, and estimated variance in normal operation. The final category includes the definition of fault scenarios characteristic of each targeted engine fault. These scenarios are defined in terms of engine model hardware parameters. Values of these parameters define engine simulations that generate expected sensor values for targeted fault scenarios. Taken together, this information provides an efficient condensation of the engineering experience and engine flow physics needed for sensor selection. The systematic sensor selection strategy is composed of three primary algorithms. The core of the selection process is a genetic algorithm that iteratively improves a defined quality measure of selected sensor suites. A merit algorithm is employed to compute the quality measure for each test sensor suite presented by the selection process. The quality measure is based on the fidelity of fault detection and the level of fault source discrimination provided by the test sensor suite. An inverse engine model, whose function is to derive hardware performance parameters from sensor data, is an integral part of the merit algorithm. The final component is a statistical evaluation algorithm that characterizes the impact of interference effects, such as control-induced sensor variation and sensor noise, on the probability of fault detection and isolation for optimal and near-optimal sensor suites.

  12. A Multivariate Quality Loss Function Approach for Optimization of Spinning Processes

    NASA Astrophysics Data System (ADS)

    Chakraborty, Shankar; Mitra, Ankan

    2018-05-01

    Recent advancements in textile industry have given rise to several spinning techniques, such as ring spinning, rotor spinning etc., which can be used to produce a wide variety of textile apparels so as to fulfil the end requirements of the customers. To achieve the best out of these processes, they should be utilized at their optimal parametric settings. However, in presence of multiple yarn characteristics which are often conflicting in nature, it becomes a challenging task for the spinning industry personnel to identify the best parametric mix which would simultaneously optimize all the responses. Hence, in this paper, the applicability of a new systematic approach in the form of multivariate quality loss function technique is explored for optimizing multiple quality characteristics of yarns while identifying the ideal settings of two spinning processes. It is observed that this approach performs well against the other multi-objective optimization techniques, such as desirability function, distance function and mean squared error methods. With slight modifications in the upper and lower specification limits of the considered quality characteristics, and constraints of the non-linear optimization problem, it can be successfully applied to other processes in textile industry to determine their optimal parametric settings.

  13. Task-based data-acquisition optimization for sparse image reconstruction systems

    NASA Astrophysics Data System (ADS)

    Chen, Yujia; Lou, Yang; Kupinski, Matthew A.; Anastasio, Mark A.

    2017-03-01

    Conventional wisdom dictates that imaging hardware should be optimized by use of an ideal observer (IO) that exploits full statistical knowledge of the class of objects to be imaged, without consideration of the reconstruction method to be employed. However, accurate and tractable models of the complete object statistics are often difficult to determine in practice. Moreover, in imaging systems that employ compressive sensing concepts, imaging hardware and (sparse) image reconstruction are innately coupled technologies. We have previously proposed a sparsity-driven ideal observer (SDIO) that can be employed to optimize hardware by use of a stochastic object model that describes object sparsity. The SDIO and sparse reconstruction method can therefore be "matched" in the sense that they both utilize the same statistical information regarding the class of objects to be imaged. To efficiently compute SDIO performance, the posterior distribution is estimated by use of computational tools developed recently for variational Bayesian inference. Subsequently, the SDIO test statistic can be computed semi-analytically. The advantages of employing the SDIO instead of a Hotelling observer are systematically demonstrated in case studies in which magnetic resonance imaging (MRI) data acquisition schemes are optimized for signal detection tasks.

  14. Wavefront-guided versus wavefront-optimized laser in situ keratomileusis: contralateral comparative study.

    PubMed

    Padmanabhan, Prema; Mrochen, Michael; Basuthkar, Subam; Viswanathan, Deepa; Joseph, Roy

    2008-03-01

    To compare the outcomes of wavefront-guided and wavefront-optimized treatment in fellow eyes of patients having laser in situ keratomileusis (LASIK) for myopia. Medical and Vision Research Foundation, Tamil Nadu, India. This prospective comparative study comprised 27 patients who had wavefront-guided LASIK in 1 eye and wavefront-optimized LASIK in the fellow eye. The Hansatome (Bausch & Lomb) was used to create a superior-hinged flap and the Allegretto laser (WaveLight Laser Technologie AG), for photoablation. The Allegretto wave analyzer was used to measure ocular wavefront aberrations and the Functional Acuity Contrast Test chart, to measure contrast sensitivity before and 1 month after LASIK. The refractive and visual outcomes and the changes in aberrations and contrast sensitivity were compared between the 2 treatment modalities. One month postoperatively, 92% of eyes in the wavefront-guided group and 85% in the wavefront-optimized group had uncorrected visual acuity of 20/20 or better; 93% and 89%, respectively, had a postoperative spherical equivalent refraction of +/-0.50 diopter. The differences between groups were not statistically significant. Wavefront-guided LASIK induced less change in 18 of 22 higher-order Zernike terms than wavefront-optimized LASIK, with the change in positive spherical aberration the only statistically significant one (P= .01). Contrast sensitivity improved at the low and middle spatial frequencies (not statistically significant) and worsened significantly at high spatial frequencies after wavefront-guided LASIK; there was a statistically significant worsening at all spatial frequencies after wavefront-optimized LASIK. Although both wavefront-guided and wavefront-optimized LASIK gave excellent refractive correction results, the former induced less higher-order aberrations and was associated with better contrast sensitivity.

  15. Hydropower Optimization Using Artificial Neural Network Surrogate Models of a High-Fidelity Hydrodynamics and Water Quality Model

    NASA Astrophysics Data System (ADS)

    Shaw, Amelia R.; Smith Sawyer, Heather; LeBoeuf, Eugene J.; McDonald, Mark P.; Hadjerioua, Boualem

    2017-11-01

    Hydropower operations optimization subject to environmental constraints is limited by challenges associated with dimensionality and spatial and temporal resolution. The need for high-fidelity hydrodynamic and water quality models within optimization schemes is driven by improved computational capabilities, increased requirements to meet specific points of compliance with greater resolution, and the need to optimize operations of not just single reservoirs but systems of reservoirs. This study describes an important advancement for computing hourly power generation schemes for a hydropower reservoir using high-fidelity models, surrogate modeling techniques, and optimization methods. The predictive power of the high-fidelity hydrodynamic and water quality model CE-QUAL-W2 is successfully emulated by an artificial neural network, then integrated into a genetic algorithm optimization approach to maximize hydropower generation subject to constraints on dam operations and water quality. This methodology is applied to a multipurpose reservoir near Nashville, Tennessee, USA. The model successfully reproduced high-fidelity reservoir information while enabling 6.8% and 6.6% increases in hydropower production value relative to actual operations for dissolved oxygen (DO) limits of 5 and 6 mg/L, respectively, while witnessing an expected decrease in power generation at more restrictive DO constraints. Exploration of simultaneous temperature and DO constraints revealed capability to address multiple water quality constraints at specified locations. The reduced computational requirements of the new modeling approach demonstrated an ability to provide decision support for reservoir operations scheduling while maintaining high-fidelity hydrodynamic and water quality information as part of the optimization decision support routines.

  16. Hydropower Optimization Using Artificial Neural Network Surrogate Models of a High-Fidelity Hydrodynamics and Water Quality Model

    DOE PAGES

    Shaw, Amelia R.; Sawyer, Heather Smith; LeBoeuf, Eugene J.; ...

    2017-10-24

    Hydropower operations optimization subject to environmental constraints is limited by challenges associated with dimensionality and spatial and temporal resolution. The need for high-fidelity hydrodynamic and water quality models within optimization schemes is driven by improved computational capabilities, increased requirements to meet specific points of compliance with greater resolution, and the need to optimize operations of not just single reservoirs but systems of reservoirs. This study describes an important advancement for computing hourly power generation schemes for a hydropower reservoir using high-fidelity models, surrogate modeling techniques, and optimization methods. The predictive power of the high-fidelity hydrodynamic and water quality model CE-QUAL-W2more » is successfully emulated by an artificial neural network, then integrated into a genetic algorithm optimization approach to maximize hydropower generation subject to constraints on dam operations and water quality. This methodology is applied to a multipurpose reservoir near Nashville, Tennessee, USA. The model successfully reproduced high-fidelity reservoir information while enabling 6.8% and 6.6% increases in hydropower production value relative to actual operations for dissolved oxygen (DO) limits of 5 and 6 mg/L, respectively, while witnessing an expected decrease in power generation at more restrictive DO constraints. Exploration of simultaneous temperature and DO constraints revealed capability to address multiple water quality constraints at specified locations. Here, the reduced computational requirements of the new modeling approach demonstrated an ability to provide decision support for reservoir operations scheduling while maintaining high-fidelity hydrodynamic and water quality information as part of the optimization decision support routines.« less

  17. Hydropower Optimization Using Artificial Neural Network Surrogate Models of a High-Fidelity Hydrodynamics and Water Quality Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shaw, Amelia R.; Sawyer, Heather Smith; LeBoeuf, Eugene J.

    Hydropower operations optimization subject to environmental constraints is limited by challenges associated with dimensionality and spatial and temporal resolution. The need for high-fidelity hydrodynamic and water quality models within optimization schemes is driven by improved computational capabilities, increased requirements to meet specific points of compliance with greater resolution, and the need to optimize operations of not just single reservoirs but systems of reservoirs. This study describes an important advancement for computing hourly power generation schemes for a hydropower reservoir using high-fidelity models, surrogate modeling techniques, and optimization methods. The predictive power of the high-fidelity hydrodynamic and water quality model CE-QUAL-W2more » is successfully emulated by an artificial neural network, then integrated into a genetic algorithm optimization approach to maximize hydropower generation subject to constraints on dam operations and water quality. This methodology is applied to a multipurpose reservoir near Nashville, Tennessee, USA. The model successfully reproduced high-fidelity reservoir information while enabling 6.8% and 6.6% increases in hydropower production value relative to actual operations for dissolved oxygen (DO) limits of 5 and 6 mg/L, respectively, while witnessing an expected decrease in power generation at more restrictive DO constraints. Exploration of simultaneous temperature and DO constraints revealed capability to address multiple water quality constraints at specified locations. Here, the reduced computational requirements of the new modeling approach demonstrated an ability to provide decision support for reservoir operations scheduling while maintaining high-fidelity hydrodynamic and water quality information as part of the optimization decision support routines.« less

  18. In vivo evaluation of the effect of stimulus distribution on FIR statistical efficiency in event-related fMRI

    PubMed Central

    Jansma, J Martijn; de Zwart, Jacco A; van Gelderen, Peter; Duyn, Jeff H; Drevets, Wayne C; Furey, Maura L

    2013-01-01

    Technical developments in MRI have improved signal to noise, allowing use of analysis methods such as Finite impulse response (FIR) of rapid event related functional MRI (er-fMRI). FIR is one of the most informative analysis methods as it determines onset and full shape of the hemodynamic response function (HRF) without any a-priori assumptions. FIR is however vulnerable to multicollinearity, which is directly related to the distribution of stimuli over time. Efficiency can be optimized by simplifying a design, and restricting stimuli distribution to specific sequences, while more design flexibility necessarily reduces efficiency. However, the actual effect of efficiency on fMRI results has never been tested in vivo. Thus, it is currently difficult to make an informed choice between protocol flexibility and statistical efficiency. The main goal of this study was to assign concrete fMRI signal to noise values to the abstract scale of FIR statistical efficiency. Ten subjects repeated a perception task with five random and m-sequence based protocol, with varying but, according to literature, acceptable levels of multicollinearity. Results indicated substantial differences in signal standard deviation, while the level was a function of multicollinearity. Experiment protocols varied up to 55.4% in standard deviation. Results confirm that quality of fMRI in an FIR analysis can significantly and substantially vary with statistical efficiency. Our in vivo measurements can be used to aid in making an informed decision between freedom in protocol design and statistical efficiency. PMID:23473798

  19. Statistical iterative material image reconstruction for spectral CT using a semi-empirical forward model

    NASA Astrophysics Data System (ADS)

    Mechlem, Korbinian; Ehn, Sebastian; Sellerer, Thorsten; Pfeiffer, Franz; Noël, Peter B.

    2017-03-01

    In spectral computed tomography (spectral CT), the additional information about the energy dependence of attenuation coefficients can be exploited to generate material selective images. These images have found applications in various areas such as artifact reduction, quantitative imaging or clinical diagnosis. However, significant noise amplification on material decomposed images remains a fundamental problem of spectral CT. Most spectral CT algorithms separate the process of material decomposition and image reconstruction. Separating these steps is suboptimal because the full statistical information contained in the spectral tomographic measurements cannot be exploited. Statistical iterative reconstruction (SIR) techniques provide an alternative, mathematically elegant approach to obtaining material selective images with improved tradeoffs between noise and resolution. Furthermore, image reconstruction and material decomposition can be performed jointly. This is accomplished by a forward model which directly connects the (expected) spectral projection measurements and the material selective images. To obtain this forward model, detailed knowledge of the different photon energy spectra and the detector response was assumed in previous work. However, accurately determining the spectrum is often difficult in practice. In this work, a new algorithm for statistical iterative material decomposition is presented. It uses a semi-empirical forward model which relies on simple calibration measurements. Furthermore, an efficient optimization algorithm based on separable surrogate functions is employed. This partially negates one of the major shortcomings of SIR, namely high computational cost and long reconstruction times. Numerical simulations and real experiments show strongly improved image quality and reduced statistical bias compared to projection-based material decomposition.

  20. Fabrication and optimization of a conducting polymer sensor array using stored grain model volatiles.

    PubMed

    Hossain, Md Eftekhar; Rahman, G M Aminur; Freund, Michael S; Jayas, Digvir S; White, Noel D G; Shafai, Cyrus; Thomson, Douglas J

    2012-03-21

    During storage, grain can experience significant degradation in quality due to a variety of physical, chemical, and biological interactions. Most commonly, these losses are associated with insects or fungi. Continuous monitoring and an ability to differentiate between sources of spoilage are critical for rapid and effective intervention to minimize deterioration or losses. Therefore, there is a keen interest in developing a straightforward, cost-effective, and efficient method for monitoring of stored grain. Sensor arrays are currently used for classifying liquors, perfumes, and the quality of food products by mimicking the mammalian olfactory system. The use of this technology for monitoring of stored grain and identification of the source of spoilage is a new application, which has the potential for broad impact. The main focus of the work described herein is on the fabrication and optimization of a carbon black (CB) polymer sensor array to monitor stored grain model volatiles associated with insect secretions (benzene derivatives) and fungi (aliphatic hydrocarbon derivatives). Various methods of statistical analysis (RSD, PCA, LDA, t test) were used to select polymers for the array that were optimum for distinguishing between important compound classes (quinones, alcohols) and to minimize the sensitivity for other parameters such as humidity. The performance of the developed sensor array was satisfactory to demonstrate identification and separation of stored grain model volatiles at ambient conditions.

  1. Additive Manufacturing of Single-Crystal Superalloy CMSX-4 Through Scanning Laser Epitaxy: Computational Modeling, Experimental Process Development, and Process Parameter Optimization

    NASA Astrophysics Data System (ADS)

    Basak, Amrita; Acharya, Ranadip; Das, Suman

    2016-08-01

    This paper focuses on additive manufacturing (AM) of single-crystal (SX) nickel-based superalloy CMSX-4 through scanning laser epitaxy (SLE). SLE, a powder bed fusion-based AM process was explored for the purpose of producing crack-free, dense deposits of CMSX-4 on top of similar chemistry investment-cast substrates. Optical microscopy and scanning electron microscopy (SEM) investigations revealed the presence of dendritic microstructures that consisted of fine γ' precipitates within the γ matrix in the deposit region. Computational fluid dynamics (CFD)-based process modeling, statistical design of experiments (DoE), and microstructural characterization techniques were combined to produce metallurgically bonded single-crystal deposits of more than 500 μm height in a single pass along the entire length of the substrate. A customized quantitative metallography based image analysis technique was employed for automatic extraction of various deposit quality metrics from the digital cross-sectional micrographs. The processing parameters were varied, and optimal processing windows were identified to obtain good quality deposits. The results reported here represent one of the few successes obtained in producing single-crystal epitaxial deposits through a powder bed fusion-based metal AM process and thus demonstrate the potential of SLE to repair and manufacture single-crystal hot section components of gas turbine systems from nickel-based superalloy powders.

  2. An Innovative Time-Cost-Quality Tradeoff Modeling of Building Construction Project Based on Resource Allocation

    PubMed Central

    2014-01-01

    The time, quality, and cost are three important but contradictive objectives in a building construction project. It is a tough challenge for project managers to optimize them since they are different parameters. This paper presents a time-cost-quality optimization model that enables managers to optimize multiobjectives. The model is from the project breakdown structure method where task resources in a construction project are divided into a series of activities and further into construction labors, materials, equipment, and administration. The resources utilized in a construction activity would eventually determine its construction time, cost, and quality, and a complex time-cost-quality trade-off model is finally generated based on correlations between construction activities. A genetic algorithm tool is applied in the model to solve the comprehensive nonlinear time-cost-quality problems. Building of a three-storey house is an example to illustrate the implementation of the model, demonstrate its advantages in optimizing trade-off of construction time, cost, and quality, and help make a winning decision in construction practices. The computational time-cost-quality curves in visual graphics from the case study prove traditional cost-time assumptions reasonable and also prove this time-cost-quality trade-off model sophisticated. PMID:24672351

  3. SU-E-I-33: Establishment of CT Diagnostic Reference Levels in Province Nova Scotia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tonkopi, E; Abdolell, M; Duffy, S

    2015-06-15

    Purpose: To evaluate patient radiation dose from the most frequently performed CT examinations and to establish provincial diagnostic reference levels (DRLs) as a tool for protocol optimization. Methods: The study investigated the following CT examinations: head, chest, abdomen/pelvis, and chest/abdomen/pelvis (CAP). Dose data, volume CT dose index (CTDIvol) and dose-length product (DLP), were collected from 15 CT scanners installed during 2004–2014 in 11 hospital sites of Nova Scotia. All scanners had dose modulation options and multislice capability (16–128 detector rows). The sample for each protocol included 15 average size patients (70±20 kg). Provincial DRLs were calculated as the 75th percentilemore » of patient dose distributions. The differences in dose between hospitals were evaluated with a single factor ANOVA statistical test. Generalized linear modeling was used to determine the factors associated with higher radiation dose. A sample of 36 abdominal studies performed on three different scanners was blinded and randomized for an assessment by an experienced radiologist who graded the imaging quality of anatomic structures. Results: Data for 900 patients were collected. The DRLs were proposed using CTDIvol (mGy) and DLP (mGy*cm) values for CT head (67 and 1049, respectively), chest (12 and 393), abdomen/pelvis (16 and 717), and CAP (14 and 1034). These DRLs were lower than the published national data except for the head CTDIvol. The differences between the means of the dose distributions from each scanner were statistically significant (p<0.05) for all examinations. A very weak correlation was found between the dose and the scanner age or the number of slices with Pearson’s correlation coefficients of 0.011–0.315. The blinded analysis of image quality demonstrated no clinically significant difference except for the noise category. Conclusion: Provincial DRLs were established for typical CT examinations. The variations in dose between the hospitals suggested a large potential for optimization of examinations. Radiology Research Foundation grant.« less

  4. Analyses of global sea surface temperature 1856-1991

    NASA Astrophysics Data System (ADS)

    Kaplan, Alexey; Cane, Mark A.; Kushnir, Yochanan; Clement, Amy C.; Blumenthal, M. Benno; Rajagopalan, Balaji

    1998-08-01

    Global analyses of monthly sea surface temperature (SST) anomalies from 1856 to 1991 are produced using three statistically based methods: optimal smoothing (OS), the Kaiman filter (KF) and optimal interpolation (OI). Each of these is accompanied by estimates of the error covariance of the analyzed fields. The spatial covariance function these methods require is estimated from the available data; the timemarching model is a first-order autoregressive model again estimated from data. The data input for the analyses are monthly anomalies from the United Kingdom Meteorological Office historical sea surface temperature data set (MOHSST5) [Parker et al., 1994] of the Global Ocean Surface Temperature Atlas (GOSTA) [Bottomley et al., 1990]. These analyses are compared with each other, with GOSTA, and with an analysis generated by projection (P) onto a set of empirical orthogonal functions (as in Smith et al. [1996]). In theory, the quality of the analyses should rank in the order OS, KF, OI, P, and GOSTA. It is found that the first four give comparable results in the data-rich periods (1951-1991), but at times when data is sparse the first three differ significantly from P and GOSTA. At these times the latter two often have extreme and fluctuating values, prima facie evidence of error. The statistical schemes are also verified against data not used in any of the analyses (proxy records derived from corals and air temperature records from coastal and island stations). We also present evidence that the analysis error estimates are indeed indicative of the quality of the products. At most times the OS and KF products are close to the OI product, but at times of especially poor coverage their use of information from other times is advantageous. The methods appear to reconstruct the major features of the global SST field from very sparse data. Comparison with other indications of the El Niño-Southern Oscillation cycle show that the analyses provide usable information on interannual variability as far back as the 1860s.

  5. Statistical principle and methodology in the NISAN system.

    PubMed Central

    Asano, C

    1979-01-01

    The NISAN system is a new interactive statistical analysis program package constructed by an organization of Japanese statisticans. The package is widely available for both statistical situations, confirmatory analysis and exploratory analysis, and is planned to obtain statistical wisdom and to choose optimal process of statistical analysis for senior statisticians. PMID:540594

  6. Statistical physics of hard combinatorial optimization: Vertex cover problem

    NASA Astrophysics Data System (ADS)

    Zhao, Jin-Hua; Zhou, Hai-Jun

    2014-07-01

    Typical-case computation complexity is a research topic at the boundary of computer science, applied mathematics, and statistical physics. In the last twenty years, the replica-symmetry-breaking mean field theory of spin glasses and the associated message-passing algorithms have greatly deepened our understanding of typical-case computation complexity. In this paper, we use the vertex cover problem, a basic nondeterministic-polynomial (NP)-complete combinatorial optimization problem of wide application, as an example to introduce the statistical physical methods and algorithms. We do not go into the technical details but emphasize mainly the intuitive physical meanings of the message-passing equations. A nonfamiliar reader shall be able to understand to a large extent the physics behind the mean field approaches and to adjust the mean field methods in solving other optimization problems.

  7. Spontaneous cortical activity reveals hallmarks of an optimal internal model of the environment.

    PubMed

    Berkes, Pietro; Orbán, Gergo; Lengyel, Máté; Fiser, József

    2011-01-07

    The brain maintains internal models of its environment to interpret sensory inputs and to prepare actions. Although behavioral studies have demonstrated that these internal models are optimally adapted to the statistics of the environment, the neural underpinning of this adaptation is unknown. Using a Bayesian model of sensory cortical processing, we related stimulus-evoked and spontaneous neural activities to inferences and prior expectations in an internal model and predicted that they should match if the model is statistically optimal. To test this prediction, we analyzed visual cortical activity of awake ferrets during development. Similarity between spontaneous and evoked activities increased with age and was specific to responses evoked by natural scenes. This demonstrates the progressive adaptation of internal models to the statistics of natural stimuli at the neural level.

  8. Parameter optimization in biased decoy-state quantum key distribution with both source errors and statistical fluctuations

    NASA Astrophysics Data System (ADS)

    Zhu, Jian-Rong; Li, Jian; Zhang, Chun-Mei; Wang, Qin

    2017-10-01

    The decoy-state method has been widely used in commercial quantum key distribution (QKD) systems. In view of the practical decoy-state QKD with both source errors and statistical fluctuations, we propose a universal model of full parameter optimization in biased decoy-state QKD with phase-randomized sources. Besides, we adopt this model to carry out simulations of two widely used sources: weak coherent source (WCS) and heralded single-photon source (HSPS). Results show that full parameter optimization can significantly improve not only the secure transmission distance but also the final key generation rate. And when taking source errors and statistical fluctuations into account, the performance of decoy-state QKD using HSPS suffered less than that of decoy-state QKD using WCS.

  9. Risk management and statistical multivariate analysis approach for design and optimization of satranidazole nanoparticles.

    PubMed

    Dhat, Shalaka; Pund, Swati; Kokare, Chandrakant; Sharma, Pankaj; Shrivastava, Birendra

    2017-01-01

    Rapidly evolving technical and regulatory landscapes of the pharmaceutical product development necessitates risk management with application of multivariate analysis using Process Analytical Technology (PAT) and Quality by Design (QbD). Poorly soluble, high dose drug, Satranidazole was optimally nanoprecipitated (SAT-NP) employing principles of Formulation by Design (FbD). The potential risk factors influencing the critical quality attributes (CQA) of SAT-NP were identified using Ishikawa diagram. Plackett-Burman screening design was adopted to screen the eight critical formulation and process parameters influencing the mean particle size, zeta potential and dissolution efficiency at 30min in pH7.4 dissolution medium. Pareto charts (individual and cumulative) revealed three most critical factors influencing CQA of SAT-NP viz. aqueous stabilizer (Polyvinyl alcohol), release modifier (Eudragit® S 100) and volume of aqueous phase. The levels of these three critical formulation attributes were optimized by FbD within established design space to minimize mean particle size, poly dispersity index, and maximize encapsulation efficiency of SAT-NP. Lenth's and Bayesian analysis along with mathematical modeling of results allowed identification and quantification of critical formulation attributes significantly active on the selected CQAs. The optimized SAT-NP exhibited mean particle size; 216nm, polydispersity index; 0.250, zeta potential; -3.75mV and encapsulation efficiency; 78.3%. The product was lyophilized using mannitol to form readily redispersible powder. X-ray diffraction analysis confirmed the conversion of crystalline SAT to amorphous form. In vitro release of SAT-NP in gradually pH changing media showed <20% release in pH1.2 and pH6.8 in 5h, while, complete release (>95%) in pH7.4 in next 3h, indicative of burst release after a lag time. This investigation demonstrated effective application of risk management and QbD tools in developing site-specific release SAT-NP by nanoprecipitation. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Automatic CT simulation optimization for radiation therapy: A general strategy.

    PubMed

    Li, Hua; Yu, Lifeng; Anastasio, Mark A; Chen, Hsin-Chen; Tan, Jun; Gay, Hiram; Michalski, Jeff M; Low, Daniel A; Mutic, Sasa

    2014-03-01

    In radiation therapy, x-ray computed tomography (CT) simulation protocol specifications should be driven by the treatment planning requirements in lieu of duplicating diagnostic CT screening protocols. The purpose of this study was to develop a general strategy that allows for automatically, prospectively, and objectively determining the optimal patient-specific CT simulation protocols based on radiation-therapy goals, namely, maintenance of contouring quality and integrity while minimizing patient CT simulation dose. The authors proposed a general prediction strategy that provides automatic optimal CT simulation protocol selection as a function of patient size and treatment planning task. The optimal protocol is the one that delivers the minimum dose required to provide a CT simulation scan that yields accurate contours. Accurate treatment plans depend on accurate contours in order to conform the dose to actual tumor and normal organ positions. An image quality index, defined to characterize how simulation scan quality affects contour delineation, was developed and used to benchmark the contouring accuracy and treatment plan quality within the predication strategy. A clinical workflow was developed to select the optimal CT simulation protocols incorporating patient size, target delineation, and radiation dose efficiency. An experimental study using an anthropomorphic pelvis phantom with added-bolus layers was used to demonstrate how the proposed prediction strategy could be implemented and how the optimal CT simulation protocols could be selected for prostate cancer patients based on patient size and treatment planning task. Clinical IMRT prostate treatment plans for seven CT scans with varied image quality indices were separately optimized and compared to verify the trace of target and organ dosimetry coverage. Based on the phantom study, the optimal image quality index for accurate manual prostate contouring was 4.4. The optimal tube potentials for patient sizes of 38, 43, 48, 53, and 58 cm were 120, 140, 140, 140, and 140 kVp, respectively, and the corresponding minimum CTDIvol for achieving the optimal image quality index 4.4 were 9.8, 32.2, 100.9, 241.4, and 274.1 mGy, respectively. For patients with lateral sizes of 43-58 cm, 120-kVp scan protocols yielded up to 165% greater radiation dose relative to 140-kVp protocols, and 140-kVp protocols always yielded a greater image quality index compared to the same dose-level 120-kVp protocols. The trace of target and organ dosimetry coverage and the γ passing rates of seven IMRT dose distribution pairs indicated the feasibility of the proposed image quality index for the predication strategy. A general strategy to predict the optimal CT simulation protocols in a flexible and quantitative way was developed that takes into account patient size, treatment planning task, and radiation dose. The experimental study indicated that the optimal CT simulation protocol and the corresponding radiation dose varied significantly for different patient sizes, contouring accuracy, and radiation treatment planning tasks.

  11. Feasibility Study: Colombian Caribbean Folk Dances to Increase Physical Fitness and Health-Related Quality of Life in Older Women.

    PubMed

    Pacheco, Ernesto; Hoyos, Diana P; Watt, Willinton J; Lema, Lucía; Arango, Carlos M

    2016-04-01

    The objectives of the study were to describe the feasibility of an intervention in older women based on folk dances of the Colombian Caribbean region, and to analyze the effects of the intervention on physical fitness and health-related quality of life (HRQoL). A pilot study was conducted in a sample of 27 participants, 15 in the intervention group (IG) and 12 in the comparison group (CG). Caribbean Colombian dance rhythms were introduced as an intervention that lasted 12 weeks. Recruitment and retention was not optimal. Treatment fidelity components indicated that intervention was administered as intended. IG participants showed positive and statistically significant changes in some components of physical fitness. No significant changes were observed in HRQoL indicators for either group. In conclusion, the intervention was feasible, but recruitment and retention was challenging. Folk dances of the Colombian Caribbean region provoked significant results in physical fitness but not in HRQoL.

  12. Study of the combined effects of ripeness and production area on Bosana oil's quality.

    PubMed

    Morrone, Lucia; Neri, Luisa; Cantini, Claudio; Alfei, Barbara; Rotondi, Annalisa

    2018-04-15

    The effects of olive ripeness, areas of production and their interaction on the chemical and sensory characteristics of cv. Bosana oil were assessed. The study was carried out in three areas of the Sassari province, Sardinia (Italy), at three stages of maturation. The results indicated the independence of the two factors: ripeness influenced saturated fatty acids, pigment content and deacetoxy oleuropein aglycone (DAOA) content and didn't affect the sensory characteristics, while production area influenced unsaturated fatty acids, content of vanillic acid and some sensory characters. In order to verify the interdependency of the two factors, statistical analyses (two-way ANOVA) were performed. Our study showed that a thoughtful planning of harvest times and production area could allow to obtain Bosana virgin olive oil of the highest quality. Furthermore, utilizing cultivars that maintain the properties of their oils even at late dates of harvest, it would be possible to optimize harvest times. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Multiscale analysis of replication technique efficiency for 3D roughness characterization of manufactured surfaces

    NASA Astrophysics Data System (ADS)

    Jolivet, S.; Mezghani, S.; El Mansori, M.

    2016-09-01

    The replication of topography has been generally restricted to optimizing material processing technologies in terms of statistical and single-scale features such as roughness. By contrast, manufactured surface topography is highly complex, irregular, and multiscale. In this work, we have demonstrated the use of multiscale analysis on replicates of surface finish to assess the precise control of the finished replica. Five commercial resins used for surface replication were compared. The topography of five standard surfaces representative of common finishing processes were acquired both directly and by a replication technique. Then, they were characterized using the ISO 25178 standard and multiscale decomposition based on a continuous wavelet transform, to compare the roughness transfer quality at different scales. Additionally, atomic force microscope force modulation mode was used in order to compare the resins’ stiffness properties. The results showed that less stiff resins are able to replicate the surface finish along a larger wavelength band. The method was then tested for non-destructive quality control of automotive gear tooth surfaces.

  14. Design of Clinical Support Systems Using Integrated Genetic Algorithm and Support Vector Machine

    NASA Astrophysics Data System (ADS)

    Chen, Yung-Fu; Huang, Yung-Fa; Jiang, Xiaoyi; Hsu, Yuan-Nian; Lin, Hsuan-Hung

    Clinical decision support system (CDSS) provides knowledge and specific information for clinicians to enhance diagnostic efficiency and improving healthcare quality. An appropriate CDSS can highly elevate patient safety, improve healthcare quality, and increase cost-effectiveness. Support vector machine (SVM) is believed to be superior to traditional statistical and neural network classifiers. However, it is critical to determine suitable combination of SVM parameters regarding classification performance. Genetic algorithm (GA) can find optimal solution within an acceptable time, and is faster than greedy algorithm with exhaustive searching strategy. By taking the advantage of GA in quickly selecting the salient features and adjusting SVM parameters, a method using integrated GA and SVM (IGS), which is different from the traditional method with GA used for feature selection and SVM for classification, was used to design CDSSs for prediction of successful ventilation weaning, diagnosis of patients with severe obstructive sleep apnea, and discrimination of different cell types form Pap smear. The results show that IGS is better than methods using SVM alone or linear discriminator.

  15. Pulmonary Infiltrates in Immunosuppressed Patients: Analysis of a Diagnostic Protocol

    PubMed Central

    Danés, Cristina; González-Martín, Julián; Pumarola, Tomàs; Rañó, Ana; Benito, Natividad; Torres, Antoni; Moreno, Asunción; Rovira, Montserrat; Puig de la Bellacasa, Jorge

    2002-01-01

    A diagnostic protocol was started to study the etiology of pulmonary infiltrates in immunosuppressed patients. The diagnostic yields of the different techniques were analyzed, with special emphasis on the importance of the sample quality and the role of rapid techniques in the diagnostic strategy. In total, 241 patients with newly developed pulmonary infiltrates within a period of 19 months were included. Noninvasive or invasive evaluation was performed according to the characteristics of the infiltrates. Diagnosis was achieved in 202 patients (84%); 173 patients (72%) had pneumonia, and specific etiologic agents were found in 114 (66%). Bronchoaspirate and bronchoalveolar lavage showed the highest yields, either on global analysis (23 of 35 specimens [66%] and 70 of 134 specimens [52%], respectively) or on analysis of each type of pneumonia. A tendency toward better results with optimal-quality samples was observed, and a statistically significant difference was found in sputum bacterial culture. Rapid diagnostic tests yielded results in 71 of 114 (62.2%) diagnoses of etiological pneumonia. PMID:12037077

  16. Phenex: ontological annotation of phenotypic diversity.

    PubMed

    Balhoff, James P; Dahdul, Wasila M; Kothari, Cartik R; Lapp, Hilmar; Lundberg, John G; Mabee, Paula; Midford, Peter E; Westerfield, Monte; Vision, Todd J

    2010-05-05

    Phenotypic differences among species have long been systematically itemized and described by biologists in the process of investigating phylogenetic relationships and trait evolution. Traditionally, these descriptions have been expressed in natural language within the context of individual journal publications or monographs. As such, this rich store of phenotype data has been largely unavailable for statistical and computational comparisons across studies or integration with other biological knowledge. Here we describe Phenex, a platform-independent desktop application designed to facilitate efficient and consistent annotation of phenotypic similarities and differences using Entity-Quality syntax, drawing on terms from community ontologies for anatomical entities, phenotypic qualities, and taxonomic names. Phenex can be configured to load only those ontologies pertinent to a taxonomic group of interest. The graphical user interface was optimized for evolutionary biologists accustomed to working with lists of taxa, characters, character states, and character-by-taxon matrices. Annotation of phenotypic data using ontologies and globally unique taxonomic identifiers will allow biologists to integrate phenotypic data from different organisms and studies, leveraging decades of work in systematics and comparative morphology.

  17. Optimal spatio-temporal design of water quality monitoring networks for reservoirs: Application of the concept of value of information

    NASA Astrophysics Data System (ADS)

    Maymandi, Nahal; Kerachian, Reza; Nikoo, Mohammad Reza

    2018-03-01

    This paper presents a new methodology for optimizing Water Quality Monitoring (WQM) networks of reservoirs and lakes using the concept of the value of information (VOI) and utilizing results of a calibrated numerical water quality simulation model. With reference to the value of information theory, water quality of every checkpoint with a specific prior probability differs in time. After analyzing water quality samples taken from potential monitoring points, the posterior probabilities are updated using the Baye's theorem, and VOI of the samples is calculated. In the next step, the stations with maximum VOI is selected as optimal stations. This process is repeated for each sampling interval to obtain optimal monitoring network locations for each interval. The results of the proposed VOI-based methodology is compared with those obtained using an entropy theoretic approach. As the results of the two methodologies would be partially different, in the next step, the results are combined using a weighting method. Finally, the optimal sampling interval and location of WQM stations are chosen using the Evidential Reasoning (ER) decision making method. The efficiency and applicability of the methodology are evaluated using available water quantity and quality data of the Karkheh Reservoir in the southwestern part of Iran.

  18. TH-B-207B-00: Pediatric Image Quality Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    This imaging educational program will focus on solutions to common pediatric image quality optimization challenges. The speakers will present collective knowledge on best practices in pediatric imaging from their experience at dedicated children’s hospitals. One of the most commonly encountered pediatric imaging requirements for the non-specialist hospital is pediatric CT in the emergency room setting. Thus, this educational program will begin with optimization of pediatric CT in the emergency department. Though pediatric cardiovascular MRI may be less common in the non-specialist hospitals, low pediatric volumes and unique cardiovascular anatomy make optimization of these techniques difficult. Therefore, our second speaker willmore » review best practices in pediatric cardiovascular MRI based on experiences from a children’s hospital with a large volume of cardiac patients. Learning Objectives: To learn techniques for optimizing radiation dose and image quality for CT of children in the emergency room setting. To learn solutions for consistently high quality cardiovascular MRI of children.« less

  19. An Optimization Principle for Deriving Nonequilibrium Statistical Models of Hamiltonian Dynamics

    NASA Astrophysics Data System (ADS)

    Turkington, Bruce

    2013-08-01

    A general method for deriving closed reduced models of Hamiltonian dynamical systems is developed using techniques from optimization and statistical estimation. Given a vector of resolved variables, selected to describe the macroscopic state of the system, a family of quasi-equilibrium probability densities on phase space corresponding to the resolved variables is employed as a statistical model, and the evolution of the mean resolved vector is estimated by optimizing over paths of these densities. Specifically, a cost function is constructed to quantify the lack-of-fit to the microscopic dynamics of any feasible path of densities from the statistical model; it is an ensemble-averaged, weighted, squared-norm of the residual that results from submitting the path of densities to the Liouville equation. The path that minimizes the time integral of the cost function determines the best-fit evolution of the mean resolved vector. The closed reduced equations satisfied by the optimal path are derived by Hamilton-Jacobi theory. When expressed in terms of the macroscopic variables, these equations have the generic structure of governing equations for nonequilibrium thermodynamics. In particular, the value function for the optimization principle coincides with the dissipation potential that defines the relation between thermodynamic forces and fluxes. The adjustable closure parameters in the best-fit reduced equations depend explicitly on the arbitrary weights that enter into the lack-of-fit cost function. Two particular model reductions are outlined to illustrate the general method. In each example the set of weights in the optimization principle contracts into a single effective closure parameter.

  20. [Quality analysis of the statistical used resources (material and methods section) in thesis projects of a university department].

    PubMed

    Regojo Zapata, O; Lamata Hernández, F; Sánchez Zalabardo, J M; Elizalde Benito, A; Navarro Gil, J; Valdivia Uría, J G

    2004-09-01

    Studies about quality in thesis and investigation projects in biomedical sciences are unusual, but very important in university teaching because is necessary to improve the quality elaboration of the thesis. The objectives the study were to determine the project's quality of thesis in our department, according to the fulfillment of the scientific methodology and to establish, if it exists, a relation between the global quality of the project and the statistical used resources. Descriptive study of 273 thesis projects performed between 1995-2002 in surgery department of the Zaragoza University. The review realized for 15 observers that they analyzed 28 indicators of every project. Giving a value to each of the indicators, the projects qualified in a scale from 1 to 10 according to the quality in the fulfillment of the scientific methodology. The mean of the project's quality was 5.53 (D.E: 1.77). In 13.9% the thesis projects was concluded with the reading of the work. The three indicators of statistical used resources had a significant difference with the value of the quality projects. The quality of the statistical resources is very important when a project of thesis wants to be realized by good methodology, because it assures to come to certain conclusions. In our study we have thought that more of the third part of the variability in the quality of the project of thesis explains for three statistical above-mentioned articles.

  1. The Interdependence of Adult Relationship Quality and Parenting Behaviours among African American and European Couples in Rural, Low-Income Communities

    PubMed Central

    Zvara, Bharathi J.; Mills-Koonce, W. Roger; Heilbron, Nicole; Clincy, Amanda; Cox, Martha J.

    2015-01-01

    The present study extends the spillover and crossover hypotheses to more carefully model the potential interdependence between parent–parent interaction quality and parent–child interaction quality in family systems. Using propensity score matching, the present study attempted to isolate family processes that are unique across African American and European American couples that are independent of other socio-demographic factors to further clarify how interparental relationships may be related to parenting in a rural, low-income sample. The Actor–Partner Interdependence Model (APIM), a statistical analysis technique that accounts for the interdependence of relationship data, was used with a sample of married and non-married cohabiting African American and European American couples (n = 82 dyads) to evaluate whether mothers' and fathers' observed parenting behaviours are related to their behaviours and their partner's behaviours observed in a couple problem-solving interaction. Findings revealed that interparental withdrawal behaviour, but not conflict behaviour, was associated with less optimal parenting for fathers but not mothers, and specifically so for African American fathers. Our findings support the notion of interdependence across subsystems within the family and suggest that African American fathers may be specifically responsive to variations in interparental relationship quality. PMID:26430390

  2. Motion artifact detection in four-dimensional computed tomography images

    NASA Astrophysics Data System (ADS)

    Bouilhol, G.; Ayadi, M.; Pinho, R.; Rit, S.; Sarrut, D.

    2014-03-01

    Motion artifacts appear in four-dimensional computed tomography (4DCT) images because of suboptimal acquisition parameters or patient breathing irregularities. Frequency of motion artifacts is high and they may introduce errors in radiation therapy treatment planning. Motion artifact detection can be useful for image quality assessment and 4D reconstruction improvement but manual detection in many images is a tedious process. We propose a novel method to evaluate the quality of 4DCT images by automatic detection of motion artifacts. The method was used to evaluate the impact of the optimization of acquisition parameters on image quality at our institute. 4DCT images of 114 lung cancer patients were analyzed. Acquisitions were performed with a rotation period of 0.5 seconds and a pitch of 0.1 (74 patients) or 0.081 (40 patients). A sensitivity of 0.70 and a specificity of 0.97 were observed. End-exhale phases were less prone to motion artifacts. In phases where motion speed is high, the number of detected artifacts was systematically reduced with a pitch of 0.081 instead of 0.1 and the mean reduction was 0.79. The increase of the number of patients with no artifact detected was statistically significant for the 10%, 70% and 80% respiratory phases, indicating a substantial image quality improvement.

  3. A Novel Consensus-Based Particle Swarm Optimization-Assisted Trust-Tech Methodology for Large-Scale Global Optimization.

    PubMed

    Zhang, Yong-Feng; Chiang, Hsiao-Dong

    2017-09-01

    A novel three-stage methodology, termed the "consensus-based particle swarm optimization (PSO)-assisted Trust-Tech methodology," to find global optimal solutions for nonlinear optimization problems is presented. It is composed of Trust-Tech methods, consensus-based PSO, and local optimization methods that are integrated to compute a set of high-quality local optimal solutions that can contain the global optimal solution. The proposed methodology compares very favorably with several recently developed PSO algorithms based on a set of small-dimension benchmark optimization problems and 20 large-dimension test functions from the CEC 2010 competition. The analytical basis for the proposed methodology is also provided. Experimental results demonstrate that the proposed methodology can rapidly obtain high-quality optimal solutions that can contain the global optimal solution. The scalability of the proposed methodology is promising.

  4. Can we do better? Economic analysis of human resource investment to improve home care service for the elderly in Serbia.

    PubMed

    Mihic, Marko M; Todorovic, Marija Lj; Obradovic, Vladimir Lj; Mitrovic, Zorica M

    2016-01-01

    Social services aimed at the elderly are facing great challenges caused by progressive aging of the global population but also by the constant pressure to spend funds in a rational manner. This paper focuses on analyzing the investments into human resources aimed at enhancing home care for the elderly since many countries have recorded progress in the area over the past years. The goal of this paper is to stress the significance of performing an economic analysis of the investment. This paper combines statistical analysis methods such as correlation and regression analysis, methods of economic analysis, and scenario method. The economic analysis of investing in human resources for home care service in Serbia showed that the both scenarios of investing in either additional home care hours or more beneficiaries are cost-efficient. However, the optimal solution with the positive (and the highest) value of economic net present value criterion is to invest in human resources to boost the number of home care hours from 6 to 8 hours per week and increase the number of the beneficiaries to 33%. This paper shows how the statistical and economic analysis results can be used to evaluate different scenarios and enable quality decision-making based on exact data in order to improve health and quality of life of the elderly and spend funds in a rational manner.

  5. An integrated logit model for contamination event detection in water distribution systems.

    PubMed

    Housh, Mashor; Ostfeld, Avi

    2015-05-15

    The problem of contamination event detection in water distribution systems has become one of the most challenging research topics in water distribution systems analysis. Current attempts for event detection utilize a variety of approaches including statistical, heuristics, machine learning, and optimization methods. Several existing event detection systems share a common feature in which alarms are obtained separately for each of the water quality indicators. Unifying those single alarms from different indicators is usually performed by means of simple heuristics. A salient feature of the current developed approach is using a statistically oriented model for discrete choice prediction which is estimated using the maximum likelihood method for integrating the single alarms. The discrete choice model is jointly calibrated with other components of the event detection system framework in a training data set using genetic algorithms. The fusing process of each indicator probabilities, which is left out of focus in many existing event detection system models, is confirmed to be a crucial part of the system which could be modelled by exploiting a discrete choice model for improving its performance. The developed methodology is tested on real water quality data, showing improved performances in decreasing the number of false positive alarms and in its ability to detect events with higher probabilities, compared to previous studies. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Building a knowledge-based statistical potential by capturing high-order inter-residue interactions and its applications in protein secondary structure assessment.

    PubMed

    Li, Yaohang; Liu, Hui; Rata, Ionel; Jakobsson, Eric

    2013-02-25

    The rapidly increasing number of protein crystal structures available in the Protein Data Bank (PDB) has naturally made statistical analyses feasible in studying complex high-order inter-residue correlations. In this paper, we report a context-based secondary structure potential (CSSP) for assessing the quality of predicted protein secondary structures generated by various prediction servers. CSSP is a sequence-position-specific knowledge-based potential generated based on the potentials of mean force approach, where high-order inter-residue interactions are taken into consideration. The CSSP potential is effective in identifying secondary structure predictions with good quality. In 56% of the targets in the CB513 benchmark, the optimal CSSP potential is able to recognize the native secondary structure or a prediction with Q3 accuracy higher than 90% as best scored in the predicted secondary structures generated by 10 popularly used secondary structure prediction servers. In more than 80% of the CB513 targets, the predicted secondary structures with the lowest CSSP potential values yield higher than 80% Q3 accuracy. Similar performance of CSSP is found on the CASP9 targets as well. Moreover, our computational results also show that the CSSP potential using triplets outperforms the CSSP potential using doublets and is currently better than the CSSP potential using quartets.

  7. Delta-doping optimization for high quality p-type GaN

    NASA Astrophysics Data System (ADS)

    Bayram, C.; Pau, J. L.; McClintock, R.; Razeghi, M.

    2008-10-01

    Delta (δ -) doping is studied in order to achieve high quality p-type GaN. Atomic force microscopy, x-ray diffraction, photoluminescence, and Hall measurements are performed on the samples to optimize the δ-doping characteristics. The effect of annealing on the electrical, optical, and structural quality is also investigated for different δ-doping parameters. Optimized pulsing conditions result in layers with hole concentrations near 1018 cm-3 and superior crystal quality compared to conventional p-GaN. This material improvement is achieved thanks to the reduction in the Mg activation energy and self-compensation effects in δ-doped p-GaN.

  8. Environmental statistics and optimal regulation

    NASA Astrophysics Data System (ADS)

    Sivak, David; Thomson, Matt

    2015-03-01

    The precision with which an organism can detect its environment, and the timescale for and statistics of environmental change, will affect the suitability of different strategies for regulating protein levels in response to environmental inputs. We propose a general framework--here applied to the enzymatic regulation of metabolism in response to changing nutrient concentrations--to predict the optimal regulatory strategy given the statistics of fluctuations in the environment and measurement apparatus, and the costs associated with enzyme production. We find: (i) relative convexity of enzyme expression cost and benefit influences the fitness of thresholding or graded responses; (ii) intermediate levels of measurement uncertainty call for a sophisticated Bayesian decision rule; and (iii) in dynamic contexts, intermediate levels of uncertainty call for retaining memory of the past. Statistical properties of the environment, such as variability and correlation times, set optimal biochemical parameters, such as thresholds and decay rates in signaling pathways. Our framework provides a theoretical basis for interpreting molecular signal processing algorithms and a classification scheme that organizes known regulatory strategies and may help conceptualize heretofore unknown ones.

  9. Sleep quantity, quality and optimism in children

    PubMed Central

    Lemola, Sakari; Räikkönen, Katri; Scheier, Michael F.; Matthews, Karen A.; Pesonen, Anu-Katriina; Heinonen, Kati; Lahti, Jari; Komsi, Niina; Paavonen, E. Juulia; Kajantie, Eero

    2014-01-01

    We tested the relationship of objectively-measured sleep quantity and quality with positive characteristics of the child. Sleep duration, sleep latency, and sleep efficiency were measured by an actigraph for an average seven (range = 3 to 14) consecutive nights in 291 eight-year-old children (SD = 0.3 years). Children's optimism, self-esteem, and social competence were rated by parents and/or teachers. Sleep duration showed a non-linear, reverse J-shaped relationship with optimism (P = 0.02) such that children with sleep duration in the middle of the distribution scored higher in optimism compared to children who slept relatively little. Shorter sleep latency was related to higher optimism (P = 0.01). The associations remained when adjusting for child's age, sex, body mass index and parental level of education; the effects of sleep on optimism were neither changed when the parents' own optimism was controlled. In conclusion, sufficient sleep quantity and good sleep quality are associated with positive characteristics of the child, further underlining their importance in promoting well-being in children. PMID:20561178

  10. Comparative one-factor-at-a-time, response surface (statistical) and bench-scale bioreactor level optimization of thermoalkaline protease production from a psychrotrophic Pseudomonas putida SKG-1 isolate.

    PubMed

    Singh, Santosh K; Singh, Sanjay K; Tripathi, Vinayak R; Khare, Sunil K; Garg, Satyendra K

    2011-12-28

    Production of alkaline protease from various bacterial strains using statistical methods is customary now-a-days. The present work is first attempt for the production optimization of a solvent stable thermoalkaline protease by a psychrotrophic Pseudomonas putida isolate using conventional, response surface methods, and fermentor level optimization. The pre-screening medium amended with optimized (w/v) 1.0% glucose, 2.0% gelatin and 0.5% yeast extract, produced 278 U protease ml(-1) at 72 h incubation. Enzyme production increased to 431 Uml(-1) when Mg2+ (0.01%, w/v) was supplemented. Optimization of physical factors further enhanced protease to 514 Uml(-1) at pH 9.0, 25°C and 200 rpm within 60 h. The combined effect of conventionally optimized variables (glucose, yeast extract, MgSO4 and pH), thereafter predicted by response surface methodology yielded 617 U protease ml(-1) at glucose 1.25% (w/v), yeast extract 0.5% (w/v), MgSO4 0.01% (w/v) and pH 8.8. Bench-scale bioreactor level optimization resulted in enhanced production of 882 U protease ml(-1) at 0.8 vvm aeration and 150 rpm agitation during only 48 h incubation. The optimization of fermentation variables using conventional, statistical approaches and aeration/agitation at fermentor level resulted in ~13.5 folds increase (882 Uml(-1)) in protease production compared to un-optimized conditions (65 Uml(-1)). This is the highest level of thermoalkaline protease reported so far by any psychrotrophic bacterium.

  11. Optimization of fermentation medium for the production of atrazine degrading strain Acinetobacter sp. DNS(32) by statistical analysis system.

    PubMed

    Zhang, Ying; Wang, Yang; Wang, Zhi-Gang; Wang, Xi; Guo, Huo-Sheng; Meng, Dong-Fang; Wong, Po-Keung

    2012-01-01

    Statistical experimental designs provided by statistical analysis system (SAS) software were applied to optimize the fermentation medium composition for the production of atrazine-degrading Acinetobacter sp. DNS(32) in shake-flask cultures. A "Plackett-Burman Design" was employed to evaluate the effects of different components in the medium. The concentrations of corn flour, soybean flour, and K(2)HPO(4) were found to significantly influence Acinetobacter sp. DNS(32) production. The steepest ascent method was employed to determine the optimal regions of these three significant factors. Then, these three factors were optimized using central composite design of "response surface methodology." The optimized fermentation medium composition was composed as follows (g/L): corn flour 39.49, soybean flour 25.64, CaCO(3) 3, K(2)HPO(4) 3.27, MgSO(4)·7H(2)O 0.2, and NaCl 0.2. The predicted and verifiable values in the medium with optimized concentration of components in shake flasks experiments were 7.079 × 10(8) CFU/mL and 7.194 × 10(8) CFU/mL, respectively. The validated model can precisely predict the growth of atrazine-degraing bacterium, Acinetobacter sp. DNS(32).

  12. Changes in quality of life after elective surgery: an observational study comparing two measures.

    PubMed

    Kronzer, Vanessa L; Jerry, Michelle R; Ben Abdallah, Arbi; Wildes, Troy S; McKinnon, Sherry L; Sharma, Anshuman; Avidan, Michael S

    2017-08-01

    Our main objective was to compare the change in a validated quality of life measure to a global assessment measure. The secondary objectives were to estimate the minimum clinically important difference (MCID) and to describe the change in quality of life by surgical specialty. This prospective cohort study included 7902 adult patients undergoing elective surgery. Changes in the Veterans RAND 12-Item Health Survey (VR-12), composed of a physical component summary (PCS) and a mental component summary (MCS), were calculated using preoperative and postoperative questionnaires. The latter also contained a global assessment question for quality of life. We compared PCS and MCS to the global assessment using descriptive statistics and weighted kappa. MCID was calculated using an anchor-based approach. Analyses were pre-specified and registered (NCT02771964). By the change in VR-12 scores, an equal proportion of patients experienced improvement and deterioration in quality of life (28% for PCS, 25% for MCS). In contrast, by the global assessment measure, 61% reported improvement, while only 10% reported deterioration. Agreement with the global assessment was slight for both PCS (kappa = 0.20, 57% matched) and MCS (kappa = 0.10, 54% matched). The MCID for the overall VR-12 score was approximately 2.5 points. Patients undergoing orthopedic surgery showed the most improvement in quality of life measures, while patients undergoing gastrointestinal/hepatobiliary or urologic surgery showed the most deterioration. Subjective global quality of life report does not agree well with a validated quality of life instrument, perhaps due to patient over-optimism.

  13. [Correlation of the DNA fragmentation index and malformation rate of optimized sperm with embryonic development and early spontaneous abortion in IVF-ET].

    PubMed

    Jiang, Wei-Jie; Jin, Fan; Zhou, Li-Ming

    2016-06-01

    To investigate the effects of the DNA fragmentation index (DFI) and malformation rate (SMR) of optimized sperm on embryonic development and early spontaneous abortion in conventional in vitro fertilization and embryo transfer (IVF-ET). We selected 602 cycles of conventional IVF-ET for pure oviductal infertility that had achieved clinical pregnancies, including 505 cycles with ongoing pregnancy and 97 cycles with early spontaneous abortion. On the day of ovum retrieval, we examined the DNA integrity and morphology of the rest of the optimized sperm using the SCD and Diff-Quik methods, established the joint predictor (JP) by logistic equation, and assessed the value of DFI and SMR in predicting early spontaneous abortion using the ROC curve. The DFI, SMR, and high-quality embryo rate were (15.91±3.69)%, (82.85±10.24)%, and 46.53% (342/735) in the early spontaneous abortion group and (9.30±4.22)%, (77.32±9.19)%, and 56.43% (2263/4010) respectively in the ongoing pregnancy group, all with statistically significant differences between the two groups (P<0.05 ). Both the DFI and SMR were the risk factors of early spontaneous abortion (OR = 5.96 and 1.66; both P< 0.01). The areas under the ROC curve for DFI, SMR and JP were 0.893±0.019, 0.685±0.028, and 0.898±0.018, respectively. According to the Youden index, the optimal cut-off values of the DFI and SMR obtained for the prediction of early spontaneous abortion were approximately 15% and 80%. The DFI was correlated positively with SMR (r= 0.31, P<0.01) but the high-quality embryo rate negatively with both the DFI (r= -0.45, P<0.01) and SMR (r= -0.22, P<0.01). The DFI and SMR of optimized sperm are closely associated with embryonic development in IVF. The DFI has a certain value for predicting early spontaneous abortion with a threshold of approximately 15%, but SMR may have a lower predictive value.

  14. Proper Image Subtraction—Optimal Transient Detection, Photometry, and Hypothesis Testing

    NASA Astrophysics Data System (ADS)

    Zackay, Barak; Ofek, Eran O.; Gal-Yam, Avishay

    2016-10-01

    Transient detection and flux measurement via image subtraction stand at the base of time domain astronomy. Due to the varying seeing conditions, the image subtraction process is non-trivial, and existing solutions suffer from a variety of problems. Starting from basic statistical principles, we develop the optimal statistic for transient detection, flux measurement, and any image-difference hypothesis testing. We derive a closed-form statistic that: (1) is mathematically proven to be the optimal transient detection statistic in the limit of background-dominated noise, (2) is numerically stable, (3) for accurately registered, adequately sampled images, does not leave subtraction or deconvolution artifacts, (4) allows automatic transient detection to the theoretical sensitivity limit by providing credible detection significance, (5) has uncorrelated white noise, (6) is a sufficient statistic for any further statistical test on the difference image, and, in particular, allows us to distinguish particle hits and other image artifacts from real transients, (7) is symmetric to the exchange of the new and reference images, (8) is at least an order of magnitude faster to compute than some popular methods, and (9) is straightforward to implement. Furthermore, we present extensions of this method that make it resilient to registration errors, color-refraction errors, and any noise source that can be modeled. In addition, we show that the optimal way to prepare a reference image is the proper image coaddition presented in Zackay & Ofek. We demonstrate this method on simulated data and real observations from the PTF data release 2. We provide an implementation of this algorithm in MATLAB and Python.

  15. Image-analysis library

    NASA Technical Reports Server (NTRS)

    1980-01-01

    MATHPAC image-analysis library is collection of general-purpose mathematical and statistical routines and special-purpose data-analysis and pattern-recognition routines for image analysis. MATHPAC library consists of Linear Algebra, Optimization, Statistical-Summary, Densities and Distribution, Regression, and Statistical-Test packages.

  16. Umbilical vein injection for management of retained placenta.

    PubMed

    Nardin, Juan Manuel; Weeks, Andrew; Carroli, Guillermo

    2011-05-11

    If a retained placenta is left untreated, there is a high risk of maternal death. However, manual removal of the placenta is an invasive procedure with serious complications of haemorrhage, infection or genital tract trauma. To assess the use of umbilical vein injection (UVI) of saline solution alone or with oxytocin in comparison either with expectant management or with an alternative solution or other uterotonic agent for retained placenta. We searched the Cochrane Pregnancy and Childbirth Group's Trials Register (28 February 2011). Randomized trials comparing UVI of saline or other fluids, with or without oxytocics, either with expectant management or with an alternative solution or other uterotonic agent, in the management of retained placenta. Two review authors assessed the methodological quality of the studies and extracted the data. We included 15 trials (1704 women). The trials were of variable quality. Compared with expectant management, UVI of saline solution alone did not show any significant difference in the incidence of manual removal of the placenta (risk ratio (RR) 0.99; 95% confidence interval (CI) 0.84 to 1.16). UVI of oxytocin solution compared with expectant management showed no reduction in the need for manual removal (RR 0.87; 95% CI 0.74 to 1.03).Oxytocin solution compared with saline solution alone showed a reduction in manual removal of the placenta, but this was not statistically significant (RR 0.91; 95% CI 0.82 to 1.00). When only high-quality studies were assessed, there was no statistical difference (RR 0.92; 95% CI 0.83 to 1.01). We detected no differences in any of the other outcomes.UVI of oxytocin solution compared with UVI of plasma expander showed no statistically significant difference in the outcomes assessed by the only one small trial included. Prostaglandin solution compared with saline solution alone was associated with a statistically significant lower incidence in manual removal of placenta (RR 0.42; 95% CI 0.22 to 0.82) but we observed no difference in the other outcomes evaluated. Prostaglandin plus saline solution showed a statistically significant reduction in manual removal of placenta when compared with oxytocin plus saline solution (RR 0.43; 95% CI 0.25 to 0.75), and we also observed a small reduction in time from injection to placental delivery (mean difference -6.00; 95% CI -8.78 to -3.22). However, there were only two small trials contributing to this meta-analysis. UVI of oxytocin solution is an inexpensive and simple intervention that could be performed while placental delivery is awaited. However, high-quality randomized trials show that the use of oxytocin has little or no effect. Further research into the optimal timing of manual removal and into UVI of prostaglandins or plasma expander is warranted.

  17. Multi Objective Optimization of Yarn Quality and Fibre Quality Using Evolutionary Algorithm

    NASA Astrophysics Data System (ADS)

    Ghosh, Anindya; Das, Subhasis; Banerjee, Debamalya

    2013-03-01

    The quality and cost of resulting yarn play a significant role to determine its end application. The challenging task of any spinner lies in producing a good quality yarn with added cost benefit. The present work does a multi-objective optimization on two objectives, viz. maximization of cotton yarn strength and minimization of raw material quality. The first objective function has been formulated based on the artificial neural network input-output relation between cotton fibre properties and yarn strength. The second objective function is formulated with the well known regression equation of spinning consistency index. It is obvious that these two objectives are conflicting in nature i.e. not a single combination of cotton fibre parameters does exist which produce maximum yarn strength and minimum cotton fibre quality simultaneously. Therefore, it has several optimal solutions from which a trade-off is needed depending upon the requirement of user. In this work, the optimal solutions are obtained with an elitist multi-objective evolutionary algorithm based on Non-dominated Sorting Genetic Algorithm II (NSGA-II). These optimum solutions may lead to the efficient exploitation of raw materials to produce better quality yarns at low costs.

  18. Using dental care resources optimally: quality-efficiency trade-offs in a competitive private market.

    PubMed

    Prasad, Banuru Muralidhara; Varatharajan, D

    2011-01-01

    Modern lifestyle changes led to increased dental care needs in India. Consequently, there has been a sharp rise in dentist numbers. Karnataka state alone produces 2,500 dentists annually, who are engaged in the non-government sector owing to inadequate public sector opportunities. This article aims to assess Karnataka private dental clinic quality and efficiency. Dentists were interviewed using a close-ended, structured interview schedule and their clinics were assessed using a checklist adopted from guidelines for providing machinery and equipment under the National Oral Health Care Programme (NOHCP). Dental "hotel" and clinical quality were scored based on this checklist. Clinical quality was "excellent" in 12 per cent of clinics and poor in 49 per cent. Clinics with better infrastructure charged higher price (p < 0.05). Multi-chair clinics charging fixed rates were high (81 per cent). According to 59.5 per cent of dentists, competition did not improve quality while 27 per cent felt that competition increased price, not quality. About 30.9 per cent of the poor quality clinics, 41 per cent average quality clinics and 26 per cent good quality clinics were technically efficient. The multi chair clinics offered better quality at higher prices and single chair clinics provided poorer quality at lower prices. In other words, they had a sub-optimal price-quality mix. Therefore, there is a need to regulate price and quality in all clinics to arrive at an optimal price-quality mix so that clients are not overburdened financially even while receiving good quality dental care. The article advocates that resources are used optimally as a way to achieve value for money and to achieve break-even points thereby providing quality care in a competitive market. Factors that influence dental practitioner behaviour are evaluated.

  19. Integration of Large-Scale Optimization and Game Theory for Sustainable Water Quality Management

    NASA Astrophysics Data System (ADS)

    Tsao, J.; Li, J.; Chou, C.; Tung, C.

    2009-12-01

    Sustainable water quality management requires total mass control in pollutant discharge based on both the principles of not exceeding assimilative capacity in a river and equity among generations. The stream assimilative capacity is the carrying capacity of a river for the maximum waste load without violating the water quality standard and the spirit of total mass control is to optimize the waste load allocation in subregions. For the goal of sustainable watershed development, this study will use large-scale optimization theory to optimize the profit, and find the marginal values of loadings as reference of the fair price and then the best way to get the equilibrium by water quality trading for the whole of watershed will be found. On the other hand, game theory plays an important role to maximize both individual and entire profits. This study proves the water quality trading market is available in some situation, and also makes the whole participants get a better outcome.

  20. Perceived Sleep Quality, Mood States, and Their Relationship With Performance Among Brazilian Elite Athletes During a Competitive Period.

    PubMed

    Brandt, Ricardo; Bevilacqua, Guilherme G; Andrade, Alexandro

    2017-04-01

    Brandt, R, Bevilacqua, GG, and Andrade, A. Perceived sleep quality, mood states, and their relationship with performance among Brazilian elite athletes during a competitive period. J Strength Cond Res 31(4): 1033-1039, 2017-We described the perceived sleep quality and mood states of elite athletes during a competitive period, and clarified their relationship to athletes' sport performance. Participants were 576 Brazilian elite athletes (404 men and 172 women) of individual and team sports. Mood states were evaluated using the Brunel Mood Scale, whereas perceived sleep quality was evaluated using a single question ("How would you evaluate the quality of your sleep in the last few days?"). Evaluations of mood state and sleep quality were performed up to 60 minutes before national and international sports competitions began. Descriptive and inferential statistics (including logistic regression) were used to evaluate the relationship of sleep quality and mood states with performance (i.e., winning or losing). Athletes typically had good sleep quality and mood states similar to the Iceberg profile (i.e., high vigor and low tension, depression, anger, fatigue, and mental confusion). The Wald test revealed that sleep, anger, tension, and vigor predicted athletes' performance. Specifically, poor sleep quality and low vigor and anger decreased the odds of winning, whereas higher tension increased these odds. The Hosmer-Lemeshow test indicated that the results were sufficiently generalizable. Overall, we observed a significant relationship between sleep and mood states, which in turn both significantly influenced athletes' sports performance. Thus, coaching staff and athletes should monitor athletes' sleep quality before competitions to ensure athletes are in the optimal condition for performance.

  1. Development and application of a statistical quality assessment method for dense-graded mixes.

    DOT National Transportation Integrated Search

    2004-08-01

    This report describes the development of the statistical quality assessment method and the procedure for mapping the measures obtained from the quality assessment method to a composite pay factor. The application to dense-graded mixes is demonstrated...

  2. Optimized protocols for cardiac magnetic resonance imaging in patients with thoracic metallic implants.

    PubMed

    Olivieri, Laura J; Cross, Russell R; O'Brien, Kendall E; Ratnayaka, Kanishka; Hansen, Michael S

    2015-09-01

    Cardiac magnetic resonance (MR) imaging is a valuable tool in congenital heart disease; however patients frequently have metal devices in the chest from the treatment of their disease that complicate imaging. Methods are needed to improve imaging around metal implants near the heart. Basic sequence parameter manipulations have the potential to minimize artifact while limiting effects on image resolution and quality. Our objective was to design cine and static cardiac imaging sequences to minimize metal artifact while maintaining image quality. Using systematic variation of standard imaging parameters on a fluid-filled phantom containing commonly used metal cardiac devices, we developed optimized sequences for steady-state free precession (SSFP), gradient recalled echo (GRE) cine imaging, and turbo spin-echo (TSE) black-blood imaging. We imaged 17 consecutive patients undergoing routine cardiac MR with 25 metal implants of various origins using both standard and optimized imaging protocols for a given slice position. We rated images for quality and metal artifact size by measuring metal artifact in two orthogonal planes within the image. All metal artifacts were reduced with optimized imaging. The average metal artifact reduction for the optimized SSFP cine was 1.5+/-1.8 mm, and for the optimized GRE cine the reduction was 4.6+/-4.5 mm (P < 0.05). Quality ratings favored the optimized GRE cine. Similarly, the average metal artifact reduction for the optimized TSE images was 1.6+/-1.7 mm (P < 0.05), and quality ratings favored the optimized TSE imaging. Imaging sequences tailored to minimize metal artifact are easily created by modifying basic sequence parameters, and images are superior to standard imaging sequences in both quality and artifact size. Specifically, for optimized cine imaging a GRE sequence should be used with settings that favor short echo time, i.e. flow compensation off, weak asymmetrical echo and a relatively high receiver bandwidth. For static black-blood imaging, a TSE sequence should be used with fat saturation turned off and high receiver bandwidth.

  3. Improving plan quality for prostate volumetric-modulated arc therapy.

    PubMed

    Wright, Katrina; Ferrari-Anderson, Janet; Barry, Tamara; Bernard, Anne; Brown, Elizabeth; Lehman, Margot; Pryor, David

    2017-01-01

    We critically evaluated the quality and consistency of volumetric-modulated arc therapy (VMAT) prostate planning at a single institution to quantify objective measures for plan quality and establish clear guidelines for plan evaluation and quality assurance. A retrospective analysis was conducted on 34 plans generated on the Pinnacle 3 version 9.4 and 9.8 treatment planning system to deliver 78 Gy in 39 fractions to the prostate only using VMAT. Data were collected on contoured structure volumes, overlaps and expansions, planning target volume (PTV) and organs at risk volumes and relationship, dose volume histogram, plan conformity, plan homogeneity, low-dose wash, and beam parameters. Standard descriptive statistics were used to describe the data. Despite a standardized planning protocol, we found variability was present in all steps of the planning process. Deviations from protocol contours by radiation oncologists and radiation therapists occurred in 12% and 50% of cases, respectively, and the number of optimization parameters ranged from 12 to 27 (median 17). This contributed to conflicts within the optimization process reflected by the mean composite objective value of 0.07 (range 0.01 to 0.44). Methods used to control low-intermediate dose wash were inconsistent. At the PTV rectum interface, the dose-gradient distance from the 74.1 Gy to 40 Gy isodose ranged from 0.6 cm to 2.0 cm (median 1.0 cm). Increasing collimator angle was associated with a decrease in monitor units and a single full 6 MV arc was sufficient for the majority of plans. A significant relationship was found between clinical target volume-rectum distance and rectal tolerances achieved. A linear relationship was determined between the PTV volume and volume of 40 Gy isodose. Objective values and composite objective values were useful in determining plan quality. Anatomic geometry and overlap of structures has a measurable impact on the plan quality achieved for prostate patients being treated with VMAT. By evaluating multiple planning variables, we have been able to determine important factors influencing plan quality and develop predictive models for quality metrics that have been incorporated into our new protocol and will be tested and refined in future studies. Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.

  4. An Automated Treatment Plan Quality Control Tool for Intensity-Modulated Radiation Therapy Using a Voxel-Weighting Factor-Based Re-Optimization Algorithm.

    PubMed

    Song, Ting; Li, Nan; Zarepisheh, Masoud; Li, Yongbao; Gautier, Quentin; Zhou, Linghong; Mell, Loren; Jiang, Steve; Cerviño, Laura

    2016-01-01

    Intensity-modulated radiation therapy (IMRT) currently plays an important role in radiotherapy, but its treatment plan quality can vary significantly among institutions and planners. Treatment plan quality control (QC) is a necessary component for individual clinics to ensure that patients receive treatments with high therapeutic gain ratios. The voxel-weighting factor-based plan re-optimization mechanism has been proved able to explore a larger Pareto surface (solution domain) and therefore increase the possibility of finding an optimal treatment plan. In this study, we incorporated additional modules into an in-house developed voxel weighting factor-based re-optimization algorithm, which was enhanced as a highly automated and accurate IMRT plan QC tool (TPS-QC tool). After importing an under-assessment plan, the TPS-QC tool was able to generate a QC report within 2 minutes. This QC report contains the plan quality determination as well as information supporting the determination. Finally, the IMRT plan quality can be controlled by approving quality-passed plans and replacing quality-failed plans using the TPS-QC tool. The feasibility and accuracy of the proposed TPS-QC tool were evaluated using 25 clinically approved cervical cancer patient IMRT plans and 5 manually created poor-quality IMRT plans. The results showed high consistency between the QC report quality determinations and the actual plan quality. In the 25 clinically approved cases that the TPS-QC tool identified as passed, a greater difference could be observed for dosimetric endpoints for organs at risk (OAR) than for planning target volume (PTV), implying that better dose sparing could be achieved in OAR than in PTV. In addition, the dose-volume histogram (DVH) curves of the TPS-QC tool re-optimized plans satisfied the dosimetric criteria more frequently than did the under-assessment plans. In addition, the criteria for unsatisfied dosimetric endpoints in the 5 poor-quality plans could typically be satisfied when the TPS-QC tool generated re-optimized plans without sacrificing other dosimetric endpoints. In addition to its feasibility and accuracy, the proposed TPS-QC tool is also user-friendly and easy to operate, both of which are necessary characteristics for clinical use.

  5. Optimal design of monitoring networks for multiple groundwater quality parameters using a Kalman filter: application to the Irapuato-Valle aquifer.

    PubMed

    Júnez-Ferreira, H E; Herrera, G S; González-Hita, L; Cardona, A; Mora-Rodríguez, J

    2016-01-01

    A new method for the optimal design of groundwater quality monitoring networks is introduced in this paper. Various indicator parameters were considered simultaneously and tested for the Irapuato-Valle aquifer in Mexico. The steps followed in the design were (1) establishment of the monitoring network objectives, (2) definition of a groundwater quality conceptual model for the study area, (3) selection of the parameters to be sampled, and (4) selection of a monitoring network by choosing the well positions that minimize the estimate error variance of the selected indicator parameters. Equal weight for each parameter was given to most of the aquifer positions and a higher weight to priority zones. The objective for the monitoring network in the specific application was to obtain a general reconnaissance of the water quality, including water types, water origin, and first indications of contamination. Water quality indicator parameters were chosen in accordance with this objective, and for the selection of the optimal monitoring sites, it was sought to obtain a low-uncertainty estimate of these parameters for the entire aquifer and with more certainty in priority zones. The optimal monitoring network was selected using a combination of geostatistical methods, a Kalman filter and a heuristic optimization method. Results show that when monitoring the 69 locations with higher priority order (the optimal monitoring network), the joint average standard error in the study area for all the groundwater quality parameters was approximately 90 % of the obtained with the 140 available sampling locations (the set of pilot wells). This demonstrates that an optimal design can help to reduce monitoring costs, by avoiding redundancy in data acquisition.

  6. Optimality, stochasticity, and variability in motor behavior

    PubMed Central

    Guigon, Emmanuel; Baraduc, Pierre; Desmurget, Michel

    2008-01-01

    Recent theories of motor control have proposed that the nervous system acts as a stochastically optimal controller, i.e. it plans and executes motor behaviors taking into account the nature and statistics of noise. Detrimental effects of noise are converted into a principled way of controlling movements. Attractive aspects of such theories are their ability to explain not only characteristic features of single motor acts, but also statistical properties of repeated actions. Here, we present a critical analysis of stochastic optimality in motor control which reveals several difficulties with this hypothesis. We show that stochastic control may not be necessary to explain the stochastic nature of motor behavior, and we propose an alternative framework, based on the action of a deterministic controller coupled with an optimal state estimator, which relieves drawbacks of stochastic optimality and appropriately explains movement variability. PMID:18202922

  7. A Hybrid Interval-Robust Optimization Model for Water Quality Management.

    PubMed

    Xu, Jieyu; Li, Yongping; Huang, Guohe

    2013-05-01

    In water quality management problems, uncertainties may exist in many system components and pollution-related processes ( i.e. , random nature of hydrodynamic conditions, variability in physicochemical processes, dynamic interactions between pollutant loading and receiving water bodies, and indeterminacy of available water and treated wastewater). These complexities lead to difficulties in formulating and solving the resulting nonlinear optimization problems. In this study, a hybrid interval-robust optimization (HIRO) method was developed through coupling stochastic robust optimization and interval linear programming. HIRO can effectively reflect the complex system features under uncertainty, where implications of water quality/quantity restrictions for achieving regional economic development objectives are studied. By delimiting the uncertain decision space through dimensional enlargement of the original chemical oxygen demand (COD) discharge constraints, HIRO enhances the robustness of the optimization processes and resulting solutions. This method was applied to planning of industry development in association with river-water pollution concern in New Binhai District of Tianjin, China. Results demonstrated that the proposed optimization model can effectively communicate uncertainties into the optimization process and generate a spectrum of potential inexact solutions supporting local decision makers in managing benefit-effective water quality management schemes. HIRO is helpful for analysis of policy scenarios related to different levels of economic penalties, while also providing insight into the tradeoff between system benefits and environmental requirements.

  8. Numerical and Qualitative Contrasts of Two Statistical Models for Water Quality Change in Tidal Waters

    EPA Science Inventory

    Two statistical approaches, weighted regression on time, discharge, and season and generalized additive models, have recently been used to evaluate water quality trends in estuaries. Both models have been used in similar contexts despite differences in statistical foundations and...

  9. Medial-based deformable models in nonconvex shape-spaces for medical image segmentation.

    PubMed

    McIntosh, Chris; Hamarneh, Ghassan

    2012-01-01

    We explore the application of genetic algorithms (GA) to deformable models through the proposition of a novel method for medical image segmentation that combines GA with nonconvex, localized, medial-based shape statistics. We replace the more typical gradient descent optimizer used in deformable models with GA, and the convex, implicit, global shape statistics with nonconvex, explicit, localized ones. Specifically, we propose GA to reduce typical deformable model weaknesses pertaining to model initialization, pose estimation and local minima, through the simultaneous evolution of a large number of models. Furthermore, we constrain the evolution, and thus reduce the size of the search-space, by using statistically-based deformable models whose deformations are intuitive (stretch, bulge, bend) and are driven in terms of localized principal modes of variation, instead of modes of variation across the entire shape that often fail to capture localized shape changes. Although GA are not guaranteed to achieve the global optima, our method compares favorably to the prevalent optimization techniques, convex/nonconvex gradient-based optimizers and to globally optimal graph-theoretic combinatorial optimization techniques, when applied to the task of corpus callosum segmentation in 50 mid-sagittal brain magnetic resonance images.

  10. Identifying optimal threshold statistics for elimination of hookworm using a stochastic simulation model.

    PubMed

    Truscott, James E; Werkman, Marleen; Wright, James E; Farrell, Sam H; Sarkar, Rajiv; Ásbjörnsdóttir, Kristjana; Anderson, Roy M

    2017-06-30

    There is an increased focus on whether mass drug administration (MDA) programmes alone can interrupt the transmission of soil-transmitted helminths (STH). Mathematical models can be used to model these interventions and are increasingly being implemented to inform investigators about expected trial outcome and the choice of optimum study design. One key factor is the choice of threshold for detecting elimination. However, there are currently no thresholds defined for STH regarding breaking transmission. We develop a simulation of an elimination study, based on the DeWorm3 project, using an individual-based stochastic disease transmission model in conjunction with models of MDA, sampling, diagnostics and the construction of study clusters. The simulation is then used to analyse the relationship between the study end-point elimination threshold and whether elimination is achieved in the long term within the model. We analyse the quality of a range of statistics in terms of the positive predictive values (PPV) and how they depend on a range of covariates, including threshold values, baseline prevalence, measurement time point and how clusters are constructed. End-point infection prevalence performs well in discriminating between villages that achieve interruption of transmission and those that do not, although the quality of the threshold is sensitive to baseline prevalence and threshold value. Optimal post-treatment prevalence threshold value for determining elimination is in the range 2% or less when the baseline prevalence range is broad. For multiple clusters of communities, both the probability of elimination and the ability of thresholds to detect it are strongly dependent on the size of the cluster and the size distribution of the constituent communities. Number of communities in a cluster is a key indicator of probability of elimination and PPV. Extending the time, post-study endpoint, at which the threshold statistic is measured improves PPV value in discriminating between eliminating clusters and those that bounce back. The probability of elimination and PPV are very sensitive to baseline prevalence for individual communities. However, most studies and programmes are constructed on the basis of clusters. Since elimination occurs within smaller population sub-units, the construction of clusters introduces new sensitivities for elimination threshold values to cluster size and the underlying population structure. Study simulation offers an opportunity to investigate key sources of sensitivity for elimination studies and programme designs in advance and to tailor interventions to prevailing local or national conditions.

  11. Integrated modeling environment for systems-level performance analysis of the Next-Generation Space Telescope

    NASA Astrophysics Data System (ADS)

    Mosier, Gary E.; Femiano, Michael; Ha, Kong; Bely, Pierre Y.; Burg, Richard; Redding, David C.; Kissil, Andrew; Rakoczy, John; Craig, Larry

    1998-08-01

    All current concepts for the NGST are innovative designs which present unique systems-level challenges. The goals are to outperform existing observatories at a fraction of the current price/performance ratio. Standard practices for developing systems error budgets, such as the 'root-sum-of- squares' error tree, are insufficient for designs of this complexity. Simulation and optimization are the tools needed for this project; in particular tools that integrate controls, optics, thermal and structural analysis, and design optimization. This paper describes such an environment which allows sub-system performance specifications to be analyzed parametrically, and includes optimizing metrics that capture the science requirements. The resulting systems-level design trades are greatly facilitated, and significant cost savings can be realized. This modeling environment, built around a tightly integrated combination of commercial off-the-shelf and in-house- developed codes, provides the foundation for linear and non- linear analysis on both the time and frequency-domains, statistical analysis, and design optimization. It features an interactive user interface and integrated graphics that allow highly-effective, real-time work to be done by multidisciplinary design teams. For the NGST, it has been applied to issues such as pointing control, dynamic isolation of spacecraft disturbances, wavefront sensing and control, on-orbit thermal stability of the optics, and development of systems-level error budgets. In this paper, results are presented from parametric trade studies that assess requirements for pointing control, structural dynamics, reaction wheel dynamic disturbances, and vibration isolation. These studies attempt to define requirements bounds such that the resulting design is optimized at the systems level, without attempting to optimize each subsystem individually. The performance metrics are defined in terms of image quality, specifically centroiding error and RMS wavefront error, which directly links to science requirements.

  12. Expanding the definition of addiction: DSM-5 vs. ICD-11.

    PubMed

    Grant, Jon E; Chamberlain, Samuel R

    2016-08-01

    While considerable efforts have been made to understand the neurobiological basis of substance addiction, the potentially "addictive" qualities of repetitive behaviors, and whether such behaviors constitute "behavioral addictions," is relatively neglected. It has been suggested that some conditions, such as gambling disorder, compulsive stealing, compulsive buying, compulsive sexual behavior, and problem Internet use, have phenomenological and neurobiological parallels with substance use disorders. This review considers how the issue of "behavioral addictions" has been handled by latest revisions of the Diagnostic and Statistical Manual of Mental Disorders (DSM) and the International Classification of Diseases (ICD), leading to somewhat divergent approaches. We also consider key areas for future research in order to address optimal diagnostic classification and treatments for such repetitive, debilitating behaviors.

  13. Dispositional and Explanatory Style Optimism as Potential Moderators of the Relationship between Hopelessness and Suicidal Ideation

    ERIC Educational Resources Information Center

    Hirsch, Jameson K.; Conner, Kenneth R.

    2006-01-01

    To test the hypothesis that higher levels of optimism reduce the association between hopelessness and suicidal ideation, 284 college students completed self-report measures of optimism and Beck scales for hopelessness, suicidal ideation, and depression. A statistically significant interaction between hopelessness and one measure of optimism was…

  14. Preceptor Perceptions of Virtual Quality Assurance Experiential Site Visits.

    PubMed

    Clarke, Cheryl L; Schott, Kathryn A; Arnold, Austin D

    2018-05-01

    Objective. To determine preceptor perceptions of the value of experiential quality assurance site visits between virtual and onsite visits, and to gauge preceptor opinions of the optimal method of site visits based on the type of visit received. Methods. Site visits (12 virtual and 17 onsite) were conducted with 29 APPE sites located at least 200 miles from campus. Participating preceptors were invited to complete an online post-visit survey adapted from a previously validated and published survey tool measuring preceptor perceptions of the value of traditional onsite visits. Results. Likert-type score averages for survey questions ranged from 4.2 to 4.6 in the virtual group and from 4.3 to 4.7 in the onsite group. No statistically significant difference was found between the two groups. Preceptors were more inclined to prefer the type of visit they received. Preceptors receiving onsite visits were also more likely to indicate no visit type preference. Conclusion. Preceptors perceived value from both onsite and virtual site visits. Preceptors who experienced virtual site visits highly preferred that methodology. This study suggests that virtual site visits may be a viable alternative for providing experiential quality assurance site visits from a preceptor's perspective.

  15. Quality evaluation of official accident reports conducted by Labour Authorities in Andalusia (Spain).

    PubMed

    Salguero-Caparros, Francisco; Suarez-Cebador, Manuel; Carrillo-Castrillo, Jesús A; Rubio-Romero, Juan Carlos

    2018-01-01

    A public accident investigation is carried out when the consequences of the incident are significant or the accident has occurred in unusual circumstances. We evaluated the quality of the official accident investigations being conducted by Safety Specialists of the Labour Authorities in Andalusia. To achieve this objective, we analysed 98 occupational accident investigations conducted by the Labour Authorities in Andalusia in the last quarter of 2014. Various phases in the accident investigation process were examined, such as the use of the Eurostat variables within European Statistics on Accidents at Work (ESAW), detection of causes, determination of preventive measures, cost analysis of the accidents, identification of noncompliance with legal requirements or the investigation method used. The results of this study show that 77% of the official occupational accident investigation reports analysed were conducted in accordance with all the quality criteria recommended in the literature. To enhance glogal learning, and optimize allocation of resources, we propose the development of a harmonized European model for the public investigation of occupational accidents. Further it would be advisable to create a common classification and coding system for the causes of accidents for all European Union Member States.

  16. ECG compression using non-recursive wavelet transform with quality control

    NASA Astrophysics Data System (ADS)

    Liu, Je-Hung; Hung, King-Chu; Wu, Tsung-Ching

    2016-09-01

    While wavelet-based electrocardiogram (ECG) data compression using scalar quantisation (SQ) yields excellent compression performance, a wavelet's SQ scheme, however, must select a set of multilevel quantisers for each quantisation process. As a result of the properties of multiple-to-one mapping, however, this scheme is not conducive for reconstruction error control. In order to address this problem, this paper presents a single-variable control SQ scheme able to guarantee the reconstruction quality of wavelet-based ECG data compression. Based on the reversible round-off non-recursive discrete periodised wavelet transform (RRO-NRDPWT), the SQ scheme is derived with a three-stage design process that first uses genetic algorithm (GA) for high compression ratio (CR), followed by a quadratic curve fitting for linear distortion control, and the third uses a fuzzy decision-making for minimising data dependency effect and selecting the optimal SQ. The two databases, Physikalisch-Technische Bundesanstalt (PTB) and Massachusetts Institute of Technology (MIT) arrhythmia, are used to evaluate quality control performance. Experimental results show that the design method guarantees a high compression performance SQ scheme with statistically linear distortion. This property can be independent of training data and can facilitate rapid error control.

  17. Improving tablet coating robustness by selecting critical process parameters from retrospective data.

    PubMed

    Galí, A; García-Montoya, E; Ascaso, M; Pérez-Lozano, P; Ticó, J R; Miñarro, M; Suñé-Negre, J M

    2016-09-01

    Although tablet coating processes are widely used in the pharmaceutical industry, they often lack adequate robustness. Up-scaling can be challenging as minor changes in parameters can lead to varying quality results. To select critical process parameters (CPP) using retrospective data of a commercial product and to establish a design of experiments (DoE) that would improve the robustness of the coating process. A retrospective analysis of data from 36 commercial batches. Batches were selected based on the quality results generated during batch release, some of which revealed quality deviations concerning the appearance of the coated tablets. The product is already marketed and belongs to the portfolio of a multinational pharmaceutical company. The Statgraphics 5.1 software was used for data processing to determine critical process parameters in order to propose new working ranges. This study confirms that it is possible to determine the critical process parameters and create design spaces based on retrospective data of commercial batches. This type of analysis is thus converted into a tool to optimize the robustness of existing processes. Our results show that a design space can be established with minimum investment in experiments, since current commercial batch data are processed statistically.

  18. The Profile of Creativity and Proposing Statistical Problem Quality Level Reviewed From Cognitive Style

    NASA Astrophysics Data System (ADS)

    Awi; Ahmar, A. S.; Rahman, A.; Minggi, I.; Mulbar, U.; Asdar; Ruslan; Upu, H.; Alimuddin; Hamda; Rosidah; Sutamrin; Tiro, M. A.; Rusli

    2018-01-01

    This research aims to reveal the profile about the level of creativity and the ability to propose statistical problem of students at Mathematics Education 2014 Batch in the State University of Makassar in terms of their cognitive style. This research uses explorative qualitative method by giving meta-cognitive scaffolding at the time of research. The hypothesis of research is that students who have field independent (FI) cognitive style in statistics problem posing from the provided information already able to propose the statistical problem that can be solved and create new data and the problem is already been included as a high quality statistical problem, while students who have dependent cognitive field (FD) commonly are still limited in statistics problem posing that can be finished and do not load new data and the problem is included as medium quality statistical problem.

  19. Compressive sampling of polynomial chaos expansions: Convergence analysis and sampling strategies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hampton, Jerrad; Doostan, Alireza, E-mail: alireza.doostan@colorado.edu

    2015-01-01

    Sampling orthogonal polynomial bases via Monte Carlo is of interest for uncertainty quantification of models with random inputs, using Polynomial Chaos (PC) expansions. It is known that bounding a probabilistic parameter, referred to as coherence, yields a bound on the number of samples necessary to identify coefficients in a sparse PC expansion via solution to an ℓ{sub 1}-minimization problem. Utilizing results for orthogonal polynomials, we bound the coherence parameter for polynomials of Hermite and Legendre type under their respective natural sampling distribution. In both polynomial bases we identify an importance sampling distribution which yields a bound with weaker dependence onmore » the order of the approximation. For more general orthonormal bases, we propose the coherence-optimal sampling: a Markov Chain Monte Carlo sampling, which directly uses the basis functions under consideration to achieve a statistical optimality among all sampling schemes with identical support. We demonstrate these different sampling strategies numerically in both high-order and high-dimensional, manufactured PC expansions. In addition, the quality of each sampling method is compared in the identification of solutions to two differential equations, one with a high-dimensional random input and the other with a high-order PC expansion. In both cases, the coherence-optimal sampling scheme leads to similar or considerably improved accuracy.« less

  20. Optimization of Synthesis Conditions of Carbon Nanotubes via Ultrasonic-Assisted Floating Catalyst Deposition Using Response Surface Methodology

    PubMed Central

    Mohammadian, Narges; Ghoreishi, Seyyed M.; Hafeziyeh, Samira; Saeidi, Samrand; Dionysiou, Dionysios D.

    2018-01-01

    The growing use of carbon nanotubes (CNTs) in a plethora of applications has provided to us a motivation to investigate CNT synthesis by new methods. In this study, ultrasonic-assisted chemical vapor deposition (CVD) method was employed to synthesize CNTs. The difficulty of controlling the size of clusters and achieving uniform distribution—the major problem in previous methods—was solved by using ultrasonic bath and dissolving ferrocene in xylene outside the reactor. The operating conditions were optimized using a rotatable central composite design (CCD), which helped optimize the operating conditions of the method. Response surface methodology (RSM) was used to analyze these experiments. Using statistical software was very effective, considering that it decreased the number of experiments needed to achieve the optimum conditions. Synthesis of CNTs was studied as a function of three independent parameters viz. hydrogen flow rate (120–280 cm3/min), catalyst concentration (2–6 wt %), and synthesis temperature (800–1200 °C). Optimum conditions for the synthesis of CNTs were found to be 3.78 wt %, 184 cm3/min, and 976 °C for catalyst concentration, hydrogen flow rate, and synthesis temperature, respectively. Under these conditions, Raman spectrum indicates high values of (IG/ID), which means high-quality CNTs. PMID:29747451

  1. Predicting the Effects of Powder Feeding Rates on Particle Impact Conditions and Cold Spray Deposited Coatings

    NASA Astrophysics Data System (ADS)

    Ozdemir, Ozan C.; Widener, Christian A.; Carter, Michael J.; Johnson, Kyle W.

    2017-10-01

    As the industrial application of the cold spray technology grows, the need to optimize both the cost and the quality of the process grows with it. Parameter selection techniques available today require the use of a coupled system of equations to be solved to involve the losses due to particle loading in the gas stream. Such analyses cause a significant increase in the computational time in comparison with calculations with isentropic flow assumptions. In cold spray operations, engineers and operators may, therefore, neglect the effects of particle loading to simplify the multiparameter optimization process. In this study, two-way coupled (particle-fluid) quasi-one-dimensional fluid dynamics simulations are used to test the particle loading effects under many potential cold spray scenarios. Output of the simulations is statistically analyzed to build regression models that estimate the changes in particle impact velocity and temperature due to particle loading. This approach eases particle loading optimization for more complete analysis on deposition cost and time. The model was validated both numerically and experimentally. Further numerical analyses were completed to test the particle loading capacity and limitations of a nozzle with a commonly used throat size. Additional experimentation helped document the physical limitations to high-rate deposition.

  2. Fold assessment for comparative protein structure modeling.

    PubMed

    Melo, Francisco; Sali, Andrej

    2007-11-01

    Accurate and automated assessment of both geometrical errors and incompleteness of comparative protein structure models is necessary for an adequate use of the models. Here, we describe a composite score for discriminating between models with the correct and incorrect fold. To find an accurate composite score, we designed and applied a genetic algorithm method that searched for a most informative subset of 21 input model features as well as their optimized nonlinear transformation into the composite score. The 21 input features included various statistical potential scores, stereochemistry quality descriptors, sequence alignment scores, geometrical descriptors, and measures of protein packing. The optimized composite score was found to depend on (1) a statistical potential z-score for residue accessibilities and distances, (2) model compactness, and (3) percentage sequence identity of the alignment used to build the model. The accuracy of the composite score was compared with the accuracy of assessment by single and combined features as well as by other commonly used assessment methods. The testing set was representative of models produced by automated comparative modeling on a genomic scale. The composite score performed better than any other tested score in terms of the maximum correct classification rate (i.e., 3.3% false positives and 2.5% false negatives) as well as the sensitivity and specificity across the whole range of thresholds. The composite score was implemented in our program MODELLER-8 and was used to assess models in the MODBASE database that contains comparative models for domains in approximately 1.3 million protein sequences.

  3. Ranked solutions to a class of combinatorial optimizations—with applications in mass spectrometry based peptide sequencing and a variant of directed paths in random media

    NASA Astrophysics Data System (ADS)

    Doerr, Timothy P.; Alves, Gelio; Yu, Yi-Kuo

    2005-08-01

    Typical combinatorial optimizations are NP-hard; however, for a particular class of cost functions the corresponding combinatorial optimizations can be solved in polynomial time using the transfer matrix technique or, equivalently, the dynamic programming approach. This suggests a way to efficiently find approximate solutions-find a transformation that makes the cost function as similar as possible to that of the solvable class. After keeping many high-ranking solutions using the approximate cost function, one may then re-assess these solutions with the full cost function to find the best approximate solution. Under this approach, it is important to be able to assess the quality of the solutions obtained, e.g., by finding the true ranking of the kth best approximate solution when all possible solutions are considered exhaustively. To tackle this statistical issue, we provide a systematic method starting with a scaling function generated from the finite number of high-ranking solutions followed by a convergent iterative mapping. This method, useful in a variant of the directed paths in random media problem proposed here, can also provide a statistical significance assessment for one of the most important proteomic tasks-peptide sequencing using tandem mass spectrometry data. For directed paths in random media, the scaling function depends on the particular realization of randomness; in the mass spectrometry case, the scaling function is spectrum-specific.

  4. Integration of Mesh Optimization with 3D All-Hex Mesh Generation, LDRD Subcase 3504340000, Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    KNUPP,PATRICK; MITCHELL,SCOTT A.

    1999-11-01

    In an attempt to automatically produce high-quality all-hex meshes, we investigated a mesh improvement strategy: given an initial poor-quality all-hex mesh, we iteratively changed the element connectivity, adding and deleting elements and nodes, and optimized the node positions. We found a set of hex reconnection primitives. We improved the optimization algorithms so they can untangle a negative-Jacobian mesh, even considering Jacobians on the boundary, and subsequently optimize the condition number of elements in an untangled mesh. However, even after applying both the primitives and optimization we were unable to produce high-quality meshes in certain regions. Our experiences suggest that manymore » boundary configurations of quadrilaterals admit no hexahedral mesh with positive Jacobians, although we have no proof of this.« less

  5. Researches of fruit quality prediction model based on near infrared spectrum

    NASA Astrophysics Data System (ADS)

    Shen, Yulin; Li, Lian

    2018-04-01

    With the improvement in standards for food quality and safety, people pay more attention to the internal quality of fruits, therefore the measurement of fruit internal quality is increasingly imperative. In general, nondestructive soluble solid content (SSC) and total acid content (TAC) analysis of fruits is vital and effective for quality measurement in global fresh produce markets, so in this paper, we aim at establishing a novel fruit internal quality prediction model based on SSC and TAC for Near Infrared Spectrum. Firstly, the model of fruit quality prediction based on PCA + BP neural network, PCA + GRNN network, PCA + BP adaboost strong classifier, PCA + ELM and PCA + LS_SVM classifier are designed and implemented respectively; then, in the NSCT domain, the median filter and the SavitzkyGolay filter are used to preprocess the spectral signal, Kennard-Stone algorithm is used to automatically select the training samples and test samples; thirdly, we achieve the optimal models by comparing 15 kinds of prediction model based on the theory of multi-classifier competition mechanism, specifically, the non-parametric estimation is introduced to measure the effectiveness of proposed model, the reliability and variance of nonparametric estimation evaluation of each prediction model to evaluate the prediction result, while the estimated value and confidence interval regard as a reference, the experimental results demonstrate that this model can better achieve the optimal evaluation of the internal quality of fruit; finally, we employ cat swarm optimization to optimize two optimal models above obtained from nonparametric estimation, empirical testing indicates that the proposed method can provide more accurate and effective results than other forecasting methods.

  6. Application of Taguchi Design and Response Surface Methodology for Improving Conversion of Isoeugenol into Vanillin by Resting Cells of Psychrobacter sp. CSW4.

    PubMed

    Ashengroph, Morahem; Nahvi, Iraj; Amini, Jahanshir

    2013-01-01

    For all industrial processes, modelling, optimisation and control are the keys to enhance productivity and ensure product quality. In the current study, the optimization of process parameters for improving the conversion of isoeugenol to vanillin by Psychrobacter sp. CSW4 was investigated by means of Taguchi approach and Box-Behnken statistical design under resting cell conditions. Taguchi design was employed for screening the significant variables in the bioconversion medium. Sequentially, Box-Behnken design experiments under Response Surface Methodology (RSM) was used for further optimization. Four factors (isoeugenol, NaCl, biomass and tween 80 initial concentrations), which have significant effects on vanillin yield, were selected from ten variables by Taguchi experimental design. With the regression coefficient analysis in the Box-Behnken design, a relationship between vanillin production and four significant variables was obtained, and the optimum levels of the four variables were as follows: initial isoeugenol concentration 6.5 g/L, initial tween 80 concentration 0.89 g/L, initial NaCl concentration 113.2 g/L and initial biomass concentration 6.27 g/L. Under these optimized conditions, the maximum predicted concentration of vanillin was 2.25 g/L. These optimized values of the factors were validated in a triplicate shaking flask study and an average of 2.19 g/L for vanillin, which corresponded to a molar yield 36.3%, after a 24 h bioconversion was obtained. The present work is the first one reporting the application of Taguchi design and Response surface methodology for optimizing bioconversion of isoeugenol into vanillin under resting cell conditions.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piepel, Gregory F.; Pasquini, Benedetta; Cooley, Scott K.

    In recent years, multivariate optimization has played an increasing role in analytical method development. ICH guidelines recommend using statistical design of experiments to identify the design space, in which multivariate combinations of composition variables and process variables have been demonstrated to provide quality results. Considering a microemulsion electrokinetic chromatography method (MEEKC), the performance of the electrophoretic run depends on the proportions of mixture components (MCs) of the microemulsion and on the values of process variables (PVs). In the present work, for the first time in the literature, a mixture-process variable (MPV) approach was applied to optimize a MEEKC method formore » the analysis of coenzyme Q10 (Q10), ascorbic acid (AA), and folic acid (FA) contained in nutraceuticals. The MCs (buffer, surfactant-cosurfactant, oil) and the PVs (voltage, buffer concentration, buffer pH) were simultaneously changed according to a MPV experimental design. A 62-run MPV design was generated using the I-optimality criterion, assuming a 46-term MPV model allowing for special-cubic blending of the MCs, quadratic effects of the PVs, and some MC-PV interactions. The obtained data were used to develop MPV models that express the performance of an electrophoretic run (measured as peak efficiencies of Q10, AA, and FA) in terms of the MCs and PVs. Contour and perturbation plots were drawn for each of the responses. Finally, the MPV models and criteria for the peak efficiencies were used to develop the design space and an optimal subregion (i.e., the settings of the mixture MCs and PVs that satisfy the respective criteria), as well as a unique optimal combination of MCs and PVs.« less

  8. Optimism bias leads to inconclusive results - an empirical study

    PubMed Central

    Djulbegovic, Benjamin; Kumar, Ambuj; Magazin, Anja; Schroen, Anneke T.; Soares, Heloisa; Hozo, Iztok; Clarke, Mike; Sargent, Daniel; Schell, Michael J.

    2010-01-01

    Objective Optimism bias refers to unwarranted belief in the efficacy of new therapies. We assessed the impact of optimism bias on a proportion of trials that did not answer their research question successfully, and explored whether poor accrual or optimism bias is responsible for inconclusive results. Study Design Systematic review Setting Retrospective analysis of a consecutive series phase III randomized controlled trials (RCTs) performed under the aegis of National Cancer Institute Cooperative groups. Results 359 trials (374 comparisons) enrolling 150,232 patients were analyzed. 70% (262/374) of the trials generated conclusive results according to the statistical criteria. Investigators made definitive statements related to the treatment preference in 73% (273/374) of studies. Investigators’ judgments and statistical inferences were concordant in 75% (279/374) of trials. Investigators consistently overestimated their expected treatment effects, but to a significantly larger extent for inconclusive trials. The median ratio of expected over observed hazard ratio or odds ratio was 1.34 (range 0.19 – 15.40) in conclusive trials compared to 1.86 (range 1.09 – 12.00) in inconclusive studies (p<0.0001). Only 17% of the trials had treatment effects that matched original researchers’ expectations. Conclusion Formal statistical inference is sufficient to answer the research question in 75% of RCTs. The answers to the other 25% depend mostly on subjective judgments, which at times are in conflict with statistical inference. Optimism bias significantly contributes to inconclusive results. PMID:21163620

  9. Optimism bias leads to inconclusive results-an empirical study.

    PubMed

    Djulbegovic, Benjamin; Kumar, Ambuj; Magazin, Anja; Schroen, Anneke T; Soares, Heloisa; Hozo, Iztok; Clarke, Mike; Sargent, Daniel; Schell, Michael J

    2011-06-01

    Optimism bias refers to unwarranted belief in the efficacy of new therapies. We assessed the impact of optimism bias on a proportion of trials that did not answer their research question successfully and explored whether poor accrual or optimism bias is responsible for inconclusive results. Systematic review. Retrospective analysis of a consecutive-series phase III randomized controlled trials (RCTs) performed under the aegis of National Cancer Institute Cooperative groups. Three hundred fifty-nine trials (374 comparisons) enrolling 150,232 patients were analyzed. Seventy percent (262 of 374) of the trials generated conclusive results according to the statistical criteria. Investigators made definitive statements related to the treatment preference in 73% (273 of 374) of studies. Investigators' judgments and statistical inferences were concordant in 75% (279 of 374) of trials. Investigators consistently overestimated their expected treatment effects but to a significantly larger extent for inconclusive trials. The median ratio of expected and observed hazard ratio or odds ratio was 1.34 (range: 0.19-15.40) in conclusive trials compared with 1.86 (range: 1.09-12.00) in inconclusive studies (P<0.0001). Only 17% of the trials had treatment effects that matched original researchers' expectations. Formal statistical inference is sufficient to answer the research question in 75% of RCTs. The answers to the other 25% depend mostly on subjective judgments, which at times are in conflict with statistical inference. Optimism bias significantly contributes to inconclusive results. Copyright © 2011 Elsevier Inc. All rights reserved.

  10. Optimization of light quality from color mixing light-emitting diode systems for general lighting

    NASA Astrophysics Data System (ADS)

    Thorseth, Anders

    2012-03-01

    Given the problem of metamerisms inherent in color mixing in light-emitting diode (LED) systems with more than three distinct colors, a method for optimizing the spectral output of multicolor LED system with regards to standardized light quality parameters has been developed. The composite spectral power distribution from the LEDs are simulated using spectral radiometric measurements of single commercially available LEDs for varying input power, to account for the efficiency droop and other non-linear effects in electrical power vs. light output. The method uses electrical input powers as input parameters in a randomized steepest decent optimization. The resulting spectral power distributions are evaluated with regard to the light quality using the standard characteristics: CIE color rendering index, correlated color temperature and chromaticity distance. The results indicate Pareto optimal boundaries for each system, mapping the capabilities of the simulated lighting systems with regard to the light quality characteristics.

  11. Optimal Cost Avoidance Investment and Pricing Strategies for Performance-Based Post-Production Service Contracts

    DTIC Science & Technology

    2011-04-30

    a BS degree in Mathematics and an MS degree in Statistics and Financial and Actuarial Mathematics from Kiev National Taras Shevchenko University...degrees from Rutgers University in Industrial Engineering (PhD and MS) and Statistics (MS) and from Universidad Nacional Autonoma de Mexico in Actuarial ...Science. His research efforts focus on developing mathematical models for the analysis, computation, and optimization of system performance with

  12. Optimization of Collision Detection in Surgical Simulations

    NASA Astrophysics Data System (ADS)

    Custură-Crăciun, Dan; Cochior, Daniel; Neagu, Corneliu

    2014-11-01

    Just like flight and spaceship simulators already represent a standard, we expect that soon enough, surgical simulators should become a standard in medical applications. A simulations quality is strongly related to the image quality as well as the degree of realism of the simulation. Increased quality requires increased resolution, increased representation speed but more important, a larger amount of mathematical equations. To make it possible, not only that we need more efficient computers, but especially more calculation process optimizations. A simulator executes one of the most complex sets of calculations each time it detects a contact between the virtual objects, therefore optimization of collision detection is fatal for the work-speed of a simulator and hence in its quality

  13. Generalized t-statistic for two-group classification.

    PubMed

    Komori, Osamu; Eguchi, Shinto; Copas, John B

    2015-06-01

    In the classic discriminant model of two multivariate normal distributions with equal variance matrices, the linear discriminant function is optimal both in terms of the log likelihood ratio and in terms of maximizing the standardized difference (the t-statistic) between the means of the two distributions. In a typical case-control study, normality may be sensible for the control sample but heterogeneity and uncertainty in diagnosis may suggest that a more flexible model is needed for the cases. We generalize the t-statistic approach by finding the linear function which maximizes a standardized difference but with data from one of the groups (the cases) filtered by a possibly nonlinear function U. We study conditions for consistency of the method and find the function U which is optimal in the sense of asymptotic efficiency. Optimality may also extend to other measures of discriminatory efficiency such as the area under the receiver operating characteristic curve. The optimal function U depends on a scalar probability density function which can be estimated non-parametrically using a standard numerical algorithm. A lasso-like version for variable selection is implemented by adding L1-regularization to the generalized t-statistic. Two microarray data sets in the study of asthma and various cancers are used as motivating examples. © 2014, The International Biometric Society.

  14. Quality effort decision in service supply chain with quality preference based on quantum game

    NASA Astrophysics Data System (ADS)

    Zhang, Cuihua; Xing, Peng; Wang, Jianwei

    2015-04-01

    Service quality preference behaviors of both members are considered in service supply chain (SSC) including a service integrator and a service provider with stochastic demand. Through analysis of service quality cost and revenue, the utility functions are established on service quality effort degree and service quality preference level in integrated and decentralized SSC. Nash equilibrium and quantum game are used to optimize the models. By comparing the different solutions, the optimal strategies are obtained in SSC with quality preference. Then some numerical examples are studied and the changing trend of service quality effort is further analyzed by the influence of the entanglement operator and quality preferences.

  15. Development of the IBSAL-SimMOpt Method for the Optimization of Quality in a Corn Stover Supply Chain

    DOE PAGES

    Chavez, Hernan; Castillo-Villar, Krystel; Webb, Erin

    2017-08-01

    Variability on the physical characteristics of feedstock has a relevant effect on the reactor’s reliability and operating cost. Most of the models developed to optimize biomass supply chains have failed to quantify the effect of biomass quality and preprocessing operations required to meet biomass specifications on overall cost and performance. The Integrated Biomass Supply Analysis and Logistics (IBSAL) model estimates the harvesting, collection, transportation, and storage cost while considering the stochastic behavior of the field-to-biorefinery supply chain. This paper proposes an IBSAL-SimMOpt (Simulation-based Multi-Objective Optimization) method for optimizing the biomass quality and costs associated with the efforts needed to meetmore » conversion technology specifications. The method is developed in two phases. For the first phase, a SimMOpt tool that interacts with the extended IBSAL is developed. For the second phase, the baseline IBSAL model is extended so that the cost for meeting and/or penalization for failing in meeting specifications are considered. The IBSAL-SimMOpt method is designed to optimize quality characteristics of biomass, cost related to activities intended to improve the quality of feedstock, and the penalization cost. A case study based on 1916 farms in Ontario, Canada is considered for testing the proposed method. Analysis of the results demonstrates that this method is able to find a high-quality set of non-dominated solutions.« less

  16. Development of the IBSAL-SimMOpt Method for the Optimization of Quality in a Corn Stover Supply Chain

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chavez, Hernan; Castillo-Villar, Krystel; Webb, Erin

    Variability on the physical characteristics of feedstock has a relevant effect on the reactor’s reliability and operating cost. Most of the models developed to optimize biomass supply chains have failed to quantify the effect of biomass quality and preprocessing operations required to meet biomass specifications on overall cost and performance. The Integrated Biomass Supply Analysis and Logistics (IBSAL) model estimates the harvesting, collection, transportation, and storage cost while considering the stochastic behavior of the field-to-biorefinery supply chain. This paper proposes an IBSAL-SimMOpt (Simulation-based Multi-Objective Optimization) method for optimizing the biomass quality and costs associated with the efforts needed to meetmore » conversion technology specifications. The method is developed in two phases. For the first phase, a SimMOpt tool that interacts with the extended IBSAL is developed. For the second phase, the baseline IBSAL model is extended so that the cost for meeting and/or penalization for failing in meeting specifications are considered. The IBSAL-SimMOpt method is designed to optimize quality characteristics of biomass, cost related to activities intended to improve the quality of feedstock, and the penalization cost. A case study based on 1916 farms in Ontario, Canada is considered for testing the proposed method. Analysis of the results demonstrates that this method is able to find a high-quality set of non-dominated solutions.« less

  17. Statistical estimation via convex optimization for trending and performance monitoring

    NASA Astrophysics Data System (ADS)

    Samar, Sikandar

    This thesis presents an optimization-based statistical estimation approach to find unknown trends in noisy data. A Bayesian framework is used to explicitly take into account prior information about the trends via trend models and constraints. The main focus is on convex formulation of the Bayesian estimation problem, which allows efficient computation of (globally) optimal estimates. There are two main parts of this thesis. The first part formulates trend estimation in systems described by known detailed models as a convex optimization problem. Statistically optimal estimates are then obtained by maximizing a concave log-likelihood function subject to convex constraints. We consider the problem of increasing problem dimension as more measurements become available, and introduce a moving horizon framework to enable recursive estimation of the unknown trend by solving a fixed size convex optimization problem at each horizon. We also present a distributed estimation framework, based on the dual decomposition method, for a system formed by a network of complex sensors with local (convex) estimation. Two specific applications of the convex optimization-based Bayesian estimation approach are described in the second part of the thesis. Batch estimation for parametric diagnostics in a flight control simulation of a space launch vehicle is shown to detect incipient fault trends despite the natural masking properties of feedback in the guidance and control loops. Moving horizon approach is used to estimate time varying fault parameters in a detailed nonlinear simulation model of an unmanned aerial vehicle. An excellent performance is demonstrated in the presence of winds and turbulence.

  18. Debating Curricular Strategies for Teaching Statistics and Research Methods: What Does the Current Evidence Suggest?

    ERIC Educational Resources Information Center

    Barron, Kenneth E.; Apple, Kevin J.

    2014-01-01

    Coursework in statistics and research methods is a core requirement in most undergraduate psychology programs. However, is there an optimal way to structure and sequence methodology courses to facilitate student learning? For example, should statistics be required before research methods, should research methods be required before statistics, or…

  19. Assessment of groundwater quality: a fusion of geochemical and geophysical information via Bayesian neural networks.

    PubMed

    Maiti, Saumen; Erram, V C; Gupta, Gautam; Tiwari, Ram Krishna; Kulkarni, U D; Sangpal, R R

    2013-04-01

    Deplorable quality of groundwater arising from saltwater intrusion, natural leaching and anthropogenic activities is one of the major concerns for the society. Assessment of groundwater quality is, therefore, a primary objective of scientific research. Here, we propose an artificial neural network-based method set in a Bayesian neural network (BNN) framework and employ it to assess groundwater quality. The approach is based on analyzing 36 water samples and inverting up to 85 Schlumberger vertical electrical sounding data. We constructed a priori model by suitably parameterizing geochemical and geophysical data collected from the western part of India. The posterior model (post-inversion) was estimated using the BNN learning procedure and global hybrid Monte Carlo/Markov Chain Monte Carlo optimization scheme. By suitable parameterization of geochemical and geophysical parameters, we simulated 1,500 training samples, out of which 50 % samples were used for training and remaining 50 % were used for validation and testing. We show that the trained model is able to classify validation and test samples with 85 % and 80 % accuracy respectively. Based on cross-correlation analysis and Gibb's diagram of geochemical attributes, the groundwater qualities of the study area were classified into following three categories: "Very good", "Good", and "Unsuitable". The BNN model-based results suggest that groundwater quality falls mostly in the range of "Good" to "Very good" except for some places near the Arabian Sea. The new modeling results powered by uncertainty and statistical analyses would provide useful constrain, which could be utilized in monitoring and assessment of the groundwater quality.

  20. Identification of Long Bone Fractures in Radiology Reports Using Natural Language Processing to support Healthcare Quality Improvement.

    PubMed

    Grundmeier, Robert W; Masino, Aaron J; Casper, T Charles; Dean, Jonathan M; Bell, Jamie; Enriquez, Rene; Deakyne, Sara; Chamberlain, James M; Alpern, Elizabeth R

    2016-11-09

    Important information to support healthcare quality improvement is often recorded in free text documents such as radiology reports. Natural language processing (NLP) methods may help extract this information, but these methods have rarely been applied outside the research laboratories where they were developed. To implement and validate NLP tools to identify long bone fractures for pediatric emergency medicine quality improvement. Using freely available statistical software packages, we implemented NLP methods to identify long bone fractures from radiology reports. A sample of 1,000 radiology reports was used to construct three candidate classification models. A test set of 500 reports was used to validate the model performance. Blinded manual review of radiology reports by two independent physicians provided the reference standard. Each radiology report was segmented and word stem and bigram features were constructed. Common English "stop words" and rare features were excluded. We used 10-fold cross-validation to select optimal configuration parameters for each model. Accuracy, recall, precision and the F1 score were calculated. The final model was compared to the use of diagnosis codes for the identification of patients with long bone fractures. There were 329 unique word stems and 344 bigrams in the training documents. A support vector machine classifier with Gaussian kernel performed best on the test set with accuracy=0.958, recall=0.969, precision=0.940, and F1 score=0.954. Optimal parameters for this model were cost=4 and gamma=0.005. The three classification models that we tested all performed better than diagnosis codes in terms of accuracy, precision, and F1 score (diagnosis code accuracy=0.932, recall=0.960, precision=0.896, and F1 score=0.927). NLP methods using a corpus of 1,000 training documents accurately identified acute long bone fractures from radiology reports. Strategic use of straightforward NLP methods, implemented with freely available software, offers quality improvement teams new opportunities to extract information from narrative documents.

  1. TU-H-CAMPUS-JeP1-04: Deformable Image Registration Performances in Pelvis Patients: Impact of CBCT Image Quality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fusella, M; Loi, G; Fiandra, C

    Purpose: To investigate the accuracy and robustness, against image noise and artifacts (typical of CBCT images), of a commercial algorithm for deformable image registration (DIR), to propagate regions of interest (ROIs) in computational phantoms based on real prostate patient images. Methods: The Anaconda DIR algorithm, implemented in RayStation was tested. Two specific Deformation Vector Fields (DVFs) were applied to the reference data set (CTref) using the ImSimQA software, obtaining two deformed CTs. For each dataset twenty-four different level of noise and/or capping artifacts were applied to simulate CBCT images. DIR was performed between CTref and each deformed CTs and CBCTs.more » In order to investigate the relationship between image quality parameters and the DIR results (expressed by a logit transform of the Dice Index) a bilinear regression was defined. Results: More than 550 DIR-mapped ROIs were analyzed. The Statistical analysis states that deformation strenght and artifacts were significant prognostic factors of DIR performances, while noise appeared to have a minor role in DIR process as implemented in RayStation as expected by the image similarity metric built in the registration algorithm. Capping artifacts reveals a determinant role for the accuracy of DIR results. Two optimal values for capping artifacts were found to obtain acceptable DIR results (DICE> 075/ 0.85). Various clinical CBCT acquisition protocol were reported to evaluate the significance of the study. Conclusion: This work illustrates the impact of image quality on DIR performance. Clinical issues like Adaptive Radiation Therapy (ART) and Dose Accumulation need accurate and robust DIR software. The RayStation DIR algorithm resulted robust against noise, but sensitive to image artifacts. This result highlights the need of robustness quality assurance against image noise and artifacts in the commissioning of a DIR commercial system and underlines the importance to adopt optimized protocols for CBCT image acquisitions in ART clinical implementation.« less

  2. Solving TSP problem with improved genetic algorithm

    NASA Astrophysics Data System (ADS)

    Fu, Chunhua; Zhang, Lijun; Wang, Xiaojing; Qiao, Liying

    2018-05-01

    The TSP is a typical NP problem. The optimization of vehicle routing problem (VRP) and city pipeline optimization can use TSP to solve; therefore it is very important to the optimization for solving TSP problem. The genetic algorithm (GA) is one of ideal methods in solving it. The standard genetic algorithm has some limitations. Improving the selection operator of genetic algorithm, and importing elite retention strategy can ensure the select operation of quality, In mutation operation, using the adaptive algorithm selection can improve the quality of search results and variation, after the chromosome evolved one-way evolution reverse operation is added which can make the offspring inherit gene of parental quality improvement opportunities, and improve the ability of searching the optimal solution algorithm.

  3. A Bayesian-based two-stage inexact optimization method for supporting stream water quality management in the Three Gorges Reservoir region.

    PubMed

    Hu, X H; Li, Y P; Huang, G H; Zhuang, X W; Ding, X W

    2016-05-01

    In this study, a Bayesian-based two-stage inexact optimization (BTIO) method is developed for supporting water quality management through coupling Bayesian analysis with interval two-stage stochastic programming (ITSP). The BTIO method is capable of addressing uncertainties caused by insufficient inputs in water quality model as well as uncertainties expressed as probabilistic distributions and interval numbers. The BTIO method is applied to a real case of water quality management for the Xiangxi River basin in the Three Gorges Reservoir region to seek optimal water quality management schemes under various uncertainties. Interval solutions for production patterns under a range of probabilistic water quality constraints have been generated. Results obtained demonstrate compromises between the system benefit and the system failure risk due to inherent uncertainties that exist in various system components. Moreover, information about pollutant emission is accomplished, which would help managers to adjust production patterns of regional industry and local policies considering interactions of water quality requirement, economic benefit, and industry structure.

  4. Spatial multiobjective optimization of agricultural conservation practices using a SWAT model and an evolutionary algorithm.

    PubMed

    Rabotyagov, Sergey; Campbell, Todd; Valcu, Adriana; Gassman, Philip; Jha, Manoj; Schilling, Keith; Wolter, Calvin; Kling, Catherine

    2012-12-09

    Finding the cost-efficient (i.e., lowest-cost) ways of targeting conservation practice investments for the achievement of specific water quality goals across the landscape is of primary importance in watershed management. Traditional economics methods of finding the lowest-cost solution in the watershed context (e.g.,(5,12,20)) assume that off-site impacts can be accurately described as a proportion of on-site pollution generated. Such approaches are unlikely to be representative of the actual pollution process in a watershed, where the impacts of polluting sources are often determined by complex biophysical processes. The use of modern physically-based, spatially distributed hydrologic simulation models allows for a greater degree of realism in terms of process representation but requires a development of a simulation-optimization framework where the model becomes an integral part of optimization. Evolutionary algorithms appear to be a particularly useful optimization tool, able to deal with the combinatorial nature of a watershed simulation-optimization problem and allowing the use of the full water quality model. Evolutionary algorithms treat a particular spatial allocation of conservation practices in a watershed as a candidate solution and utilize sets (populations) of candidate solutions iteratively applying stochastic operators of selection, recombination, and mutation to find improvements with respect to the optimization objectives. The optimization objectives in this case are to minimize nonpoint-source pollution in the watershed, simultaneously minimizing the cost of conservation practices. A recent and expanding set of research is attempting to use similar methods and integrates water quality models with broadly defined evolutionary optimization methods(3,4,9,10,13-15,17-19,22,23,25). In this application, we demonstrate a program which follows Rabotyagov et al.'s approach and integrates a modern and commonly used SWAT water quality model(7) with a multiobjective evolutionary algorithm SPEA2(26), and user-specified set of conservation practices and their costs to search for the complete tradeoff frontiers between costs of conservation practices and user-specified water quality objectives. The frontiers quantify the tradeoffs faced by the watershed managers by presenting the full range of costs associated with various water quality improvement goals. The program allows for a selection of watershed configurations achieving specified water quality improvement goals and a production of maps of optimized placement of conservation practices.

  5. Elaborate ligand-based modeling coupled with QSAR analysis and in silico screening reveal new potent acetylcholinesterase inhibitors.

    PubMed

    Abuhamdah, Sawsan; Habash, Maha; Taha, Mutasem O

    2013-12-01

    Inhibition of the enzyme acetylcholinesterase (AChE) has been shown to alleviate neurodegenerative diseases prompting several attempts to discover and optimize new AChE inhibitors. In this direction, we explored the pharmacophoric space of 85 AChE inhibitors to identify high quality pharmacophores. Subsequently, we implemented genetic algorithm-based quantitative structure-activity relationship (QSAR) modeling to select optimal combination of pharmacophoric models and 2D physicochemical descriptors capable of explaining bioactivity variation among training compounds (r2(68)=0.94, F-statistic=125.8, r2 LOO=0.92, r2 PRESS against 17 external test inhibitors = 0.84). Two orthogonal pharmacophores emerged in the QSAR equation suggesting the existence of at least two binding modes accessible to ligands within AChE binding pocket. The successful pharmacophores were comparable with crystallographically resolved AChE binding pocket. We employed the pharmacophoric models and associated QSAR equation to screen the national cancer institute list of compounds. Twenty-four low micromolar AChE inhibitors were identified. The most potent gave IC50 value of 1.0 μM.

  6. Modeling an aquatic ecosystem: application of an evolutionary algorithm with genetic doping to reduce prediction uncertainty

    NASA Astrophysics Data System (ADS)

    Friedel, Michael; Buscema, Massimo

    2016-04-01

    Aquatic ecosystem models can potentially be used to understand the influence of stresses on catchment resource quality. Given that catchment responses are functions of natural and anthropogenic stresses reflected in sparse and spatiotemporal biological, physical, and chemical measurements, an ecosystem is difficult to model using statistical or numerical methods. We propose an artificial adaptive systems approach to model ecosystems. First, an unsupervised machine-learning (ML) network is trained using the set of available sparse and disparate data variables. Second, an evolutionary algorithm with genetic doping is applied to reduce the number of ecosystem variables to an optimal set. Third, the optimal set of ecosystem variables is used to retrain the ML network. Fourth, a stochastic cross-validation approach is applied to quantify and compare the nonlinear uncertainty in selected predictions of the original and reduced models. Results are presented for aquatic ecosystems (tens of thousands of square kilometers) undergoing landscape change in the USA: Upper Illinois River Basin and Central Colorado Assessment Project Area, and Southland region, NZ.

  7. Additive Manufacturing in Production: A Study Case Applying Technical Requirements

    NASA Astrophysics Data System (ADS)

    Ituarte, Iñigo Flores; Coatanea, Eric; Salmi, Mika; Tuomi, Jukka; Partanen, Jouni

    Additive manufacturing (AM) is expanding the manufacturing capabilities. However, quality of AM produced parts is dependent on a number of machine, geometry and process parameters. The variability of these parameters affects the manufacturing drastically and therefore standardized processes and harmonized methodologies need to be developed to characterize the technology for end use applications and enable the technology for manufacturing. This research proposes a composite methodology integrating Taguchi Design of Experiments, multi-objective optimization and statistical process control, to optimize the manufacturing process and fulfil multiple requirements imposed to an arbitrary geometry. The proposed methodology aims to characterize AM technology depending upon manufacturing process variables as well as to perform a comparative assessment of three AM technologies (Selective Laser Sintering, Laser Stereolithography and Polyjet). Results indicate that only one machine, laser-based Stereolithography, was feasible to fulfil simultaneously macro and micro level geometrical requirements but mechanical properties were not at required level. Future research will study a single AM system at the time to characterize AM machine technical capabilities and stimulate pre-normative initiatives of the technology for end use applications.

  8. Compositional Models of Glass/Melt Properties and their Use for Glass Formulation

    DOE PAGES

    Vienna, John D.; USA, Richland Washington

    2014-12-18

    Nuclear waste glasses must simultaneously meet a number of criteria related to their processability, product quality, and cost factors. The properties that must be controlled in glass formulation and waste vitrification plant operation tend to vary smoothly with composition allowing for glass property-composition models to be developed and used. Models have been fit to the key glass properties. The properties are transformed so that simple functions of composition (e.g., linear, polynomial, or component ratios) can be used as model forms. The model forms are fit to experimental data designed statistically to efficiently cover the composition space of interest. Examples ofmore » these models are found in literature. The glass property-composition models, their uncertainty definitions, property constraints, and optimality criteria are combined to formulate optimal glass compositions, control composition in vitrification plants, and to qualify waste glasses for disposal. An overview of current glass property-composition modeling techniques is summarized in this paper along with an example of how those models are applied to glass formulation and product qualification at the planned Hanford high-level waste vitrification plant.« less

  9. Tolerance allocation for an electronic system using neural network/Monte Carlo approach

    NASA Astrophysics Data System (ADS)

    Al-Mohammed, Mohammed; Esteve, Daniel; Boucher, Jaque

    2001-12-01

    The intense global competition to produce quality products at a low cost has led many industrial nations to consider tolerances as a key factor to bring about cost as well as to remain competitive. In actually, Tolerance allocation stays widely applied on the Mechanic System. It is known that to study the tolerances in an electronic domain, Monte-Carlo method well be used. But the later method spends a long time. This paper reviews several methods (Worst-case, Statistical Method, Least Cost Allocation by Optimization methods) that can be used for treating the tolerancing problem for an Electronic System and explains their advantages and their limitations. Then, it proposes an efficient method based on the Neural Networks associated with Monte-Carlo method as basis data. The network is trained using the Error Back Propagation Algorithm to predict the individual part tolerances, minimizing the total cost of the system by a method of optimization. This proposed approach has been applied on Small-Signal Amplifier Circuit as an example. This method can be easily extended to a complex system of n-components.

  10. Optimal and fast E/B separation with a dual messenger field

    NASA Astrophysics Data System (ADS)

    Kodi Ramanah, Doogesh; Lavaux, Guilhem; Wandelt, Benjamin D.

    2018-05-01

    We adapt our recently proposed dual messenger algorithm for spin field reconstruction and showcase its efficiency and effectiveness in Wiener filtering polarized cosmic microwave background (CMB) maps. Unlike conventional preconditioned conjugate gradient (PCG) solvers, our preconditioner-free technique can deal with high-resolution joint temperature and polarization maps with inhomogeneous noise distributions and arbitrary mask geometries with relative ease. Various convergence diagnostics illustrate the high quality of the dual messenger reconstruction. In contrast, the PCG implementation fails to converge to a reasonable solution for the specific problem considered. The implementation of the dual messenger method is straightforward and guarantees numerical stability and convergence. We show how the algorithm can be modified to generate fluctuation maps, which, combined with the Wiener filter solution, yield unbiased constrained signal realizations, consistent with observed data. This algorithm presents a pathway to exact global analyses of high-resolution and high-sensitivity CMB data for a statistically optimal separation of E and B modes. It is therefore relevant for current and next-generation CMB experiments, in the quest for the elusive primordial B-mode signal.

  11. A novel non-uniform control vector parameterization approach with time grid refinement for flight level tracking optimal control problems.

    PubMed

    Liu, Ping; Li, Guodong; Liu, Xinggao; Xiao, Long; Wang, Yalin; Yang, Chunhua; Gui, Weihua

    2018-02-01

    High quality control method is essential for the implementation of aircraft autopilot system. An optimal control problem model considering the safe aerodynamic envelop is therefore established to improve the control quality of aircraft flight level tracking. A novel non-uniform control vector parameterization (CVP) method with time grid refinement is then proposed for solving the optimal control problem. By introducing the Hilbert-Huang transform (HHT) analysis, an efficient time grid refinement approach is presented and an adaptive time grid is automatically obtained. With this refinement, the proposed method needs fewer optimization parameters to achieve better control quality when compared with uniform refinement CVP method, whereas the computational cost is lower. Two well-known flight level altitude tracking problems and one minimum time cost problem are tested as illustrations and the uniform refinement control vector parameterization method is adopted as the comparative base. Numerical results show that the proposed method achieves better performances in terms of optimization accuracy and computation cost; meanwhile, the control quality is efficiently improved. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  12. Nonlinear Multiobjective MPC-Based Optimal Operation of a High Consistency Refining System in Papermaking

    DOE PAGES

    Li, Mingjie; Zhou, Ping; Wang, Hong; ...

    2017-09-19

    As one of the most important unit in the papermaking industry, the high consistency (HC) refining system is confronted with challenges such as improving pulp quality, energy saving, and emissions reduction in its operation processes. Here in this correspondence, an optimal operation of HC refining system is presented using nonlinear multiobjective model predictive control strategies that aim at set-point tracking objective of pulp quality, economic objective, and specific energy (SE) consumption objective, respectively. First, a set of input and output data at different times are employed to construct the subprocess model of the state process model for the HC refiningmore » system, and then the Wiener-type model can be obtained through combining the mechanism model of Canadian Standard Freeness and the state process model that determines their structures based on Akaike information criterion. Second, the multiobjective optimization strategy that optimizes both the set-point tracking objective of pulp quality and SE consumption is proposed simultaneously, which uses NSGA-II approach to obtain the Pareto optimal set. Furthermore, targeting at the set-point tracking objective of pulp quality, economic objective, and SE consumption objective, the sequential quadratic programming method is utilized to produce the optimal predictive controllers. In conclusion, the simulation results demonstrate that the proposed methods can make the HC refining system provide a better performance of set-point tracking of pulp quality when these predictive controllers are employed. In addition, while the optimal predictive controllers orienting with comprehensive economic objective and SE consumption objective, it has been shown that they have significantly reduced the energy consumption.« less

  13. Nonlinear Multiobjective MPC-Based Optimal Operation of a High Consistency Refining System in Papermaking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Mingjie; Zhou, Ping; Wang, Hong

    As one of the most important unit in the papermaking industry, the high consistency (HC) refining system is confronted with challenges such as improving pulp quality, energy saving, and emissions reduction in its operation processes. Here in this correspondence, an optimal operation of HC refining system is presented using nonlinear multiobjective model predictive control strategies that aim at set-point tracking objective of pulp quality, economic objective, and specific energy (SE) consumption objective, respectively. First, a set of input and output data at different times are employed to construct the subprocess model of the state process model for the HC refiningmore » system, and then the Wiener-type model can be obtained through combining the mechanism model of Canadian Standard Freeness and the state process model that determines their structures based on Akaike information criterion. Second, the multiobjective optimization strategy that optimizes both the set-point tracking objective of pulp quality and SE consumption is proposed simultaneously, which uses NSGA-II approach to obtain the Pareto optimal set. Furthermore, targeting at the set-point tracking objective of pulp quality, economic objective, and SE consumption objective, the sequential quadratic programming method is utilized to produce the optimal predictive controllers. In conclusion, the simulation results demonstrate that the proposed methods can make the HC refining system provide a better performance of set-point tracking of pulp quality when these predictive controllers are employed. In addition, while the optimal predictive controllers orienting with comprehensive economic objective and SE consumption objective, it has been shown that they have significantly reduced the energy consumption.« less

  14. Experimental Optimization of Exposure Index and Quality of Service in Wlan Networks.

    PubMed

    Plets, David; Vermeeren, Günter; Poorter, Eli De; Moerman, Ingrid; Goudos, Sotirios K; Luc, Martens; Wout, Joseph

    2017-07-01

    This paper presents the first real-life optimization of the Exposure Index (EI). A genetic optimization algorithm is developed and applied to three real-life Wireless Local Area Network scenarios in an experimental testbed. The optimization accounts for downlink, uplink and uplink of other users, for realistic duty cycles, and ensures a sufficient Quality of Service to all users. EI reductions up to 97.5% compared to a reference configuration can be achieved in a downlink-only scenario, in combination with an improved Quality of Service. Due to the dominance of uplink exposure and the lack of WiFi power control, no optimizations are possible in scenarios that also consider uplink traffic. However, future deployments that do implement WiFi power control can be successfully optimized, with EI reductions up to 86% compared to a reference configuration and an EI that is 278 times lower than optimized configurations under the absence of power control. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  15. Spatiotemporal variability of drinking water quality and the associated health risks in southwestern towns of Ethiopia.

    PubMed

    Sisay, Tadesse; Beyene, Abebe; Alemayehu, Esayas

    2017-10-18

    The failure to provide safe drinking water services to all people is the greatest development setback of the twenty-first century including Ethiopia. Potential pollutants from various sources are deteriorating drinking water quality in different seasons, and associated health risks were not clearly known. We determined seasonal and spatial variations of urban drinking water characteristics and associated health risks in Agaro, Jimma, and Metu towns, Southwest Ethiopia. Seventy-two samples were collected during dry and rainy seasons of 2014 and 2015. The majority (87.4%) of physicochemical parameters was found within the recommended limits. However, free residual chlorine in Jimma and Agaro town water sources was lower than the recommended limit and negatively correlated with total and fecal coliform counts (r = - 0.585 and - 0.638). Statistically significant differences were observed at pH, turbidity, and total coliform between dry and rainy seasons (p < 0.05). A Kruskal-Wallis H test revealed a statistically significant difference in electrical conductivity, total hardness, fluoride, iron, and fecal coliform across the study towns (p < 0.05). The Agaro town water source was the highest in fluoride concentration (3.15 mg/l). The daily exposure level for high fluoride concentration in Agaro town was estimated between 0.19 and 0.41 mg/kg day, and the average cumulative hazard index of fluoride was > 3.13 for all age groups. Water quality variations were observed in all conventional water treatment systems in the rainy season, and further research should focus on its optimization to safeguard the public.

  16. A Simple and Practical Index to Measure Dementia-Related Quality of Life.

    PubMed

    Arons, Alexander M M; Schölzel-Dorenbos, Carla J M; Olde Rikkert, Marcel G M; Krabbe, Paul F M

    2016-01-01

    Research on new treatments for dementia is gaining pace worldwide in an effort to alleviate this growing health care problem. The optimal evaluation of such interventions, however, calls for a practical and credible patient-reported outcome measure. To describe the refinement of the Dementia Quality-of-life Instrument (DQI) and present its revised version. A prototype of the DQI was adapted to cover a broader range of health-related quality of life (HRQOL) and to improve consistency in the descriptions of its domains. A valuation study was then conducted to assign meaningful numbers to all DQI health states. Pairs of DQI states were presented to a sample of professionals working with people with dementia and a representative sample of the Dutch population. They had to repeatedly select the best DQI state, and their responses were statistically modeled to obtain values for each health state. In total, 207 professionals working with people with dementia and 631 members of the general population completed the paired comparison tasks. Statistically significant differences between the two samples were found for the domains of social functioning, mood, and memory. Severe problems with physical health and severe memory problems were deemed most important by the general population. In contrast, severe mood problems were considered most important by professionals working with people with dementia. The DQI is a simple and feasible measurement instrument that expresses the overall HRQOL of people suffering from dementia in a single meaningful number. Current results suggest that revisiting the discussion of using values from the general population might be warranted in the dementia context. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  17. Computed tomography imaging with the Adaptive Statistical Iterative Reconstruction (ASIR) algorithm: dependence of image quality on the blending level of reconstruction.

    PubMed

    Barca, Patrizio; Giannelli, Marco; Fantacci, Maria Evelina; Caramella, Davide

    2018-06-01

    Computed tomography (CT) is a useful and widely employed imaging technique, which represents the largest source of population exposure to ionizing radiation in industrialized countries. Adaptive Statistical Iterative Reconstruction (ASIR) is an iterative reconstruction algorithm with the potential to allow reduction of radiation exposure while preserving diagnostic information. The aim of this phantom study was to assess the performance of ASIR, in terms of a number of image quality indices, when different reconstruction blending levels are employed. CT images of the Catphan-504 phantom were reconstructed using conventional filtered back-projection (FBP) and ASIR with reconstruction blending levels of 20, 40, 60, 80, and 100%. Noise, noise power spectrum (NPS), contrast-to-noise ratio (CNR) and modulation transfer function (MTF) were estimated for different scanning parameters and contrast objects. Noise decreased and CNR increased non-linearly up to 50 and 100%, respectively, with increasing blending level of reconstruction. Also, ASIR has proven to modify the NPS curve shape. The MTF of ASIR reconstructed images depended on tube load/contrast and decreased with increasing blending level of reconstruction. In particular, for low radiation exposure and low contrast acquisitions, ASIR showed lower performance than FBP, in terms of spatial resolution for all blending levels of reconstruction. CT image quality varies substantially with the blending level of reconstruction. ASIR has the potential to reduce noise whilst maintaining diagnostic information in low radiation exposure CT imaging. Given the opposite variation of CNR and spatial resolution with the blending level of reconstruction, it is recommended to use an optimal value of this parameter for each specific clinical application.

  18. Positive factors associated with quality of life among Chinese patients with renal carcinoma: a cross-sectional study.

    PubMed

    Liu, Jiao; Gong, Da-Xin; Zeng, Yu; Li, Zhen-Hua; Kong, Chui-Ze

    2018-01-01

    Quality of life and positive psychological variables has become a focus of concern in patients with renal carcinoma. However, the integrative effects of positive psychological variables on the illness have seldom been reported. The aims of this study were to evaluate the quality of life and the integrative effects of hope, resilience and optimism on the quality of life among Chinese renal carcinoma patients. A cross-sectional study was conducted at the First Hospital of China Medical University. 284 participants completed questionnaires consisting of demographic and clinical characteristics, EORTC QLQ-C30, Adult Hope Scale, Resilience Scale-14 and Life Orientation Scale-Revised from July 2013 to July 2014. Pearson's correlation and hierarchical regression analyses were performed to explore the effects of related factors. Hope, resilience and optimism were significantly associated with quality of life. Hierarchical regression analyses indicated that hope, resilience and optimism as a whole accounted for 9.8, 24.4 and 21.9% of the variance in the global health status, functioning status and symptom status, respectively. The low level of quality of life for Chinese renal carcinoma patients should receive more attention from Chinese medical institutions. Psychological interventions to increase hope, resilience and optimism may be essential to enhancing the quality of life of Chinese cancer patients.

  19. Horsetail matching: a flexible approach to optimization under uncertainty

    NASA Astrophysics Data System (ADS)

    Cook, L. W.; Jarrett, J. P.

    2018-04-01

    It is important to design engineering systems to be robust with respect to uncertainties in the design process. Often, this is done by considering statistical moments, but over-reliance on statistical moments when formulating a robust optimization can produce designs that are stochastically dominated by other feasible designs. This article instead proposes a formulation for optimization under uncertainty that minimizes the difference between a design's cumulative distribution function and a target. A standard target is proposed that produces stochastically non-dominated designs, but the formulation also offers enough flexibility to recover existing approaches for robust optimization. A numerical implementation is developed that employs kernels to give a differentiable objective function. The method is applied to algebraic test problems and a robust transonic airfoil design problem where it is compared to multi-objective, weighted-sum and density matching approaches to robust optimization; several advantages over these existing methods are demonstrated.

  20. Scenario based optimization of a container vessel with respect to its projected operating conditions

    NASA Astrophysics Data System (ADS)

    Wagner, Jonas; Binkowski, Eva; Bronsart, Robert

    2014-06-01

    In this paper the scenario based optimization of the bulbous bow of the KRISO Container Ship (KCS) is presented. The optimization of the parametrically modeled vessel is based on a statistically developed operational profile generated from noon-to-noon reports of a comparable 3600 TEU container vessel and specific development functions representing the growth of global economy during the vessels service time. In order to consider uncertainties, statistical fluctuations are added. An analysis of these data lead to a number of most probable upcoming operating conditions (OC) the vessel will stay in the future. According to their respective likeliness an objective function for the evaluation of the optimal design variant of the vessel is derived and implemented within the parametrical optimization workbench FRIENDSHIP Framework. In the following this evaluation is done with respect to vessel's calculated effective power based on the usage of potential flow code. The evaluation shows, that the usage of scenarios within the optimization process has a strong influence on the hull form.

  1. Statistics Report on TEQSA Registered Higher Education Providers, 2016

    ERIC Educational Resources Information Center

    Australian Government Tertiary Education Quality and Standards Agency, 2016

    2016-01-01

    This Statistics Report is the third release of selected higher education sector data held by the Australian Government Tertiary Education Quality and Standards Agency (TEQSA) for its quality assurance activities. It provides a snapshot of national statistics on all parts of the sector by bringing together data collected directly by TEQSA with data…

  2. Investigation of statistical iterative reconstruction for dedicated breast CT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makeev, Andrey; Glick, Stephen J.

    2013-08-15

    Purpose: Dedicated breast CT has great potential for improving the detection and diagnosis of breast cancer. Statistical iterative reconstruction (SIR) in dedicated breast CT is a promising alternative to traditional filtered backprojection (FBP). One of the difficulties in using SIR is the presence of free parameters in the algorithm that control the appearance of the resulting image. These parameters require tuning in order to achieve high quality reconstructions. In this study, the authors investigated the penalized maximum likelihood (PML) method with two commonly used types of roughness penalty functions: hyperbolic potential and anisotropic total variation (TV) norm. Reconstructed images weremore » compared with images obtained using standard FBP. Optimal parameters for PML with the hyperbolic prior are reported for the task of detecting microcalcifications embedded in breast tissue.Methods: Computer simulations were used to acquire projections in a half-cone beam geometry. The modeled setup describes a realistic breast CT benchtop system, with an x-ray spectra produced by a point source and an a-Si, CsI:Tl flat-panel detector. A voxelized anthropomorphic breast phantom with 280 μm microcalcification spheres embedded in it was used to model attenuation properties of the uncompressed woman's breast in a pendant position. The reconstruction of 3D images was performed using the separable paraboloidal surrogates algorithm with ordered subsets. Task performance was assessed with the ideal observer detectability index to determine optimal PML parameters.Results: The authors' findings suggest that there is a preferred range of values of the roughness penalty weight and the edge preservation threshold in the penalized objective function with the hyperbolic potential, which resulted in low noise images with high contrast microcalcifications preserved. In terms of numerical observer detectability index, the PML method with optimal parameters yielded substantially improved performance (by a factor of greater than 10) compared to FBP. The hyperbolic prior was also observed to be superior to the TV norm. A few of the best-performing parameter pairs for the PML method also demonstrated superior performance for various radiation doses. In fact, using PML with certain parameter values results in better images, acquired using 2 mGy dose, than FBP-reconstructed images acquired using 6 mGy dose.Conclusions: A range of optimal free parameters for the PML algorithm with hyperbolic and TV norm-based potentials is presented for the microcalcification detection task, in dedicated breast CT. The reported values can be used as starting values of the free parameters, when SIR techniques are used for image reconstruction. Significant improvement in image quality can be achieved by using PML with optimal combination of parameters, as compared to FBP. Importantly, these results suggest improved detection of microcalcifications can be obtained by using PML with lower radiation dose to the patient, than using FBP with higher dose.« less

  3. Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.

    PubMed

    Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias

    2016-01-01

    To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.

  4. Multilayer perceptron architecture optimization using parallel computing techniques.

    PubMed

    Castro, Wilson; Oblitas, Jimy; Santa-Cruz, Roberto; Avila-George, Himer

    2017-01-01

    The objective of this research was to develop a methodology for optimizing multilayer-perceptron-type neural networks by evaluating the effects of three neural architecture parameters, namely, number of hidden layers (HL), neurons per hidden layer (NHL), and activation function type (AF), on the sum of squares error (SSE). The data for the study were obtained from quality parameters (physicochemical and microbiological) of milk samples. Architectures or combinations were organized in groups (G1, G2, and G3) generated upon interspersing one, two, and three layers. Within each group, the networks had three neurons in the input layer, six neurons in the output layer, three to twenty-seven NHL, and three AF (tan-sig, log-sig, and linear) types. The number of architectures was determined using three factorial-type experimental designs, which reached 63, 2 187, and 50 049 combinations for G1, G2 and G3, respectively. Using MATLAB 2015a, a logical sequence was designed and implemented for constructing, training, and evaluating multilayer-perceptron-type neural networks using parallel computing techniques. The results show that HL and NHL have a statistically relevant effect on SSE, and from two hidden layers, AF also has a significant effect; thus, both AF and NHL can be evaluated to determine the optimal combination per group. Moreover, in the three study groups, it is observed that there is an inverse relationship between the number of processors and the total optimization time.

  5. Detection of proximal caries using digital radiographic systems with different resolutions.

    PubMed

    Nikneshan, Sima; Abbas, Fatemeh Mashhadi; Sabbagh, Sedigheh

    2015-01-01

    Dental radiography is an important tool for detection of caries and digital radiography is the latest advancement in this regard. Spatial resolution is a characteristic of digital receptors used for describing the quality of images. This study was aimed to compare the diagnostic accuracy of two digital radiographic systems with three different resolutions for detection of noncavitated proximal caries. Diagnostic accuracy. Seventy premolar teeth were mounted in 14 gypsum blocks. Digora; Optime and RVG Access were used for obtaining digital radiographs. Six observers evaluated the proximal surfaces in radiographs for each resolution in order to determine the depth of caries based on a 4-point scale. The teeth were then histologically sectioned, and the results of histologic analysis were considered as the gold standard. Data were entered using SPSS version 18 software and the Kruskal-Wallis test was used for data analysis. P <0.05 was considered as statistically significant. No significant difference was found between different resolutions for detection of proximal caries (P > 0.05). RVG access system had the highest specificity (87.7%) and Digora; Optime at high resolution had the lowest specificity (84.2%). Furthermore, Digora; Optime had higher sensitivity for detection of caries exceeding outer half of enamel. Judgment of oral radiologists for detection of the depth of caries had higher reliability than that of restorative dentistry specialists. The three resolutions of Digora; Optime and RVG access had similar accuracy in detection of noncavitated proximal caries.

  6. Multilayer perceptron architecture optimization using parallel computing techniques

    PubMed Central

    Castro, Wilson; Oblitas, Jimy; Santa-Cruz, Roberto; Avila-George, Himer

    2017-01-01

    The objective of this research was to develop a methodology for optimizing multilayer-perceptron-type neural networks by evaluating the effects of three neural architecture parameters, namely, number of hidden layers (HL), neurons per hidden layer (NHL), and activation function type (AF), on the sum of squares error (SSE). The data for the study were obtained from quality parameters (physicochemical and microbiological) of milk samples. Architectures or combinations were organized in groups (G1, G2, and G3) generated upon interspersing one, two, and three layers. Within each group, the networks had three neurons in the input layer, six neurons in the output layer, three to twenty-seven NHL, and three AF (tan-sig, log-sig, and linear) types. The number of architectures was determined using three factorial-type experimental designs, which reached 63, 2 187, and 50 049 combinations for G1, G2 and G3, respectively. Using MATLAB 2015a, a logical sequence was designed and implemented for constructing, training, and evaluating multilayer-perceptron-type neural networks using parallel computing techniques. The results show that HL and NHL have a statistically relevant effect on SSE, and from two hidden layers, AF also has a significant effect; thus, both AF and NHL can be evaluated to determine the optimal combination per group. Moreover, in the three study groups, it is observed that there is an inverse relationship between the number of processors and the total optimization time. PMID:29236744

  7. An integrated prediction and optimization model of biogas production system at a wastewater treatment facility.

    PubMed

    Akbaş, Halil; Bilgen, Bilge; Turhan, Aykut Melih

    2015-11-01

    This study proposes an integrated prediction and optimization model by using multi-layer perceptron neural network and particle swarm optimization techniques. Three different objective functions are formulated. The first one is the maximization of methane percentage with single output. The second one is the maximization of biogas production with single output. The last one is the maximization of biogas quality and biogas production with two outputs. Methane percentage, carbon dioxide percentage, and other contents' percentage are used as the biogas quality criteria. Based on the formulated models and data from a wastewater treatment facility, optimal values of input variables and their corresponding maximum output values are found out for each model. It is expected that the application of the integrated prediction and optimization models increases the biogas production and biogas quality, and contributes to the quantity of electricity production at the wastewater treatment facility. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Optimal implementation of best management practices to improve agricultural hydrology and water quality

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Engel, B.; Collingsworth, P.; Pijanowski, B. C.

    2017-12-01

    Nutrient loading from the Maumee River watershed is a significant reason for the harmful algal blooms (HABs) problem in Lake Erie. Strategies to reduce nutrient loading from agricultural areas in the Maumee River watershed need to be explored. Best management practices (BMPs) are popular approaches for improving hydrology and water quality. Various scenarios of BMP implementation were simulated in the AXL watershed (an agricultural watershed in Maumee River watershed) using Soil and Water Assessment Tool (SWAT) and a new BMP cost tool to explore the cost-effectiveness of the practices. BMPs of interest included vegetative filter strips, grassed waterways, blind inlets, grade stabilization structures, wetlands, no-till, nutrient management, residue management, and cover crops. The following environmental concerns were considered: streamflow, Total Phosphorous (TP), Dissolved Reactive Phosphorus (DRP), Total Kjeldahl Nitrogen (TKN), and Nitrate+Nitrite (NOx). To obtain maximum hydrological and water quality benefits with minimum cost, an optimization tool was developed to optimally select and place BMPs by connecting SWAT, the BMP cost tool, and optimization algorithms. The optimization tool was then applied in AXL watershed to explore optimization focusing on critical areas (top 25% of areas with highest runoff volume/pollutant loads per area) vs. all areas of the watershed, optimization using weather data for spring (March to July, due to the goal of reducing spring phosphorus in watershed management plan) vs. full year, and optimization results of implementing BMPs to achieve the watershed management plan goal (reducing 2008 TP levels by 40%). The optimization tool and BMP optimization results can be used by watershed groups and communities to solve hydrology and water quality problems.

  9. Optimizing a Sensor Network with Data from Hazard Mapping Demonstrated in a Heavy-Vehicle Manufacturing Facility.

    PubMed

    Berman, Jesse D; Peters, Thomas M; Koehler, Kirsten A

    2018-05-28

    To design a method that uses preliminary hazard mapping data to optimize the number and location of sensors within a network for a long-term assessment of occupational concentrations, while preserving temporal variability, accuracy, and precision of predicted hazards. Particle number concentrations (PNCs) and respirable mass concentrations (RMCs) were measured with direct-reading instruments in a large heavy-vehicle manufacturing facility at 80-82 locations during 7 mapping events, stratified by day and season. Using kriged hazard mapping, a statistical approach identified optimal orders for removing locations to capture temporal variability and high prediction precision of PNC and RMC concentrations. We compared optimal-removal, random-removal, and least-optimal-removal orders to bound prediction performance. The temporal variability of PNC was found to be higher than RMC with low correlation between the two particulate metrics (ρ = 0.30). Optimal-removal orders resulted in more accurate PNC kriged estimates (root mean square error [RMSE] = 49.2) at sample locations compared with random-removal order (RMSE = 55.7). For estimates at locations having concentrations in the upper 10th percentile, the optimal-removal order preserved average estimated concentrations better than random- or least-optimal-removal orders (P < 0.01). However, estimated average concentrations using an optimal-removal were not statistically different than random-removal when averaged over the entire facility. No statistical difference was observed for optimal- and random-removal methods for RMCs that were less variable in time and space than PNCs. Optimized removal performed better than random-removal in preserving high temporal variability and accuracy of hazard map for PNC, but not for the more spatially homogeneous RMC. These results can be used to reduce the number of locations used in a network of static sensors for long-term monitoring of hazards in the workplace, without sacrificing prediction performance.

  10. Theoretic aspects of the identification of the parameters in the optimal control model

    NASA Technical Reports Server (NTRS)

    Vanwijk, R. A.; Kok, J. J.

    1977-01-01

    The identification of the parameters of the optimal control model from input-output data of the human operator is considered. Accepting the basic structure of the model as a cascade of a full-order observer and a feedback law, and suppressing the inherent optimality of the human controller, the parameters to be identified are the feedback matrix, the observer gain matrix, and the intensity matrices of the observation noise and the motor noise. The identification of the parameters is a statistical problem, because the system and output are corrupted by noise, and therefore the solution must be based on the statistics (probability density function) of the input and output data of the human operator. However, based on the statistics of the input-output data of the human operator, no distinction can be made between the observation and the motor noise, which shows that the model suffers from overparameterization.

  11. A Hybrid Interval–Robust Optimization Model for Water Quality Management

    PubMed Central

    Xu, Jieyu; Li, Yongping; Huang, Guohe

    2013-01-01

    Abstract In water quality management problems, uncertainties may exist in many system components and pollution-related processes (i.e., random nature of hydrodynamic conditions, variability in physicochemical processes, dynamic interactions between pollutant loading and receiving water bodies, and indeterminacy of available water and treated wastewater). These complexities lead to difficulties in formulating and solving the resulting nonlinear optimization problems. In this study, a hybrid interval–robust optimization (HIRO) method was developed through coupling stochastic robust optimization and interval linear programming. HIRO can effectively reflect the complex system features under uncertainty, where implications of water quality/quantity restrictions for achieving regional economic development objectives are studied. By delimiting the uncertain decision space through dimensional enlargement of the original chemical oxygen demand (COD) discharge constraints, HIRO enhances the robustness of the optimization processes and resulting solutions. This method was applied to planning of industry development in association with river-water pollution concern in New Binhai District of Tianjin, China. Results demonstrated that the proposed optimization model can effectively communicate uncertainties into the optimization process and generate a spectrum of potential inexact solutions supporting local decision makers in managing benefit-effective water quality management schemes. HIRO is helpful for analysis of policy scenarios related to different levels of economic penalties, while also providing insight into the tradeoff between system benefits and environmental requirements. PMID:23922495

  12. In vivo evaluation of the effect of stimulus distribution on FIR statistical efficiency in event-related fMRI.

    PubMed

    Jansma, J Martijn; de Zwart, Jacco A; van Gelderen, Peter; Duyn, Jeff H; Drevets, Wayne C; Furey, Maura L

    2013-05-15

    Technical developments in MRI have improved signal to noise, allowing use of analysis methods such as Finite impulse response (FIR) of rapid event related functional MRI (er-fMRI). FIR is one of the most informative analysis methods as it determines onset and full shape of the hemodynamic response function (HRF) without any a priori assumptions. FIR is however vulnerable to multicollinearity, which is directly related to the distribution of stimuli over time. Efficiency can be optimized by simplifying a design, and restricting stimuli distribution to specific sequences, while more design flexibility necessarily reduces efficiency. However, the actual effect of efficiency on fMRI results has never been tested in vivo. Thus, it is currently difficult to make an informed choice between protocol flexibility and statistical efficiency. The main goal of this study was to assign concrete fMRI signal to noise values to the abstract scale of FIR statistical efficiency. Ten subjects repeated a perception task with five random and m-sequence based protocol, with varying but, according to literature, acceptable levels of multicollinearity. Results indicated substantial differences in signal standard deviation, while the level was a function of multicollinearity. Experiment protocols varied up to 55.4% in standard deviation. Results confirm that quality of fMRI in an FIR analysis can significantly and substantially vary with statistical efficiency. Our in vivo measurements can be used to aid in making an informed decision between freedom in protocol design and statistical efficiency. Published by Elsevier B.V.

  13. A random optimization approach for inherent optic properties of nearshore waters

    NASA Astrophysics Data System (ADS)

    Zhou, Aijun; Hao, Yongshuai; Xu, Kuo; Zhou, Heng

    2016-10-01

    Traditional method of water quality sampling is time-consuming and highly cost. It can not meet the needs of social development. Hyperspectral remote sensing technology has well time resolution, spatial coverage and more general segment information on spectrum. It has a good potential in water quality supervision. Via the method of semi-analytical, remote sensing information can be related with the water quality. The inherent optical properties are used to quantify the water quality, and an optical model inside the water is established to analysis the features of water. By stochastic optimization algorithm Threshold Acceptance, a global optimization of the unknown model parameters can be determined to obtain the distribution of chlorophyll, organic solution and suspended particles in water. Via the improvement of the optimization algorithm in the search step, the processing time will be obviously reduced, and it will create more opportunity for the increasing the number of parameter. For the innovation definition of the optimization steps and standard, the whole inversion process become more targeted, thus improving the accuracy of inversion. According to the application result for simulated data given by IOCCG and field date provided by NASA, the approach model get continuous improvement and enhancement. Finally, a low-cost, effective retrieval model of water quality from hyper-spectral remote sensing can be achieved.

  14. Comprehensive optimization process of paranasal sinus radiography.

    PubMed

    Saarakkala, S; Nironen, K; Hermunen, H; Aarnio, J; Heikkinen, J O

    2009-04-01

    The optimization of radiological examinations is important in order to reduce unnecessary patient radiation exposure. To perform a comprehensive optimization process for paranasal sinus radiography at Mikkeli Central Hospital, Finland. Patients with suspicion of acute sinusitis were imaged with a Kodak computed radiography (CR) system (n=20) and with a Philips digital radiography (DR) system (n=30) using focus-detector distances (FDDs) of 110 cm, 150 cm, or 200 cm. Patients' radiation exposure was determined in terms of entrance surface dose and dose-area product. Furthermore, an anatomical phantom was used for the estimation of point doses inside the head. Clinical image quality was evaluated by an experienced radiologist, and physical image quality was evaluated from the digital radiography phantom. Patient doses were significantly lower and image quality better with the DR system compared to the CR system. The differences in patient dose and physical image quality were small with varying FDD. Clinical image quality of the DR system was lowest with FDD of 200 cm. Further, imaging with FDD of 150 cm was technically easier for the technologist to perform than with FDD of 110 cm. After optimization, it was recommended that the DR system with FDD of 150 cm should always be used at Mikkeli Central Hospital. We recommend this kind of comprehensive approach in all optimization processes of radiological examinations.

  15. To the theory of high-power gyrotrons with uptapered resonators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dumbrajs, O.; Nusinovich, G. S.

    In high-power gyrotrons it is desirable to combine an optimal resonator length with the optimal value of the resonator quality factor. In resonators with the constant radius of the central part, the possibilities of this combination are limited because the quality factor of the resonator sharply increases with its length. Therefore the attempts to increase the length for maximizing the efficiency leads to such increase in the quality factor which makes the optimal current too small. Resonators with slightly uptapered profiles offer more flexibility in this regard. In such resonators, one can separate optimization of the interaction length from optimizationmore » of the quality factor because the quality factor determined by diffractive losses can be reduced by increasing the angle of uptapering. In the present paper, these issues are analyzed by studying as a typical high-power 17 GHz gyrotron which is currently under development in Europe for ITER (http://en.wikipedia.org/wiki/ITER). The effect of a slight uptapering of the resonator wall on the efficiency enhancement and the purity of the radiation spectrum in the process of the gyrotron start-up and power modulation are studied. Results show that optimal modification of the shape of a slightly uptapered resonator may result in increasing the gyrotron power from 1052 to 1360 kW.« less

  16. Fatigue in Arthritis: A Multidimensional Phenomenon with Impact on Quality of Life : Fatigue and Quality of Life in Arthritis.

    PubMed

    Alikari, Victoria; Sachlas, Athanasios; Giatrakou, Stavroula; Stathoulis, John; Fradelos, Evagelos; Theofilou, Paraskevi; Lavdaniti, Maria; Zyga, Sofia

    2017-01-01

    An important factor which influences the quality of life of patients with arthritis is the fatigue they experience. The purpose of this study was to assess the relationship between fatigue and quality of life among patients with osteoarthritis and rheumatoid arthritis. Between January 2015 and March 2015, 179 patients with osteoarthritis and rheumatoid arthritis completed the Fatigue Assessment Scale and the Missoula-VITAS Quality of Life Index-15 (MVQoLI-15). The study was conducted in Rehabilitation Centers located in the area of Peloponnese, Greece. Data related to sociodemographic characteristics and their individual medical histories were recorded. Statistical analysis was performed using the IBM SPSS Statistics version 19. The analysis did not reveal statistically significant correlation between fatigue and quality of life neither in the total sample nor among patients with osteoarthritis (r = -0.159; p = 0.126) or rheumatoid arthritis. However, there was a statistically significant relationship between some aspects of fatigue and dimensions of quality of life. Osteoarthritis patients had statistically significant lower MVQoLI-15 score than rheumatoid arthritis patients (13.73 ± 1.811 vs 14.61 ± 1.734) and lower FAS score than rheumatoid patients (26.14 ± 3.668 vs 29.94 ± 3.377) (p-value < 0.001). The finding that different aspects of fatigue may affect dimensions of quality of life may help health care professionals by proposing the early treatment of fatigue in order to gain benefits for quality of life.

  17. A statistical approach to optimizing concrete mixture design.

    PubMed

    Ahmad, Shamsad; Alghamdi, Saeid A

    2014-01-01

    A step-by-step statistical approach is proposed to obtain optimum proportioning of concrete mixtures using the data obtained through a statistically planned experimental program. The utility of the proposed approach for optimizing the design of concrete mixture is illustrated considering a typical case in which trial mixtures were considered according to a full factorial experiment design involving three factors and their three levels (3(3)). A total of 27 concrete mixtures with three replicates (81 specimens) were considered by varying the levels of key factors affecting compressive strength of concrete, namely, water/cementitious materials ratio (0.38, 0.43, and 0.48), cementitious materials content (350, 375, and 400 kg/m(3)), and fine/total aggregate ratio (0.35, 0.40, and 0.45). The experimental data were utilized to carry out analysis of variance (ANOVA) and to develop a polynomial regression model for compressive strength in terms of the three design factors considered in this study. The developed statistical model was used to show how optimization of concrete mixtures can be carried out with different possible options.

  18. A Statistical Approach to Optimizing Concrete Mixture Design

    PubMed Central

    Alghamdi, Saeid A.

    2014-01-01

    A step-by-step statistical approach is proposed to obtain optimum proportioning of concrete mixtures using the data obtained through a statistically planned experimental program. The utility of the proposed approach for optimizing the design of concrete mixture is illustrated considering a typical case in which trial mixtures were considered according to a full factorial experiment design involving three factors and their three levels (33). A total of 27 concrete mixtures with three replicates (81 specimens) were considered by varying the levels of key factors affecting compressive strength of concrete, namely, water/cementitious materials ratio (0.38, 0.43, and 0.48), cementitious materials content (350, 375, and 400 kg/m3), and fine/total aggregate ratio (0.35, 0.40, and 0.45). The experimental data were utilized to carry out analysis of variance (ANOVA) and to develop a polynomial regression model for compressive strength in terms of the three design factors considered in this study. The developed statistical model was used to show how optimization of concrete mixtures can be carried out with different possible options. PMID:24688405

  19. Automatic initialization and quality control of large-scale cardiac MRI segmentations.

    PubMed

    Albà, Xènia; Lekadir, Karim; Pereañez, Marco; Medrano-Gracia, Pau; Young, Alistair A; Frangi, Alejandro F

    2018-01-01

    Continuous advances in imaging technologies enable ever more comprehensive phenotyping of human anatomy and physiology. Concomitant reduction of imaging costs has resulted in widespread use of imaging in large clinical trials and population imaging studies. Magnetic Resonance Imaging (MRI), in particular, offers one-stop-shop multidimensional biomarkers of cardiovascular physiology and pathology. A wide range of analysis methods offer sophisticated cardiac image assessment and quantification for clinical and research studies. However, most methods have only been evaluated on relatively small databases often not accessible for open and fair benchmarking. Consequently, published performance indices are not directly comparable across studies and their translation and scalability to large clinical trials or population imaging cohorts is uncertain. Most existing techniques still rely on considerable manual intervention for the initialization and quality control of the segmentation process, becoming prohibitive when dealing with thousands of images. The contributions of this paper are three-fold. First, we propose a fully automatic method for initializing cardiac MRI segmentation, by using image features and random forests regression to predict an initial position of the heart and key anatomical landmarks in an MRI volume. In processing a full imaging database, the technique predicts the optimal corrective displacements and positions in relation to the initial rough intersections of the long and short axis images. Second, we introduce for the first time a quality control measure capable of identifying incorrect cardiac segmentations with no visual assessment. The method uses statistical, pattern and fractal descriptors in a random forest classifier to detect failures to be corrected or removed from subsequent statistical analysis. Finally, we validate these new techniques within a full pipeline for cardiac segmentation applicable to large-scale cardiac MRI databases. The results obtained based on over 1200 cases from the Cardiac Atlas Project show the promise of fully automatic initialization and quality control for population studies. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Simultaneous assessment of phase chemistry, phase abundance and bulk chemistry with statistical electron probe micro-analyses: Application to cement clinkers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, William; Krakowiak, Konrad J.; Ulm, Franz-Josef, E-mail: ulm@mit.edu

    2014-01-15

    According to recent developments in cement clinker engineering, the optimization of chemical substitutions in the main clinker phases offers a promising approach to improve both reactivity and grindability of clinkers. Thus, monitoring the chemistry of the phases may become part of the quality control at the cement plants, along with the usual measurements of the abundance of the mineralogical phases (quantitative X-ray diffraction) and the bulk chemistry (X-ray fluorescence). This paper presents a new method to assess these three complementary quantities with a single experiment. The method is based on electron microprobe spot analyses, performed over a grid located onmore » a representative surface of the sample and interpreted with advanced statistical tools. This paper describes the method and the experimental program performed on industrial clinkers to establish the accuracy in comparison to conventional methods. -- Highlights: •A new method of clinker characterization •Combination of electron probe technique with cluster analysis •Simultaneous assessment of phase abundance, composition and bulk chemistry •Experimental validation performed on industrial clinkers.« less

Top