Sample records for error margin analysis

  1. Model selection for marginal regression analysis of longitudinal data with missing observations and covariate measurement error.

    PubMed

    Shen, Chung-Wei; Chen, Yi-Hau

    2015-10-01

    Missing observations and covariate measurement error commonly arise in longitudinal data. However, existing methods for model selection in marginal regression analysis of longitudinal data fail to address the potential bias resulting from these issues. To tackle this problem, we propose a new model selection criterion, the Generalized Longitudinal Information Criterion, which is based on an approximately unbiased estimator for the expected quadratic error of a considered marginal model accounting for both data missingness and covariate measurement error. The simulation results reveal that the proposed method performs quite well in the presence of missing data and covariate measurement error. On the contrary, the naive procedures without taking care of such complexity in data may perform quite poorly. The proposed method is applied to data from the Taiwan Longitudinal Study on Aging to assess the relationship of depression with health and social status in the elderly, accommodating measurement error in the covariate as well as missing observations. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  2. Robust Flutter Margin Analysis that Incorporates Flight Data

    NASA Technical Reports Server (NTRS)

    Lind, Rick; Brenner, Martin J.

    1998-01-01

    An approach for computing worst-case flutter margins has been formulated in a robust stability framework. Uncertainty operators are included with a linear model to describe modeling errors and flight variations. The structured singular value, mu, computes a stability margin that directly accounts for these uncertainties. This approach introduces a new method of computing flutter margins and an associated new parameter for describing these margins. The mu margins are robust margins that indicate worst-case stability estimates with respect to the defined uncertainty. Worst-case flutter margins are computed for the F/A-18 Systems Research Aircraft using uncertainty sets generated by flight data analysis. The robust margins demonstrate flight conditions for flutter may lie closer to the flight envelope than previously estimated by p-k analysis.

  3. A multiloop generalization of the circle criterion for stability margin analysis

    NASA Technical Reports Server (NTRS)

    Safonov, M. G.; Athans, M.

    1979-01-01

    In order to provide a theoretical tool suited for characterizing the stability margins of multiloop feedback systems, multiloop input-output stability results generalizing the circle stability criterion are considered. Generalized conic sectors with 'centers' and 'radii' determined by linear dynamical operators are employed to specify the stability margins as a frequency dependent convex set of modeling errors (including nonlinearities, gain variations and phase variations) which the system must be able to tolerate in each feedback loop without instability. The resulting stability criterion gives sufficient conditions for closed loop stability in the presence of frequency dependent modeling errors, even when the modeling errors occur simultaneously in all loops. The stability conditions yield an easily interpreted scalar measure of the amount by which a multiloop system exceeds, or falls short of, its stability margin specifications.

  4. Worst-Case Flutter Margins from F/A-18 Aircraft Aeroelastic Data

    NASA Technical Reports Server (NTRS)

    Lind, Rick; Brenner, Marty

    1997-01-01

    An approach for computing worst-case flutter margins has been formulated in a robust stability framework. Uncertainty operators are included with a linear model to describe modeling errors and flight variations. The structured singular value, micron, computes a stability margin which directly accounts for these uncertainties. This approach introduces a new method of computing flutter margins and an associated new parameter for describing these margins. The micron margins are robust margins which indicate worst-case stability estimates with respect to the defined uncertainty. Worst-case flutter margins are computed for the F/A-18 SRA using uncertainty sets generated by flight data analysis. The robust margins demonstrate flight conditions for flutter may lie closer to the flight envelope than previously estimated by p-k analysis.

  5. Set-up uncertainties: online correction with X-ray volume imaging.

    PubMed

    Kataria, Tejinder; Abhishek, Ashu; Chadha, Pranav; Nandigam, Janardhan

    2011-01-01

    To determine interfractional three-dimensional set-up errors using X-ray volumetric imaging (XVI). Between December 2007 and August 2009, 125 patients were taken up for image-guided radiotherapy using online XVI. After matching of reference and acquired volume view images, set-up errors in three translation directions were recorded and corrected online before treatment each day. Mean displacements, population systematic (Σ), and random (σ) errors were calculated and analyzed using SPSS (v16) software. Optimum clinical target volume (CTV) to planning target volume (PTV) margin was calculated using Van Herk's (2.5Σ + 0.7 σ) and Stroom's (2Σ + 0.7 σ) formula. Patients were grouped in 4 cohorts, namely brain, head and neck, thorax, and abdomen-pelvis. The mean vector displacement recorded were 0.18 cm, 0.15 cm, 0.36 cm, and 0.35 cm for brain, head and neck, thorax, and abdomen-pelvis, respectively. Analysis of individual mean set-up errors revealed good agreement with the proposed 0.3 cm isotropic margins for brain and 0.5 cm isotropic margins for head-neck. Similarly, 0.5 cm circumferential and 1 cm craniocaudal proposed margins were in agreement with thorax and abdomen-pelvic cases. The calculated mean displacements were well within CTV-PTV margin estimates of Van Herk (90% population coverage to minimum 95% prescribed dose) and Stroom (99% target volume coverage by 95% prescribed dose). Employing these individualized margins in a particular cohort ensure comparable target coverage as described in literature, which is further improved if XVI-aided set-up error detection and correction is used before treatment.

  6. Tutorial on Biostatistics: Linear Regression Analysis of Continuous Correlated Eye Data.

    PubMed

    Ying, Gui-Shuang; Maguire, Maureen G; Glynn, Robert; Rosner, Bernard

    2017-04-01

    To describe and demonstrate appropriate linear regression methods for analyzing correlated continuous eye data. We describe several approaches to regression analysis involving both eyes, including mixed effects and marginal models under various covariance structures to account for inter-eye correlation. We demonstrate, with SAS statistical software, applications in a study comparing baseline refractive error between one eye with choroidal neovascularization (CNV) and the unaffected fellow eye, and in a study determining factors associated with visual field in the elderly. When refractive error from both eyes were analyzed with standard linear regression without accounting for inter-eye correlation (adjusting for demographic and ocular covariates), the difference between eyes with CNV and fellow eyes was 0.15 diopters (D; 95% confidence interval, CI -0.03 to 0.32D, p = 0.10). Using a mixed effects model or a marginal model, the estimated difference was the same but with narrower 95% CI (0.01 to 0.28D, p = 0.03). Standard regression for visual field data from both eyes provided biased estimates of standard error (generally underestimated) and smaller p-values, while analysis of the worse eye provided larger p-values than mixed effects models and marginal models. In research involving both eyes, ignoring inter-eye correlation can lead to invalid inferences. Analysis using only right or left eyes is valid, but decreases power. Worse-eye analysis can provide less power and biased estimates of effect. Mixed effects or marginal models using the eye as the unit of analysis should be used to appropriately account for inter-eye correlation and maximize power and precision.

  7. The Role of Margin in Link Design and Optimization

    NASA Technical Reports Server (NTRS)

    Cheung, K.

    2015-01-01

    Link analysis is a system engineering process in the design, development, and operation of communication systems and networks. Link models that are mathematical abstractions representing the useful signal power and the undesirable noise and attenuation effects (including weather effects if the signal path transverses through the atmosphere) that are integrated into the link budget calculation that provides the estimates of signal power and noise power at the receiver. Then the link margin is applied which attempts to counteract the fluctuations of the signal and noise power to ensure reliable data delivery from transmitter to receiver. (Link margin is dictated by the link margin policy or requirements.) A simple link budgeting approach assumes link parameters to be deterministic values typically adopted a rule-of-thumb policy of 3 dB link margin. This policy works for most S- and X-band links due to their insensitivity to weather effects. But for higher frequency links like Ka-band, Ku-band, and optical communication links, it is unclear if a 3 dB link margin would guarantee link closure. Statistical link analysis that adopted the 2-sigma or 3-sigma link margin incorporates link uncertainties in the sigma calculation. (The Deep Space Network (DSN) link margin policies are 2-sigma for downlink and 3-sigma for uplink.) The link reliability can therefore be quantified statistically even for higher frequency links. However in the current statistical link analysis approach, link reliability is only expressed as the likelihood of exceeding the signal-to-noise ratio (SNR) threshold that corresponds to a given bit-error-rate (BER) or frame-error-rate (FER) requirement. The method does not provide the true BER or FER estimate of the link with margin, or the required signalto-noise ratio (SNR) that would meet the BER or FER requirement in the statistical sense. In this paper, we perform in-depth analysis on the relationship between BER/FER requirement, operating SNR, and coding performance curve, in the case when the channel coherence time of link fluctuation is comparable or larger than the time duration of a codeword. We compute the "true" SNR design point that would meet the BER/FER requirement by taking into account the fluctuation of signal power and noise power at the receiver, and the shape of the coding performance curve. This analysis yields a number of valuable insights on the design choices of coding scheme and link margin for the reliable data delivery of a communication system - space and ground. We illustrate the aforementioned analysis using a number of standard NASA error-correcting codes.

  8. Verifiable Adaptive Control with Analytical Stability Margins by Optimal Control Modification

    NASA Technical Reports Server (NTRS)

    Nguyen, Nhan T.

    2010-01-01

    This paper presents a verifiable model-reference adaptive control method based on an optimal control formulation for linear uncertain systems. A predictor model is formulated to enable a parameter estimation of the system parametric uncertainty. The adaptation is based on both the tracking error and predictor error. Using a singular perturbation argument, it can be shown that the closed-loop system tends to a linear time invariant model asymptotically under an assumption of fast adaptation. A stability margin analysis is given to estimate a lower bound of the time delay margin using a matrix measure method. Using this analytical method, the free design parameter n of the optimal control modification adaptive law can be determined to meet a specification of stability margin for verification purposes.

  9. Tutorial on Biostatistics: Linear Regression Analysis of Continuous Correlated Eye Data

    PubMed Central

    Ying, Gui-shuang; Maguire, Maureen G; Glynn, Robert; Rosner, Bernard

    2017-01-01

    Purpose To describe and demonstrate appropriate linear regression methods for analyzing correlated continuous eye data. Methods We describe several approaches to regression analysis involving both eyes, including mixed effects and marginal models under various covariance structures to account for inter-eye correlation. We demonstrate, with SAS statistical software, applications in a study comparing baseline refractive error between one eye with choroidal neovascularization (CNV) and the unaffected fellow eye, and in a study determining factors associated with visual field data in the elderly. Results When refractive error from both eyes were analyzed with standard linear regression without accounting for inter-eye correlation (adjusting for demographic and ocular covariates), the difference between eyes with CNV and fellow eyes was 0.15 diopters (D; 95% confidence interval, CI −0.03 to 0.32D, P=0.10). Using a mixed effects model or a marginal model, the estimated difference was the same but with narrower 95% CI (0.01 to 0.28D, P=0.03). Standard regression for visual field data from both eyes provided biased estimates of standard error (generally underestimated) and smaller P-values, while analysis of the worse eye provided larger P-values than mixed effects models and marginal models. Conclusion In research involving both eyes, ignoring inter-eye correlation can lead to invalid inferences. Analysis using only right or left eyes is valid, but decreases power. Worse-eye analysis can provide less power and biased estimates of effect. Mixed effects or marginal models using the eye as the unit of analysis should be used to appropriately account for inter-eye correlation and maximize power and precision. PMID:28102741

  10. Utilizing Flight Data to Update Aeroelastic Stability Estimates

    NASA Technical Reports Server (NTRS)

    Lind, Rick; Brenner, Marty

    1997-01-01

    Stability analysis of high performance aircraft must account for errors in the system model. A method for computing flutter margins that incorporates flight data has been developed using robust stability theory. This paper considers applying this method to update flutter margins during a post-flight or on-line analysis. Areas of modeling uncertainty that arise when using flight data with this method are investigated. The amount of conservatism in the resulting flutter margins depends on the flight data sets used to update the model. Post-flight updates of flutter margins for an F/A-18 are presented along with a simulation of on-line updates during a flight test.

  11. Multiple imputation to account for measurement error in marginal structural models

    PubMed Central

    Edwards, Jessie K.; Cole, Stephen R.; Westreich, Daniel; Crane, Heidi; Eron, Joseph J.; Mathews, W. Christopher; Moore, Richard; Boswell, Stephen L.; Lesko, Catherine R.; Mugavero, Michael J.

    2015-01-01

    Background Marginal structural models are an important tool for observational studies. These models typically assume that variables are measured without error. We describe a method to account for differential and non-differential measurement error in a marginal structural model. Methods We illustrate the method estimating the joint effects of antiretroviral therapy initiation and current smoking on all-cause mortality in a United States cohort of 12,290 patients with HIV followed for up to 5 years between 1998 and 2011. Smoking status was likely measured with error, but a subset of 3686 patients who reported smoking status on separate questionnaires composed an internal validation subgroup. We compared a standard joint marginal structural model fit using inverse probability weights to a model that also accounted for misclassification of smoking status using multiple imputation. Results In the standard analysis, current smoking was not associated with increased risk of mortality. After accounting for misclassification, current smoking without therapy was associated with increased mortality [hazard ratio (HR): 1.2 (95% CI: 0.6, 2.3)]. The HR for current smoking and therapy (0.4 (95% CI: 0.2, 0.7)) was similar to the HR for no smoking and therapy (0.4; 95% CI: 0.2, 0.6). Conclusions Multiple imputation can be used to account for measurement error in concert with methods for causal inference to strengthen results from observational studies. PMID:26214338

  12. SIRTF Focal Plane Survey: A Pre-flight Error Analysis

    NASA Technical Reports Server (NTRS)

    Bayard, David S.; Brugarolas, Paul B.; Boussalis, Dhemetrios; Kang, Bryan H.

    2003-01-01

    This report contains a pre-flight error analysis of the calibration accuracies expected from implementing the currently planned SIRTF focal plane survey strategy. The main purpose of this study is to verify that the planned strategy will meet focal plane survey calibration requirements (as put forth in the SIRTF IOC-SV Mission Plan [4]), and to quantify the actual accuracies expected. The error analysis was performed by running the Instrument Pointing Frame (IPF) Kalman filter on a complete set of simulated IOC-SV survey data, and studying the resulting propagated covariances. The main conclusion of this study is that the all focal plane calibration requirements can be met with the currently planned survey strategy. The associated margins range from 3 to 95 percent, and tend to be smallest for frames having a 0.14" requirement, and largest for frames having a more generous 0.28" (or larger) requirement. The smallest margin of 3 percent is associated with the IRAC 3.6 and 5.8 micron array centers (frames 068 and 069), and the largest margin of 95 percent is associated with the MIPS 160 micron array center (frame 087). For pointing purposes, the most critical calibrations are for the IRS Peakup sweet spots and short wavelength slit centers (frames 019, 023, 052, 028, 034). Results show that these frames are meeting their 0.14" requirements with an expected accuracy of approximately 0.1", which corresponds to a 28 percent margin.

  13. Effect of Body Mass Index on Magnitude of Setup Errors in Patients Treated With Adjuvant Radiotherapy for Endometrial Cancer With Daily Image Guidance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Lilie L., E-mail: lin@uphs.upenn.edu; Hertan, Lauren; Rengan, Ramesh

    2012-06-01

    Purpose: To determine the impact of body mass index (BMI) on daily setup variations and frequency of imaging necessary for patients with endometrial cancer treated with adjuvant intensity-modulated radiotherapy (IMRT) with daily image guidance. Methods and Materials: The daily shifts from a total of 782 orthogonal kilovoltage images from 30 patients who received pelvic IMRT between July 2008 and August 2010 were analyzed. The BMI, mean daily shifts, and random and systematic errors in each translational and rotational direction were calculated for each patient. Margin recipes were generated based on BMI. Linear regression and spearman rank correlation analysis were performed.more » To simulate a less-than-daily IGRT protocol, the average shift of the first five fractions was applied to subsequent setups without IGRT for assessing the impact on setup error and margin requirements. Results: Median BMI was 32.9 (range, 23-62). Of the 30 patients, 16.7% (n = 5) were normal weight (BMI <25); 23.3% (n = 7) were overweight (BMI {>=}25 to <30); 26.7% (n = 8) were mildly obese (BMI {>=}30 to <35); and 33.3% (n = 10) were moderately to severely obese (BMI {>=} 35). On linear regression, mean absolute vertical, longitudinal, and lateral shifts positively correlated with BMI (p = 0.0127, p = 0.0037, and p < 0.0001, respectively). Systematic errors in the longitudinal and vertical direction were found to be positively correlated with BMI category (p < 0.0001 for both). IGRT for the first five fractions, followed by correction of the mean error for all subsequent fractions, led to a substantial reduction in setup error and resultant margin requirement overall compared with no IGRT. Conclusions: Daily shifts, systematic errors, and margin requirements were greatest in obese patients. For women who are normal or overweight, a planning target margin margin of 7 to 10 mm may be sufficient without IGRT, but for patients who are moderately or severely obese, this is insufficient.« less

  14. Design of a Torque Current Generator for Strapdown Gyroscopes. Ph.D. Thesis; [and performance prediction

    NASA Technical Reports Server (NTRS)

    Mcknight, R. D.; Blalock, T. V.; Kennedy, E. J.

    1974-01-01

    The design, analysis, and experimental evaluation of an optimum performance torque current generator for use with strapdown gyroscopes, is presented. Among the criteria used to evaluate the design were the following: (1) steady-state accuracy; (2) margins of stability against self-oscillation; (3) temperature variations; (4) aging; (5) static errors drift errors, and transient errors, (6) classical frequency and time domain characteristics; and (7) the equivalent noise at the input of the comparater operational amplifier. The DC feedback loop of the torque current generator was approximated as a second-order system. Stability calculations for gain margins are discussed. Circuit diagrams are shown and block diagrams showing the implementation of the torque current generator are discussed.

  15. Frozen section analysis of margins for head and neck tumor resections: reduction of sampling errors with a third histologic level.

    PubMed

    Olson, Stephen M; Hussaini, Mohammad; Lewis, James S

    2011-05-01

    Frozen section analysis is an essential tool for assessing margins intra-operatively to assure complete resection. Many institutions evaluate surgical defect edge tissue provided by the surgeon after the main lesion has been removed. With the increasing use of transoral laser microsurgery, this method is becoming even more prevalent. We sought to evaluate error rates at our large academic institution and to see if sampling errors could be reduced by the simple method change of taking an additional third section on these specimens. All head and neck tumor resection cases from January 2005 through August 2008 with margins evaluated by frozen section were identified by database search. These cases were analyzed by cutting two levels during frozen section and a third permanent section later. All resection cases from August 2008 through July 2009 were identified as well. These were analyzed by cutting three levels during frozen section (the third a 'much deeper' level) and a fourth permanent section later. Error rates for both of these periods were determined. Errors were separated into sampling and interpretation types. There were 4976 total frozen section specimens from 848 patients. The overall error rate was 2.4% for all frozen sections where just two levels were evaluated and was 2.5% when three levels were evaluated (P=0.67). The sampling error rate was 1.6% for two-level sectioning and 1.2% for three-level sectioning (P=0.42). However, when considering only the frozen section cases where tumor was ultimately identified (either at the time of frozen section or on permanent sections) the sampling error rate for two-level sectioning was 15.3 versus 7.4% for three-level sectioning. This difference was statistically significant (P=0.006). Cutting a single additional 'deeper' level at the time of frozen section identifies more tumor-bearing specimens and may reduce the number of sampling errors.

  16. Inter- and Intrafraction Uncertainty in Prostate Bed Image-Guided Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Kitty; Palma, David A.; Department of Oncology, University of Western Ontario, London

    2012-10-01

    Purpose: The goals of this study were to measure inter- and intrafraction setup error and prostate bed motion (PBM) in patients undergoing post-prostatectomy image-guided radiotherapy (IGRT) and to propose appropriate population-based three-dimensional clinical target volume to planning target volume (CTV-PTV) margins in both non-IGRT and IGRT scenarios. Methods and Materials: In this prospective study, 14 patients underwent adjuvant or salvage radiotherapy to the prostate bed under image guidance using linac-based kilovoltage cone-beam CT (kV-CBCT). Inter- and intrafraction uncertainty/motion was assessed by offline analysis of three consecutive daily kV-CBCT images of each patient: (1) after initial setup to skin marks, (2)more » after correction for positional error/immediately before radiation treatment, and (3) immediately after treatment. Results: The magnitude of interfraction PBM was 2.1 mm, and intrafraction PBM was 0.4 mm. The maximum inter- and intrafraction prostate bed motion was primarily in the anterior-posterior direction. Margins of at least 3-5 mm with IGRT and 4-7 mm without IGRT (aligning to skin marks) will ensure 95% of the prescribed dose to the clinical target volume in 90% of patients. Conclusions: PBM is a predominant source of intrafraction error compared with setup error and has implications for appropriate PTV margins. Based on inter- and estimated intrafraction motion of the prostate bed using pre- and post-kV-CBCT images, CBCT IGRT to correct for day-to-day variances can potentially reduce CTV-PTV margins by 1-2 mm. CTV-PTV margins for prostate bed treatment in the IGRT and non-IGRT scenarios are proposed; however, in cases with more uncertainty of target delineation and image guidance accuracy, larger margins are recommended.« less

  17. Using the Sampling Margin of Error to Assess the Interpretative Validity of Student Evaluations of Teaching

    ERIC Educational Resources Information Center

    James, David E.; Schraw, Gregory; Kuch, Fred

    2015-01-01

    We present an equation, derived from standard statistical theory, that can be used to estimate sampling margin of error for student evaluations of teaching (SETs). We use the equation to examine the effect of sample size, response rates and sample variability on the estimated sampling margin of error, and present results in four tables that allow…

  18. Controller certification: The generalized stability margin inference for a large number of MIMO controllers

    NASA Astrophysics Data System (ADS)

    Park, Jisang

    In this dissertation, we investigate MIMO stability margin inference of a large number of controllers using pre-established stability margins of a small number of nu-gap-wise adjacent controllers. The generalized stability margin and the nu-gap metric are inherently able to handle MIMO system analysis without the necessity of repeating multiple channel-by-channel SISO analyses. This research consists of three parts: (i) development of a decision support tool for inference of the stability margin, (ii) computational considerations for yielding the maximal stability margin with the minimal nu-gap metric in a less conservative manner, and (iii) experiment design for estimating the generalized stability margin with an assured error bound. A modern problem from aerospace control involves the certification of a large set of potential controllers with either a single plant or a fleet of potential plant systems, with both plants and controllers being MIMO and, for the moment, linear. Experiments on a limited number of controller/plant pairs should establish the stability and a certain level of margin of the complete set. We consider this certification problem for a set of controllers and provide algorithms for selecting an efficient subset for testing. This is done for a finite set of candidate controllers and, at least for SISO plants, for an infinite set. In doing this, the nu-gap metric will be the main tool. We provide a theorem restricting a radius of a ball in the parameter space so that the controller can guarantee a prescribed level of stability and performance if parameters of the controllers are contained in the ball. Computational examples are given, including one of certification of an aircraft engine controller. The overarching aim is to introduce truly MIMO margin calculations and to understand their efficacy in certifying stability over a set of controllers and in replacing legacy single-loop gain and phase margin calculations. We consider methods for the computation of; maximal MIMO stability margins bP̂,C, minimal nu-gap metrics deltanu , and the maximal difference between these two values, through the use of scaling and weighting functions. We propose simultaneous scaling selections that attempt to maximize the generalized stability margin and minimize the nu-gap. The minimization of the nu-gap by scaling involves a non-convex optimization. We modify the XY-centering algorithm to handle this non-convexity. This is done for applications in controller certification. Estimating the generalized stability margin with an accurate error bound has significant impact on controller certification. We analyze an error bound of the generalized stability margin as the infinity norm of the MIMO empirical transfer function estimate (ETFE). Input signal design to reduce the error on the estimate is also studied. We suggest running the system for a certain amount of time prior to recording of each output data set. The assured upper bound of estimation error can be tuned by the amount of the pre-experiment.

  19. TH-AB-202-04: Auto-Adaptive Margin Generation for MLC-Tracked Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Glitzner, M; Lagendijk, J; Raaymakers, B

    Purpose: To develop an auto-adaptive margin generator for MLC tracking. The generator is able to estimate errors arising in image guided radiotherapy, particularly on an MR-Linac, which depend on the latencies of machine and image processing, as well as on patient motion characteristics. From the estimated error distribution, a segment margin is generated, able to compensate errors up to a user-defined confidence. Method: In every tracking control cycle (TCC, 40ms), the desired aperture D(t) is compared to the actual aperture A(t), a delayed and imperfect representation of D(t). Thus an error e(t)=A(T)-D(T) is measured every TCC. Applying kernel-density-estimation (KDE), themore » cumulative distribution (CDF) of e(t) is estimated. With CDF-confidence limits, upper and lower error limits are extracted for motion axes along and perpendicular leaf-travel direction and applied as margins. To test the dosimetric impact, two representative motion traces were extracted from fast liver-MRI (10Hz). The traces were applied onto a 4D-motion platform and continuously tracked by an Elekta Agility 160 MLC using an artificially imposed tracking delay. Gafchromic film was used to detect dose exposition for static, tracked, and error-compensated tracking cases. The margin generator was parameterized to cover 90% of all tracking errors. Dosimetric impact was rated by calculating the ratio between underexposed points (>5% underdosage) to the total number of points inside FWHM of static exposure. Results: Without imposing adaptive margins, tracking experiments showed a ratio of underexposed points of 17.5% and 14.3% for two motion cases with imaging delays of 200ms and 300ms, respectively. Activating the margin generated yielded total suppression (<1%) of underdosed points. Conclusion: We showed that auto-adaptive error compensation using machine error statistics is possible for MLC tracking. The error compensation margins are calculated on-line, without the need of assuming motion or machine models. Further strategies to reduce consequential overdosages are currently under investigation. This work was funded by the SoRTS consortium, which includes the industry partners Elekta, Philips and Technolution.« less

  20. Safety margins in older adults increase with improved control of a dynamic object

    PubMed Central

    Hasson, Christopher J.; Sternad, Dagmar

    2014-01-01

    Older adults face decreasing motor capabilities due to pervasive neuromuscular degradations. As a consequence, errors in movement control increase. Thus, older individuals should maintain larger safety margins than younger adults. While this has been shown for object manipulation tasks, several reports on whole-body activities, such as posture and locomotion, demonstrate age-related reductions in safety margins. This is despite increased costs for control errors, such as a fall. We posit that this paradox could be explained by the dynamic challenge presented by the body or also an external object, and that age-related reductions in safety margins are in part due to a decreased ability to control dynamics. To test this conjecture we used a virtual ball-in-cup task that had challenging dynamics, yet afforded an explicit rendering of the physics and safety margin. The hypotheses were: (1) When manipulating an object with challenging dynamics, older adults have smaller safety margins than younger adults. (2) Older adults increase their safety margins with practice. Nine young and 10 healthy older adults practiced moving the virtual ball-in-cup to a target location in exactly 2 s. The accuracy and precision of the timing error quantified skill, and the ball energy relative to an escape threshold quantified the safety margin. Compared to the young adults, older adults had increased timing errors, greater variability, and decreased safety margins. With practice, both young and older adults improved their ability to control the object with decreased timing errors and variability, and increased their safety margins. These results suggest that safety margins are related to the ability to control dynamics, and may explain why in tasks with simple dynamics older adults use adequate safety margins, but in more complex tasks, safety margins may be inadequate. Further, the results indicate that task-specific training may improve safety margins in older adults. PMID:25071566

  1. Data Assimilation in the Presence of Forecast Bias: The GEOS Moisture Analysis

    NASA Technical Reports Server (NTRS)

    Dee, Dick P.; Todling, Ricardo

    1999-01-01

    We describe the application of the unbiased sequential analysis algorithm developed by Dee and da Silva (1998) to the GEOS DAS moisture analysis. The algorithm estimates the persistent component of model error using rawinsonde observations and adjusts the first-guess moisture field accordingly. Results of two seasonal data assimilation cycles show that moisture analysis bias is almost completely eliminated in all observed regions. The improved analyses cause a sizable reduction in the 6h-forecast bias and a marginal improvement in the error standard deviations.

  2. Calculating radiotherapy margins based on Bayesian modelling of patient specific random errors

    NASA Astrophysics Data System (ADS)

    Herschtal, A.; te Marvelde, L.; Mengersen, K.; Hosseinifard, Z.; Foroudi, F.; Devereux, T.; Pham, D.; Ball, D.; Greer, P. B.; Pichler, P.; Eade, T.; Kneebone, A.; Bell, L.; Caine, H.; Hindson, B.; Kron, T.

    2015-02-01

    Collected real-life clinical target volume (CTV) displacement data show that some patients undergoing external beam radiotherapy (EBRT) demonstrate significantly more fraction-to-fraction variability in their displacement (‘random error’) than others. This contrasts with the common assumption made by historical recipes for margin estimation for EBRT, that the random error is constant across patients. In this work we present statistical models of CTV displacements in which random errors are characterised by an inverse gamma (IG) distribution in order to assess the impact of random error variability on CTV-to-PTV margin widths, for eight real world patient cohorts from four institutions, and for different sites of malignancy. We considered a variety of clinical treatment requirements and penumbral widths. The eight cohorts consisted of a total of 874 patients and 27 391 treatment sessions. Compared to a traditional margin recipe that assumes constant random errors across patients, for a typical 4 mm penumbral width, the IG based margin model mandates that in order to satisfy the common clinical requirement that 90% of patients receive at least 95% of prescribed RT dose to the entire CTV, margins be increased by a median of 10% (range over the eight cohorts -19% to +35%). This substantially reduces the proportion of patients for whom margins are too small to satisfy clinical requirements.

  3. [Risk Management: concepts and chances for public health].

    PubMed

    Palm, Stefan; Cardeneo, Margareta; Halber, Marco; Schrappe, Matthias

    2002-01-15

    Errors are a common problem in medicine and occur as a result of a complex process involving many contributing factors. Medical errors significantly reduce the safety margin for the patient and contribute additional costs in health care delivery. In most cases adverse events cannot be attributed to a single underlying cause. Therefore an effective risk management strategy must follow a system approach, which is based on counting and analysis of near misses. The development of defenses against the undesired effects of errors should be the main focus rather than asking the question "Who blundered?". Analysis of near misses (which in this context can be compared to indicators) offers several methodological advantages as compared to the analysis of errors and adverse events. Risk management is an integral element of quality management.

  4. Negative Stress Margins - Are They Real?

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S.; Lee, Darlene S.; Mohaghegh, Michael

    2011-01-01

    Advances in modeling and simulation, new finite element software, modeling engines and powerful computers are providing opportunities to interrogate designs in a very different manner and in a more detailed approach than ever before. Margins of safety are also often evaluated using local stresses for various design concepts and design parameters quickly once analysis models are defined and developed. This paper suggests that not all the negative margins of safety evaluated are real. The structural areas where negative margins are frequently encountered are often near stress concentrations, point loads and load discontinuities, near locations of stress singularities, in areas having large gradients but with insufficient mesh density, in areas with modeling issues and modeling errors, and in areas with connections and interfaces, in two-dimensional (2D) and three-dimensional (3D) transitions, bolts and bolt modeling, and boundary conditions. Now, more than ever, structural analysts need to examine and interrogate their analysis results and perform basic sanity checks to determine if these negative margins are real.

  5. A general method for the definition of margin recipes depending on the treatment technique applied in helical tomotherapy prostate plans.

    PubMed

    Sevillano, David; Mínguez, Cristina; Sánchez, Alicia; Sánchez-Reyes, Alberto

    2016-01-01

    To obtain specific margin recipes that take into account the dosimetric characteristics of the treatment plans used in a single institution. We obtained dose-population histograms (DPHs) of 20 helical tomotherapy treatment plans for prostate cancer by simulating the effects of different systematic errors (Σ) and random errors (σ) on these plans. We obtained dosimetric margins and margin reductions due to random errors (random margins) by fitting the theoretical results of coverages for Gaussian distributions with coverages of the planned D99% obtained from the DPHs. The dosimetric margins obtained for helical tomotherapy prostate treatments were 3.3 mm, 3 mm, and 1 mm in the lateral (Lat), anterior-posterior (AP), and superior-inferior (SI) directions. Random margins showed parabolic dependencies, yielding expressions of 0.16σ(2), 0.13σ(2), and 0.15σ(2) for the Lat, AP, and SI directions, respectively. When focusing on values up to σ = 5 mm, random margins could be fitted considering Gaussian penumbras with standard deviations (σp) equal to 4.5 mm Lat, 6 mm AP, and 5.5 mm SI. Despite complex dose distributions in helical tomotherapy treatment plans, we were able to simplify the behaviour of our plans against treatment errors to single values of dosimetric and random margins for each direction. These margins allowed us to develop specific margin recipes for the respective treatment technique. The method is general and could be used for any treatment technique provided that DPHs can be obtained. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  6. Real-time auto-adaptive margin generation for MLC-tracked radiotherapy

    NASA Astrophysics Data System (ADS)

    Glitzner, M.; Fast, M. F.; de Senneville, B. Denis; Nill, S.; Oelfke, U.; Lagendijk, J. J. W.; Raaymakers, B. W.; Crijns, S. P. M.

    2017-01-01

    In radiotherapy, abdominal and thoracic sites are candidates for performing motion tracking. With real-time control it is possible to adjust the multileaf collimator (MLC) position to the target position. However, positions are not perfectly matched and position errors arise from system delays and complicated response of the electromechanic MLC system. Although, it is possible to compensate parts of these errors by using predictors, residual errors remain and need to be compensated to retain target coverage. This work presents a method to statistically describe tracking errors and to automatically derive a patient-specific, per-segment margin to compensate the arising underdosage on-line, i.e. during plan delivery. The statistics of the geometric error between intended and actual machine position are derived using kernel density estimators. Subsequently a margin is calculated on-line according to a selected coverage parameter, which determines the amount of accepted underdosage. The margin is then applied onto the actual segment to accommodate the positioning errors in the enlarged segment. The proof-of-concept was tested in an on-line tracking experiment and showed the ability to recover underdosages for two test cases, increasing {{V}90 %} in the underdosed area about 47 % and 41 % , respectively. The used dose model was able to predict the loss of dose due to tracking errors and could be used to infer the necessary margins. The implementation had a running time of 23 ms which is compatible with real-time requirements of MLC tracking systems. The auto-adaptivity to machine and patient characteristics makes the technique a generic yet intuitive candidate to avoid underdosages due to MLC tracking errors.

  7. SU-F-J-24: Setup Uncertainty and Margin of the ExacTrac 6D Image Guide System for Patients with Brain Tumors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, S; Oh, S; Yea, J

    Purpose: This study evaluated the setup uncertainties for brain sites when using BrainLAB’s ExacTrac X-ray 6D system for daily pretreatment to determine the optimal planning target volume (PTV) margin. Methods: Between August 2012 and April 2015, 28 patients with brain tumors were treated by daily image-guided radiotherapy using the BrainLAB ExacTrac 6D image guidance system of the Novalis-Tx linear accelerator. DUONTM (Orfit Industries, Wijnegem, Belgium) masks were used to fix the head. The radiotherapy was fractionated into 27–33 treatments. In total, 844 image verifications were performed for 28 patients and used for the analysis. The setup corrections along with themore » systematic and random errors were analyzed for six degrees of freedom in the translational (lateral, longitudinal, and vertical) and rotational (pitch, roll, and yaw) dimensions. Results: Optimal PTV margins were calculated based on van Herk et al.’s [margin recipe = 2.5∑ + 0.7σ − 3 mm] and Stroom et al.’s [margin recipe = 2∑ + 0.7σ] formulas. The systematic errors (∑) were 0.72, 1.57, and 0.97 mm in the lateral, longitudinal, and vertical translational dimensions, respectively, and 0.72°, 0.87°, and 0.83° in the pitch, roll, and yaw rotational dimensions, respectively. The random errors (σ) were 0.31, 0.46, and 0.54 mm in the lateral, longitudinal, and vertical rotational dimensions, respectively, and 0.28°, 0.24°, and 0.31° in the pitch, roll, and yaw rotational dimensions, respectively. According to van Herk et al.’s and Stroom et al.’s recipes, the recommended lateral PTV margins were 0.97 and 1.66 mm, respectively; the longitudinal margins were 1.26 and 3.47 mm, respectively; and the vertical margins were 0.21 and 2.31 mm, respectively. Conclusion: Therefore, daily setup verifications using the BrainLAB ExacTrac 6D image guide system are very useful for evaluating the setup uncertainties and determining the setup margin.∑σ.« less

  8. Three independent one-dimensional margins for single-fraction frameless stereotactic radiosurgery brain cases using CBCT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Qinghui; Chan, Maria F.; Burman, Chandra

    2013-12-15

    Purpose: Setting a proper margin is crucial for not only delivering the required radiation dose to a target volume, but also reducing the unnecessary radiation to the adjacent organs at risk. This study investigated the independent one-dimensional symmetric and asymmetric margins between the clinical target volume (CTV) and the planning target volume (PTV) for linac-based single-fraction frameless stereotactic radiosurgery (SRS).Methods: The authors assumed a Dirac delta function for the systematic error of a specific machine and a Gaussian function for the residual setup errors. Margin formulas were then derived in details to arrive at a suitable CTV-to-PTV margin for single-fractionmore » frameless SRS. Such a margin ensured that the CTV would receive the prescribed dose in 95% of the patients. To validate our margin formalism, the authors retrospectively analyzed nine patients who were previously treated with noncoplanar conformal beams. Cone-beam computed tomography (CBCT) was used in the patient setup. The isocenter shifts between the CBCT and linac were measured for a Varian Trilogy linear accelerator for three months. For each plan, the authors shifted the isocenter of the plan in each direction by ±3 mm simultaneously to simulate the worst setup scenario. Subsequently, the asymptotic behavior of the CTV V{sub 80%} for each patient was studied as the setup error approached the CTV-PTV margin.Results: The authors found that the proper margin for single-fraction frameless SRS cases with brain cancer was about 3 mm for the machine investigated in this study. The isocenter shifts between the CBCT and the linac remained almost constant over a period of three months for this specific machine. This confirmed our assumption that the machine systematic error distribution could be approximated as a delta function. This definition is especially relevant to a single-fraction treatment. The prescribed dose coverage for all the patients investigated was 96.1%± 5.5% with an extreme 3-mm setup error in all three directions simultaneously. It was found that the effect of the setup error on dose coverage was tumor location dependent. It mostly affected the tumors located in the posterior part of the brain, resulting in a minimum coverage of approximately 72%. This was entirely due to the unique geometry of the posterior head.Conclusions: Margin expansion formulas were derived for single-fraction frameless SRS such that the CTV would receive the prescribed dose in 95% of the patients treated for brain cancer. The margins defined in this study are machine-specific and account for nonzero mean systematic error. The margin for single-fraction SRS for a group of machines was also derived in this paper.« less

  9. SU-E-J-30: Benchmark Image-Based TCP Calculation for Evaluation of PTV Margins for Lung SBRT Patients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, M; Chetty, I; Zhong, H

    2014-06-01

    Purpose: Tumor control probability (TCP) calculated with accumulated radiation doses may help design appropriate treatment margins. Image registration errors, however, may compromise the calculated TCP. The purpose of this study is to develop benchmark CT images to quantify registration-induced errors in the accumulated doses and their corresponding TCP. Methods: 4DCT images were registered from end-inhale (EI) to end-exhale (EE) using a “demons” algorithm. The demons DVFs were corrected by an FEM model to get realistic deformation fields. The FEM DVFs were used to warp the EI images to create the FEM-simulated images. The two images combined with the FEM DVFmore » formed a benchmark model. Maximum intensity projection (MIP) images, created from the EI and simulated images, were used to develop IMRT plans. Two plans with 3 and 5 mm margins were developed for each patient. With these plans, radiation doses were recalculated on the simulated images and warped back to the EI images using the FEM DVFs to get the accumulated doses. The Elastix software was used to register the FEM-simulated images to the EI images. TCPs calculated with the Elastix-accumulated doses were compared with those generated by the FEM to get the TCP error of the Elastix registrations. Results: For six lung patients, the mean Elastix registration error ranged from 0.93 to 1.98 mm. Their relative dose errors in PTV were between 0.28% and 6.8% for 3mm margin plans, and between 0.29% and 6.3% for 5mm-margin plans. As the PTV margin reduced from 5 to 3 mm, the mean TCP error of the Elastix-reconstructed doses increased from 2.0% to 2.9%, and the mean NTCP errors decreased from 1.2% to 1.1%. Conclusion: Patient-specific benchmark images can be used to evaluate the impact of registration errors on the computed TCPs, and may help select appropriate PTV margins for lung SBRT patients.« less

  10. Increasing Safety of a Robotic System for Inner Ear Surgery Using Probabilistic Error Modeling Near Vital Anatomy

    PubMed Central

    Dillon, Neal P.; Siebold, Michael A.; Mitchell, Jason E.; Blachon, Gregoire S.; Balachandran, Ramya; Fitzpatrick, J. Michael; Webster, Robert J.

    2017-01-01

    Safe and effective planning for robotic surgery that involves cutting or ablation of tissue must consider all potential sources of error when determining how close the tool may come to vital anatomy. A pre-operative plan that does not adequately consider potential deviations from ideal system behavior may lead to patient injury. Conversely, a plan that is overly conservative may result in ineffective or incomplete performance of the task. Thus, enforcing simple, uniform-thickness safety margins around vital anatomy is insufficient in the presence of spatially varying, anisotropic error. Prior work has used registration error to determine a variable-thickness safety margin around vital structures that must be approached during mastoidectomy but ultimately preserved. In this paper, these methods are extended to incorporate image distortion and physical robot errors, including kinematic errors and deflections of the robot. These additional sources of error are discussed and stochastic models for a bone-attached robot for otologic surgery are developed. An algorithm for generating appropriate safety margins based on a desired probability of preserving the underlying anatomical structure is presented. Simulations are performed on a CT scan of a cadaver head and safety margins are calculated around several critical structures for planning of a robotic mastoidectomy. PMID:29200595

  11. Increasing safety of a robotic system for inner ear surgery using probabilistic error modeling near vital anatomy

    NASA Astrophysics Data System (ADS)

    Dillon, Neal P.; Siebold, Michael A.; Mitchell, Jason E.; Blachon, Gregoire S.; Balachandran, Ramya; Fitzpatrick, J. Michael; Webster, Robert J.

    2016-03-01

    Safe and effective planning for robotic surgery that involves cutting or ablation of tissue must consider all potential sources of error when determining how close the tool may come to vital anatomy. A pre-operative plan that does not adequately consider potential deviations from ideal system behavior may lead to patient injury. Conversely, a plan that is overly conservative may result in ineffective or incomplete performance of the task. Thus, enforcing simple, uniform-thickness safety margins around vital anatomy is insufficient in the presence of spatially varying, anisotropic error. Prior work has used registration error to determine a variable-thickness safety margin around vital structures that must be approached during mastoidectomy but ultimately preserved. In this paper, these methods are extended to incorporate image distortion and physical robot errors, including kinematic errors and deflections of the robot. These additional sources of error are discussed and stochastic models for a bone-attached robot for otologic surgery are developed. An algorithm for generating appropriate safety margins based on a desired probability of preserving the underlying anatomical structure is presented. Simulations are performed on a CT scan of a cadaver head and safety margins are calculated around several critical structures for planning of a robotic mastoidectomy.

  12. Evaluation of overall setup accuracy and adequate setup margins in pelvic image-guided radiotherapy: Comparison of the male and female patients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laaksomaa, Marko, E-mail: marko.laaksomaa@pshp.fi; Kapanen, Mika; Department of Medical Physics, Tampere University Hospital

    We evaluated adequate setup margins for the radiotherapy (RT) of pelvic tumors based on overall position errors of bony landmarks. We also estimated the difference in setup accuracy between the male and female patients. Finally, we compared the patient rotation for 2 immobilization devices. The study cohort included consecutive 64 male and 64 female patients. Altogether, 1794 orthogonal setup images were analyzed. Observer-related deviation in image matching and the effect of patient rotation were explicitly determined. Overall systematic and random errors were calculated in 3 orthogonal directions. Anisotropic setup margins were evaluated based on residual errors after weekly image guidance.more » The van Herk formula was used to calculate the margins. Overall, 100 patients were immobilized with a house-made device. The patient rotation was compared against 28 patients immobilized with CIVCO's Kneefix and Feetfix. We found that the usually applied isotropic setup margin of 8 mm covered all the uncertainties related to patient setup for most RT treatments of the pelvis. However, margins of even 10.3 mm were needed for the female patients with very large pelvic target volumes centered either in the symphysis or in the sacrum containing both of these structures. This was because the effect of rotation (p ≤ 0.02) and the observer variation in image matching (p ≤ 0.04) were significantly larger for the female patients than for the male patients. Even with daily image guidance, the required margins remained larger for the women. Patient rotations were largest about the lateral axes. The difference between the required margins was only 1 mm for the 2 immobilization devices. The largest component of overall systematic position error came from patient rotation. This emphasizes the need for rotation correction. Overall, larger position errors and setup margins were observed for the female patients with pelvic cancer than for the male patients.« less

  13. MO-FG-CAMPUS-JeP3-01: A Statistical Model for Analyzing the Rotational Error of Single Iso-Center Technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, J; Dept of Radiation Oncology, New York Weill Cornell Medical Ctr, New York, NY

    Purpose: To develop a generalized statistical model that incorporates the treatment uncertainty from the rotational error of single iso-center technique, and calculate the additional PTV (planning target volume) margin required to compensate for this error. Methods: The random vectors for setup and additional rotation errors in the three-dimensional (3D) patient coordinate system were assumed to follow the 3D independent normal distribution with zero mean, and standard deviations σx, σy, σz, for setup error and a uniform σR for rotational error. Both random vectors were summed, normalized and transformed to the spherical coordinates to derive the chi distribution with 3 degreesmore » of freedom for the radical distance ρ. PTV margin was determined using the critical value of this distribution for 0.05 significant level so that 95% of the time the treatment target would be covered by ρ. The additional PTV margin required to compensate for the rotational error was calculated as a function of σx, σy, σz and σR. Results: The effect of the rotational error is more pronounced for treatments that requires high accuracy/precision like stereotactic radiosurgery (SRS) or stereotactic body radiotherapy (SBRT). With a uniform 2mm PTV margin (or σx =σy=σz=0.7mm), a σR=0.32mm will decrease the PTV coverage from 95% to 90% of the time, or an additional 0.2mm PTV margin is needed to prevent this loss of coverage. If we choose 0.2 mm as the threshold, any σR>0.3mm will lead to an additional PTV margin that cannot be ignored, and the maximal σR that can be ignored is 0.0064 rad (or 0.37°) for iso-to-target distance=5cm, or 0.0032 rad (or 0.18°) for iso-to-target distance=10cm. Conclusions: The rotational error cannot be ignored for high-accuracy/-precision treatments like SRS/SBRT, particularly when the distance between the iso-center and target is large.« less

  14. Efficacy and workload analysis of a fixed vertical couch position technique and a fixed‐action–level protocol in whole‐breast radiotherapy

    PubMed Central

    Verhoeven, Karolien; Weltens, Caroline; Van den Heuvel, Frank

    2015-01-01

    Quantification of the setup errors is vital to define appropriate setup margins preventing geographical misses. The no‐action–level (NAL) correction protocol reduces the systematic setup errors and, hence, the setup margins. The manual entry of the setup corrections in the record‐and‐verify software, however, increases the susceptibility of the NAL protocol to human errors. Moreover, the impact of the skin mobility on the anteroposterior patient setup reproducibility in whole‐breast radiotherapy (WBRT) is unknown. In this study, we therefore investigated the potential of fixed vertical couch position‐based patient setup in WBRT. The possibility to introduce a threshold for correction of the systematic setup errors was also explored. We measured the anteroposterior, mediolateral, and superior–inferior setup errors during fractions 1–12 and weekly thereafter with tangential angled single modality paired imaging. These setup data were used to simulate the residual setup errors of the NAL protocol, the fixed vertical couch position protocol, and the fixed‐action–level protocol with different correction thresholds. Population statistics of the setup errors of 20 breast cancer patients and 20 breast cancer patients with additional regional lymph node (LN) irradiation were calculated to determine the setup margins of each off‐line correction protocol. Our data showed the potential of the fixed vertical couch position protocol to restrict the systematic and random anteroposterior residual setup errors to 1.8 mm and 2.2 mm, respectively. Compared to the NAL protocol, a correction threshold of 2.5 mm reduced the frequency of mediolateral and superior–inferior setup corrections with 40% and 63%, respectively. The implementation of the correction threshold did not deteriorate the accuracy of the off‐line setup correction compared to the NAL protocol. The combination of the fixed vertical couch position protocol, for correction of the anteroposterior setup error, and the fixed‐action–level protocol with 2.5 mm correction threshold, for correction of the mediolateral and the superior–inferior setup errors, was proved to provide adequate and comparable patient setup accuracy in WBRT and WBRT with additional LN irradiation. PACS numbers: 87.53.Kn, 87.57.‐s

  15. PTV margin determination in conformal SRT of intracranial lesions

    PubMed Central

    Parker, Brent C.; Shiu, Almon S.; Maor, Moshe H.; Lang, Frederick F.; Liu, H. Helen; White, R. Allen; Antolak, John A.

    2002-01-01

    The planning target volume (PTV) includes the clinical target volume (CTV) to be irradiated and a margin to account for uncertainties in the treatment process. Uncertainties in miniature multileaf collimator (mMLC) leaf positioning, CT scanner spatial localization, CT‐MRI image fusion spatial localization, and Gill‐Thomas‐Cosman (GTC) relocatable head frame repositioning were quantified for the purpose of determining a minimum PTV margin that still delivers a satisfactory CTV dose. The measured uncertainties were then incorporated into a simple Monte Carlo calculation for evaluation of various margin and fraction combinations. Satisfactory CTV dosimetric criteria were selected to be a minimum CTV dose of 95% of the PTV dose and at least 95% of the CTV receiving 100% of the PTV dose. The measured uncertainties were assumed to be Gaussian distributions. Systematic errors were added linearly and random errors were added in quadrature assuming no correlation to arrive at the total combined error. The Monte Carlo simulation written for this work examined the distribution of cumulative dose volume histograms for a large patient population using various margin and fraction combinations to determine the smallest margin required to meet the established criteria. The program examined 5 and 30 fraction treatments, since those are the only fractionation schemes currently used at our institution. The fractionation schemes were evaluated using no margin, a margin of just the systematic component of the total uncertainty, and a margin of the systematic component plus one standard deviation of the total uncertainty. It was concluded that (i) a margin of the systematic error plus one standard deviation of the total uncertainty is the smallest PTV margin necessary to achieve the established CTV dose criteria, and (ii) it is necessary to determine the uncertainties introduced by the specific equipment and procedures used at each institution since the uncertainties may vary among locations. PACS number(s): 87.53.Kn, 87.53.Ly PMID:12132939

  16. Estimation of daily interfractional larynx residual setup error after isocentric alignment for head and neck radiotherapy: Quality-assurance implications for target volume and organ-at-risk margination using daily CT-on-rails imaging

    PubMed Central

    Baron, Charles A.; Awan, Musaddiq J.; Mohamed, Abdallah S. R.; Akel, Imad; Rosenthal, David I.; Gunn, G. Brandon; Garden, Adam S.; Dyer, Brandon A.; Court, Laurence; Sevak, Parag R; Kocak-Uzel, Esengul; Fuller, Clifton D.

    2016-01-01

    Larynx may alternatively serve as a target or organ-at-risk (OAR) in head and neck cancer (HNC) image-guided radiotherapy (IGRT). The objective of this study was to estimate IGRT parameters required for larynx positional error independent of isocentric alignment and suggest population–based compensatory margins. Ten HNC patients receiving radiotherapy (RT) with daily CT-on-rails imaging were assessed. Seven landmark points were placed on each daily scan. Taking the most superior anterior point of the C5 vertebra as a reference isocenter for each scan, residual displacement vectors to the other 6 points were calculated post-isocentric alignment. Subsequently, using the first scan as a reference, the magnitude of vector differences for all 6 points for all scans over the course of treatment were calculated. Residual systematic and random error, and the necessary compensatory CTV-to-PTV and OAR-to-PRV margins were calculated, using both observational cohort data and a bootstrap-resampled population estimator. The grand mean displacements for all anatomical points was 5.07mm, with mean systematic error of 1.1mm and mean random setup error of 2.63mm, while bootstrapped POIs grand mean displacement was 5.09mm, with mean systematic error of 1.23mm and mean random setup error of 2.61mm. Required margin for CTV-PTV expansion was 4.6mm for all cohort points, while the bootstrap estimator of the equivalent margin was 4.9mm. The calculated OAR-to-PRV expansion for the observed residual set-up error was 2.7mm, and bootstrap estimated expansion of 2.9mm. We conclude that the interfractional larynx setup error is a significant source of RT set-up/delivery error in HNC both when the larynx is considered as a CTV or OAR. We estimate the need for a uniform expansion of 5mm to compensate for set up error if the larynx is a target or 3mm if the larynx is an OAR when using a non-laryngeal bony isocenter. PMID:25679151

  17. Estimation of daily interfractional larynx residual setup error after isocentric alignment for head and neck radiotherapy: quality assurance implications for target volume and organs‐at‐risk margination using daily CT on‐rails imaging

    PubMed Central

    Baron, Charles A.; Awan, Musaddiq J.; Mohamed, Abdallah S.R.; Akel, Imad; Rosenthal, David I.; Gunn, G. Brandon; Garden, Adam S.; Dyer, Brandon A.; Court, Laurence; Sevak, Parag R.; Kocak‐Uzel, Esengul

    2014-01-01

    Larynx may alternatively serve as a target or organs at risk (OAR) in head and neck cancer (HNC) image‐guided radiotherapy (IGRT). The objective of this study was to estimate IGRT parameters required for larynx positional error independent of isocentric alignment and suggest population‐based compensatory margins. Ten HNC patients receiving radiotherapy (RT) with daily CT on‐rails imaging were assessed. Seven landmark points were placed on each daily scan. Taking the most superior‐anterior point of the C5 vertebra as a reference isocenter for each scan, residual displacement vectors to the other six points were calculated postisocentric alignment. Subsequently, using the first scan as a reference, the magnitude of vector differences for all six points for all scans over the course of treatment was calculated. Residual systematic and random error and the necessary compensatory CTV‐to‐PTV and OAR‐to‐PRV margins were calculated, using both observational cohort data and a bootstrap‐resampled population estimator. The grand mean displacements for all anatomical points was 5.07 mm, with mean systematic error of 1.1 mm and mean random setup error of 2.63 mm, while bootstrapped POIs grand mean displacement was 5.09 mm, with mean systematic error of 1.23 mm and mean random setup error of 2.61 mm. Required margin for CTV‐PTV expansion was 4.6 mm for all cohort points, while the bootstrap estimator of the equivalent margin was 4.9 mm. The calculated OAR‐to‐PRV expansion for the observed residual setup error was 2.7 mm and bootstrap estimated expansion of 2.9 mm. We conclude that the interfractional larynx setup error is a significant source of RT setup/delivery error in HNC, both when the larynx is considered as a CTV or OAR. We estimate the need for a uniform expansion of 5 mm to compensate for setup error if the larynx is a target, or 3 mm if the larynx is an OAR, when using a nonlaryngeal bony isocenter. PACS numbers: 87.55.D‐, 87.55.Qr

  18. A model for predicting propagation-related DSCS (Defense Communications Engineering Center) margin requirements

    NASA Astrophysics Data System (ADS)

    Shultheis, C. F.

    1985-02-01

    This technical report describes an analysis of the performance allocations for a satellite link, focusing specifically on a single-hop 7 to 8 GHz link of the Defense Satellite Communications System (DSCS). The analysis is performed for three primary reasons: (1) to reevaluate link power margin requirements for DSCS links based on digital signalling; (2) to analyze the implications of satellite availability and error rate allocations contained in proposed MIL-STD-188-323, system design and engineering standards for long haul digital transmission system performance; and (3) to standardize a methodology for determination of rain-related propagation constraints. The aforementioned methodology is then used to calculate the link margin requirements of typical DSCS binary/quaternary phase shift keying (BPSK/QPSK) links at 7 to 8 GHz for several different Earth terminal locations.

  19. Improved Margin of Error Estimates for Proportions in Business: An Educational Example

    ERIC Educational Resources Information Center

    Arzumanyan, George; Halcoussis, Dennis; Phillips, G. Michael

    2015-01-01

    This paper presents the Agresti & Coull "Adjusted Wald" method for computing confidence intervals and margins of error for common proportion estimates. The presented method is easily implementable by business students and practitioners and provides more accurate estimates of proportions particularly in extreme samples and small…

  20. Precision diet formulation to improve performance and profitability across various climates: Modeling the implications of increasing the formulation frequency of dairy cattle diets.

    PubMed

    White, Robin R; Capper, Judith L

    2014-03-01

    The objective of this study was to use a precision nutrition model to simulate the relationship between diet formulation frequency and dairy cattle performance across various climates. Agricultural Modeling and Training Systems (AMTS) CattlePro diet-balancing software (Cornell Research Foundation, Ithaca, NY) was used to compare 3 diet formulation frequencies (weekly, monthly, or seasonal) and 3 levels of climate variability (hot, cold, or variable). Predicted daily milk yield (MY), metabolizable energy (ME) balance, and dry matter intake (DMI) were recorded for each frequency-variability combination. Economic analysis was conducted to calculate the predicted revenue over feed and labor costs. Diet formulation frequency affected ME balance and MY but did not affect DMI. Climate variability affected ME balance and DMI but not MY. The interaction between climate variability and formulation frequency did not affect ME balance, MY, or DMI. Formulating diets more frequently increased MY, DMI, and ME balance. Economic analysis showed that formulating diets weekly rather than seasonally could improve returns over variable costs by $25,000 per year for a moderate-sized (300-cow) operation. To achieve this increase in returns, an entire feeding system margin of error of <1% was required. Formulating monthly, rather than seasonally, may be a more feasible alternative as this requires a margin of error of only 2.5% for the entire feeding system. Feeding systems with a low margin of error must be developed to better take advantage of the benefits of precision nutrition. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  1. Correcting for Measurement Error in Time-Varying Covariates in Marginal Structural Models.

    PubMed

    Kyle, Ryan P; Moodie, Erica E M; Klein, Marina B; Abrahamowicz, Michał

    2016-08-01

    Unbiased estimation of causal parameters from marginal structural models (MSMs) requires a fundamental assumption of no unmeasured confounding. Unfortunately, the time-varying covariates used to obtain inverse probability weights are often error-prone. Although substantial measurement error in important confounders is known to undermine control of confounders in conventional unweighted regression models, this issue has received comparatively limited attention in the MSM literature. Here we propose a novel application of the simulation-extrapolation (SIMEX) procedure to address measurement error in time-varying covariates, and we compare 2 approaches. The direct approach to SIMEX-based correction targets outcome model parameters, while the indirect approach corrects the weights estimated using the exposure model. We assess the performance of the proposed methods in simulations under different clinically plausible assumptions. The simulations demonstrate that measurement errors in time-dependent covariates may induce substantial bias in MSM estimators of causal effects of time-varying exposures, and that both proposed SIMEX approaches yield practically unbiased estimates in scenarios featuring low-to-moderate degrees of error. We illustrate the proposed approach in a simple analysis of the relationship between sustained virological response and liver fibrosis progression among persons infected with hepatitis C virus, while accounting for measurement error in γ-glutamyltransferase, using data collected in the Canadian Co-infection Cohort Study from 2003 to 2014. © The Author 2016. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  2. Face recognition using total margin-based adaptive fuzzy support vector machines.

    PubMed

    Liu, Yi-Hung; Chen, Yen-Ting

    2007-01-01

    This paper presents a new classifier called total margin-based adaptive fuzzy support vector machines (TAF-SVM) that deals with several problems that may occur in support vector machines (SVMs) when applied to the face recognition. The proposed TAF-SVM not only solves the overfitting problem resulted from the outlier with the approach of fuzzification of the penalty, but also corrects the skew of the optimal separating hyperplane due to the very imbalanced data sets by using different cost algorithm. In addition, by introducing the total margin algorithm to replace the conventional soft margin algorithm, a lower generalization error bound can be obtained. Those three functions are embodied into the traditional SVM so that the TAF-SVM is proposed and reformulated in both linear and nonlinear cases. By using two databases, the Chung Yuan Christian University (CYCU) multiview and the facial recognition technology (FERET) face databases, and using the kernel Fisher's discriminant analysis (KFDA) algorithm to extract discriminating face features, experimental results show that the proposed TAF-SVM is superior to SVM in terms of the face-recognition accuracy. The results also indicate that the proposed TAF-SVM can achieve smaller error variances than SVM over a number of tests such that better recognition stability can be obtained.

  3. Exact test-based approach for equivalence test with parameter margin.

    PubMed

    Cassie Dong, Xiaoyu; Bian, Yuanyuan; Tsong, Yi; Wang, Tianhua

    2017-01-01

    The equivalence test has a wide range of applications in pharmaceutical statistics which we need to test for the similarity between two groups. In recent years, the equivalence test has been used in assessing the analytical similarity between a proposed biosimilar product and a reference product. More specifically, the mean values of the two products for a given quality attribute are compared against an equivalence margin in the form of ±f × σ R , where ± f × σ R is a function of the reference variability. In practice, this margin is unknown and is estimated from the sample as ±f × S R . If we use this estimated margin with the classic t-test statistic on the equivalence test for the means, both Type I and Type II error rates may inflate. To resolve this issue, we develop an exact-based test method and compare this method with other proposed methods, such as the Wald test, the constrained Wald test, and the Generalized Pivotal Quantity (GPQ) in terms of Type I error rate and power. Application of those methods on data analysis is also provided in this paper. This work focuses on the development and discussion of the general statistical methodology and is not limited to the application of analytical similarity.

  4. The Development of Dynamic Human Reliability Analysis Simulations for Inclusion in Risk Informed Safety Margin Characterization Frameworks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeffrey C. Joe; Diego Mandelli; Ronald L. Boring

    2015-07-01

    The United States Department of Energy is sponsoring the Light Water Reactor Sustainability program, which has the overall objective of supporting the near-term and the extended operation of commercial nuclear power plants. One key research and development (R&D) area in this program is the Risk-Informed Safety Margin Characterization pathway, which combines probabilistic risk simulation with thermohydraulic simulation codes to define and manage safety margins. The R&D efforts to date, however, have not included robust simulations of human operators, and how the reliability of human performance or lack thereof (i.e., human errors) can affect risk-margins and plant performance. This paper describesmore » current and planned research efforts to address the absence of robust human reliability simulations and thereby increase the fidelity of simulated accident scenarios.« less

  5. SU-F-T-394: Impact of PTV Margins With Taking Into Account Shape Variation On IMRT Plans For Prostate Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hirose, T; Arimura, H; Oga, S

    2016-06-15

    Purpose: The purpose of this study was to investigate the impact of planning target volume (PTV) margins with taking into consideration clinical target volume (CTV) shape variations on treatment plans of intensity modulated radiation therapy (IMRT) for prostate cancer. Methods: The systematic errors and the random errors for patient setup errors in right-left (RL), anterior-posterior (AP), and superior-inferior (SI) directions were obtained from data of 20 patients, and those for CTV shape variations were calculated from 10 patients, who were weekly scanned using cone beam computed tomography (CBCT). The setup error was defined as the difference in prostate centers betweenmore » planning CT and CBCT images after bone-based registrations. CTV shape variations of high, intermediate and low risk CTVs were calculated for each patient from variances of interfractional shape variations on each vertex of three-dimensional CTV point distributions, which were manually obtained from CTV contours on the CBCT images. PTV margins were calculated using the setup errors with and without CTV shape variations for each risk CTV. Six treatment plans were retrospectively made by using the PTV margins with and without CTV shape variations for the three risk CTVs of 5 test patients. Furthermore, the treatment plans were applied to CBCT images for investigating the impact of shape variations on PTV margins. Results: The percentages of population to cover with the PTV, which satisfies the CTV D98 of 95%, with and without the shape variations were 89.7% and 74.4% for high risk, 89.7% and 76.9% for intermediate risk, 84.6% and 76.9% for low risk, respectively. Conclusion: PTV margins taking into account CTV shape variation provide significant improvement of applicable percentage of population (P < 0.05). This study suggested that CTV shape variation should be taken consideration into determination of the PTV margins.« less

  6. Multidimensional Analysis of Nuclear Detonations

    DTIC Science & Technology

    2015-09-17

    Features on the nuclear weapons testing films because of the expanding and emissive nature of the nuclear fireball. The use of these techniques to produce...Treaty (New Start Treaty) have reduced the acceptable margins of error. Multidimensional analysis provides the modern approach to nuclear weapon ...scientific community access to the information necessary to expand upon the knowledge of nuclear weapon effects. This data set has the potential to provide

  7. The potential failure risk of the cone-beam computed tomography-based planning target volume margin definition for prostate image-guided radiotherapy based on a prospective single-institutional hybrid analysis.

    PubMed

    Hirose, Katsumi; Sato, Mariko; Hatayama, Yoshiomi; Kawaguchi, Hideo; Komai, Fumio; Sohma, Makoto; Obara, Hideki; Suzuki, Masashi; Tanaka, Mitsuki; Fujioka, Ichitaro; Ichise, Koji; Takai, Yoshihiro; Aoki, Masahiko

    2018-06-07

    The purpose of this study was to evaluate the impact of markerless on-board kilovoltage (kV) cone-beam computed tomography (CBCT)-based positioning uncertainty on determination of the planning target volume (PTV) margin by comparison with kV on-board imaging (OBI) with gold fiducial markers (FMs), and to validate a methodology for the evaluation of PTV margins for markerless kV-CBCT in prostate image-guided radiotherapy (IGRT). A total of 1177 pre- and 1177 post-treatment kV-OBI and 1177 pre- and 206 post-treatment kV-CBCT images were analyzed in 25 patients who received prostate IGRT with daily localization by implanted FMs. Intrafractional motion of the prostate was evaluated between each pre- and post-treatment image with these two different techniques. The differences in prostate deviations and intrafractional motions between matching by FM in kV-OBI (OBI-FM) and matching by soft tissues in kV-CBCT (CBCT-ST) were compared by Bland-Altman limits of agreement. Compensated PTV margins were determined and compensated by references. Mean differences between OBI-FM and CBCT-ST in the anterior to posterior (AP), superior to inferior (SI), and left to right (LR) directions were - 0.43 ± 1.45, - 0.09 ± 1.65, and - 0.12 ± 0.80 mm, respectively, with R 2  = 0.85, 0.88, and 0.83, respectively. Intrafractional motions obtained from CBCT-ST were 0.00 ± 1.46, 0.02 ± 1.49, and 0.15 ± 0.64 mm, respectively, which were smaller than the results from OBI-FM, with 0.43 ± 1.90, 0.12 ± 1.98, and 0.26 ± 0.80 mm, respectively, with R 2  = 0.42, 0.33, and 0.16, respectively. Bland-Altman analysis showed a significant proportional bias. PTV margins of 1.5 mm, 1.4 mm, and 0.9 mm for CBCT-ST were calculated from the values of CBCT-ST, which were also smaller than the values of 3.15 mm, 3.66 mm, and 1.60 mm from OBI-FM. The practical PTV margin for CBCT-ST was compensated with the values from OBI-FM as 4.1 mm, 4.8 mm, and 2.2 mm. PTV margins calculated from CBCT-ST might be underestimated compared to the true PTV margins. To determine a reliable CBCT-ST-based PTV margin, at least the systemic error Σ and the random error σ for on-line matching errors need to be investigated by supportive preliminary FM evaluation at least once.

  8. Analysis of Prostate Patient Setup and Tracking Data: Potential Intervention Strategies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Su Zhong, E-mail: zsu@floridaproton.org; Zhang Lisha; Murphy, Martin

    Purpose: To evaluate the setup, interfraction, and intrafraction organ motion error distributions and simulate intrafraction intervention strategies for prostate radiotherapy. Methods and Materials: A total of 17 patients underwent treatment setup and were monitored using the Calypso system during radiotherapy. On average, the prostate tracking measurements were performed for 8 min/fraction for 28 fractions for each patient. For both patient couch shift data and intrafraction organ motion data, the systematic and random errors were obtained from the patient population. The planning target volume margins were calculated using the van Herk formula. Two intervention strategies were simulated using the tracking data:more » the deviation threshold and period. The related planning target volume margins, time costs, and prostate position 'fluctuation' were presented. Results: The required treatment margin for the left-right, superoinferior, and anteroposterior axes was 8.4, 10.8, and 14.7 mm for skin mark-only setup and 1.3, 2.3, and 2.8 mm using the on-line setup correction, respectively. Prostate motion significantly correlated among the superoinferior and anteroposterior directions. Of the 17 patients, 14 had prostate motion within 5 mm of the initial setup position for {>=}91.6% of the total tracking time. The treatment margin decreased to 1.1, 1.8, and 2.3 mm with a 3-mm threshold correction and to 0.5, 1.0, and 1.5 mm with an every-2-min correction in the left-right, superoinferior, and anteroposterior directions, respectively. The periodic corrections significantly increase the treatment time and increased the number of instances when the setup correction was made during transient excursions. Conclusions: The residual systematic and random error due to intrafraction prostate motion is small after on-line setup correction. Threshold-based and time-based intervention strategies both reduced the planning target volume margins. The time-based strategies increased the treatment time and the in-fraction position fluctuation.« less

  9. A Reassessment of the Precision of Carbonate Clumped Isotope Measurements: Implications for Calibrations and Paleoclimate Reconstructions

    NASA Astrophysics Data System (ADS)

    Fernandez, Alvaro; Müller, Inigo A.; Rodríguez-Sanz, Laura; van Dijk, Joep; Looser, Nathan; Bernasconi, Stefano M.

    2017-12-01

    Carbonate clumped isotopes offer a potentially transformational tool to interpret Earth's history, but the proxy is still limited by poor interlaboratory reproducibility. Here, we focus on the uncertainties that result from the analysis of only a few replicate measurements to understand the extent to which unconstrained errors affect calibration relationships and paleoclimate reconstructions. We find that highly precise data can be routinely obtained with multiple replicate analyses, but this is not always done in many laboratories. For instance, using published estimates of external reproducibilities we find that typical clumped isotope measurements (three replicate analyses) have margins of error at the 95% confidence level (CL) that are too large for many applications. These errors, however, can be systematically reduced with more replicate measurements. Second, using a Monte Carlo-type simulation we demonstrate that the degree of disagreement on published calibration slopes is about what we should expect considering the precision of Δ47 data, the number of samples and replicate analyses, and the temperature range covered in published calibrations. Finally, we show that the way errors are typically reported in clumped isotope data can be problematic and lead to the impression that data are more precise than warranted. We recommend that uncertainties in Δ47 data should no longer be reported as the standard error of a few replicate measurements. Instead, uncertainties should be reported as margins of error at a specified confidence level (e.g., 68% or 95% CL). These error bars are a more realistic indication of the reliability of a measurement.

  10. 76 FR 52937 - Ball Bearings and Parts Thereof From France, Germany, and Italy: Final Results of Antidumping...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-24

    ... programming and other errors in the margin calculations. Therefore, the final results are different from the... certain companies. We have corrected programming and other errors in the margins we included in the... Technologies GmbH; Italy--Schaeffler Italia S.r.l./WPB Water Pump Bearing GmbH & Co. KG/The Schaeffler Group...

  11. A Novel TRM Calculation Method by Probabilistic Concept

    NASA Astrophysics Data System (ADS)

    Audomvongseree, Kulyos; Yokoyama, Akihiko; Verma, Suresh Chand; Nakachi, Yoshiki

    In a new competitive environment, it becomes possible for the third party to access a transmission facility. From this structure, to efficiently manage the utilization of the transmission network, a new definition about Available Transfer Capability (ATC) has been proposed. According to the North American ElectricReliability Council (NERC)’s definition, ATC depends on several parameters, i. e. Total Transfer Capability (TTC), Transmission Reliability Margin (TRM), and Capacity Benefit Margin (CBM). This paper is focused on the calculation of TRM which is one of the security margin reserved for any uncertainty of system conditions. The TRM calculation by probabilistic method is proposed in this paper. Based on the modeling of load forecast error and error in transmission line limitation, various cases of transmission transfer capability and its related probabilistic nature can be calculated. By consideration of the proposed concept of risk analysis, the appropriate required amount of TRM can be obtained. The objective of this research is to provide realistic information on the actual ability of the network which may be an alternative choice for system operators to make an appropriate decision in the competitive market. The advantages of the proposed method are illustrated by application to the IEEJ-WEST10 model system.

  12. Residual position errors of lymph node surrogates in breast cancer adjuvant radiotherapy: Comparison of two arm fixation devices and the effect of arm position correction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kapanen, Mika; Department of Medical Physics, Tampere University Hospital; Laaksomaa, Marko, E-mail: Marko.Laaksomaa@pshp.fi

    2016-04-01

    Residual position errors of the lymph node (LN) surrogates and humeral head (HH) were determined for 2 different arm fixation devices in radiotherapy (RT) of breast cancer: a standard wrist-hold (WH) and a house-made rod-hold (RH). The effect of arm position correction (APC) based on setup images was also investigated. A total of 113 consecutive patients with early-stage breast cancer with LN irradiation were retrospectively analyzed (53 and 60 using the WH and RH, respectively). Residual position errors of the LN surrogates (Th1-2 and clavicle) and the HH were investigated to compare the 2 fixation devices. The position errors andmore » setup margins were determined before and after the APC to investigate the efficacy of the APC in the treatment situation. A threshold of 5 mm was used for the residual errors of the clavicle and Th1-2 to perform the APC, and a threshold of 7 mm was used for the HH. The setup margins were calculated with the van Herk formula. Irradiated volumes of the HH were determined from RT treatment plans. With the WH and the RH, setup margins up to 8.1 and 6.7 mm should be used for the LN surrogates, and margins up to 4.6 and 3.6 mm should be used to spare the HH, respectively, without the APC. After the APC, the margins of the LN surrogates were equal to or less than 7.5/6.0 mm with the WH/RH, but margins up to 4.2/2.9 mm were required for the HH. The APC was needed at least once with both the devices for approximately 60% of the patients. With the RH, irradiated volume of the HH was approximately 2 times more than with the WH, without any dose constraints. Use of the RH together with the APC resulted in minimal residual position errors and setup margins for all the investigated bony landmarks. Based on the obtained results, we prefer the house-made RH. However, more attention should be given to minimize the irradiation of the HH with the RH than with the WH.« less

  13. SU-E-J-88: The Study of Setup Error Measured by CBCT in Postoperative Radiotherapy for Cervical Carcinoma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Runxiao, L; Aikun, W; Xiaomei, F

    2015-06-15

    Purpose: To compare two registration methods in the CBCT guided radiotherapy for cervical carcinoma, analyze the setup errors and registration methods, determine the margin required for clinical target volume(CTV) extending to planning target volume(PTV). Methods: Twenty patients with cervical carcinoma were enrolled. All patients were underwent CT simulation in the supine position. Transfering the CT images to the treatment planning system and defining the CTV, PTV and the organs at risk (OAR), then transmit them to the XVI workshop. CBCT scans were performed before radiotherapy and registered to planning CT images according to bone and gray value registration methods. Comparedmore » two methods and obtain left-right(X), superior-inferior(Y), anterior-posterior (Z) setup errors, the margin required for CTV to PTV were calculated. Results: Setup errors were unavoidable in postoperative cervical carcinoma irradiation. The setup errors measured by method of bone (systemic ± random) on X(1eft.right),Y(superior.inferior),Z(anterior.posterior) directions were(0.24±3.62),(0.77±5.05) and (0.13±3.89)mm, respectively, the setup errors measured by method of grey (systemic ± random) on X(1eft-right), Y(superior-inferior), Z(anterior-posterior) directions were(0.31±3.93), (0.85±5.16) and (0.21±4.12)mm, respectively.The spatial distributions of setup error was maximum in Y direction. The margins were 4 mm in X axis, 6 mm in Y axis, 4 mm in Z axis respectively.These two registration methods were similar and highly recommended. Conclusion: Both bone and grey registration methods could offer an accurate setup error. The influence of setup errors of a PTV margin would be suggested by 4mm, 4mm and 6mm on X, Y and Z directions for postoperative radiotherapy for cervical carcinoma.« less

  14. Improved cosmological constraints on the curvature and equation of state of dark energy

    NASA Astrophysics Data System (ADS)

    Pan, Nana; Gong, Yungui; Chen, Yun; Zhu, Zong-Hong

    2010-08-01

    We apply the Constitution compilation of 397 supernova Ia, the baryon acoustic oscillation measurements including the A parameter, the distance ratio and the radial data, the five-year Wilkinson microwave anisotropy probe and the Hubble parameter data to study the geometry of the Universe and the property of dark energy by using the popular Chevallier-Polarski-Linder and Jassal-Bagla-Padmanabhan parameterizations. We compare the simple χ2 method of joined contour estimation and the Monte Carlo Markov chain method, and find that it is necessary to make the marginalized analysis on the error estimation. The probabilities of Ωk and wa in the Chevallier-Polarski-Linder model are skew distributions, and the marginalized 1σ errors are Ωm = 0.279+0.015- 0.008, Ωk = 0.005+0.006- 0.011, w0 = -1.05+0.23- 0.06 and wa = 0.5+0.3- 1.5. For the Jassal-Bagla-Padmanabhan model, the marginalized 1σ errors are Ωm = 0.281+0.015- 0.01, Ωk = 0.000+0.007- 0.006, w0 = -0.96+0.25- 0.18 and wa = -0.6+1.9- 1.6. The equation of state parameter w(z) of dark energy is negative in the redshift range 0 <= z <= 2 at more than 3σ level. The flat ΛCDM model is consistent with the current observational data at the 1σ level.

  15. --No Title--

    Science.gov Websites

    body{min-width:0px !important}div#widget-content{height:100%;width:100%}div#widget-chart-container {float:right}div#widget-error-container{height:100%}div#widget-error-container div#widget-error{margin-right:20

  16. Bayesian generalized least squares regression with application to log Pearson type 3 regional skew estimation

    NASA Astrophysics Data System (ADS)

    Reis, D. S.; Stedinger, J. R.; Martins, E. S.

    2005-10-01

    This paper develops a Bayesian approach to analysis of a generalized least squares (GLS) regression model for regional analyses of hydrologic data. The new approach allows computation of the posterior distributions of the parameters and the model error variance using a quasi-analytic approach. Two regional skew estimation studies illustrate the value of the Bayesian GLS approach for regional statistical analysis of a shape parameter and demonstrate that regional skew models can be relatively precise with effective record lengths in excess of 60 years. With Bayesian GLS the marginal posterior distribution of the model error variance and the corresponding mean and variance of the parameters can be computed directly, thereby providing a simple but important extension of the regional GLS regression procedures popularized by Tasker and Stedinger (1989), which is sensitive to the likely values of the model error variance when it is small relative to the sampling error in the at-site estimator.

  17. On-Line Mu Method for Robust Flutter Prediction in Expanding a Safe Flight Envelope for an Aircraft Model Under Flight Test

    NASA Technical Reports Server (NTRS)

    Lind, Richard C. (Inventor); Brenner, Martin J.

    2001-01-01

    A structured singular value (mu) analysis method of computing flutter margins has robust stability of a linear aeroelastic model with uncertainty operators (Delta). Flight data is used to update the uncertainty operators to accurately account for errors in the computed model and the observed range of aircraft dynamics of the aircraft under test caused by time-varying aircraft parameters, nonlinearities, and flight anomalies, such as test nonrepeatability. This mu-based approach computes predict flutter margins that are worst case with respect to the modeling uncertainty for use in determining when the aircraft is approaching a flutter condition and defining an expanded safe flight envelope for the aircraft that is accepted with more confidence than traditional methods that do not update the analysis algorithm with flight data by introducing mu as a flutter margin parameter that presents several advantages over tracking damping trends as a measure of a tendency to instability from available flight data.

  18. Dopamine Reward Prediction Error Responses Reflect Marginal Utility

    PubMed Central

    Stauffer, William R.; Lak, Armin; Schultz, Wolfram

    2014-01-01

    Summary Background Optimal choices require an accurate neuronal representation of economic value. In economics, utility functions are mathematical representations of subjective value that can be constructed from choices under risk. Utility usually exhibits a nonlinear relationship to physical reward value that corresponds to risk attitudes and reflects the increasing or decreasing marginal utility obtained with each additional unit of reward. Accordingly, neuronal reward responses coding utility should robustly reflect this nonlinearity. Results In two monkeys, we measured utility as a function of physical reward value from meaningful choices under risk (that adhered to first- and second-order stochastic dominance). The resulting nonlinear utility functions predicted the certainty equivalents for new gambles, indicating that the functions’ shapes were meaningful. The monkeys were risk seeking (convex utility function) for low reward and risk avoiding (concave utility function) with higher amounts. Critically, the dopamine prediction error responses at the time of reward itself reflected the nonlinear utility functions measured at the time of choices. In particular, the reward response magnitude depended on the first derivative of the utility function and thus reflected the marginal utility. Furthermore, dopamine responses recorded outside of the task reflected the marginal utility of unpredicted reward. Accordingly, these responses were sufficient to train reinforcement learning models to predict the behaviorally defined expected utility of gambles. Conclusions These data suggest a neuronal manifestation of marginal utility in dopamine neurons and indicate a common neuronal basis for fundamental explanatory constructs in animal learning theory (prediction error) and economic decision theory (marginal utility). PMID:25283778

  19. Cone-Beam Computed Tomography–Guided Positioning of Laryngeal Cancer Patients with Large Interfraction Time Trends in Setup and Nonrigid Anatomy Variations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gangsaas, Anne, E-mail: a.gangsaas@erasmusmc.nl; Astreinidou, Eleftheria; Quint, Sandra

    2013-10-01

    Purpose: To investigate interfraction setup variations of the primary tumor, elective nodes, and vertebrae in laryngeal cancer patients and to validate protocols for cone beam computed tomography (CBCT)-guided correction. Methods and Materials: For 30 patients, CBCT-measured displacements in fractionated treatments were used to investigate population setup errors and to simulate residual setup errors for the no action level (NAL) offline protocol, the extended NAL (eNAL) protocol, and daily CBCT acquisition with online analysis and repositioning. Results: Without corrections, 12 of 26 patients treated with radical radiation therapy would have experienced a gradual change (time trend) in primary tumor setup ≥4more » mm in the craniocaudal (CC) direction during the fractionated treatment (11/12 in caudal direction, maximum 11 mm). Due to these trends, correction of primary tumor displacements with NAL resulted in large residual CC errors (required margin 6.7 mm). With the weekly correction vector adjustments in eNAL, the trends could be largely compensated (CC margin 3.5 mm). Correlation between movements of the primary and nodal clinical target volumes (CTVs) in the CC direction was poor (r{sup 2}=0.15). Therefore, even with online setup corrections of the primary CTV, the required CC margin for the nodal CTV was as large as 6.8 mm. Also for the vertebrae, large time trends were observed for some patients. Because of poor CC correlation (r{sup 2}=0.19) between displacements of the primary CTV and the vertebrae, even with daily online repositioning of the vertebrae, the required CC margin around the primary CTV was 6.9 mm. Conclusions: Laryngeal cancer patients showed substantial interfraction setup variations, including large time trends, and poor CC correlation between primary tumor displacements and motion of the nodes and vertebrae (internal tumor motion). These trends and nonrigid anatomy variations have to be considered in the choice of setup verification protocol and planning target volume margins. eNAL could largely compensate time trends with minor prolongation of fraction time.« less

  20. An Image-Guided Study of Setup Reproducibility of Postmastectomy Breast Cancer Patients Treated With Inverse-Planned Intensity Modulated Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feng, Christine H.; Gerry, Emily; Chmura, Steven J.

    2015-01-01

    Purpose: To calculate planning target volume (PTV) margins for chest wall and regional nodal targets using daily orthogonal kilovolt (kV) imaging and to study residual setup error after kV alignment using volumetric cone-beam computed tomography (CBCT). Methods and Materials: Twenty-one postmastectomy patients were treated with intensity modulated radiation therapy with 7-mm PTV margins. Population-based PTV margins were calculated from translational shifts after daily kV positioning and/or weekly CBCT data for each of 8 patients, whose surgical clips were used as surrogates for target volumes. Errors from kV and CBCT data were mathematically combined to generate PTV margins for 3 simulatedmore » alignment workflows: (1) skin marks alone; (2) weekly kV imaging; and (3) daily kV imaging. Results: The kV data from 613 treatment fractions indicated that a 7-mm uniform margin would account for 95% of daily shifts if patients were positioned using only skin marks. Total setup errors incorporating both kV and CBCT data were larger than those from kV alone, yielding PTV expansions of 7 mm anterior–posterior, 9 mm left–right, and 9 mm superior–inferior. Required PTV margins after weekly kV imaging were similar in magnitude as alignment to skin marks, but rotational adjustments of patients were required in 32% ± 17% of treatments. These rotations would have remained uncorrected without the use of daily kV imaging. Despite the use of daily kV imaging, CBCT data taken at the treatment position indicate that an anisotropic PTV margin of 6 mm anterior–posterior, 4 mm left–right, and 8 mm superior–inferior must be retained to account for residual errors. Conclusions: Cone-beam CT provides additional information on 3-dimensional reproducibility of treatment setup for chest wall targets. Three-dimensional data indicate that a uniform 7-mm PTV margin is insufficient in the absence of daily IGRT. Interfraction movement is greater than suggested by 2-dimensional imaging, thus a margin of at least 4 to 8 mm must be retained despite the use of daily IGRT.« less

  1. Sample sizes needed for specified margins of relative error in the estimates of the repeatability and reproducibility standard deviations.

    PubMed

    McClure, Foster D; Lee, Jung K

    2005-01-01

    Sample size formulas are developed to estimate the repeatability and reproducibility standard deviations (Sr and S(R)) such that the actual error in (Sr and S(R)) relative to their respective true values, sigmar and sigmaR, are at predefined levels. The statistical consequences associated with AOAC INTERNATIONAL required sample size to validate an analytical method are discussed. In addition, formulas to estimate the uncertainties of (Sr and S(R)) were derived and are provided as supporting documentation. Formula for the Number of Replicates Required for a Specified Margin of Relative Error in the Estimate of the Repeatability Standard Deviation.

  2. A statistical model for analyzing the rotational error of single isocenter for multiple targets technique.

    PubMed

    Chang, Jenghwa

    2017-06-01

    To develop a statistical model that incorporates the treatment uncertainty from the rotational error of the single isocenter for multiple targets technique, and calculates the extra PTV (planning target volume) margin required to compensate for this error. The random vector for modeling the setup (S) error in the three-dimensional (3D) patient coordinate system was assumed to follow a 3D normal distribution with a zero mean, and standard deviations of σ x , σ y , σ z . It was further assumed that the rotation of clinical target volume (CTV) about the isocenter happens randomly and follows a three-dimensional (3D) independent normal distribution with a zero mean and a uniform standard deviation of σ δ . This rotation leads to a rotational random error (R), which also has a 3D independent normal distribution with a zero mean and a uniform standard deviation of σ R equal to the product of σδπ180 and dI⇔T, the distance between the isocenter and CTV. Both (S and R) random vectors were summed, normalized, and transformed to the spherical coordinates to derive the Chi distribution with three degrees of freedom for the radial coordinate of S+R. PTV margin was determined using the critical value of this distribution for a 0.05 significance level so that 95% of the time the treatment target would be covered by the prescription dose. The additional PTV margin required to compensate for the rotational error was calculated as a function of σ R and dI⇔T. The effect of the rotational error is more pronounced for treatments that require high accuracy/precision like stereotactic radiosurgery (SRS) or stereotactic body radiotherapy (SBRT). With a uniform 2-mm PTV margin (or σ x = σ y = σ z = 0.715 mm), a σ R = 0.328 mm will decrease the CTV coverage probability from 95.0% to 90.9%, or an additional 0.2-mm PTV margin is needed to prevent this loss of coverage. If we choose 0.2 mm as the threshold, any σ R > 0.328 mm will lead to an extra PTV margin that cannot be ignored, and the maximal σ δ that can be ignored is 0.45° (or 0.0079 rad ) for dI⇔T = 50 mm or 0.23° (or 0.004 rad ) for dI⇔T = 100 mm. The rotational error cannot be ignored for high-accuracy/-precision treatments like SRS/SBRT, particularly when the distance between the isocenter and target is large. © 2017 American Association of Physicists in Medicine.

  3. Performance analysis of an integrated GPS/inertial attitude determination system. M.S. Thesis - MIT

    NASA Technical Reports Server (NTRS)

    Sullivan, Wendy I.

    1994-01-01

    The performance of an integrated GPS/inertial attitude determination system is investigated using a linear covariance analysis. The principles of GPS interferometry are reviewed, and the major error sources of both interferometers and gyroscopes are discussed and modeled. A new figure of merit, attitude dilution of precision (ADOP), is defined for two possible GPS attitude determination methods, namely single difference and double difference interferometry. Based on this figure of merit, a satellite selection scheme is proposed. The performance of the integrated GPS/inertial attitude determination system is determined using a linear covariance analysis. Based on this analysis, it is concluded that the baseline errors (i.e., knowledge of the GPS interferometer baseline relative to the vehicle coordinate system) are the limiting factor in system performance. By reducing baseline errors, it should be possible to use lower quality gyroscopes without significantly reducing performance. For the cases considered, single difference interferometry is only marginally better than double difference interferometry. Finally, the performance of the system is found to be relatively insensitive to the satellite selection technique.

  4. Losses from effluent taxes and quotas under uncertainty

    USGS Publications Warehouse

    Watson, W.D.; Ridker, R.G.

    1984-01-01

    Recent theoretical papers by Adar and Griffin (J. Environ. Econ. Manag.3, 178-188 (1976)), Fishelson (J. Environ. Econ. Manag.3, 189-197 (1976)), and Weitzman (Rev. Econ. Studies41, 477-491 (1974)) show that,different expected social losses arise from using effluent taxes and quotas as alternative control instruments when marginal control costs are uncertain. Key assumptions in these analyses are linear marginal cost and benefit functions and an additive error for the marginal cost function (to reflect uncertainty). In this paper, empirically derived nonlinear functions and more realistic multiplicative error terms are used to estimate expected control and damage costs and to identify (empirically) the mix of control instruments that minimizes expected losses. ?? 1984.

  5. Analysis of Geometric Shifts and Proper Setup-Margin in Prostate Cancer Patients Treated With Pelvic Intensity-Modulated Radiotherapy Using Endorectal Ballooning and Daily Enema for Prostate Immobilization.

    PubMed

    Jeong, Songmi; Lee, Jong Hoon; Chung, Mi Joo; Lee, Sea Won; Lee, Jeong Won; Kang, Dae Gyu; Kim, Sung Hwan

    2016-01-01

    We evaluate geometric shifts of daily setup for evaluating the appropriateness of treatment and determining proper margins for the planning target volume (PTV) in prostate cancer patients.We analyzed 1200 sets of pretreatment megavoltage-CT scans that were acquired from 40 patients with intermediate to high-risk prostate cancer. They received whole pelvic intensity-modulated radiotherapy (IMRT). They underwent daily endorectal ballooning and enema to limit intrapelvic organ movement. The mean and standard deviation (SD) of daily translational shifts in right-to-left (X), anterior-to-posterior (Y), and superior-to-inferior (Z) were evaluated for systemic and random error.The mean ± SD of systemic error (Σ) in X, Y, Z, and roll was 2.21 ± 3.42 mm, -0.67 ± 2.27 mm, 1.05 ± 2.87 mm, and -0.43 ± 0.89°, respectively. The mean ± SD of random error (δ) was 1.95 ± 1.60 mm in X, 1.02 ± 0.50 mm in Y, 1.01 ± 0.48 mm in Z, and 0.37 ± 0.15° in roll. The calculated proper PTV margins that cover >95% of the target on average were 8.20 (X), 5.25 (Y), and 6.45 (Z) mm. Mean systemic geometrical shifts of IMRT were not statistically different in all transitional and three-dimensional shifts from early to late weeks. There was no grade 3 or higher gastrointestinal or genitourianry toxicity.The whole pelvic IMRT technique is a feasible and effective modality that limits intrapelvic organ motion and reduces setup uncertainties. Proper margins for the PTV can be determined by using geometric shifts data.

  6. Analysis of Geometric Shifts and Proper Setup-Margin in Prostate Cancer Patients Treated With Pelvic Intensity-Modulated Radiotherapy Using Endorectal Ballooning and Daily Enema for Prostate Immobilization

    PubMed Central

    Jeong, Songmi; Lee, Jong Hoon; Chung, Mi Joo; Lee, Sea Won; Lee, Jeong Won; Kang, Dae Gyu; Kim, Sung Hwan

    2016-01-01

    Abstract We evaluate geometric shifts of daily setup for evaluating the appropriateness of treatment and determining proper margins for the planning target volume (PTV) in prostate cancer patients. We analyzed 1200 sets of pretreatment megavoltage-CT scans that were acquired from 40 patients with intermediate to high-risk prostate cancer. They received whole pelvic intensity-modulated radiotherapy (IMRT). They underwent daily endorectal ballooning and enema to limit intrapelvic organ movement. The mean and standard deviation (SD) of daily translational shifts in right-to-left (X), anterior-to-posterior (Y), and superior-to-inferior (Z) were evaluated for systemic and random error. The mean ± SD of systemic error (Σ) in X, Y, Z, and roll was 2.21 ± 3.42 mm, −0.67 ± 2.27 mm, 1.05 ± 2.87 mm, and −0.43 ± 0.89°, respectively. The mean ± SD of random error (δ) was 1.95 ± 1.60 mm in X, 1.02 ± 0.50 mm in Y, 1.01 ± 0.48 mm in Z, and 0.37 ± 0.15° in roll. The calculated proper PTV margins that cover >95% of the target on average were 8.20 (X), 5.25 (Y), and 6.45 (Z) mm. Mean systemic geometrical shifts of IMRT were not statistically different in all transitional and three-dimensional shifts from early to late weeks. There was no grade 3 or higher gastrointestinal or genitourianry toxicity. The whole pelvic IMRT technique is a feasible and effective modality that limits intrapelvic organ motion and reduces setup uncertainties. Proper margins for the PTV can be determined by using geometric shifts data. PMID:26765418

  7. Limitations of the planning organ at risk volume (PRV) concept.

    PubMed

    Stroom, Joep C; Heijmen, Ben J M

    2006-09-01

    Previously, we determined a planning target volume (PTV) margin recipe for geometrical errors in radiotherapy equal to M(T) = 2 Sigma + 0.7 sigma, with Sigma and sigma standard deviations describing systematic and random errors, respectively. In this paper, we investigated margins for organs at risk (OAR), yielding the so-called planning organ at risk volume (PRV). For critical organs with a maximum dose (D(max)) constraint, we calculated margins such that D(max) in the PRV is equal to the motion averaged D(max) in the (moving) clinical target volume (CTV). We studied margins for the spinal cord in 10 head-and-neck cases and 10 lung cases, each with two different clinical plans. For critical organs with a dose-volume constraint, we also investigated whether a margin recipe was feasible. For the 20 spinal cords considered, the average margin recipe found was: M(R) = 1.6 Sigma + 0.2 sigma with variations for systematic and random errors of 1.2 Sigma to 1.8 Sigma and -0.2 sigma to 0.6 sigma, respectively. The variations were due to differences in shape and position of the dose distributions with respect to the cords. The recipe also depended significantly on the volume definition of D(max). For critical organs with a dose-volume constraint, the PRV concept appears even less useful because a margin around, e.g., the rectum changes the volume in such a manner that dose-volume constraints stop making sense. The concept of PRV for planning of radiotherapy is of limited use. Therefore, alternative ways should be developed to include geometric uncertainties of OARs in radiotherapy planning.

  8. Resolution requirements for aero-optical simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mani, Ali; Wang Meng; Moin, Parviz

    2008-11-10

    Analytical criteria are developed to estimate the error of aero-optical computations due to inadequate spatial resolution of refractive index fields in high Reynolds number flow simulations. The unresolved turbulence structures are assumed to be locally isotropic and at low turbulent Mach number. Based on the Kolmogorov spectrum for the unresolved structures, the computational error of the optical path length is estimated and linked to the resulting error in the computed far-field optical irradiance. It is shown that in the high Reynolds number limit, for a given geometry and Mach number, the spatial resolution required to capture aero-optics within a pre-specifiedmore » error margin does not scale with Reynolds number. In typical aero-optical applications this resolution requirement is much lower than the resolution required for direct numerical simulation, and therefore, a typical large-eddy simulation can capture the aero-optical effects. The analysis is extended to complex turbulent flow simulations in which non-uniform grid spacings are used to better resolve the local turbulence structures. As a demonstration, the analysis is used to estimate the error of aero-optical computation for an optical beam passing through turbulent wake of flow over a cylinder.« less

  9. A Statistical Method for Synthesizing Mediation Analyses Using the Product of Coefficient Approach Across Multiple Trials

    PubMed Central

    Huang, Shi; MacKinnon, David P.; Perrino, Tatiana; Gallo, Carlos; Cruden, Gracelyn; Brown, C Hendricks

    2016-01-01

    Mediation analysis often requires larger sample sizes than main effect analysis to achieve the same statistical power. Combining results across similar trials may be the only practical option for increasing statistical power for mediation analysis in some situations. In this paper, we propose a method to estimate: 1) marginal means for mediation path a, the relation of the independent variable to the mediator; 2) marginal means for path b, the relation of the mediator to the outcome, across multiple trials; and 3) the between-trial level variance-covariance matrix based on a bivariate normal distribution. We present the statistical theory and an R computer program to combine regression coefficients from multiple trials to estimate a combined mediated effect and confidence interval under a random effects model. Values of coefficients a and b, along with their standard errors from each trial are the input for the method. This marginal likelihood based approach with Monte Carlo confidence intervals provides more accurate inference than the standard meta-analytic approach. We discuss computational issues, apply the method to two real-data examples and make recommendations for the use of the method in different settings. PMID:28239330

  10. Dopamine reward prediction error responses reflect marginal utility.

    PubMed

    Stauffer, William R; Lak, Armin; Schultz, Wolfram

    2014-11-03

    Optimal choices require an accurate neuronal representation of economic value. In economics, utility functions are mathematical representations of subjective value that can be constructed from choices under risk. Utility usually exhibits a nonlinear relationship to physical reward value that corresponds to risk attitudes and reflects the increasing or decreasing marginal utility obtained with each additional unit of reward. Accordingly, neuronal reward responses coding utility should robustly reflect this nonlinearity. In two monkeys, we measured utility as a function of physical reward value from meaningful choices under risk (that adhered to first- and second-order stochastic dominance). The resulting nonlinear utility functions predicted the certainty equivalents for new gambles, indicating that the functions' shapes were meaningful. The monkeys were risk seeking (convex utility function) for low reward and risk avoiding (concave utility function) with higher amounts. Critically, the dopamine prediction error responses at the time of reward itself reflected the nonlinear utility functions measured at the time of choices. In particular, the reward response magnitude depended on the first derivative of the utility function and thus reflected the marginal utility. Furthermore, dopamine responses recorded outside of the task reflected the marginal utility of unpredicted reward. Accordingly, these responses were sufficient to train reinforcement learning models to predict the behaviorally defined expected utility of gambles. These data suggest a neuronal manifestation of marginal utility in dopamine neurons and indicate a common neuronal basis for fundamental explanatory constructs in animal learning theory (prediction error) and economic decision theory (marginal utility). Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  11. Generalized Fisher matrices

    NASA Astrophysics Data System (ADS)

    Heavens, A. F.; Seikel, M.; Nord, B. D.; Aich, M.; Bouffanais, Y.; Bassett, B. A.; Hobson, M. P.

    2014-12-01

    The Fisher Information Matrix formalism (Fisher 1935) is extended to cases where the data are divided into two parts (X, Y), where the expectation value of Y depends on X according to some theoretical model, and X and Y both have errors with arbitrary covariance. In the simplest case, (X, Y) represent data pairs of abscissa and ordinate, in which case the analysis deals with the case of data pairs with errors in both coordinates, but X can be any measured quantities on which Y depends. The analysis applies for arbitrary covariance, provided all errors are Gaussian, and provided the errors in X are small, both in comparison with the scale over which the expected signal Y changes, and with the width of the prior distribution. This generalizes the Fisher Matrix approach, which normally only considers errors in the `ordinate' Y. In this work, we include errors in X by marginalizing over latent variables, effectively employing a Bayesian hierarchical model, and deriving the Fisher Matrix for this more general case. The methods here also extend to likelihood surfaces which are not Gaussian in the parameter space, and so techniques such as DALI (Derivative Approximation for Likelihoods) can be generalized straightforwardly to include arbitrary Gaussian data error covariances. For simple mock data and theoretical models, we compare to Markov Chain Monte Carlo experiments, illustrating the method with cosmological supernova data. We also include the new method in the FISHER4CAST software.

  12. MIXREG: a computer program for mixed-effects regression analysis with autocorrelated errors.

    PubMed

    Hedeker, D; Gibbons, R D

    1996-05-01

    MIXREG is a program that provides estimates for a mixed-effects regression model (MRM) for normally-distributed response data including autocorrelated errors. This model can be used for analysis of unbalanced longitudinal data, where individuals may be measured at a different number of timepoints, or even at different timepoints. Autocorrelated errors of a general form or following an AR(1), MA(1), or ARMA(1,1) form are allowable. This model can also be used for analysis of clustered data, where the mixed-effects model assumes data within clusters are dependent. The degree of dependency is estimated jointly with estimates of the usual model parameters, thus adjusting for clustering. MIXREG uses maximum marginal likelihood estimation, utilizing both the EM algorithm and a Fisher-scoring solution. For the scoring solution, the covariance matrix of the random effects is expressed in its Gaussian decomposition, and the diagonal matrix reparameterized using the exponential transformation. Estimation of the individual random effects is accomplished using an empirical Bayes approach. Examples illustrating usage and features of MIXREG are provided.

  13. Distribution of the two-sample t-test statistic following blinded sample size re-estimation.

    PubMed

    Lu, Kaifeng

    2016-05-01

    We consider the blinded sample size re-estimation based on the simple one-sample variance estimator at an interim analysis. We characterize the exact distribution of the standard two-sample t-test statistic at the final analysis. We describe a simulation algorithm for the evaluation of the probability of rejecting the null hypothesis at given treatment effect. We compare the blinded sample size re-estimation method with two unblinded methods with respect to the empirical type I error, the empirical power, and the empirical distribution of the standard deviation estimator and final sample size. We characterize the type I error inflation across the range of standardized non-inferiority margin for non-inferiority trials, and derive the adjusted significance level to ensure type I error control for given sample size of the internal pilot study. We show that the adjusted significance level increases as the sample size of the internal pilot study increases. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  14. SU-E-J-103: Setup Errors Analysis by Cone-Beam CT (CBCT)-Based Imaged-Guided Intensity Modulated Radiotherapy for Esophageal Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, H; Wang, W; Hu, W

    2014-06-01

    Purpose: To quantify setup errors by pretreatment kilovolt cone-beam computed tomography(KV-CBCT) scans for middle or distal esophageal carcinoma patients. Methods: Fifty-two consecutive middle or distal esophageal carcinoma patients who underwent IMRT were included this study. A planning CT scan using a big-bore CT simulator was performed in the treatment position and was used as the reference scan for image registration with CBCT. CBCT scans(On-Board Imaging v1. 5 system, Varian Medical Systems) were acquired daily during the first treatment week. A total of 260 CBCT scans was assessed with a registration clip box defined around the PTV-thorax in the reference scanmore » based on(nine CBCTs per patient) bony anatomy using Offline Review software v10.0(Varian Medical Systems). The anterior-posterior(AP), left-right(LR), superiorinferior( SI) corrections were recorded. The systematic and random errors were calculated. The CTV-to-PTV margins in each CBCT frequency was based on the Van Herk formula (2.5Σ+0.7σ). Results: The SD of systematic error (Σ) was 2.0mm, 2.3mm, 3.8mm in the AP, LR and SI directions, respectively. The average random error (σ) was 1.6mm, 2.4mm, 4.1mm in the AP, LR and SI directions, respectively. The CTV-to-PTV safety margin was 6.1mm, 7.5mm, 12.3mm in the AP, LR and SI directions based on van Herk formula. Conclusion: Our data recommend the use of 6 mm, 8mm, and 12 mm for esophageal carcinoma patient setup in AP, LR, SI directions, respectively.« less

  15. Clinical application of a light-pen computer system for quantitative angiography

    NASA Technical Reports Server (NTRS)

    Alderman, E. L.

    1975-01-01

    The paper describes an angiographic analysis system which uses a video disk for recording and playback, a light-pen for data input, minicomputer processing, and an electrostatic printer/plotter for hardcopy output. The method is applied to quantitative analysis of ventricular volumes, sequential ventriculography for assessment of physiologic and pharmacologic interventions, analysis of instantaneous time sequence of ventricular systolic and diastolic events, and quantitation of segmental abnormalities. The system is shown to provide the capability for computation of ventricular volumes and other measurements from operator-defined margins by greatly reducing the tedium and errors associated with manual planimetry.

  16. Quality of Impressions and Work Authorizations Submitted by Dental Students Supervised by Prosthodontists and General Dentists.

    PubMed

    Imbery, Terence A; Diaz, Nicholas; Greenfield, Kristy; Janus, Charles; Best, Al M

    2016-10-01

    Preclinical fixed prosthodontics is taught by Department of Prosthodontics faculty members at Virginia Commonwealth University School of Dentistry; however, 86% of all clinical cases in academic year 2012 were staffed by faculty members from the Department of General Practice. The aims of this retrospective study were to quantify the quality of impressions, accuracy of laboratory work authorizations, and most common errors and to determine if there were differences between the rate of errors in cases supervised by the prosthodontists and the general dentists. A total of 346 Fixed Prosthodontic Laboratory Tracking Sheets for the 2012 academic year were reviewed. The results showed that, overall, 73% of submitted impressions were acceptable at initial evaluation, 16% had to be poured first and re-evaluated for quality prior to pindexing, 7% had multiple impressions submitted for transfer dies, and 4% were rejected for poor quality. There were higher acceptance rates for impressions and work authorizations for cases staffed by prosthodontists than by general dentists, but the differences were not statistically significant (p=0.0584 and p=0.0666, respectively). Regarding the work authorizations, 43% overall did not provide sufficient information or had technical errors that delayed prosthesis fabrication. The most common errors were incorrect mountings, absence of solid casts, inadequate description of margins for porcelain fused to metal crowns, inaccurate die trimming, and margin marking. The percentages of errors in cases supervised by general dentists and prosthodontists were similar for 17 of the 18 types of errors identified; only for margin description was the percentage of errors statistically significantly higher for general dentist-supervised than prosthodontist-supervised cases. These results highlighted the ongoing need for faculty development and calibration to ensure students receive the highest quality education from all faculty members teaching fixed prosthodontics.

  17. Image-Guided Radiotherapy for Left-Sided Breast Cancer Patients: Geometrical Uncertainty of the Heart

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Topolnjak, Rajko; Borst, Gerben R.; Nijkamp, Jasper

    Purpose: To quantify the geometrical uncertainties for the heart during radiotherapy treatment of left-sided breast cancer patients and to determine and validate planning organ at risk volume (PRV) margins. Methods and Materials: Twenty-two patients treated in supine position in 28 fractions with regularly acquired cone-beam computed tomography (CBCT) scans for offline setup correction were included. Retrospectively, the CBCT scans were reconstructed into 10-phase respiration correlated four-dimensional scans. The heart was registered in each breathing phase to the planning CT scan to establish the respiratory heart motion during the CBCT scan ({sigma}{sub resp}). The average of the respiratory motion was calculatedmore » as the heart displacement error for a fraction. Subsequently, the systematic ({Sigma}), random ({sigma}), and total random ({sigma}{sub tot}={radical}({sigma}{sup 2}+{sigma}{sub resp}{sup 2})) errors of the heart position were calculated. Based on the errors a PRV margin for the heart was calculated to ensure that the maximum heart dose (D{sub max}) is not underestimated in at least 90% of the cases (M{sub heart} = 1.3{Sigma}-0.5{sigma}{sub tot}). All analysis were performed in left-right (LR), craniocaudal (CC), and anteroposterior (AP) directions with respect to both online and offline bony anatomy setup corrections. The PRV margin was validated by accumulating the dose to the heart based on the heart registrations and comparing the planned PRV D{sub max} to the accumulated heart D{sub max}. Results: For online setup correction, the cardiac geometrical uncertainties and PRV margins were N-Ary-Summation = 2.2/3.2/2.1 mm, {sigma} = 2.1/2.9/1.4 mm, and M{sub heart} = 1.6/2.3/1.3 mm for LR/CC/AP, respectively. For offline setup correction these were N-Ary-Summation = 2.4/3.7/2.2 mm, {sigma} = 2.9/4.1/2.7 mm, and M{sub heart} = 1.6/2.1/1.4 mm. Cardiac motion induced by breathing was {sigma}{sub resp} = 1.4/2.9/1.4 mm for LR/CC/AP. The PRV D{sub max} underestimated the accumulated heart D{sub max} for 9.1% patients using online and 13.6% patients using offline bony anatomy setup correction, which validated that PRV margin size was adequate. Conclusion: Considerable cardiac position variability relative to the bony anatomy was observed in breast cancer patients. A PRV margin can be used during treatment planning to take these uncertainties into account.« less

  18. Dosimetric Implications of Residual Tracking Errors During Robotic SBRT of Liver Metastases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chan, Mark; Tuen Mun Hospital, Hong Kong; Grehn, Melanie

    Purpose: Although the metric precision of robotic stereotactic body radiation therapy in the presence of breathing motion is widely known, we investigated the dosimetric implications of breathing phase–related residual tracking errors. Methods and Materials: In 24 patients (28 liver metastases) treated with the CyberKnife, we recorded the residual correlation, prediction, and rotational tracking errors from 90 fractions and binned them into 10 breathing phases. The average breathing phase errors were used to shift and rotate the clinical tumor volume (CTV) and planning target volume (PTV) for each phase to calculate a pseudo 4-dimensional error dose distribution for comparison with themore » original planned dose distribution. Results: The median systematic directional correlation, prediction, and absolute aggregate rotation errors were 0.3 mm (range, 0.1-1.3 mm), 0.01 mm (range, 0.00-0.05 mm), and 1.5° (range, 0.4°-2.7°), respectively. Dosimetrically, 44%, 81%, and 92% of all voxels differed by less than 1%, 3%, and 5% of the planned local dose, respectively. The median coverage reduction for the PTV was 1.1% (range in coverage difference, −7.8% to +0.8%), significantly depending on correlation (P=.026) and rotational (P=.005) error. With a 3-mm PTV margin, the median coverage change for the CTV was 0.0% (range, −1.0% to +5.4%), not significantly depending on any investigated parameter. In 42% of patients, the 3-mm margin did not fully compensate for the residual tracking errors, resulting in a CTV coverage reduction of 0.1% to 1.0%. Conclusions: For liver tumors treated with robotic stereotactic body radiation therapy, a safety margin of 3 mm is not always sufficient to cover all residual tracking errors. Dosimetrically, this translates into only small CTV coverage reductions.« less

  19. Objectified quantification of uncertainties in Bayesian atmospheric inversions

    NASA Astrophysics Data System (ADS)

    Berchet, A.; Pison, I.; Chevallier, F.; Bousquet, P.; Bonne, J.-L.; Paris, J.-D.

    2015-05-01

    Classical Bayesian atmospheric inversions process atmospheric observations and prior emissions, the two being connected by an observation operator picturing mainly the atmospheric transport. These inversions rely on prescribed errors in the observations, the prior emissions and the observation operator. When data pieces are sparse, inversion results are very sensitive to the prescribed error distributions, which are not accurately known. The classical Bayesian framework experiences difficulties in quantifying the impact of mis-specified error distributions on the optimized fluxes. In order to cope with this issue, we rely on recent research results to enhance the classical Bayesian inversion framework through a marginalization on a large set of plausible errors that can be prescribed in the system. The marginalization consists in computing inversions for all possible error distributions weighted by the probability of occurrence of the error distributions. The posterior distribution of the fluxes calculated by the marginalization is not explicitly describable. As a consequence, we carry out a Monte Carlo sampling based on an approximation of the probability of occurrence of the error distributions. This approximation is deduced from the well-tested method of the maximum likelihood estimation. Thus, the marginalized inversion relies on an automatic objectified diagnosis of the error statistics, without any prior knowledge about the matrices. It robustly accounts for the uncertainties on the error distributions, contrary to what is classically done with frozen expert-knowledge error statistics. Some expert knowledge is still used in the method for the choice of an emission aggregation pattern and of a sampling protocol in order to reduce the computation cost. The relevance and the robustness of the method is tested on a case study: the inversion of methane surface fluxes at the mesoscale with virtual observations on a realistic network in Eurasia. Observing system simulation experiments are carried out with different transport patterns, flux distributions and total prior amounts of emitted methane. The method proves to consistently reproduce the known "truth" in most cases, with satisfactory tolerance intervals. Additionally, the method explicitly provides influence scores and posterior correlation matrices. An in-depth interpretation of the inversion results is then possible. The more objective quantification of the influence of the observations on the fluxes proposed here allows us to evaluate the impact of the observation network on the characterization of the surface fluxes. The explicit correlations between emission aggregates reveal the mis-separated regions, hence the typical temporal and spatial scales the inversion can analyse. These scales are consistent with the chosen aggregation patterns.

  20. Cost effectiveness of a pharmacist-led information technology intervention for reducing rates of clinically important errors in medicines management in general practices (PINCER).

    PubMed

    Elliott, Rachel A; Putman, Koen D; Franklin, Matthew; Annemans, Lieven; Verhaeghe, Nick; Eden, Martin; Hayre, Jasdeep; Rodgers, Sarah; Sheikh, Aziz; Avery, Anthony J

    2014-06-01

    We recently showed that a pharmacist-led information technology-based intervention (PINCER) was significantly more effective in reducing medication errors in general practices than providing simple feedback on errors, with cost per error avoided at £79 (US$131). We aimed to estimate cost effectiveness of the PINCER intervention by combining effectiveness in error reduction and intervention costs with the effect of the individual errors on patient outcomes and healthcare costs, to estimate the effect on costs and QALYs. We developed Markov models for each of six medication errors targeted by PINCER. Clinical event probability, treatment pathway, resource use and costs were extracted from literature and costing tariffs. A composite probabilistic model combined patient-level error models with practice-level error rates and intervention costs from the trial. Cost per extra QALY and cost-effectiveness acceptability curves were generated from the perspective of NHS England, with a 5-year time horizon. The PINCER intervention generated £2,679 less cost and 0.81 more QALYs per practice [incremental cost-effectiveness ratio (ICER): -£3,037 per QALY] in the deterministic analysis. In the probabilistic analysis, PINCER generated 0.001 extra QALYs per practice compared with simple feedback, at £4.20 less per practice. Despite this extremely small set of differences in costs and outcomes, PINCER dominated simple feedback with a mean ICER of -£3,936 (standard error £2,970). At a ceiling 'willingness-to-pay' of £20,000/QALY, PINCER reaches 59 % probability of being cost effective. PINCER produced marginal health gain at slightly reduced overall cost. Results are uncertain due to the poor quality of data to inform the effect of avoiding errors.

  1. SU-E-T-36: An Investigation of the Margin From CTV to PTV Using Retraction Method for Cervical Carcinoma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, D; Chen, J; Hao, Y

    Purpose: This work employs the retraction method to compute and evaluate the margin from CTV to PTV based on the influence of target dosimetry of setup errors during cervical carcinoma patients treatment. Methods: Sixteen patients with cervical cancer were treated by Elekta synergy and received a total of 305 KV-CBCT images. The iso-center of the initial plans were changed according to the setup errors to simulate radiotherapy and then recalculated the dose distribution using leaf sequences and MUs for individual plans. The margin from CTV to PTV will be concluded both by the method of retracting (Fixed the PTV ofmore » the original plan, and retract PTV a certain distance defined as simulative organization CTVnx. The minimum distance value from PTV to CTVnx which get specified doses, namely guarantee at least 99% CTV volume can receive the dose of 95%, is the margin CTV to PTV we found) and the former formula method. Results: (1)The setup errors of 16 patients in X, Y and Z directions were(1.13±2.94) mm,(−1.63±7.13) mm,(−0.65±2.25) mm. (2) The distance between CTVx and PTV was 5, 9 and 3mm in X, Y and Z directions According to 2.5+0.7σ. (3) Transplantation plans displayed 99% of CTVx10- CTVx7 and received 95% of prescription dose, but CTVx6- CTVx3 departed from standard of clinic.In order to protect normal tissues, we selected 7mm as the minimum value of the margin from CTV to PTV. Conclusion: We have test an retraction method for the margin from CTV to PTV evaluation. The retraction method is more reliable than the formula method for calculating the margin from the CTV to the PTV, because it represented practice of treatment, and increasing a new method in this field.« less

  2. Reducing Uncertainty in the American Community Survey through Data-Driven Regionalization

    PubMed Central

    Spielman, Seth E.; Folch, David C.

    2015-01-01

    The American Community Survey (ACS) is the largest survey of US households and is the principal source for neighborhood scale information about the US population and economy. The ACS is used to allocate billions in federal spending and is a critical input to social scientific research in the US. However, estimates from the ACS can be highly unreliable. For example, in over 72% of census tracts, the estimated number of children under 5 in poverty has a margin of error greater than the estimate. Uncertainty of this magnitude complicates the use of social data in policy making, research, and governance. This article presents a heuristic spatial optimization algorithm that is capable of reducing the margins of error in survey data via the creation of new composite geographies, a process called regionalization. Regionalization is a complex combinatorial problem. Here rather than focusing on the technical aspects of regionalization we demonstrate how to use a purpose built open source regionalization algorithm to process survey data in order to reduce the margins of error to a user-specified threshold. PMID:25723176

  3. Reducing uncertainty in the american community survey through data-driven regionalization.

    PubMed

    Spielman, Seth E; Folch, David C

    2015-01-01

    The American Community Survey (ACS) is the largest survey of US households and is the principal source for neighborhood scale information about the US population and economy. The ACS is used to allocate billions in federal spending and is a critical input to social scientific research in the US. However, estimates from the ACS can be highly unreliable. For example, in over 72% of census tracts, the estimated number of children under 5 in poverty has a margin of error greater than the estimate. Uncertainty of this magnitude complicates the use of social data in policy making, research, and governance. This article presents a heuristic spatial optimization algorithm that is capable of reducing the margins of error in survey data via the creation of new composite geographies, a process called regionalization. Regionalization is a complex combinatorial problem. Here rather than focusing on the technical aspects of regionalization we demonstrate how to use a purpose built open source regionalization algorithm to process survey data in order to reduce the margins of error to a user-specified threshold.

  4. Helical tomotherapy setup variations in canine nasal tumor patients immobilized with a bite block.

    PubMed

    Kubicek, Lyndsay N; Seo, Songwon; Chappell, Richard J; Jeraj, Robert; Forrest, Lisa J

    2012-01-01

    The purpose of our study was to compare setup variation in four degrees of freedom (vertical, longitudinal, lateral, and roll) between canine nasal tumor patients immobilized with a mattress and bite block, versus a mattress alone. Our secondary aim was to define a clinical target volume (CTV) to planning target volume (PTV) expansion margin based on our mean systematic error values associated with nasal tumor patients immobilized by a mattress and bite block. We evaluated six parameters for setup corrections: systematic error, random error, patient-patient variation in systematic errors, the magnitude of patient-specific random errors (root mean square [RMS]), distance error, and the variation of setup corrections from zero shift. The variations in all parameters were statistically smaller in the group immobilized by a mattress and bite block. The mean setup corrections in the mattress and bite block group ranged from 0.91 mm to 1.59 mm for the translational errors and 0.5°. Although most veterinary radiation facilities do not have access to Image-guided radiotherapy (IGRT), we identified a need for more rigid fixation, established the value of adding IGRT to veterinary radiation therapy, and define the CTV-PTV setup error margin for canine nasal tumor patients immobilized in a mattress and bite block. © 2012 Veterinary Radiology & Ultrasound.

  5. Exploring diversity in ensemble classification: Applications in large area land cover mapping

    NASA Astrophysics Data System (ADS)

    Mellor, Andrew; Boukir, Samia

    2017-07-01

    Ensemble classifiers, such as random forests, are now commonly applied in the field of remote sensing, and have been shown to perform better than single classifier systems, resulting in reduced generalisation error. Diversity across the members of ensemble classifiers is known to have a strong influence on classification performance - whereby classifier errors are uncorrelated and more uniformly distributed across ensemble members. The relationship between ensemble diversity and classification performance has not yet been fully explored in the fields of information science and machine learning and has never been examined in the field of remote sensing. This study is a novel exploration of ensemble diversity and its link to classification performance, applied to a multi-class canopy cover classification problem using random forests and multisource remote sensing and ancillary GIS data, across seven million hectares of diverse dry-sclerophyll dominated public forests in Victoria Australia. A particular emphasis is placed on analysing the relationship between ensemble diversity and ensemble margin - two key concepts in ensemble learning. The main novelty of our work is on boosting diversity by emphasizing the contribution of lower margin instances used in the learning process. Exploring the influence of tree pruning on diversity is also a new empirical analysis that contributes to a better understanding of ensemble performance. Results reveal insights into the trade-off between ensemble classification accuracy and diversity, and through the ensemble margin, demonstrate how inducing diversity by targeting lower margin training samples is a means of achieving better classifier performance for more difficult or rarer classes and reducing information redundancy in classification problems. Our findings inform strategies for collecting training data and designing and parameterising ensemble classifiers, such as random forests. This is particularly important in large area remote sensing applications, for which training data is costly and resource intensive to collect.

  6. Accounting for response misclassification and covariate measurement error improves power and reduces bias in epidemiologic studies.

    PubMed

    Cheng, Dunlei; Branscum, Adam J; Stamey, James D

    2010-07-01

    To quantify the impact of ignoring misclassification of a response variable and measurement error in a covariate on statistical power, and to develop software for sample size and power analysis that accounts for these flaws in epidemiologic data. A Monte Carlo simulation-based procedure is developed to illustrate the differences in design requirements and inferences between analytic methods that properly account for misclassification and measurement error to those that do not in regression models for cross-sectional and cohort data. We found that failure to account for these flaws in epidemiologic data can lead to a substantial reduction in statistical power, over 25% in some cases. The proposed method substantially reduced bias by up to a ten-fold margin compared to naive estimates obtained by ignoring misclassification and mismeasurement. We recommend as routine practice that researchers account for errors in measurement of both response and covariate data when determining sample size, performing power calculations, or analyzing data from epidemiological studies. 2010 Elsevier Inc. All rights reserved.

  7. Topological analysis of polymeric melts: chain-length effects and fast-converging estimators for entanglement length.

    PubMed

    Hoy, Robert S; Foteinopoulou, Katerina; Kröger, Martin

    2009-09-01

    Primitive path analyses of entanglements are performed over a wide range of chain lengths for both bead spring and atomistic polyethylene polymer melts. Estimators for the entanglement length N_{e} which operate on results for a single chain length N are shown to produce systematic O(1/N) errors. The mathematical roots of these errors are identified as (a) treating chain ends as entanglements and (b) neglecting non-Gaussian corrections to chain and primitive path dimensions. The prefactors for the O(1/N) errors may be large; in general their magnitude depends both on the polymer model and the method used to obtain primitive paths. We propose, derive, and test new estimators which eliminate these systematic errors using information obtainable from the variation in entanglement characteristics with chain length. The new estimators produce accurate results for N_{e} from marginally entangled systems. Formulas based on direct enumeration of entanglements appear to converge faster and are simpler to apply.

  8. Robustness study of the pseudo open-loop controller for multiconjugate adaptive optics.

    PubMed

    Piatrou, Piotr; Gilles, Luc

    2005-02-20

    Robustness of the recently proposed "pseudo open-loop control" algorithm against various system errors has been investigated for the representative example of the Gemini-South 8-m telescope multiconjugate adaptive-optics system. The existing model to represent the adaptive-optics system with pseudo open-loop control has been modified to account for misalignments, noise and calibration errors in deformable mirrors, and wave-front sensors. Comparison with the conventional least-squares control model has been done. We show with the aid of both transfer-function pole-placement analysis and Monte Carlo simulations that POLC remains remarkably stable and robust against very large levels of system errors and outperforms in this respect least-squares control. Approximate stability margins as well as performance metrics such as Strehl ratios and rms wave-front residuals averaged over a 1-arc min field of view have been computed for different types and levels of system errors to quantify the expected performance degradation.

  9. Prospect theory does not describe the feedback-related negativity value function.

    PubMed

    Sambrook, Thomas D; Roser, Matthew; Goslin, Jeremy

    2012-12-01

    Humans handle uncertainty poorly. Prospect theory accounts for this with a value function in which possible losses are overweighted compared to possible gains, and the marginal utility of rewards decreases with size. fMRI studies have explored the neural basis of this value function. A separate body of research claims that prediction errors are calculated by midbrain dopamine neurons. We investigated whether the prospect theoretic effects shown in behavioral and fMRI studies were present in midbrain prediction error coding by using the feedback-related negativity, an ERP component believed to reflect midbrain prediction errors. Participants' stated satisfaction with outcomes followed prospect theory but their feedback-related negativity did not, instead showing no effect of marginal utility and greater sensitivity to potential gains than losses. Copyright © 2012 Society for Psychophysiological Research.

  10. Adaptive optimization by 6 DOF robotic couch in prostate volumetric IMRT treatment: rototranslational shift and dosimetric consequences

    PubMed Central

    Placidi, Lorenzo; Azario, Luigi; Mattiucci, Gian Carlo; Greco, Francesca; Damiani, Andrea; Mantini, Giovanna; Frascino, Vincenzo; Piermattei, Angelo; Valentini, Vincenzo; Balducci, Mario

    2015-01-01

    The purpose of this study was to investigate the magnitude and dosimetric relevance of translational and rotational shifts on IGRT prostate volumetric‐modulated arc therapy (VMAT) using Protura six degrees of freedom (DOF) Robotic Patient Positioning System. Patients with cT3aN0M0 prostate cancer, treated with VMAT simultaneous integrated boost (VMAT‐SIB), were enrolled. PTV2 was obtained adding 0.7 cm margin to seminal vesicles base (CTV2), while PTV1 adding to prostate (CTV1) 0.7 cm margin in all directions, except 1.2 cm, as caudal margin. A daily CBCT was acquired before dose delivery. The translational and rotational displacements were corrected through Protura Robotic Couch, collected and applied to the simulation CT to obtain a translated CT (tCT) and a rototranslated CT (rtCT) on which we recalculated the initial treatment plan (TP). We analyzed the correlation between dosimetric coverage, organs at risk (OAR) sparing, and translational or rotational displacements. The dosimetric impact of a rototranslational correction was calculated. From October 2012 to September 2013, a total of 263 CBCT scans from 12 patients were collected. Translational shifts were <5mm in 81% of patients and the rotational shifts were <2∘ in 93% of patient scans. The dosimetric analysis was performed on 172 CBCT scans and calculating 344 VMAT‐TP. Two significant linear correlations were observed between yaw and the V20 femoral heads and between pitch rotation and V50 rectum (p<0.001); rototranslational correction seems to impact more on PTV2 than on PTV1, especially when margins are reduced. Rotational errors are of dosimetric significance in sparing OAR and in target coverage. This is relevant for femoral heads and rectum because of major distance from isocenter, and for seminal vesicles because of irregular shape. No correlation was observed between translational and rotational errors. A study considering the intrafractional error and the deformable registration is ongoing. PACS number: 87.55.de PMID:26699314

  11. Flight assessment of the onboard propulsion system model for the Performance Seeking Control algorithm on an F-15 aircraft

    NASA Technical Reports Server (NTRS)

    Orme, John S.; Schkolnik, Gerard S.

    1995-01-01

    Performance Seeking Control (PSC), an onboard, adaptive, real-time optimization algorithm, relies upon an onboard propulsion system model. Flight results illustrated propulsion system performance improvements as calculated by the model. These improvements were subject to uncertainty arising from modeling error. Thus to quantify uncertainty in the PSC performance improvements, modeling accuracy must be assessed. A flight test approach to verify PSC-predicted increases in thrust (FNP) and absolute levels of fan stall margin is developed and applied to flight test data. Application of the excess thrust technique shows that increases of FNP agree to within 3 percent of full-scale measurements for most conditions. Accuracy to these levels is significant because uncertainty bands may now be applied to the performance improvements provided by PSC. Assessment of PSC fan stall margin modeling accuracy was completed with analysis of in-flight stall tests. Results indicate that the model overestimates the stall margin by between 5 to 10 percent. Because PSC achieves performance gains by using available stall margin, this overestimation may represent performance improvements to be recovered with increased modeling accuracy. Assessment of thrust and stall margin modeling accuracy provides a critical piece for a comprehensive understanding of PSC's capabilities and limitations.

  12. WE-AB-207B-03: A Computational Methodology for Determination of CTV-To-PTV Margins with Inter Fractional Shape Variations Based On a Statistical Point Distribution Model for Prostate Cancer Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shibayama, Y; Umezu, Y; Nakamura, Y

    2016-06-15

    Purpose: Our assumption was that interfractional shape variations of target volumes could not be negligible for determination of clinical target volume (CTV)-to-planning target volume (PTV) margins. The aim of this study was to investigate this assumption as a simulation study by developing a computational framework of CTV-to-PTV margins with taking the interfractional shape variations into account based on point distribution model (PDM) Methods: The systematic and random errors for interfractional shape variations and translations of target volumes were evaluated for four types of CTV regions (only a prostate, a prostate plus proximal 1-cm seminal vesicles, a prostate plus proximal 2-cmmore » seminal vesicles, and a prostate plus whole seminal vesicles). The CTV regions were delineated depending on prostate cancer risk groups on planning computed tomography (CT) and cone beam CT (CBCT) images of 73 fractions of 10 patients. The random and systematic errors for shape variations of CTV regions were derived from PDMs of CTV surfaces for all fractions of each patient. Systematic errors of shape variations of CTV regions were derived by comparing PDMs between planning CTV surfaces and average CTV surfaces. Finally, anisotropic CTV-to-PTV margins with shape variations in 6 directions (anterior, posterior, superior, inferior, right, and left) were computed by using a van Herk margin formula. Results: Differences between CTV-to-PTV margins with and without shape variations ranged from 0.7 to 1.7 mm in anterior direction, 1.0 to 2.8 mm in posterior direction, 0.8 to 2.8 mm in superior direction, 0.6 to 1.6 mm in inferior direction, 1.4 to 4.4 mm in right direction, and 1.3 to 5.2 mm in left direction. Conclusion: More than 1.0 mm additional margins were needed at least in 3 directions to guarantee CTV coverage due to shape variations. Therefore, shape variations should be taken into account for the determination of CTV-to-PTV margins.« less

  13. Dosimetric consequences of translational and rotational errors in frame-less image-guided radiosurgery

    PubMed Central

    2012-01-01

    Background To investigate geometric and dosimetric accuracy of frame-less image-guided radiosurgery (IG-RS) for brain metastases. Methods and materials Single fraction IG-RS was practiced in 72 patients with 98 brain metastases. Patient positioning and immobilization used either double- (n = 71) or single-layer (n = 27) thermoplastic masks. Pre-treatment set-up errors (n = 98) were evaluated with cone-beam CT (CBCT) based image-guidance (IG) and were corrected in six degrees of freedom without an action level. CBCT imaging after treatment measured intra-fractional errors (n = 64). Pre- and post-treatment errors were simulated in the treatment planning system and target coverage and dose conformity were evaluated. Three scenarios of 0 mm, 1 mm and 2 mm GTV-to-PTV (gross tumor volume, planning target volume) safety margins (SM) were simulated. Results Errors prior to IG were 3.9 mm ± 1.7 mm (3D vector) and the maximum rotational error was 1.7° ± 0.8° on average. The post-treatment 3D error was 0.9 mm ± 0.6 mm. No differences between double- and single-layer masks were observed. Intra-fractional errors were significantly correlated with the total treatment time with 0.7mm±0.5mm and 1.2mm±0.7mm for treatment times ≤23 minutes and >23 minutes (p<0.01), respectively. Simulation of RS without image-guidance reduced target coverage and conformity to 75% ± 19% and 60% ± 25% of planned values. Each 3D set-up error of 1 mm decreased target coverage and dose conformity by 6% and 10% on average, respectively, with a large inter-patient variability. Pre-treatment correction of translations only but not rotations did not affect target coverage and conformity. Post-treatment errors reduced target coverage by >5% in 14% of the patients. A 1 mm safety margin fully compensated intra-fractional patient motion. Conclusions IG-RS with online correction of translational errors achieves high geometric and dosimetric accuracy. Intra-fractional errors decrease target coverage and conformity unless compensated with appropriate safety margins. PMID:22531060

  14. Experimental validation of the van Herk margin formula for lung radiation therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ecclestone, Gillian; Heath, Emily; Bissonnette, Jean-Pierre

    2013-11-15

    Purpose: To validate the van Herk margin formula for lung radiation therapy using realistic dose calculation algorithms and respiratory motion modeling. The robustness of the margin formula against variations in lesion size, peak-to-peak motion amplitude, tissue density, treatment technique, and plan conformity was assessed, along with the margin formula assumption of a homogeneous dose distribution with perfect plan conformity.Methods: 3DCRT and IMRT lung treatment plans were generated within the ORBIT treatment planning platform (RaySearch Laboratories, Sweden) on 4DCT datasets of virtual phantoms. Random and systematic respiratory motion induced errors were simulated using deformable registration and dose accumulation tools available withinmore » ORBIT for simulated cases of varying lesion sizes, peak-to-peak motion amplitudes, tissue densities, and plan conformities. A detailed comparison between the margin formula dose profile model, the planned dose profiles, and penumbra widths was also conducted to test the assumptions of the margin formula. Finally, a correction to account for imperfect plan conformity was tested as well as a novel application of the margin formula that accounts for the patient-specific motion trajectory.Results: The van Herk margin formula ensured full clinical target volume coverage for all 3DCRT and IMRT plans of all conformities with the exception of small lesions in soft tissue. No dosimetric trends with respect to plan technique or lesion size were observed for the systematic and random error simulations. However, accumulated plans showed that plan conformity decreased with increasing tumor motion amplitude. When comparing dose profiles assumed in the margin formula model to the treatment plans, discrepancies in the low dose regions were observed for the random and systematic error simulations. However, the margin formula respected, in all experiments, the 95% dose coverage required for planning target volume (PTV) margin derivation, as defined by the ICRU; thus, suitable PTV margins were estimated. The penumbra widths calculated in lung tissue for each plan were found to be very similar to the 6.4 mm value assumed by the margin formula model. The plan conformity correction yielded inconsistent results which were largely affected by image and dose grid resolution while the trajectory modified PTV plans yielded a dosimetric benefit over the standard internal target volumes approach with up to a 5% decrease in the V20 value.Conclusions: The margin formula showed to be robust against variations in tumor size and motion, treatment technique, plan conformity, as well as low tissue density. This was validated by maintaining coverage of all of the derived PTVs by 95% dose level, as required by the formal definition of the PTV. However, the assumption of perfect plan conformity in the margin formula derivation yields conservative margin estimation. Future modifications to the margin formula will require a correction for plan conformity. Plan conformity can also be improved by using the proposed trajectory modified PTV planning approach. This proves especially beneficial for tumors with a large anterior–posterior component of respiratory motion.« less

  15. Digital Filters for Digital Phase-locked Loops

    NASA Technical Reports Server (NTRS)

    Simon, M.; Mileant, A.

    1985-01-01

    An s/z hybrid model for a general phase locked loop is proposed. The impact of the loop filter on the stability, gain margin, noise equivalent bandwidth, steady state error and time response is investigated. A specific digital filter is selected which maximizes the overall gain margin of the loop. This filter can have any desired number of integrators. Three integrators are sufficient in order to track a phase jerk with zero steady state error at loop update instants. This filter has one zero near z = 1.0 for each integrator. The total number of poles of the filter is equal to the number of integrators plus two.

  16. Portal imaging based definition of the planning target volume during pelvic irradiation for gynecological malignancies.

    PubMed

    Mock, U; Dieckmann, K; Wolff, U; Knocke, T H; Pötter, R

    1999-08-01

    Geometrical accuracy in patient positioning can vary substantially during external radiotherapy. This study estimated the set-up accuracy during pelvic irradiation for gynecological malignancies for determination of safety margins (planning target volume, PTV). Based on electronic portal imaging devices (EPID), 25 patients undergoing 4-field pelvic irradiation for gynecological malignancies were analyzed with regard to set-up accuracy during the treatment course. Regularly performed EPID images were used in order to systematically assess the systematic and random component of set-up displacements. Anatomical matching of verification and simulation images was followed by measuring corresponding distances between the central axis and anatomical features. Data analysis of set-up errors referred to the x-, y-,and z-axes. Additionally, cumulative frequencies were evaluated. A total of 50 simulation films and 313 verification images were analyzed. For the anterior-posterior (AP) beam direction mean deviations along the x- and z-axes were 1.5 mm and -1.9 mm, respectively. Moreover, random errors of 4.8 mm (x-axis) and 3.0 mm (z-axis) were determined. Concerning the latero-lateral treatment fields, the systematic errors along the two axes were calculated to 2.9 mm (y-axis) and -2.0 mm (z-axis) and random errors of 3.8 mm and 3.5 mm were found, respectively. The cumulative frequency of misalignments < or =5 mm showed values of 75% (AP fields) and 72% (latero-lateral fields). With regard to cumulative frequencies < or =10 mm quantification revealed values of 97% for both beam directions. During external pelvic irradiation therapy for gynecological malignancies, EPID images on a regular basis revealed acceptable set-up inaccuracies. Safety margins (PTV) of 1 cm appear to be sufficient, accounting for more than 95% of all deviations.

  17. Frozen Section Evaluation of Margin Status in Primary Squamous Cell Carcinomas of the Head and Neck: A Correlation Study of Frozen Section and Final Diagnoses.

    PubMed

    Layfield, Eleanor M; Schmidt, Robert L; Esebua, Magda; Layfield, Lester J

    2018-06-01

    Frozen section is routinely used for intraoperative margin evaluation in carcinomas of the head and neck. We studied a series of frozen sections performed for margin status of head and neck tumors to determine diagnostic accuracy. All frozen sections for margin control of squamous carcinomas of the head and neck were studied from a 66 month period. Frozen and permanent section diagnoses were classified as negative or malignant. Correlation of diagnoses was performed to determine accuracy. One thousand seven hundred and ninety-six pairs of frozen section and corresponding permanent section diagnoses were obtained. Discordances were found in 55 (3.1%) pairs. In 35 pairs (1.9%), frozen section was reported as benign, but permanent sections disclosed carcinoma. In 21 cases, the discrepancy was due to sampling and in the remaining cases it was an interpretive error. In 20 cases (1.1%), frozen section was malignant, but the permanent section was interpreted as negative. Frozen section is an accurate method for evaluation of operative margins for head and neck carcinomas with concordance between frozen and permanent results of 97%. Most errors are false negative results with the majority of these being due to sampling issues.

  18. Improved inference in Bayesian segmentation using Monte Carlo sampling: application to hippocampal subfield volumetry.

    PubMed

    Iglesias, Juan Eugenio; Sabuncu, Mert Rory; Van Leemput, Koen

    2013-10-01

    Many segmentation algorithms in medical image analysis use Bayesian modeling to augment local image appearance with prior anatomical knowledge. Such methods often contain a large number of free parameters that are first estimated and then kept fixed during the actual segmentation process. However, a faithful Bayesian analysis would marginalize over such parameters, accounting for their uncertainty by considering all possible values they may take. Here we propose to incorporate this uncertainty into Bayesian segmentation methods in order to improve the inference process. In particular, we approximate the required marginalization over model parameters using computationally efficient Markov chain Monte Carlo techniques. We illustrate the proposed approach using a recently developed Bayesian method for the segmentation of hippocampal subfields in brain MRI scans, showing a significant improvement in an Alzheimer's disease classification task. As an additional benefit, the technique also allows one to compute informative "error bars" on the volume estimates of individual structures. Copyright © 2013 Elsevier B.V. All rights reserved.

  19. Improved Inference in Bayesian Segmentation Using Monte Carlo Sampling: Application to Hippocampal Subfield Volumetry

    PubMed Central

    Iglesias, Juan Eugenio; Sabuncu, Mert Rory; Leemput, Koen Van

    2013-01-01

    Many segmentation algorithms in medical image analysis use Bayesian modeling to augment local image appearance with prior anatomical knowledge. Such methods often contain a large number of free parameters that are first estimated and then kept fixed during the actual segmentation process. However, a faithful Bayesian analysis would marginalize over such parameters, accounting for their uncertainty by considering all possible values they may take. Here we propose to incorporate this uncertainty into Bayesian segmentation methods in order to improve the inference process. In particular, we approximate the required marginalization over model parameters using computationally efficient Markov chain Monte Carlo techniques. We illustrate the proposed approach using a recently developed Bayesian method for the segmentation of hippocampal subfields in brain MRI scans, showing a significant improvement in an Alzheimer’s disease classification task. As an additional benefit, the technique also allows one to compute informative “error bars” on the volume estimates of individual structures. PMID:23773521

  20. A hybrid strategy of offline adaptive planning and online image guidance for prostate cancer radiotherapy.

    PubMed

    Lei, Yu; Wu, Qiuwen

    2010-04-21

    Offline adaptive radiotherapy (ART) has been used to effectively correct and compensate for prostate motion and reduce the required margin. The efficacy depends on the characteristics of the patient setup error and interfraction motion through the whole treatment; specifically, systematic errors are corrected and random errors are compensated for through the margins. In online image-guided radiation therapy (IGRT) of prostate cancer, the translational setup error and inter-fractional prostate motion are corrected through pre-treatment imaging and couch correction at each fraction. However, the rotation and deformation of the target are not corrected and only accounted for with margins in treatment planning. The purpose of this study was to investigate whether the offline ART strategy is necessary for an online IGRT protocol and to evaluate the benefit of the hybrid strategy. First, to investigate the rationale of the hybrid strategy, 592 cone-beam-computed tomography (CBCT) images taken before and after each fraction for an online IGRT protocol from 16 patients were analyzed. Specifically, the characteristics of prostate rotation were analyzed. It was found that there exist systematic inter-fractional prostate rotations, and they are patient specific. These rotations, if not corrected, are persistent through the treatment fraction, and rotations detected in early fractions are representative of those in later fractions. These findings suggest that the offline adaptive replanning strategy is beneficial to the online IGRT protocol with further margin reductions. Second, to quantitatively evaluate the benefit of the hybrid strategy, 412 repeated helical CT scans from 25 patients during the course of treatment were included in the replanning study. Both low-risk patients (LRP, clinical target volume, CTV = prostate) and intermediate-risk patients (IRP, CTV = prostate + seminal vesicles) were included in the simulation. The contours of prostate and seminal vesicles were delineated on each CT. The benefit of margin reduction to compensate for both rotation and deformation in the hybrid strategy was evaluated geometrically. With the hybrid strategy, the planning margins can be reduced by 1.4 mm for LRP, and 2.0 mm for IRP, compared with the standard online IGRT only, to maintain the same 99% target volume coverage. The average relative reduction in planning target volume (PTV) based on the internal target volume (ITV) from PTV based on CTV is 19% for LRP, and 27% for IRP.

  1. Evaluating deviations in prostatectomy patients treated with IMRT.

    PubMed

    Sá, Ana Cravo; Peres, Ana; Pereira, Mónica; Coelho, Carina Marques; Monsanto, Fátima; Macedo, Ana; Lamas, Adrian

    2016-01-01

    To evaluate the deviations in prostatectomy patients treated with IMRT in order to calculate appropriate margins to create the PTV. Defining inappropriate margins can lead to underdosing in target volumes and also overdosing in healthy tissues, increasing morbidity. 223 CBCT images used for alignment with the CT planning scan based on bony anatomy were analyzed in 12 patients treated with IMRT following prostatectomy. Shifts of CBCT images were recorded in three directions to calculate the required margin to create PTV. The mean and standard deviation (SD) values in millimetres were -0.05 ± 1.35 in the LR direction, -0.03 ± 0.65 in the SI direction and -0.02 ± 2.05 the AP direction. The systematic error measured in the LR, SI and AP direction were 1.35 mm, 0.65 mm, and 2.05 mm with a random error of 2.07 mm; 1.45 mm and 3.16 mm, resulting in a PTV margin of 4.82 mm; 2.64 mm, and 7.33 mm, respectively. With IGRT we suggest a margin of 5 mm, 3 mm and 8 mm in the LR, SI and AP direction, respectively, to PTV1 and PTV2. Therefore, this study supports an anisotropic margin expansion to the PTV being the largest expansion in the AP direction and lower in SI.

  2. Using Marginal Structural Measurement-Error Models to Estimate the Long-term Effect of Antiretroviral Therapy on Incident AIDS or Death

    PubMed Central

    Cole, Stephen R.; Jacobson, Lisa P.; Tien, Phyllis C.; Kingsley, Lawrence; Chmiel, Joan S.; Anastos, Kathryn

    2010-01-01

    To estimate the net effect of imperfectly measured highly active antiretroviral therapy on incident acquired immunodeficiency syndrome or death, the authors combined inverse probability-of-treatment-and-censoring weighted estimation of a marginal structural Cox model with regression-calibration methods. Between 1995 and 2007, 950 human immunodeficiency virus–positive men and women were followed in 2 US cohort studies. During 4,054 person-years, 374 initiated highly active antiretroviral therapy, 211 developed acquired immunodeficiency syndrome or died, and 173 dropped out. Accounting for measured confounders and determinants of dropout, the weighted hazard ratio for acquired immunodeficiency syndrome or death comparing use of highly active antiretroviral therapy in the prior 2 years with no therapy was 0.36 (95% confidence limits: 0.21, 0.61). This association was relatively constant over follow-up (P = 0.19) and stronger than crude or adjusted hazard ratios of 0.75 and 0.95, respectively. Accounting for measurement error in reported exposure using external validation data on 331 men and women provided a hazard ratio of 0.17, with bias shifted from the hazard ratio to the estimate of precision as seen by the 2.5-fold wider confidence limits (95% confidence limits: 0.06, 0.43). Marginal structural measurement-error models can simultaneously account for 3 major sources of bias in epidemiologic research: validated exposure measurement error, measured selection bias, and measured time-fixed and time-varying confounding. PMID:19934191

  3. A class of optimum digital phase locked loops

    NASA Technical Reports Server (NTRS)

    Kumar, R.; Hurd, W. J.

    1986-01-01

    This paper presents a class of optimum digital filters for digital phase locked loops, for the important case in which the maximum update rate of the loop filter and numerically controlled oscillator (NCO) is limited. This case is typical when the loop filter is implemented in a microprocessor. In these situations, pure delay is encountered in the loop transfer function and thus the stability and gain margin of the loop are of crucial interest. The optimum filters designed for such situations are evaluated in terms of their gain margin for stability, dynamic error, and steady-state error performance. For situations involving considerably high phase dynamics an adaptive and programmable implementation is also proposed to obtain an overall optimum strategy.

  4. Positioning accuracy during VMAT of gynecologic malignancies and the resulting dosimetric impact by a 6-degree-of-freedom couch in combination with daily kilovoltage cone beam computed tomography.

    PubMed

    Yao, Lihong; Zhu, Lihong; Wang, Junjie; Liu, Lu; Zhou, Shun; Jiang, ShuKun; Cao, Qianqian; Qu, Ang; Tian, Suqing

    2015-04-26

    To improve the delivery of radiotherapy in gynecologic malignancies and to minimize the irradiation of unaffected tissues by using daily kilovoltage cone beam computed tomography (kV-CBCT) to reduce setup errors. Thirteen patients with gynecologic cancers were treated with postoperative volumetric-modulated arc therapy (VMAT). All patients had a planning CT scan and daily CBCT during treatment. Automatic bone anatomy matching was used to determine initial inter-fraction positioning error. Positional correction on a six-degrees-of-freedom (6DoF) couch was followed by a second scan to calculate the residual inter-fraction error, and a post-treatment scan assessed intra-fraction motion. The margins of the planning target volume (MPTV) were calculated from these setup variations and the effect of margin size on normal tissue sparing was evaluated. In total, 573 CBCT scans were acquired. Mean absolute pre-/post-correction errors were obtained in all six planes. With 6DoF couch correction, the MPTV accounting for intra-fraction errors was reduced by 3.8-5.6 mm. This permitted a reduction in the maximum dose to the small intestine, bladder and femoral head (P=0.001, 0.035 and 0.032, respectively), the average dose to the rectum, small intestine, bladder and pelvic marrow (P=0.003, 0.000, 0.001 and 0.000, respectively) and markedly reduced irradiated normal tissue volumes. A 6DoF couch in combination with daily kV-CBCT can considerably improve positioning accuracy during VMAT treatment in gynecologic malignancies, reducing the MPTV. The reduced margin size permits improved normal tissue sparing and a smaller total irradiated volume.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balderson, Michael, E-mail: michael.balderson@rmp.uhn.ca; Brown, Derek; Johnson, Patricia

    The purpose of this work was to compare static gantry intensity-modulated radiation therapy (IMRT) with volume-modulated arc therapy (VMAT) in terms of tumor control probability (TCP) under scenarios involving large geometric misses, i.e., those beyond what are accounted for when margin expansion is determined. Using a planning approach typical for these treatments, a linear-quadratic–based model for TCP was used to compare mean TCP values for a population of patients who experiences a geometric miss (i.e., systematic and random shifts of the clinical target volume within the planning target dose distribution). A Monte Carlo approach was used to account for themore » different biological sensitivities of a population of patients. Interestingly, for errors consisting of coplanar systematic target volume offsets and three-dimensional random offsets, static gantry IMRT appears to offer an advantage over VMAT in that larger shift errors are tolerated for the same mean TCP. For example, under the conditions simulated, erroneous systematic shifts of 15 mm directly between or directly into static gantry IMRT fields result in mean TCP values between 96% and 98%, whereas the same errors on VMAT plans result in mean TCP values between 45% and 74%. Random geometric shifts of the target volume were characterized using normal distributions in each Cartesian dimension. When the standard deviations were doubled from those values assumed in the derivation of the treatment margins, our model showed a 7% drop in mean TCP for the static gantry IMRT plans but a 20% drop in TCP for the VMAT plans. Although adding a margin for error to a clinical target volume is perhaps the best approach to account for expected geometric misses, this work suggests that static gantry IMRT may offer a treatment that is more tolerant to geometric miss errors than VMAT.« less

  6. Estimation of sensible and latent heat flux from natural sparse vegetation surfaces using surface renewal

    NASA Astrophysics Data System (ADS)

    Zapata, N.; Martínez-Cob, A.

    2001-12-01

    This paper reports a study undertaken to evaluate the feasibility of the surface renewal method to accurately estimate long-term evaporation from the playa and margins of an endorreic salty lagoon (Gallocanta lagoon, Spain) under semiarid conditions. High-frequency temperature readings were taken for two time lags ( r) and three measurement heights ( z) in order to get surface renewal sensible heat flux ( HSR) values. These values were compared against eddy covariance sensible heat flux ( HEC) values for a calibration period (25-30 July 2000). Error analysis statistics (index of agreement, IA; root mean square error, RMSE; and systematic mean square error, MSEs) showed that the agreement between HSR and HEC improved as measurement height decreased and time lag increased. Calibration factors α were obtained for all analyzed cases. The best results were obtained for the z=0.9 m ( r=0.75 s) case for which α=1.0 was observed. In this case, uncertainty was about 10% in terms of relative error ( RE). Latent heat flux values were obtained by solving the energy balance equation for both the surface renewal ( LESR) and the eddy covariance ( LEEC) methods, using HSR and HEC, respectively, and measurements of net radiation and soil heat flux. For the calibration period, error analysis statistics for LESR were quite similar to those for HSR, although errors were mostly at random. LESR uncertainty was less than 9%. Calibration factors were applied for a validation data subset (30 July-4 August 2000) for which meteorological conditions were somewhat different (higher temperatures and wind speed and lower solar and net radiation). Error analysis statistics for both HSR and LESR were quite good for all cases showing the goodness of the calibration factors. Nevertheless, the results obtained for the z=0.9 m ( r=0.75 s) case were still the best ones.

  7. A comparison of marginal fit between press-fabricated and CAD/CAM lithium disilicate crowns.

    PubMed

    Carlile, Richard S; Owens, Wade H; Greenwood, William J; Guevara, Peter H

    2018-01-01

    The purpose of this study was to compare the marginal fit of press-fabricated lithium disilicate crowns with that of computer-aided design/computer-aided manufacturing (CAD/CAM) lithium disilicate crowns to determine if the fabrication method has an influence on marginal fit. The marginal fit of 25 pressed and 25 CAD/CAM crowns was measured using the replica technique. The sites measured were the mesial, distal, facial, and lingual margins. A microscope at 10× magnification was used to obtain the measurements. Each site was measured 4 times, and intraclass correlation coefficients were used to assess measurement errors. An unpaired t test was used to evaluate the differences between the 2 groups. Mean marginal gap measurements were greater for CAD/CAM crowns than for pressed crowns at all sites. Only the difference in mean gap at the facial margin was statistically significant (P < 0.001). Press-fabricated lithium disilicate crowns provided a better marginal fit than those fabricated by CAD/CAM, but both fabrication methods provided crowns with a clinically acceptable marginal fit.

  8. Analysis of automatic repeat request methods for deep-space downlinks

    NASA Technical Reports Server (NTRS)

    Pollara, F.; Ekroot, L.

    1995-01-01

    Automatic repeat request (ARQ) methods cannot increase the capacity of a memoryless channel. However, they can be used to decrease the complexity of the channel-coding system to achieve essentially error-free transmission and to reduce link margins when the channel characteristics are poorly predictable. This article considers ARQ methods on a power-limited channel (e.g., the deep-space channel), where it is important to minimize the total power needed to transmit the data, as opposed to a bandwidth-limited channel (e.g., terrestrial data links), where the spectral efficiency or the total required transmission time is the most relevant performance measure. In the analysis, we compare the performance of three reference concatenated coded systems used in actual deep-space missions to that obtainable by ARQ methods using the same codes, in terms of required power, time to transmit with a given number of retransmissions, and achievable probability of word error. The ultimate limits of ARQ with an arbitrary number of retransmissions are also derived.

  9. How Accurate Is a Test Score?

    ERIC Educational Resources Information Center

    Doppelt, Jerome E.

    1956-01-01

    The standard error of measurement as a means for estimating the margin of error that should be allowed for in test scores is discussed. The true score measures the performance that is characteristic of the person tested; the variations, plus and minus, around the true score describe a characteristic of the test. When the standard deviation is used…

  10. 2010 Workplace and Gender Relations Survey of Active Duty Members. Overview Report on Sexual Harassment

    DTIC Science & Technology

    2011-04-01

    getting out of your Service Your work performace decreased WGRA 2010 Q37 Margins of error range from ±1 to ±2 Note. “Large extent” includes the...Mental health care doesn’t work ........ a. b. c. d. e. f. g. h. i. j. k. GENDER-RELATED EXPERIENCES Yes, and your gender was a factor Yes, but your...months prior to taking the survey and the details of incidents they have experienced. The report also includes an analysis of the effectiveness of

  11. Structured Set Intra Prediction With Discriminative Learning in a Max-Margin Markov Network for High Efficiency Video Coding

    PubMed Central

    Dai, Wenrui; Xiong, Hongkai; Jiang, Xiaoqian; Chen, Chang Wen

    2014-01-01

    This paper proposes a novel model on intra coding for High Efficiency Video Coding (HEVC), which simultaneously predicts blocks of pixels with optimal rate distortion. It utilizes the spatial statistical correlation for the optimal prediction based on 2-D contexts, in addition to formulating the data-driven structural interdependences to make the prediction error coherent with the probability distribution, which is desirable for successful transform and coding. The structured set prediction model incorporates a max-margin Markov network (M3N) to regulate and optimize multiple block predictions. The model parameters are learned by discriminating the actual pixel value from other possible estimates to maximize the margin (i.e., decision boundary bandwidth). Compared to existing methods that focus on minimizing prediction error, the M3N-based model adaptively maintains the coherence for a set of predictions. Specifically, the proposed model concurrently optimizes a set of predictions by associating the loss for individual blocks to the joint distribution of succeeding discrete cosine transform coefficients. When the sample size grows, the prediction error is asymptotically upper bounded by the training error under the decomposable loss function. As an internal step, we optimize the underlying Markov network structure to find states that achieve the maximal energy using expectation propagation. For validation, we integrate the proposed model into HEVC for optimal mode selection on rate-distortion optimization. The proposed prediction model obtains up to 2.85% bit rate reduction and achieves better visual quality in comparison to the HEVC intra coding. PMID:25505829

  12. A critical look at spatial scale choices in satellite-based aerosol indirect effect studies

    NASA Astrophysics Data System (ADS)

    Grandey, B. S.; Stier, P.

    2010-06-01

    Analysing satellite datasets over large regions may introduce spurious relationships between aerosol and cloud properties due to spatial variations in aerosol type, cloud regime and synoptic regime climatologies. Using MODerate resolution Imaging Spectroradiometer data, we calculate relationships between aerosol optical depth τa, derived liquid cloud droplet effective number concentration Ne and liquid cloud droplet effective radius re at different spatial scales. Generally, positive values of dlnNe dlnτa are found for ocean regions, whilst negative values occur for many land regions. The spatial distribution of dlnre dlnτa shows approximately the opposite pattern, with generally postive values for land regions and negative values for ocean regions. We find that for region sizes larger than 4°×4°, spurious spatial variations in retrieved cloud and aerosol properties can introduce widespread significant errors to calculations of dlnNe dlnτa and dlnre dlnτa . For regions on the scale of 60°×60°, these methodological errors may lead to an overestimate in global cloud albedo effect radiative forcing of order 80%.

  13. Dynamic diagnostics of the error fields in tokamaks

    NASA Astrophysics Data System (ADS)

    Pustovitov, V. D.

    2007-07-01

    The error field diagnostics based on magnetic measurements outside the plasma is discussed. The analysed methods rely on measuring the plasma dynamic response to the finite-amplitude external magnetic perturbations, which are the error fields and the pre-programmed probing pulses. Such pulses can be created by the coils designed for static error field correction and for stabilization of the resistive wall modes, the technique developed and applied in several tokamaks, including DIII-D and JET. Here analysis is based on the theory predictions for the resonant field amplification (RFA). To achieve the desired level of the error field correction in tokamaks, the diagnostics must be sensitive to signals of several Gauss. Therefore, part of the measurements should be performed near the plasma stability boundary, where the RFA effect is stronger. While the proximity to the marginal stability is important, the absolute values of plasma parameters are not. This means that the necessary measurements can be done in the diagnostic discharges with parameters below the nominal operating regimes, with the stability boundary intentionally lowered. The estimates for ITER are presented. The discussed diagnostics can be tested in dedicated experiments in existing tokamaks. The diagnostics can be considered as an extension of the 'active MHD spectroscopy' used recently in the DIII-D tokamak and the EXTRAP T2R reversed field pinch.

  14. The cross-correlation between 3D cosmic shear and the integrated Sachs-Wolfe effect

    NASA Astrophysics Data System (ADS)

    Zieser, Britta; Merkel, Philipp M.

    2016-06-01

    We present the first calculation of the cross-correlation between 3D cosmic shear and the integrated Sachs-Wolfe (iSW) effect. Both signals are combined in a single formalism, which permits the computation of the full covariance matrix. In order to avoid the uncertainties presented by the non-linear evolution of the matter power spectrum and intrinsic alignments of galaxies, our analysis is restricted to large scales, I.e. multipoles below ℓ = 1000. We demonstrate in a Fisher analysis that this reduction compared to other studies of 3D weak lensing extending to smaller scales is compensated by the information that is gained if the additional iSW signal and in particular its cross-correlation with lensing data are considered. Given the observational standards of upcoming weak-lensing surveys like Euclid, marginal errors on cosmological parameters decrease by 10 per cent compared to a cosmic shear experiment if both types of information are combined without a cosmic wave background (CMB) prior. Once the constraining power of CMB data is added, the improvement becomes marginal.

  15. Local Setup Reproducibility of the Spinal Column When Using Intensity-Modulated Radiation Therapy for Craniospinal Irradiation With Patient in Supine Position

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stoiber, Eva Maria, E-mail: eva.stoiber@med.uni-heidelberg.de; Department of Medical Physics, German Cancer Research Center, Heidelberg; Giske, Kristina

    Purpose: To evaluate local positioning errors of the lumbar spine during fractionated intensity-modulated radiotherapy of patients treated with craniospinal irradiation and to assess the impact of rotational error correction on these uncertainties for one patient setup correction strategy. Methods and Materials: 8 patients (6 adults, 2 children) treated with helical tomotherapy for craniospinal irradiation were retrospectively chosen for this analysis. Patients were immobilized with a deep-drawn Aquaplast head mask. Additionally to daily megavoltage control computed tomography scans of the skull, once-a-week positioning of the lumbar spine was assessed. Therefore, patient setup was corrected by a target point correction, derived frommore » a registration of the patient's skull. The residual positioning variations of the lumbar spine were evaluated applying a rigid-registration algorithm. The impact of different rotational error corrections was simulated. Results: After target point correction, residual local positioning errors of the lumbar spine varied considerably. Craniocaudal axis rotational error correction did not improve or deteriorate these translational errors, whereas simulation of a rotational error correction of the right-left and anterior-posterior axis increased these errors by a factor of 2 to 3. Conclusion: The patient fixation used allows for deformations between the patient's skull and spine. Therefore, for the setup correction strategy evaluated in this study, generous margins for the lumbar spinal target volume are needed to prevent a local geographic miss. With any applied correction strategy, it needs to be evaluated whether or not a rotational error correction is beneficial.« less

  16. Enhancement of Speed Margins for 16× Digital Versatile Disc-Random Access Memory

    NASA Astrophysics Data System (ADS)

    Watanabe, Koichi; Minemura, Hiroyuki; Miyamoto, Makoto; Iimura, Makoto

    2006-02-01

    We have evaluated the speed margins of write/read 16× digital versatile disc-random access memory (DVD-RAM) test discs using write strategies for 6--16× constant angular velocity (CAV) control. Our approach is to determine the writing parameters for the middle zones by interpolating the zone numbers. Using this interpolation strategy, we successfully obtained overwrite jitter values of less than 8% and bit error rates of less than 10-5 in 6--16× DVD-RAM. Moreover, we confirmed that the speed margins were ± 20% for a 6--16× CAV.

  17. Comparing risk in conventional and organic dairy farming in the Netherlands: an empirical analysis.

    PubMed

    Berentsen, P B M; Kovacs, K; van Asseldonk, M A P M

    2012-07-01

    This study was undertaken to contribute to the understanding of why most dairy farmers do not convert to organic farming. Therefore, the objective of this research was to assess and compare risks for conventional and organic farming in the Netherlands with respect to gross margin and the underlying price and production variables. To investigate the risk factors a farm accountancy database was used containing panel data from both conventional and organic representative Dutch dairy farms (2001-2007). Variables with regard to price and production risk were identified using a gross margin analysis scheme. Price risk variables were milk price and concentrate price. The main production risk variables were milk yield per cow, roughage yield per hectare, and veterinary costs per cow. To assess risk, an error component implicit detrending method was applied and the resulting detrended standard deviations were compared between conventional and organic farms. Results indicate that the risk included in the gross margin per cow is significantly higher in organic farming. This is caused by both higher price and production risks. Price risks are significantly higher in organic farming for both milk price and concentrate price. With regard to production risk, only milk yield per cow poses a significantly higher risk in organic farming. Copyright © 2012 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  18. On solid ground. Revenue gains continue to outpace growth in expenses, allowing U.S. hospitals to enjoy record profit and margin.

    PubMed

    Evans, Melanie

    2007-10-29

    Hospitals enjoyed a surge in profits last year, reporting an aggregate profit margin of 6%. Executives at financially strong systems credit long-term efforts to improve performance for the results. Elizabeth Concordia, left, of the University of Pittsburgh Medical Center system, says its efforts stressed ongoing consolidation and integration to wipe out waste and errors.

  19. On land-use modeling: A treatise of satellite imagery data and misclassification error

    NASA Astrophysics Data System (ADS)

    Sandler, Austin M.

    Recent availability of satellite-based land-use data sets, including data sets with contiguous spatial coverage over large areas, relatively long temporal coverage, and fine-scale land cover classifications, is providing new opportunities for land-use research. However, care must be used when working with these datasets due to misclassification error, which causes inconsistent parameter estimates in the discrete choice models typically used to model land-use. I therefore adapt the empirical correction methods developed for other contexts (e.g., epidemiology) so that they can be applied to land-use modeling. I then use a Monte Carlo simulation, and an empirical application using actual satellite imagery data from the Northern Great Plains, to compare the results of a traditional model ignoring misclassification to those from models accounting for misclassification. Results from both the simulation and application indicate that ignoring misclassification will lead to biased results. Even seemingly insignificant levels of misclassification error (e.g., 1%) result in biased parameter estimates, which alter marginal effects enough to affect policy inference. At the levels of misclassification typical in current satellite imagery datasets (e.g., as high as 35%), ignoring misclassification can lead to systematically erroneous land-use probabilities and substantially biased marginal effects. The correction methods I propose, however, generate consistent parameter estimates and therefore consistent estimates of marginal effects and predicted land-use probabilities.

  20. Estimation of health effects of prenatal methylmercury exposure using structural equation models.

    PubMed

    Budtz-Jørgensen, Esben; Keiding, Niels; Grandjean, Philippe; Weihe, Pal

    2002-10-14

    Observational studies in epidemiology always involve concerns regarding validity, especially measurement error, confounding, missing data, and other problems that may affect the study outcomes. Widely used standard statistical techniques, such as multiple regression analysis, may to some extent adjust for these shortcomings. However, structural equations may incorporate most of these considerations, thereby providing overall adjusted estimations of associations. This approach was used in a large epidemiological data set from a prospective study of developmental methyl-mercury toxicity. Structural equation models were developed for assessment of the association between biomarkers of prenatal mercury exposure and neuropsychological test scores in 7 year old children. Eleven neurobehavioral outcomes were grouped into motor function and verbally mediated function. Adjustment for local dependence and item bias was necessary for a satisfactory fit of the model, but had little impact on the estimated mercury effects. The mercury effect on the two latent neurobehavioral functions was similar to the strongest effects seen for individual test scores of motor function and verbal skills. Adjustment for contaminant exposure to poly chlorinated biphenyls (PCBs) changed the estimates only marginally, but the mercury effect could be reduced to non-significance by assuming a large measurement error for the PCB biomarker. The structural equation analysis allows correction for measurement error in exposure variables, incorporation of multiple outcomes and incomplete cases. This approach therefore deserves to be applied more frequently in the analysis of complex epidemiological data sets.

  1. Stimulating Parenting Practices in Indigenous and Non-Indigenous Mexican Communities

    PubMed Central

    Ozer, Emily J.; Dow, William

    2017-01-01

    Parenting may be influenced by ethnicity; marginalization; education; and poverty. A critical but unexamined question is how these factors may interact to compromise or support parenting practices in ethnic minority communities. This analysis examined associations between mothers’ stimulating parenting practices and a range of child-level (age; sex; and cognitive and socio-emotional development); household-level (indigenous ethnicity; poverty; and parental education); and community-level (economic marginalization and majority indigenous population) variables among 1893 children ages 4–18 months in poor; rural communities in Mexico. We also explored modifiers of associations between living in an indigenous community and parenting. Key findings were that stimulating parenting was negatively associated with living in an indigenous community or family self-identification as indigenous (β = −4.25; SE (Standard Error) = 0.98; β = −1.58; SE = 0.83 respectively). However; living in an indigenous community was associated with significantly more stimulating parenting among indigenous families than living in a non-indigenous community (β = 2.96; SE = 1.25). Maternal education was positively associated with stimulating parenting only in indigenous communities; and household crowding was negatively associated with stimulating parenting only in non-indigenous communities. Mothers’ parenting practices were not associated with child sex; father’s residential status; education; or community marginalization. Our findings demonstrate that despite greater community marginalization; living in an indigenous community is protective for stimulating parenting practices of indigenous mothers. PMID:29295595

  2. Stimulating Parenting Practices in Indigenous and Non-Indigenous Mexican Communities.

    PubMed

    Knauer, Heather A; Ozer, Emily J; Dow, William; Fernald, Lia C H

    2017-12-25

    Parenting may be influenced by ethnicity; marginalization; education; and poverty. A critical but unexamined question is how these factors may interact to compromise or support parenting practices in ethnic minority communities. This analysis examined associations between mothers' stimulating parenting practices and a range of child-level (age; sex; and cognitive and socio-emotional development); household-level (indigenous ethnicity; poverty; and parental education); and community-level (economic marginalization and majority indigenous population) variables among 1893 children ages 4-18 months in poor; rural communities in Mexico. We also explored modifiers of associations between living in an indigenous community and parenting. Key findings were that stimulating parenting was negatively associated with living in an indigenous community or family self-identification as indigenous (β = -4.25; SE (Standard Error) = 0.98; β = -1.58; SE = 0.83 respectively). However; living in an indigenous community was associated with significantly more stimulating parenting among indigenous families than living in a non-indigenous community (β = 2.96; SE = 1.25). Maternal education was positively associated with stimulating parenting only in indigenous communities; and household crowding was negatively associated with stimulating parenting only in non-indigenous communities. Mothers' parenting practices were not associated with child sex; father's residential status; education; or community marginalization. Our findings demonstrate that despite greater community marginalization; living in an indigenous community is protective for stimulating parenting practices of indigenous mothers.

  3. Technical Note: Millimeter precision in ultrasound based patient positioning: Experimental quantification of inherent technical limitations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ballhausen, Hendrik, E-mail: hendrik.ballhausen@med.uni-muenchen.de; Hieber, Sheila; Li, Minglun

    2014-08-15

    Purpose: To identify the relevant technical sources of error of a system based on three-dimensional ultrasound (3D US) for patient positioning in external beam radiotherapy. To quantify these sources of error in a controlled laboratory setting. To estimate the resulting end-to-end geometric precision of the intramodality protocol. Methods: Two identical free-hand 3D US systems at both the planning-CT and the treatment room were calibrated to the laboratory frame of reference. Every step of the calibration chain was repeated multiple times to estimate its contribution to overall systematic and random error. Optimal margins were computed given the identified and quantified systematicmore » and random errors. Results: In descending order of magnitude, the identified and quantified sources of error were: alignment of calibration phantom to laser marks 0.78 mm, alignment of lasers in treatment vs planning room 0.51 mm, calibration and tracking of 3D US probe 0.49 mm, alignment of stereoscopic infrared camera to calibration phantom 0.03 mm. Under ideal laboratory conditions, these errors are expected to limit ultrasound-based positioning to an accuracy of 1.05 mm radially. Conclusions: The investigated 3D ultrasound system achieves an intramodal accuracy of about 1 mm radially in a controlled laboratory setting. The identified systematic and random errors require an optimal clinical tumor volume to planning target volume margin of about 3 mm. These inherent technical limitations do not prevent clinical use, including hypofractionation or stereotactic body radiation therapy.« less

  4. Under conditions of large geometric miss, tumor control probability can be higher for static gantry intensity-modulated radiation therapy compared to volume-modulated arc therapy for prostate cancer.

    PubMed

    Balderson, Michael; Brown, Derek; Johnson, Patricia; Kirkby, Charles

    2016-01-01

    The purpose of this work was to compare static gantry intensity-modulated radiation therapy (IMRT) with volume-modulated arc therapy (VMAT) in terms of tumor control probability (TCP) under scenarios involving large geometric misses, i.e., those beyond what are accounted for when margin expansion is determined. Using a planning approach typical for these treatments, a linear-quadratic-based model for TCP was used to compare mean TCP values for a population of patients who experiences a geometric miss (i.e., systematic and random shifts of the clinical target volume within the planning target dose distribution). A Monte Carlo approach was used to account for the different biological sensitivities of a population of patients. Interestingly, for errors consisting of coplanar systematic target volume offsets and three-dimensional random offsets, static gantry IMRT appears to offer an advantage over VMAT in that larger shift errors are tolerated for the same mean TCP. For example, under the conditions simulated, erroneous systematic shifts of 15mm directly between or directly into static gantry IMRT fields result in mean TCP values between 96% and 98%, whereas the same errors on VMAT plans result in mean TCP values between 45% and 74%. Random geometric shifts of the target volume were characterized using normal distributions in each Cartesian dimension. When the standard deviations were doubled from those values assumed in the derivation of the treatment margins, our model showed a 7% drop in mean TCP for the static gantry IMRT plans but a 20% drop in TCP for the VMAT plans. Although adding a margin for error to a clinical target volume is perhaps the best approach to account for expected geometric misses, this work suggests that static gantry IMRT may offer a treatment that is more tolerant to geometric miss errors than VMAT. Copyright © 2016 American Association of Medical Dosimetrists. Published by Elsevier Inc. All rights reserved.

  5. Prostate Localization on Daily Cone-Beam Computed Tomography Images: Accuracy Assessment of Similarity Metrics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Jinkoo, E-mail: jkim3@hfhs.or; Hammoud, Rabih; Pradhan, Deepak

    2010-07-15

    Purpose: To evaluate different similarity metrics (SM) using natural calcifications and observation-based measures to determine the most accurate prostate and seminal vesicle localization on daily cone-beam CT (CBCT) images. Methods and Materials: CBCT images of 29 patients were retrospectively analyzed; 14 patients with prostate calcifications (calcification data set) and 15 patients without calcifications (no-calcification data set). Three groups of test registrations were performed. Test 1: 70 CT/CBCT pairs from calcification dataset were registered using 17 SMs (6,580 registrations) and compared using the calcification mismatch error as an endpoint. Test 2: Using the four best SMs from Test 1, 75 CT/CBCTmore » pairs in the no-calcification data set were registered (300 registrations). Accuracy of contour overlays was ranked visually. Test 3: For the best SM from Tests 1 and 2, accuracy was estimated using 356 CT/CBCT registrations. Additionally, target expansion margins were investigated for generating registration regions of interest. Results: Test 1-Incremental sign correlation (ISC), gradient correlation (GC), gradient difference (GD), and normalized cross correlation (NCC) showed the smallest errors ({mu} {+-} {sigma}: 1.6 {+-} 0.9 {approx} 2.9 {+-} 2.1 mm). Test 2-Two of the three reviewers ranked GC higher. Test 3-Using GC, 96% of registrations showed <3-mm error when calcifications were filtered. Errors were left/right: 0.1 {+-} 0.5mm, anterior/posterior: 0.8 {+-} 1.0mm, and superior/inferior: 0.5 {+-} 1.1 mm. The existence of calcifications increased the success rate to 97%. Expansion margins of 4-10 mm were equally successful. Conclusion: Gradient-based SMs were most accurate. Estimated error was found to be <3 mm (1.1 mm SD) in 96% of the registrations. Results suggest that the contour expansion margin should be no less than 4 mm.« less

  6. Mental Depreciation and Marginal Decision Making

    PubMed

    Heath; Fennema

    1996-11-01

    We propose that individuals practice "mental depreciation," that is, they implicitly spread the fixed costs of their expenses over time or use. Two studies explore how people spread fixed costs on durable goods. A third study shows that depreciation can lead to two distinct errors in marginal decisions: First, people sometimes invest too much effort to get their money's worth from an expense (e.g., they may use a product a lot to spread the fixed expense across more uses). Second, people sometimes invest too little effort to get their money's worth: When people add a portion of the fixed cost to the current costs, their perceived marginal (i.e., incremental) costs exceed their true marginal costs. In response, they may stop investing because their perceived costs surpass the marginal benefits they are receiving. The latter effect is supported by two field studies that explore real board plan decisions by university students.

  7. Reduced change blindness suggests enhanced attention to detail in individuals with autism.

    PubMed

    Smith, Hayley; Milne, Elizabeth

    2009-03-01

    The phenomenon of change blindness illustrates that a limited number of items within the visual scene are attended to at any one time. It has been suggested that individuals with autism focus attention on less contextually relevant aspects of the visual scene, show superior perceptual discrimination and notice details which are often ignored by typical observers. In this study we investigated change blindness in autism by asking participants to detect continuity errors deliberately introduced into a short film. Whether the continuity errors involved central/marginal or social/non-social aspects of the visual scene was varied. Thirty adolescent participants, 15 with autistic spectrum disorder (ASD) and 15 typically developing (TD) controls participated. The participants with ASD detected significantly more errors than the TD participants. Both groups identified more errors involving central rather than marginal aspects of the scene, although this effect was larger in the TD participants. There was no difference in the number of social or non-social errors detected by either group of participants. In line with previous data suggesting an abnormally broad attentional spotlight and enhanced perceptual function in individuals with ASD, the results of this study suggest enhanced awareness of the visual scene in ASD. The results of this study could reflect superior top-down control of visual search in autism, enhanced perceptual function, or inefficient filtering of visual information in ASD.

  8. A quantitative microscopic approach to predict local recurrence based on in vivo intraoperative imaging of sarcoma tumor margins

    PubMed Central

    Mueller, Jenna L.; Fu, Henry L.; Mito, Jeffrey K.; Whitley, Melodi J.; Chitalia, Rhea; Erkanli, Alaattin; Dodd, Leslie; Cardona, Diana M.; Geradts, Joseph; Willett, Rebecca M.; Kirsch, David G.; Ramanujam, Nimmi

    2015-01-01

    The goal of resection of soft tissue sarcomas located in the extremity is to preserve limb function while completely excising the tumor with a margin of normal tissue. With surgery alone, one-third of patients with soft tissue sarcoma of the extremity will have local recurrence due to microscopic residual disease in the tumor bed. Currently, a limited number of intraoperative pathology-based techniques are used to assess margin status; however, few have been widely adopted due to sampling error and time constraints. To aid in intraoperative diagnosis, we developed a quantitative optical microscopy toolbox, which includes acriflavine staining, fluorescence microscopy, and analytic techniques called sparse component analysis and circle transform to yield quantitative diagnosis of tumor margins. A series of variables were quantified from images of resected primary sarcomas and used to optimize a multivariate model. The sensitivity and specificity for differentiating positive from negative ex vivo resected tumor margins was 82% and 75%. The utility of this approach was tested by imaging the in vivo tumor cavities from 34 mice after resection of a sarcoma with local recurrence as a bench mark. When applied prospectively to images from the tumor cavity, the sensitivity and specificity for differentiating local recurrence was 78% and 82%. For comparison, if pathology was used to predict local recurrence in this data set, it would achieve a sensitivity of 29% and a specificity of 71%. These results indicate a robust approach for detecting microscopic residual disease, which is an effective predictor of local recurrence. PMID:25994353

  9. On the Performance of the Marginal Homogeneity Test to Detect Rater Drift.

    PubMed

    Sgammato, Adrienne; Donoghue, John R

    2018-06-01

    When constructed response items are administered repeatedly, "trend scoring" can be used to test for rater drift. In trend scoring, raters rescore responses from the previous administration. Two simulation studies evaluated the utility of Stuart's Q measure of marginal homogeneity as a way of evaluating rater drift when monitoring trend scoring. In the first study, data were generated based on trend scoring tables obtained from an operational assessment. The second study tightly controlled table margins to disentangle certain features present in the empirical data. In addition to Q , the paired t test was included as a comparison, because of its widespread use in monitoring trend scoring. Sample size, number of score categories, interrater agreement, and symmetry/asymmetry of the margins were manipulated. For identical margins, both statistics had good Type I error control. For a unidirectional shift in margins, both statistics had good power. As expected, when shifts in the margins were balanced across categories, the t test had little power. Q demonstrated good power for all conditions and identified almost all items identified by the t test. Q shows substantial promise for monitoring of trend scoring.

  10. Dynamic Modeling of the SMAP Rotating Flexible Antenna

    NASA Technical Reports Server (NTRS)

    Nayeri, Reza D.

    2012-01-01

    Dynamic model development in ADAMS for the SMAP project is explained: The main objective of the dynamic models are for pointing error assessment, and the control/stability margin requirement verifications

  11. VCSEL-based fiber optic link for avionics: implementation and performance analyses

    NASA Astrophysics Data System (ADS)

    Shi, Jieqin; Zhang, Chunxi; Duan, Jingyuan; Wen, Huaitao

    2006-11-01

    A Gb/s fiber optic link with built-in test capability (BIT) basing on vertical-cavity surface-emitting laser (VCSEL) sources for military avionics bus for next generation has been presented in this paper. To accurately predict link performance, statistical methods and Bit Error Rate (BER) measurements have been examined. The results show that the 1Gb/s fiber optic link meets the BER requirement and values for link margin can reach up to 13dB. Analysis shows that the suggested photonic network may provide high performance and low cost interconnections alternative for future military avionics.

  12. Experiments in monthly mean simulation of the atmosphere with a coarse-mesh general circulation model

    NASA Technical Reports Server (NTRS)

    Lutz, R. J.; Spar, J.

    1978-01-01

    The Hansen atmospheric model was used to compute five monthly forecasts (October 1976 through February 1977). The comparison is based on an energetics analysis, meridional and vertical profiles, error statistics, and prognostic and observed mean maps. The monthly mean model simulations suffer from several defects. There is, in general, no skill in the simulation of the monthly mean sea-level pressure field, and only marginal skill is indicated for the 850 mb temperatures and 500 mb heights. The coarse-mesh model appears to generate a less satisfactory monthly mean simulation than the finer mesh GISS model.

  13. Non-inferiority trials: are they inferior? A systematic review of reporting in major medical journals

    PubMed Central

    Morris, Tim P; Fielding, Katherine; Carpenter, James R; Phillips, Patrick P J

    2016-01-01

    Objective To assess the adequacy of reporting of non-inferiority trials alongside the consistency and utility of current recommended analyses and guidelines. Design Review of randomised clinical trials that used a non-inferiority design published between January 2010 and May 2015 in medical journals that had an impact factor >10 (JAMA Internal Medicine, Archives Internal Medicine, PLOS Medicine, Annals of Internal Medicine, BMJ, JAMA, Lancet and New England Journal of Medicine). Data sources Ovid (MEDLINE). Methods We searched for non-inferiority trials and assessed the following: choice of non-inferiority margin and justification of margin; power and significance level for sample size; patient population used and how this was defined; any missing data methods used and assumptions declared and any sensitivity analyses used. Results A total of 168 trial publications were included. Most trials concluded non-inferiority (132; 79%). The non-inferiority margin was reported for 98% (164), but less than half reported any justification for the margin (77; 46%). While most chose two different analyses (91; 54%) the most common being intention-to-treat (ITT) or modified ITT and per-protocol, a large number of articles only chose to conduct and report one analysis (65; 39%), most commonly the ITT analysis. There was lack of clarity or inconsistency between the type I error rate and corresponding CIs for 73 (43%) articles. Missing data were rarely considered with (99; 59%) not declaring whether imputation techniques were used. Conclusions Reporting and conduct of non-inferiority trials is inconsistent and does not follow the recommendations in available statistical guidelines, which are not wholly consistent themselves. Authors should clearly describe the methods used and provide clear descriptions of and justifications for their design and primary analysis. Failure to do this risks misleading conclusions being drawn, with consequent effects on clinical practice. PMID:27855102

  14. Resistive wall mode feedback control in EXTRAP T2R with improved steady-state error and transient response

    NASA Astrophysics Data System (ADS)

    Brunsell, P. R.; Olofsson, K. E. J.; Frassinetti, L.; Drake, J. R.

    2007-10-01

    Experiments in the EXTRAP T2R reversed field pinch [P. R. Brunsell, H. Bergsåker, M. Cecconello et al., Plasma Phys. Control. Fusion 43, 1457 (2001)] on feedback control of m =1 resistive wall modes (RWMs) are compared with simulations using the cylindrical linear magnetohydrodynamic model, including the dynamics of the active coils and power amplifiers. Stabilization of the main RWMs (n=-11,-10,-9,-8,+5,+6) is shown using modest loop gains of the order G ˜1. However, other marginally unstable RWMs (n=-2,-1,+1,+2) driven by external field errors are only partially canceled at these gains. The experimental system stability limit is confirmed by simulations showing that the latency of the digital controller ˜50μs is degrading the system gain margin. The transient response is improved with a proportional-plus-derivative controller, and steady-state error is improved with a proportional-plus-integral controller. Suppression of all modes is obtained at high gain G ˜10 using a proportional-plus-integral-plus-derivative controller.

  15. A critical look at spatial scale choices in satellite-based aerosol indirect effect studies

    NASA Astrophysics Data System (ADS)

    Grandey, B. S.; Stier, P.

    2010-12-01

    Analysing satellite datasets over large regions may introduce spurious relationships between aerosol and cloud properties due to spatial variations in aerosol type, cloud regime and synoptic regime climatologies. Using MODerate resolution Imaging Spectroradiometer data, we calculate relationships between aerosol optical depth τa derived liquid cloud droplet effective number concentration Ne and liquid cloud droplet effective radius re at different spatial scales. Generally, positive values of dlnNedlnτa are found for ocean regions, whilst negative values occur for many land regions. The spatial distribution of dlnredlnτa shows approximately the opposite pattern, with generally postive values for land regions and negative values for ocean regions. We find that for region sizes larger than 4° × 4°, spurious spatial variations in retrieved cloud and aerosol properties can introduce widespread significant errors to calculations of dlnNedlnτa and dlnredlnτa. For regions on the scale of 60° × 60°, these methodological errors may lead to an overestimate in global cloud albedo effect radiative forcing of order 80% relative to that calculated for regions on the scale of 1° × 1°.

  16. Paradigms for parasite conservation.

    PubMed

    Dougherty, Eric R; Carlson, Colin J; Bueno, Veronica M; Burgio, Kevin R; Cizauskas, Carrie A; Clements, Christopher F; Seidel, Dana P; Harris, Nyeema C

    2016-08-01

    Parasitic species, which depend directly on host species for their survival, represent a major regulatory force in ecosystems and a significant component of Earth's biodiversity. Yet the negative impacts of parasites observed at the host level have motivated a conservation paradigm of eradication, moving us farther from attainment of taxonomically unbiased conservation goals. Despite a growing body of literature highlighting the importance of parasite-inclusive conservation, most parasite species remain understudied, underfunded, and underappreciated. We argue the protection of parasitic biodiversity requires a paradigm shift in the perception and valuation of their role as consumer species, similar to that of apex predators in the mid-20th century. Beyond recognizing parasites as vital trophic regulators, existing tools available to conservation practitioners should explicitly account for the unique threats facing dependent species. We built upon concepts from epidemiology and economics (e.g., host-density threshold and cost-benefit analysis) to devise novel metrics of margin of error and minimum investment for parasite conservation. We define margin of error as the risk of accidental host extinction from misestimating equilibrium population sizes and predicted oscillations, while minimum investment represents the cost associated with conserving the additional hosts required to maintain viable parasite populations. This framework will aid in the identification of readily conserved parasites that present minimal health risks. To establish parasite conservation, we propose an extension of population viability analysis for host-parasite assemblages to assess extinction risk. In the direst cases, ex situ breeding programs for parasites should be evaluated to maximize success without undermining host protection. Though parasitic species pose a considerable conservation challenge, adaptations to conservation tools will help protect parasite biodiversity in the face of an uncertain environmental future. © 2015 Society for Conservation Biology.

  17. Toward Prostate Cancer Contouring Guidelines on Magnetic Resonance Imaging: Dominant Lesion Gross and Clinical Target Volume Coverage Via Accurate Histology Fusion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gibson, Eli; Biomedical Engineering, University of Western Ontario, London, Ontario; Centre for Medical Image Computing, University College London, London

    Purpose: Defining prostate cancer (PCa) lesion clinical target volumes (CTVs) for multiparametric magnetic resonance imaging (mpMRI) could support focal boosting or treatment to improve outcomes or lower morbidity, necessitating appropriate CTV margins for mpMRI-defined gross tumor volumes (GTVs). This study aimed to identify CTV margins yielding 95% coverage of PCa tumors for prospective cases with high likelihood. Methods and Materials: Twenty-five men with biopsy-confirmed clinical stage T1 or T2 PCa underwent pre-prostatectomy mpMRI, yielding T2-weighted, dynamic contrast-enhanced, and apparent diffusion coefficient images. Digitized whole-mount histology was contoured and registered to mpMRI scans (error ≤2 mm). Four observers contoured lesion GTVs onmore » each mpMRI scan. CTVs were defined by isotropic and anisotropic expansion from these GTVs and from multiparametric (unioned) GTVs from 2 to 3 scans. Histologic coverage (proportions of tumor area on co-registered histology inside the CTV, measured for Gleason scores [GSs] ≥6 and ≥7) and prostate sparing (proportions of prostate volume outside the CTV) were measured. Nonparametric histologic-coverage prediction intervals defined minimal margins yielding 95% coverage for prospective cases with 78% to 92% likelihood. Results: On analysis of 72 true-positive tumor detections, 95% coverage margins were 9 to 11 mm (GS ≥ 6) and 8 to 10 mm (GS ≥ 7) for single-sequence GTVs and were 8 mm (GS ≥ 6) and 6 mm (GS ≥ 7) for 3-sequence GTVs, yielding CTVs that spared 47% to 81% of prostate tissue for the majority of tumors. Inclusion of T2-weighted contours increased sparing for multiparametric CTVs with 95% coverage margins for GS ≥6, and inclusion of dynamic contrast-enhanced contours increased sparing for GS ≥7. Anisotropic 95% coverage margins increased the sparing proportions to 71% to 86%. Conclusions: Multiparametric magnetic resonance imaging–defined GTVs expanded by appropriate margins may support focal boosting or treatment of PCa; however, these margins, accounting for interobserver and intertumoral variability, may preclude highly conformal CTVs. Multiparametric GTVs and anisotropic margins may reduce the required margins and improve prostate sparing.« less

  18. Marine Corps Body Composition Program: The Flawed Measurement System

    DTIC Science & Technology

    2006-02-07

    fitness expert and writer for ABC Bodybuilding , an error of 3% in a body fat evaluation is extreme and methods that have this margin of error should not...most other methods. In fact, bodybuilders use a seven to nine point skin fold measurement weekly during their training to monitor body fat...19.95 and recommended and endorsed by “Body-For-Life” and the World Natural Bodybuilding Federation. The caliper comes with detailed instructions

  19. Sci—Fri AM: Mountain — 06: Optimizing planning target volume in lung radiotherapy using deformable registration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoang, P; Wierzbicki, M; Juravinski Cancer Centre, Medical Physics Department, Hamilton, Ontario

    A four dimensional computed tomography (4DCT) image is acquired for all radically treated, lung cancer patients to define the internal target volume (ITV), which encompasses tumour motion due to breathing and subclinical disease. Patient set-up error and anatomical motion that is not due to breathing is addressed through an additional 1 cm margin around the ITV to obtain the planning target volume (PTV). The objective of this retrospective study is to find the minimum PTV margin that provides an acceptable probability of delivering the prescribed dose to the ITV. Acquisition of a kV cone beam computed tomography (CBCT) image atmore » each fraction was used to shift the treatment couch to accurately align the spinal cord and carina. Our method utilized deformable image registration to automatically position the planning ITV on each CBCT. We evaluated the percentage of the ITV surface that fell within various PTVs for 79 fractions across 18 patients. Treatment success was defined as a situation where at least 99% of the ITV is covered by the PTV. Overall, this is to be achieved in at least 90% of the treatment fractions. The current approach with a 1cm PTV margin was successful ∼96% of the time. This analysis revealed that the current margin can be reduced to 0.8cm isotropic or 0.6×0.6×1 cm{sup 3} non-isotropic, which were successful 92 and 91 percent of the time respectively. Moreover, we have shown that these margins maintain accuracy, despite intrafractional variation, and maximize CBCT image guidance capabilities.« less

  20. SU-E-J-140: Availability of Using Diaphragm Matching in Stereotactic Body Radiotherapy (SBRT) at the Time in Breath-Holding SBRT for Liver Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kawahara, D; Tsuda, S.; Section of Radiation Therapy, Department of Clinical Support, Hiroshima University Hospital Health

    2014-06-01

    Purpose: IGRT based on the bone matching may produce a larger target positioning error in terms of the reproducibility of the expiration breath hold. Therefore, the feasibility of the 3D image matching between planning CT image and pretreatment CBCT image based on the diaphragm matching was investigated. Methods: In fifteen-nine liver SBRT cases, Lipiodol, uptake after TACE was outlined as the marker of the tumor. The relative coordinate of the isocenter obtained by the contrast matching was defined as the reference coordinate. The target positioning difference between diaphragm matching and bone matching were evaluated by the relative coordinate of themore » isocenter from the reference coordinate obtained by each matching technique. In addition, we evaluated PTV margins by van Herk setup margin formula. Results: The target positioning error by the diaphragm matching and the bone matching was 1.31±0.83 and 3.10±2.80 mm in the cranial-caudal(C-C) direction, 1.04±0.95 and 1.62±1.02 mm in the anterior-posterior(A-P) direction, 0.93±1.19 and 1.12±0.94 mm in the left-right(L-R) direction, respectively. The positioning error by the diaphragm matching was significantly smaller than the bone matching in the C-C direction (p<0.05). The setup margin of diaphragm matching and bone matching that we had calculated based on van Herk margin formula was 4.5mm and 6.2mm(C-C), and 3.6mm and 6.3mm(A-P), and 2.6mm and 4.5mm(L-R), respectively. Conclusion: IGRT based on a diaphragm matching could be one alternative image matching technique for the positioning of the patients with liver tumor.« less

  1. Tumor resection at the pelvis using three-dimensional planning and patient-specific instruments: a case series.

    PubMed

    Jentzsch, Thorsten; Vlachopoulos, Lazaros; Fürnstahl, Philipp; Müller, Daniel A; Fuchs, Bruno

    2016-09-21

    Sarcomas are associated with a relatively high local recurrence rate of around 30 % in the pelvis. Inadequate surgical margins are the most important reason. However, obtaining adequate margins is particularly difficult in this anatomically demanding region. Recently, three-dimensional (3-D) planning, printed models, and patient-specific instruments (PSI) with cutting blocks have been introduced to improve the precision during surgical tumor resection. This case series illustrates these modern 3-D tools in pelvic tumor surgery. The first consecutive patients with 3-D-planned tumor resection around the pelvis were included in this retrospective study at a University Hospital in 2015. Detailed information about the clinical presentation, imaging techniques, preoperative planning, intraoperative surgical procedures, and postoperative evaluation is provided for each case. The primary outcome was tumor-free resection margins as assessed by a postoperative computed tomography (CT) scan of the specimen. The secondary outcomes were precision of preoperative planning and complications. Four patients with pelvic sarcomas were included in this study. The mean follow-up was 7.8 (range, 6.0-9.0) months. The combined use of preoperative planning with 3-D techniques, 3-D-printed models, and PSI for osteotomies led to higher precision (maximal (max) error of 0.4 centimeters (cm)) than conventional 3-D planning and freehand osteotomies (max error of 2.8 cm). Tumor-free margins were obtained where measurable (n = 3; margins were not assessable in a patient with curettage). Two insufficiency fractures were noted postoperatively. Three-dimensional planning as well as the intraoperative use of 3-D-printed models and PSI are valuable for complex sarcoma resection at the pelvis. Three-dimensionally printed models of the patient anatomy may help visualization and precision. PSI with cutting blocks help perform very precise osteotomies for adequate resection margins.

  2. Error in the Honeybee Waggle Dance Improves Foraging Flexibility

    PubMed Central

    Okada, Ryuichi; Ikeno, Hidetoshi; Kimura, Toshifumi; Ohashi, Mizue; Aonuma, Hitoshi; Ito, Etsuro

    2014-01-01

    The honeybee waggle dance communicates the location of profitable food sources, usually with a certain degree of error in the directional information ranging from 10–15° at the lower margin. We simulated one-day colonial foraging to address the biological significance of information error in the waggle dance. When the error was 30° or larger, the waggle dance was not beneficial. If the error was 15°, the waggle dance was beneficial when the food sources were scarce. When the error was 10° or smaller, the waggle dance was beneficial under all the conditions tested. Our simulation also showed that precise information (0–5° error) yielded great success in finding feeders, but also caused failures at finding new feeders, i.e., a high-risk high-return strategy. The observation that actual bees perform the waggle dance with an error of 10–15° might reflect, at least in part, the maintenance of a successful yet risky foraging trade-off. PMID:24569525

  3. PAM-4 Signaling over VCSELs with 0.13µm CMOS Chip Technology

    NASA Astrophysics Data System (ADS)

    Cunningham, J. E.; Beckman, D.; Zheng, Xuezhe; Huang, Dawei; Sze, T.; Krishnamoorthy, A. V.

    2006-12-01

    We present results for VCSEL based links operating PAM-4 signaling using a commercial 0.13µm CMOS technology. We perform a complete link analysis of the Bit Error Rate, Q factor, random and deterministic jitter by measuring waterfall curves versus margins in time and amplitude. We demonstrate that VCSEL based PAM 4 can match or even improve performance over binary signaling under conditions of a bandwidth limited, 100meter multi-mode optical link at 5Gbps. We present the first sensitivity measurements for optical PAM-4 and compare it with binary signaling. Measured benefits are reconciled with information theory predictions.

  4. PAM-4 Signaling over VCSELs with 0.13microm CMOS Chip Technology.

    PubMed

    Cunningham, J E; Beckman, D; Zheng, Xuezhe; Huang, Dawei; Sze, T; Krishnamoorthy, A V

    2006-12-11

    We present results for VCSEL based links operating PAM-4 signaling using a commercial 0.13microm CMOS technology. We perform a complete link analysis of the Bit Error Rate, Q factor, random and deterministic jitter by measuring waterfall curves versus margins in time and amplitude. We demonstrate that VCSEL based PAM-4 can match or even improve performance over binary signaling under conditions of a bandwidth limited, 100meter multi-mode optical link at 5Gbps. We present the first sensitivity measurements for optical PAM-4 and compare it with binary signaling. Measured benefits are reconciled with information theory predictions.

  5. Microprocessor-based cardiopulmonary monitoring system

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The system uses a dedicated microprocessor for transducer control and data acquisition and analysis. No data will be stored in this system, but the data will be transmitted to the onboard data system. The data system will require approximately 12 inches of rack space and will consume only 100 watts of power. An experiment specific control panel, through a series of lighted buttons, will guide the operator through the test series providing a smaller margin of error. The experimental validity of the system was verified, and the reproducibility of data and reliability of the system checked. In addition, ease of training, ease of operator interaction, and crew acceptance were evaluated in actual flight conditions.

  6. Background CO2 levels and error analysis from ground-based solar absorption IR measurements in central Mexico

    NASA Astrophysics Data System (ADS)

    Baylon, Jorge L.; Stremme, Wolfgang; Grutter, Michel; Hase, Frank; Blumenstock, Thomas

    2017-07-01

    In this investigation we analyze two common optical configurations to retrieve CO2 total column amounts from solar absorption infrared spectra. The noise errors using either a KBr or a CaF2 beam splitter, a main component of a Fourier transform infrared spectrometer (FTIR), are quantified in order to assess the relative precisions of the measurements. The configuration using a CaF2 beam splitter, as deployed by the instruments which contribute to the Total Carbon Column Observing Network (TCCON), shows a slightly better precision. However, we show that the precisions in XCO2 ( = 0.2095 ṡ Total Column CO2Total Column O2) retrieved from > 96 % of the spectra measured with a KBr beam splitter fall well below 0.2 %. A bias in XCO2 (KBr - CaF2) of +0.56 ± 0.25 ppm was found when using an independent data set as reference. This value, which corresponds to +0.14 ± 0.064 %, is slightly larger than the mean precisions obtained. A 3-year XCO2 time series from FTIR measurements at the high-altitude site of Altzomoni in central Mexico presents clear annual and diurnal cycles, and a trend of +2.2 ppm yr-1 could be determined.

  7. Methods for determining time of death.

    PubMed

    Madea, Burkhard

    2016-12-01

    Medicolegal death time estimation must estimate the time since death reliably. Reliability can only be provided empirically by statistical analysis of errors in field studies. Determining the time since death requires the calculation of measurable data along a time-dependent curve back to the starting point. Various methods are used to estimate the time since death. The current gold standard for death time estimation is a previously established nomogram method based on the two-exponential model of body cooling. Great experimental and practical achievements have been realized using this nomogram method. To reduce the margin of error of the nomogram method, a compound method was developed based on electrical and mechanical excitability of skeletal muscle, pharmacological excitability of the iris, rigor mortis, and postmortem lividity. Further increasing the accuracy of death time estimation involves the development of conditional probability distributions for death time estimation based on the compound method. Although many studies have evaluated chemical methods of death time estimation, such methods play a marginal role in daily forensic practice. However, increased precision of death time estimation has recently been achieved by considering various influencing factors (i.e., preexisting diseases, duration of terminal episode, and ambient temperature). Putrefactive changes may be used for death time estimation in water-immersed bodies. Furthermore, recently developed technologies, such as H magnetic resonance spectroscopy, can be used to quantitatively study decompositional changes. This review addresses the gold standard method of death time estimation in forensic practice and promising technological and scientific developments in the field.

  8. Improved setup and positioning accuracy using a three‐point customized cushion/mask/bite‐block immobilization system for stereotactic reirradiation of head and neck cancer

    PubMed Central

    Wang, He; Wang, Congjun; Tung, Samuel; Dimmitt, Andrew Wilson; Wong, Pei Fong; Edson, Mark A.; Garden, Adam S.; Rosenthal, David I.; Fuller, Clifton D.; Gunn, Gary B.; Takiar, Vinita; Wang, Xin A.; Luo, Dershan; Yang, James N.; Wong, Jennifer

    2016-01-01

    The purpose of this study was to investigate the setup and positioning uncertainty of a custom cushion/mask/bite‐block (CMB) immobilization system and determine PTV margin for image‐guided head and neck stereotactic ablative radiotherapy (HN‐SABR). We analyzed 105 treatment sessions among 21 patients treated with HN‐SABR for recurrent head and neck cancers using a custom CMB immobilization system. Initial patient setup was performed using the ExacTrac infrared (IR) tracking system and initial setup errors were based on comparison of ExacTrac IR tracking system to corrected online ExacTrac X‐rays images registered to treatment plans. Residual setup errors were determined using repeat verification X‐ray. The online ExacTrac corrections were compared to cone‐beam CT (CBCT) before treatment to assess agreement. Intrafractional positioning errors were determined using prebeam X‐rays. The systematic and random errors were analyzed. The initial translational setup errors were −0.8±1.3 mm, −0.8±1.6 mm, and 0.3±1.9 mm in AP, CC, and LR directions, respectively, with a three‐dimensional (3D) vector of 2.7±1.4 mm. The initial rotational errors were up to 2.4° if 6D couch is not available. CBCT agreed with ExacTrac X‐ray images to within 2 mm and 2.5°. The intrafractional uncertainties were 0.1±0.6 mm, 0.1±0.6 mm, and 0.2±0.5 mm in AP, CC, and LR directions, respectively, and 0.0∘±0.5°, 0.0∘±0.6°, and −0.1∘±0.4∘ in yaw, roll, and pitch direction, respectively. The translational vector was 0.9±0.6 mm. The calculated PTV margins mPTV(90,95) were within 1.6 mm when using image guidance for online setup correction. The use of image guidance for online setup correction, in combination with our customized CMB device, highly restricted target motion during treatments and provided robust immobilization to ensure minimum dose of 95% to target volume with 2.0 mm PTV margin for HN‐SABR. PACS number(s): 87.55.ne PMID:27167275

  9. SU-D-201-01: A Multi-Institutional Study Quantifying the Impact of Simulated Linear Accelerator VMAT Errors for Nasopharynx

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pogson, E; Liverpool and Macarthur Cancer Therapy Centres, Liverpool, NSW; Ingham Institute for Applied Medical Research, Sydney, NSW

    Purpose: To quantify the impact of differing magnitudes of simulated linear accelerator errors on the dose to the target volume and organs at risk for nasopharynx VMAT. Methods: Ten nasopharynx cancer patients were retrospectively replanned twice with one full arc VMAT by two institutions. Treatment uncertainties (gantry angle and collimator in degrees, MLC field size and MLC shifts in mm) were introduced into these plans at increments of 5,2,1,−1,−2 and −5. This was completed using an in-house Python script within Pinnacle3 and analysed using 3DVH and MatLab. The mean and maximum dose were calculated for the Planning Target Volume (PTV1),more » parotids, brainstem, and spinal cord and then compared to the original baseline plan. The D1cc was also calculated for the spinal cord and brainstem. Patient average results were compared across institutions. Results: Introduced gantry angle errors had the smallest effect of dose, no tolerances were exceeded for one institution, and the second institutions VMAT plans were only exceeded for gantry angle of ±5° affecting different sided parotids by 14–18%. PTV1, brainstem and spinal cord tolerances were exceeded for collimator angles of ±5 degrees, MLC shifts and MLC field sizes of ±1 and beyond, at the first institution. At the second institution, sensitivity to errors was marginally higher for some errors including the collimator error producing doses exceeding tolerances above ±2 degrees, and marginally lower with tolerances exceeded above MLC shifts of ±2. The largest differences occur with MLC field sizes, with both institutions reporting exceeded tolerances, for all introduced errors (±1 and beyond). Conclusion: The plan robustness for VMAT nasopharynx plans has been demonstrated. Gantry errors have the least impact on patient doses, however MLC field sizes exceed tolerances even with relatively low introduced errors and also produce the largest errors. This was consistent across both departments. The authors acknowledge funding support from the NSW Cancer Council.« less

  10. Image guidance in prostate cancer - can offline corrections be an effective substitute for daily online imaging?

    PubMed

    Prasad, Devleena; Das, Pinaki; Saha, Niladri S; Chatterjee, Sanjoy; Achari, Rimpa; Mallick, Indranil

    2014-01-01

    This aim of this study was to determine if a less resource-intensive and established offline correction protocol - the No Action Level (NAL) protocol was as effective as daily online corrections of setup deviations in curative high-dose radiotherapy of prostate cancer. A total of 683 daily megavoltage CT (MVCT) or kilovoltage CT (kvCBCT) images of 30 patients with localized prostate cancer treated with intensity modulated radiotherapy were evaluated. Daily image-guidance was performed and setup errors in three translational axes recorded. The NAL protocol was simulated by using the mean shift calculated from the first five fractions and implemented on all subsequent treatments. Using the imaging data from the remaining fractions, the daily residual error (RE) was determined. The proportion of fractions where the RE was greater than 3,5 and 7 mm was calculated, and also the actual PTV margin that would be required if the offline protocol was followed. Using the NAL protocol reduced the systematic but not the random errors. Corrections made using the NAL protocol resulted in small and acceptable RE in the mediolateral (ML) and superoinferior (SI) directions with 46/533 (8.1%) and 48/533 (5%) residual shifts above 5 mm. However; residual errors greater than 5mm in the anteroposterior (AP) direction remained in 181/533 (34%) of fractions. The PTV margins calculated based on residual errors were 5mm, 5mm and 13 mm in the ML, SI and AP directions respectively. Offline correction using the NAL protocol resulted in unacceptably high residual errors in the AP direction, due to random uncertainties of rectal and bladder filling. Daily online imaging and corrections remain the standard image guidance policy for highly conformal radiotherapy of prostate cancer.

  11. Linking Observations of Dynamic Topography from Oceanic and Continental Realms around Australia

    NASA Astrophysics Data System (ADS)

    Czarnota, K.; Hoggard, M. J.; White, N.; Winterbourne, J.

    2012-04-01

    In the last decade, there has been growing interest in predicting the spatial and temporal evolution of dynamic topography (i.e. the surface manifestation of mantle convection). By directly measuring Neogene and Quaternary dynamic topography around Australia's passive margins we assess the veracity of these predictions and the interplay between mantle convection and plate motion. We mapped the present dynamic topography by carefully measuring residual topography of oceanic lithosphere adjacent to passive margins. This map provides a reference with respect to which the relative record of vertical motions, preserved within the stratigraphic architecture of the margins, can be interpreted. We carefully constrained the temporal record of vertical motions along Australia's Northwest Shelf by backstripping Neogene carbonate clinoform rollover trajectories in order to minimise paleobathymetric errors. Elsewhere, we compile temporal constraints from published literature. Three principal insights emerge from our analysis. First, the present-day drawn-down residual topography of Australia, cannot be approximated by a regional tilt down towards the northeast, as previously hypothesised. The south-western and south-eastern corners of Australia are at negligible to slightly positive residual topography which slopes down towards Australia's northern margin and the Great Australian Bight. Secondly, the record of passive margin subsidence suggests drawdown across northern Australia commenced synchronously at 8±2 Ma. The amplitude of this synchronous drawdown corresponds to the amplitude of oceanic residual topography, indicating northern Australia was at an unperturbed dynamic elevation until drawdown commenced. The synchronicity of this subsidence suggests that the Australian plate has not been affected by a southward propagating wave of drawdown, despite Australia's rapid northward motion towards the subduction realm in south-east Asia. In contrast, it appears the mantle anomaly responsible for this drawdown is a relatively young, long-wavelength feature. Thirdly, there is an apparent mismatch between the current drawdown of oceanic lithosphere observed along Australia's southern margin and the onshore record of Cenozoic uplift. This disparity we attribute to the region undergoing recent uplift from a position of dynamic drawdown.

  12. Preparing and Restoring Composite Resin Restorations. The Advantage of High Magnification Loupes or the Dental Surgical Operating Microscope.

    PubMed

    Mamoun, John

    2015-01-01

    Use of magnification, such as 6x to 8x binocular surgical loupes or the surgical operating microscope, combined with co-axial illumination, may facilitate the creation of stable composite resin restorations that are less likely to develop caries, cracks or margin stains over years of service. Microscopes facilitate observation of clinically relevant microscopic visual details, such as microscopic amounts of demineralization or caries at preparation margins; microscopic areas of soft, decayed tooth structure; microscopic amounts of moisture contamination of the preparation during bonding; or microscopic marginal gaps in the composite. Preventing microscope-level errors in composite fabrication can result in a composite restoration that, at initial placement, appears perfect when viewed under 6x to 8x magnification and which also is free of secondary caries, marginal staining or cracks at multi-year follow-up visits.

  13. Accounting for Berkson and Classical Measurement Error in Radon Exposure Using a Bayesian Structural Approach in the Analysis of Lung Cancer Mortality in the French Cohort of Uranium Miners.

    PubMed

    Hoffmann, Sabine; Rage, Estelle; Laurier, Dominique; Laroche, Pierre; Guihenneuc, Chantal; Ancelet, Sophie

    2017-02-01

    Many occupational cohort studies on underground miners have demonstrated that radon exposure is associated with an increased risk of lung cancer mortality. However, despite the deleterious consequences of exposure measurement error on statistical inference, these analyses traditionally do not account for exposure uncertainty. This might be due to the challenging nature of measurement error resulting from imperfect surrogate measures of radon exposure. Indeed, we are typically faced with exposure uncertainty in a time-varying exposure variable where both the type and the magnitude of error may depend on period of exposure. To address the challenge of accounting for multiplicative and heteroscedastic measurement error that may be of Berkson or classical nature, depending on the year of exposure, we opted for a Bayesian structural approach, which is arguably the most flexible method to account for uncertainty in exposure assessment. We assessed the association between occupational radon exposure and lung cancer mortality in the French cohort of uranium miners and found the impact of uncorrelated multiplicative measurement error to be of marginal importance. However, our findings indicate that the retrospective nature of exposure assessment that occurred in the earliest years of mining of this cohort as well as many other cohorts of underground miners might lead to an attenuation of the exposure-risk relationship. More research is needed to address further uncertainties in the calculation of lung dose, since this step will likely introduce important sources of shared uncertainty.

  14. Investigation of Error Patterns in Geographical Databases

    NASA Technical Reports Server (NTRS)

    Dryer, David; Jacobs, Derya A.; Karayaz, Gamze; Gronbech, Chris; Jones, Denise R. (Technical Monitor)

    2002-01-01

    The objective of the research conducted in this project is to develop a methodology to investigate the accuracy of Airport Safety Modeling Data (ASMD) using statistical, visualization, and Artificial Neural Network (ANN) techniques. Such a methodology can contribute to answering the following research questions: Over a representative sampling of ASMD databases, can statistical error analysis techniques be accurately learned and replicated by ANN modeling techniques? This representative ASMD sample should include numerous airports and a variety of terrain characterizations. Is it possible to identify and automate the recognition of patterns of error related to geographical features? Do such patterns of error relate to specific geographical features, such as elevation or terrain slope? Is it possible to combine the errors in small regions into an error prediction for a larger region? What are the data density reduction implications of this work? ASMD may be used as the source of terrain data for a synthetic visual system to be used in the cockpit of aircraft when visual reference to ground features is not possible during conditions of marginal weather or reduced visibility. In this research, United States Geologic Survey (USGS) digital elevation model (DEM) data has been selected as the benchmark. Artificial Neural Networks (ANNS) have been used and tested as alternate methods in place of the statistical methods in similar problems. They often perform better in pattern recognition, prediction and classification and categorization problems. Many studies show that when the data is complex and noisy, the accuracy of ANN models is generally higher than those of comparable traditional methods.

  15. "Simulated molecular evolution" or computer-generated artifacts?

    PubMed

    Darius, F; Rojas, R

    1994-11-01

    1. The authors define a function with value 1 for the positive examples and 0 for the negative ones. They fit a continuous function but do not deal at all with the error margin of the fit, which is almost as large as the function values they compute. 2. The term "quality" for the value of the fitted function gives the impression that some biological significance is associated with values of the fitted function strictly between 0 and 1, but there is no justification for this kind of interpretation and finding the point where the fit achieves its maximum does not make sense. 3. By neglecting the error margin the authors try to optimize the fitted function using differences in the second, third, fourth, and even fifth decimal place which have no statistical significance. 4. Even if such a fit could profit from more data points, the authors should first prove that the region of interest has some kind of smoothness, that is, that a continuous fit makes any sense at all. 5. "Simulated molecular evolution" is a misnomer. We are dealing here with random search. Since the margin of error is so large, the fitted function does not provide statistically significant information about the points in search space where strings with cleavage sites could be found. This implies that the method is a highly unreliable stochastic search in the space of strings, even if the neural network is capable of learning some simple correlations. 6. Classical statistical methods are for these kind of problems with so few data points clearly superior to the neural networks used as a "black box" by the authors, which in the way they are structured provide a model with an error margin as large as the numbers being computed.7. And finally, even if someone would provide us with a function which separates strings with cleavage sites from strings without them perfectly, so-called simulated molecular evolution would not be better than random selection.Since a perfect fit would only produce exactly ones or zeros,starting a search in a region of space where all strings in the neighborhood get the value zero would not provide any kind of directional information for new iterations. We would just skip from one point to the other in a typical random walk manner.

  16. New-Sum: A Novel Online ABFT Scheme For General Iterative Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tao, Dingwen; Song, Shuaiwen; Krishnamoorthy, Sriram

    Emerging high-performance computing platforms, with large component counts and lower power margins, are anticipated to be more susceptible to soft errors in both logic circuits and memory subsystems. We present an online algorithm-based fault tolerance (ABFT) approach to efficiently detect and recover soft errors for general iterative methods. We design a novel checksum-based encoding scheme for matrix-vector multiplication that is resilient to both arithmetic and memory errors. Our design decouples the checksum updating process from the actual computation, and allows adaptive checksum overhead control. Building on this new encoding mechanism, we propose two online ABFT designs that can effectively recovermore » from errors when combined with a checkpoint/rollback scheme.« less

  17. The role of the medical laboratory technologist in drinking and driving cases. Part 2: The use of hospital alcohol results as evidence and providing testimony in court.

    PubMed

    Westenbrink, W

    1992-01-01

    Medical laboratory technologists routinely conduct alcohol analyses for medical purposes, however, in certain circumstances the results are used legally to determine a driver's Blood Alcohol Concentration. The technologist who performed the analysis may subsequently be required to provide evidence in court. The primary areas of interest to the court in these cases are: the type of swab utilized, the continuity of the blood sample, the method of analysis, the margin of error of the results, and the conversion of a serum alcohol concentration to a blood alcohol concentration in units as per the Criminal Code. Pre-trial preparation, courtroom procedures, and suggestions to enhance the technologist's credibility as a professional witness are outlined.

  18. Modeling error in assessment of mammographic image features for improved computer-aided mammography training: initial experience

    NASA Astrophysics Data System (ADS)

    Mazurowski, Maciej A.; Tourassi, Georgia D.

    2011-03-01

    In this study we investigate the hypothesis that there exist patterns in erroneous assessment of BI-RADS image features among radiology trainees when performing diagnostic interpretation of mammograms. We also investigate whether these error making patterns can be captured by individual user models. To test our hypothesis we propose a user modeling algorithm that uses the previous readings of a trainee to identify whether certain BI-RADS feature values (e.g. "spiculated" value for "margin" feature) are associated with higher than usual likelihood that the feature will be assessed incorrectly. In our experiments we used readings of 3 radiology residents and 7 breast imaging experts for 33 breast masses for the following BI-RADS features: parenchyma density, mass margin, mass shape and mass density. The expert readings were considered as the gold standard. Rule-based individual user models were developed and tested using the leave one-one-out crossvalidation scheme. Our experimental evaluation showed that the individual user models are accurate in identifying cases for which errors are more likely to be made. The user models captured regularities in error making for all 3 residents. This finding supports our hypothesis about existence of individual error making patterns in assessment of mammographic image features using the BI-RADS lexicon. Explicit user models identifying the weaknesses of each resident could be of great use when developing and adapting a personalized training plan to meet the resident's individual needs. Such approach fits well with the framework of adaptive computer-aided educational systems in mammography we have proposed before.

  19. 75 FR 13610 - Office of New Reactors; Interim Staff Guidance on Implementation of a Seismic Margin Analysis for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-22

    ... Staff Guidance on Implementation of a Seismic Margin Analysis for New Reactors Based on Probabilistic... Seismic Margin Analysis for New Reactors Based on Probabilistic Risk Assessment,'' (Agencywide Documents.../COL-ISG-020 ``Implementation of a Seismic Margin Analysis for New Reactors Based on Probabilistic Risk...

  20. Problems and pitfalls in cardiac drug therapy.

    PubMed

    Stone, S M; Rai, N; Nei, J

    2001-01-01

    Medical errors in the care of patients may account for 44,000 to 98,000 deaths per year, and 7,000 deaths per year are attributed to medication errors alone. Increasing awareness among health care providers of potential errors is a critical step toward improving the safety of medical care. Because today's medications are increasingly complex, approved at an accelerated rate, and often have a narrow therapeutic window with only a small margin of safety, patient and provider education is critical in assuring optimal therapeutic outcomes. Providers can use electronic resources such as Web sites to keep informed on drug-drug, drug-food, and drug-nutritional supplements interactions.

  1. Visual cues and perceived reachability.

    PubMed

    Gabbard, Carl; Ammar, Diala

    2005-12-01

    A rather consistent finding in studies of perceived (imagined) compared to actual movement in a reaching paradigm is the tendency to overestimate at midline. Explanations of such behavior have focused primarily on perceptions of postural constraints and the notion that individuals calibrate reachability in reference to multiple degrees of freedom, also known as the whole-body explanation. The present study examined the role of visual information in the form of binocular and monocular cues in perceived reachability. Right-handed participants judged the reachability of visual targets at midline with both eyes open, dominant eye occluded, and the non-dominant eye covered. Results indicated that participants were relatively accurate with condition responses not being significantly different in regard to total error. Analysis of the direction of error (mean bias) revealed effective accuracy across conditions with only a marginal distinction between monocular and binocular conditions. Therefore, within the task conditions of this experiment, it appears that binocular and monocular cues provide sufficient visual information for effective judgments of perceived reach at midline.

  2. Experimental investigation of optimum beam size for FSO uplink

    NASA Astrophysics Data System (ADS)

    Kaushal, Hemani; Kaddoum, Georges; Jain, Virander Kumar; Kar, Subrat

    2017-10-01

    In this paper, the effect of transmitter beam size on the performance of free space optical (FSO) communication has been determined experimentally. Irradiance profile for varying turbulence strength is obtained using optical turbulence generating (OTG) chamber inside laboratory environment. Based on the results, an optimum beam size is investigated using the semi-analytical method. Moreover, the combined effects of atmospheric scintillation and beam wander induced pointing errors are considered in order to determine the optimum beam size that minimizes the bit error rate (BER) of the system for a fixed transmitter power and link length. The results show that the optimum beam size for FSO uplink depends upon Fried parameter and outer scale of the turbulence. Further, it is observed that the optimum beam size increases with the increase in zenith angle but has negligible effect with the increase in fade threshold level at low turbulence levels and has a marginal effect at high turbulence levels. Finally, the obtained outcome is useful for FSO system design and BER performance analysis.

  3. Dependence of the bit error rate on the signal power and length of a single-channel coherent single-span communication line (100 Gbit s{sup -1}) with polarisation division multiplexing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gurkin, N V; Konyshev, V A; Novikov, A G

    2015-01-31

    We have studied experimentally and using numerical simulations and a phenomenological analytical model the dependences of the bit error rate (BER) on the signal power and length of a coherent single-span communication line with transponders employing polarisation division multiplexing and four-level phase modulation (100 Gbit s{sup -1} DP-QPSK format). In comparing the data of the experiment, numerical simulations and theoretical analysis, we have found two optimal powers: the power at which the BER is minimal and the power at which the fade margin in the line is maximal. We have derived and analysed the dependences of the BER on themore » optical signal power at the fibre line input and the dependence of the admissible input signal power range for implementation of the communication lines with a length from 30 – 50 km up to a maximum length of 250 km. (optical transmission of information)« less

  4. 19 CFR 351.224 - Disclosure of calculations and procedures for the correction of ministerial errors.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... least five absolute percentage points in, but not less than 25 percent of, the weighted-average dumping... margin or countervailable subsidy rate (whichever is applicable) of zero (or de minimis) and a weighted...

  5. 19 CFR 351.224 - Disclosure of calculations and procedures for the correction of ministerial errors.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... least five absolute percentage points in, but not less than 25 percent of, the weighted-average dumping... margin or countervailable subsidy rate (whichever is applicable) of zero (or de minimis) and a weighted...

  6. 19 CFR 351.224 - Disclosure of calculations and procedures for the correction of ministerial errors.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... least five absolute percentage points in, but not less than 25 percent of, the weighted-average dumping... margin or countervailable subsidy rate (whichever is applicable) of zero (or de minimis) and a weighted...

  7. 19 CFR 351.224 - Disclosure of calculations and procedures for the correction of ministerial errors.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... least five absolute percentage points in, but not less than 25 percent of, the weighted-average dumping... margin or countervailable subsidy rate (whichever is applicable) of zero (or de minimis) and a weighted...

  8. 19 CFR 351.224 - Disclosure of calculations and procedures for the correction of ministerial errors.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... least five absolute percentage points in, but not less than 25 percent of, the weighted-average dumping... margin or countervailable subsidy rate (whichever is applicable) of zero (or de minimis) and a weighted...

  9. Marginal Fit of Lithium Disilicate Crowns Fabricated Using Conventional and Digital Methodology: A Three-Dimensional Analysis.

    PubMed

    Mostafa, Nezrine Z; Ruse, N Dorin; Ford, Nancy L; Carvalho, Ricardo M; Wyatt, Chris C L

    2018-02-01

    To compare the marginal fit of lithium disilicate (LD) crowns fabricated with digital impression and manufacturing (DD), digital impression and traditional pressed manufacturing (DP), and traditional impression and manufacturing (TP). Tooth #15 was prepared for all-ceramic crowns on an ivorine typodont. There were 45 LD crowns fabricated using three techniques: DD, DP, and TP. Microcomputed tomography (micro-CT) was used to assess the 2D and 3D marginal fit of crowns in all three groups. The 2D vertical marginal gap (MG) measurements were done at 20 systematically selected points/crown, while the 3D measurements represented the 3D volume of the gap measured circumferentially at the crown margin. Frequencies of different marginal discrepancies were also recorded, including overextension (OE), underextension (UE), and marginal chipping. Crowns with vertical MG > 120 μm at more than five points were considered unacceptable and were rejected. The results were analyzed by one-way ANOVA with Scheffe post hoc test (α = 0.05). DD crowns demonstrated significantly smaller mean vertical MG (33.3 ± 19.99 μm) compared to DP (54.08 ± 32.34 μm) and TP (51.88 ± 35.34 μm) crowns. Similarly, MG volume was significantly lower in the DD group (3.32 ± 0.58 mm 3 ) compared to TP group (4.16 ± 0.59 mm 3 ). The mean MG volume for the DP group (3.55 ± 0.78 mm 3 ) was not significantly different from the other groups. The occurrence of underextension error was higher in DP (6.25%) and TP (5.4%) than in DD (0.33%) group, while overextension was more frequent in DD (37.67%) than in TP (28.85%) and DP (18.75%) groups. Overall, 4 out of 45 crowns fabricated were deemed unacceptable based on the vertical MG measurements (three in TP group and one in DP group; all crowns in DD group were deemed acceptable). The results suggested that digital impression and CAD/CAM technology is a suitable, better alternative to traditional impression and manufacturing. © 2017 by the American College of Prosthodontists.

  10. Quantitative Segmentation of Fluorescence Microscopy Images of Heterogeneous Tissue: Application to the Detection of Residual Disease in Tumor Margins

    PubMed Central

    Mueller, Jenna L.; Harmany, Zachary T.; Mito, Jeffrey K.; Kennedy, Stephanie A.; Kim, Yongbaek; Dodd, Leslie; Geradts, Joseph; Kirsch, David G.; Willett, Rebecca M.; Brown, J. Quincy; Ramanujam, Nimmi

    2013-01-01

    Purpose To develop a robust tool for quantitative in situ pathology that allows visualization of heterogeneous tissue morphology and segmentation and quantification of image features. Materials and Methods Tissue excised from a genetically engineered mouse model of sarcoma was imaged using a subcellular resolution microendoscope after topical application of a fluorescent anatomical contrast agent: acriflavine. An algorithm based on sparse component analysis (SCA) and the circle transform (CT) was developed for image segmentation and quantification of distinct tissue types. The accuracy of our approach was quantified through simulations of tumor and muscle images. Specifically, tumor, muscle, and tumor+muscle tissue images were simulated because these tissue types were most commonly observed in sarcoma margins. Simulations were based on tissue characteristics observed in pathology slides. The potential clinical utility of our approach was evaluated by imaging excised margins and the tumor bed in a cohort of mice after surgical resection of sarcoma. Results Simulation experiments revealed that SCA+CT achieved the lowest errors for larger nuclear sizes and for higher contrast ratios (nuclei intensity/background intensity). For imaging of tumor margins, SCA+CT effectively isolated nuclei from tumor, muscle, adipose, and tumor+muscle tissue types. Differences in density were correctly identified with SCA+CT in a cohort of ex vivo and in vivo images, thus illustrating the diagnostic potential of our approach. Conclusion The combination of a subcellular-resolution microendoscope, acriflavine staining, and SCA+CT can be used to accurately isolate nuclei and quantify their density in anatomical images of heterogeneous tissue. PMID:23824589

  11. Quantitative Segmentation of Fluorescence Microscopy Images of Heterogeneous Tissue: Application to the Detection of Residual Disease in Tumor Margins.

    PubMed

    Mueller, Jenna L; Harmany, Zachary T; Mito, Jeffrey K; Kennedy, Stephanie A; Kim, Yongbaek; Dodd, Leslie; Geradts, Joseph; Kirsch, David G; Willett, Rebecca M; Brown, J Quincy; Ramanujam, Nimmi

    2013-01-01

    To develop a robust tool for quantitative in situ pathology that allows visualization of heterogeneous tissue morphology and segmentation and quantification of image features. TISSUE EXCISED FROM A GENETICALLY ENGINEERED MOUSE MODEL OF SARCOMA WAS IMAGED USING A SUBCELLULAR RESOLUTION MICROENDOSCOPE AFTER TOPICAL APPLICATION OF A FLUORESCENT ANATOMICAL CONTRAST AGENT: acriflavine. An algorithm based on sparse component analysis (SCA) and the circle transform (CT) was developed for image segmentation and quantification of distinct tissue types. The accuracy of our approach was quantified through simulations of tumor and muscle images. Specifically, tumor, muscle, and tumor+muscle tissue images were simulated because these tissue types were most commonly observed in sarcoma margins. Simulations were based on tissue characteristics observed in pathology slides. The potential clinical utility of our approach was evaluated by imaging excised margins and the tumor bed in a cohort of mice after surgical resection of sarcoma. Simulation experiments revealed that SCA+CT achieved the lowest errors for larger nuclear sizes and for higher contrast ratios (nuclei intensity/background intensity). For imaging of tumor margins, SCA+CT effectively isolated nuclei from tumor, muscle, adipose, and tumor+muscle tissue types. Differences in density were correctly identified with SCA+CT in a cohort of ex vivo and in vivo images, thus illustrating the diagnostic potential of our approach. The combination of a subcellular-resolution microendoscope, acriflavine staining, and SCA+CT can be used to accurately isolate nuclei and quantify their density in anatomical images of heterogeneous tissue.

  12. The contribution of low-energy protons to the total on-orbit SEU rate

    DOE PAGES

    Dodds, Nathaniel Anson; Martinez, Marino J.; Dodd, Paul E.; ...

    2015-11-10

    Low- and high-energy proton experimental data and error rate predictions are presented for many bulk Si and SOI circuits from the 20-90 nm technology nodes to quantify how much low-energy protons (LEPs) can contribute to the total on-orbit single-event upset (SEU) rate. Every effort was made to predict LEP error rates that are conservatively high; even secondary protons generated in the spacecraft shielding have been included in the analysis. Across all the environments and circuits investigated, and when operating within 10% of the nominal operating voltage, LEPs were found to increase the total SEU rate to up to 4.3 timesmore » as high as it would have been in the absence of LEPs. Therefore, the best approach to account for LEP effects may be to calculate the total error rate from high-energy protons and heavy ions, and then multiply it by a safety margin of 5. If that error rate can be tolerated then our findings suggest that it is justified to waive LEP tests in certain situations. Trends were observed in the LEP angular responses of the circuits tested. As a result, grazing angles were the worst case for the SOI circuits, whereas the worst-case angle was at or near normal incidence for the bulk circuits.« less

  13. Multi-reader ROC studies with split-plot designs: a comparison of statistical methods.

    PubMed

    Obuchowski, Nancy A; Gallas, Brandon D; Hillis, Stephen L

    2012-12-01

    Multireader imaging trials often use a factorial design, in which study patients undergo testing with all imaging modalities and readers interpret the results of all tests for all patients. A drawback of this design is the large number of interpretations required of each reader. Split-plot designs have been proposed as an alternative, in which one or a subset of readers interprets all images of a sample of patients, while other readers interpret the images of other samples of patients. In this paper, the authors compare three methods of analysis for the split-plot design. Three statistical methods are presented: the Obuchowski-Rockette method modified for the split-plot design, a newly proposed marginal-mean analysis-of-variance approach, and an extension of the three-sample U-statistic method. A simulation study using the Roe-Metz model was performed to compare the type I error rate, power, and confidence interval coverage of the three test statistics. The type I error rates for all three methods are close to the nominal level but tend to be slightly conservative. The statistical power is nearly identical for the three methods. The coverage of 95% confidence intervals falls close to the nominal coverage for small and large sample sizes. The split-plot multireader, multicase study design can be statistically efficient compared to the factorial design, reducing the number of interpretations required per reader. Three methods of analysis, shown to have nominal type I error rates, similar power, and nominal confidence interval coverage, are available for this study design. Copyright © 2012 AUR. All rights reserved.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, H; Chen, Z; Nath, R

    Purpose: kV fluoroscopic imaging combined with MV treatment beam imaging has been investigated for intrafractional motion monitoring and correction. It is, however, subject to additional kV imaging dose to normal tissue. To balance tracking accuracy and imaging dose, we previously proposed an adaptive imaging strategy to dynamically decide future imaging type and moments based on motion tracking uncertainty. kV imaging may be used continuously for maximal accuracy or only when the position uncertainty (probability of out of threshold) is high if a preset imaging dose limit is considered. In this work, we propose more accurate methods to estimate tracking uncertaintymore » through analyzing acquired data in real-time. Methods: We simulated motion tracking process based on a previously developed imaging framework (MV + initial seconds of kV imaging) using real-time breathing data from 42 patients. Motion tracking errors for each time point were collected together with the time point’s corresponding features, such as tumor motion speed and 2D tracking error of previous time points, etc. We tested three methods for error uncertainty estimation based on the features: conditional probability distribution, logistic regression modeling, and support vector machine (SVM) classification to detect errors exceeding a threshold. Results: For conditional probability distribution, polynomial regressions on three features (previous tracking error, prediction quality, and cosine of the angle between the trajectory and the treatment beam) showed strong correlation with the variation (uncertainty) of the mean 3D tracking error and its standard deviation: R-square = 0.94 and 0.90, respectively. The logistic regression and SVM classification successfully identified about 95% of tracking errors exceeding 2.5mm threshold. Conclusion: The proposed methods can reliably estimate the motion tracking uncertainty in real-time, which can be used to guide adaptive additional imaging to confirm the tumor is within the margin or initialize motion compensation if it is out of the margin.« less

  15. Re-use of pilot data and interim analysis of pivotal data in MRMC studies: a simulation study

    NASA Astrophysics Data System (ADS)

    Chen, Weijie; Samuelson, Frank; Sahiner, Berkman; Petrick, Nicholas

    2017-03-01

    Novel medical imaging devices are often evaluated with multi-reader multi-case (MRMC) studies in which radiologists read images of patient cases for a specified clinical task (e.g., cancer detection). A pilot study is often used to measure the effect size and variance parameters that are necessary for sizing a pivotal study (including sizing readers, non-diseased and diseased cases). Due to the practical difficulty of collecting patient cases or recruiting clinical readers, some investigators attempt to include the pilot data as part of their pivotal study. In other situations, some investigators attempt to perform an interim analysis of their pivotal study data based upon which the sample sizes may be re-estimated. Re-use of the pilot data or interim analyses of the pivotal data may inflate the type I error of the pivotal study. In this work, we use the Roe and Metz model to simulate MRMC data under the null hypothesis (i.e., two devices have equal diagnostic performance) and investigate the type I error rate for several practical designs involving re-use of pilot data or interim analysis of pivotal data. Our preliminary simulation results indicate that, under the simulation conditions we investigated, the inflation of type I error is none or only marginal for some design strategies (e.g., re-use of patient data without re-using readers, and size re-estimation without using the effect-size estimated in the interim analysis). Upon further verifications, these are potentially useful design methods in that they may help make a study less burdensome and have a better chance to succeed without substantial loss of the statistical rigor.

  16. Examining the margins: a concept analysis of marginalization.

    PubMed

    Vasas, Elyssa B

    2005-01-01

    The aim of this analysis is to explore the concept of social marginalization for the purpose of concept development. Specifically, the article intends to clarify the relationship between health disparities and marginalization and generate knowledge about working with people who are socially marginalized. Concept development evolved from the critical analysis of relevant literature generated through searches of nursing and social science databases. Literature was organized thematically and themes related to marginalization as a social process were included and analyzed. The article explores the challenges of using marginalization as an independent concept and suggests areas for future inquiry and research.

  17. Stability Assessment and Tuning of an Adaptively Augmented Classical Controller for Launch Vehicle Flight Control

    NASA Technical Reports Server (NTRS)

    VanZwieten, Tannen; Zhu, J. Jim; Adami, Tony; Berry, Kyle; Grammar, Alex; Orr, Jeb S.; Best, Eric A.

    2014-01-01

    Recently, a robust and practical adaptive control scheme for launch vehicles [ [1] has been introduced. It augments a classical controller with a real-time loop-gain adaptation, and it is therefore called Adaptive Augmentation Control (AAC). The loop-gain will be increased from the nominal design when the tracking error between the (filtered) output and the (filtered) command trajectory is large; whereas it will be decreased when excitation of flex or sloshing modes are detected. There is a need to determine the range and rate of the loop-gain adaptation in order to retain (exponential) stability, which is critical in vehicle operation, and to develop some theoretically based heuristic tuning methods for the adaptive law gain parameters. The classical launch vehicle flight controller design technics are based on gain-scheduling, whereby the launch vehicle dynamics model is linearized at selected operating points along the nominal tracking command trajectory, and Linear Time-Invariant (LTI) controller design techniques are employed to ensure asymptotic stability of the tracking error dynamics, typically by meeting some prescribed Gain Margin (GM) and Phase Margin (PM) specifications. The controller gains at the design points are then scheduled, tuned and sometimes interpolated to achieve good performance and stability robustness under external disturbances (e.g. winds) and structural perturbations (e.g. vehicle modeling errors). While the GM does give a bound for loop-gain variation without losing stability, it is for constant dispersions of the loop-gain because the GM is based on frequency-domain analysis, which is applicable only for LTI systems. The real-time adaptive loop-gain variation of the AAC effectively renders the closed-loop system a time-varying system, for which it is well-known that the LTI system stability criterion is neither necessary nor sufficient when applying to a Linear Time-Varying (LTV) system in a frozen-time fashion. Therefore, a generalized stability metric for time-varying loop=gain perturbations is needed for the AAC.

  18. 75 FR 20813 - Certain Magnesia Carbon Bricks from the People's Republic of China: Amended Preliminary...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-21

    ... other errors, would result in (1) a change of at least five absolute percentage points in, but not less...) preliminary determination, or (2) a difference between a weighted-average dumping margin of zero or de minimis...

  19. Teaching Tolerance Magazine, 2003.

    ERIC Educational Resources Information Center

    Carnes, Jim, Ed.

    2003-01-01

    This magazine provides teachers with classroom learning materials to help children learn to be tolerant with others. Articles in the magazine are: "A Standard to Sustain" (Mary M. Harrison); "Let's Just Play" (Janet Schmidt); "Who's Helen Keller?" (Ruth Shagoury Hubbard); "Margins of Error" (Joe Parsons);…

  20. A service evaluation of on-line image-guided radiotherapy to lower extremity sarcoma: Investigating the workload implications of a 3 mm action level for image assessment and correction prior to delivery.

    PubMed

    Taylor, C; Parker, J; Stratford, J; Warren, M

    2018-05-01

    Although all systematic and random positional setup errors can be corrected for in entirety during on-line image-guided radiotherapy, the use of a specified action level, below which no correction occurs, is also an option. The following service evaluation aimed to investigate the use of this 3 mm action level for on-line image assessment and correction (online, systematic set-up error and weekly evaluation) for lower extremity sarcoma, and understand the impact on imaging frequency and patient positioning error within one cancer centre. All patients were immobilised using a thermoplastic shell attached to a plastic base and an individual moulded footrest. A retrospective analysis of 30 patients was performed. Patient setup and correctional data derived from cone beam CT analysis was retrieved. The timing, frequency and magnitude of corrections were evaluated. The population systematic and random error was derived. 20% of patients had no systematic corrections over the duration of treatment, and 47% had one. The maximum number of systematic corrections per course of radiotherapy was 4, which occurred for 2 patients. 34% of episodes occurred within the first 5 fractions. All patients had at least one observed translational error during their treatment greater than 0.3 cm, and 80% of patients had at least one observed translational error during their treatment greater than 0.5 cm. The population systematic error was 0.14 cm, 0.10 cm, 0.14 cm and random error was 0.27 cm, 0.22 cm, 0.23 cm in the lateral, caudocranial and anteroposterial directions. The required Planning Target Volume margin for the study population was 0.55 cm, 0.41 cm and 0.50 cm in the lateral, caudocranial and anteroposterial directions. The 3 mm action level for image assessment and correction prior to delivery reduced the imaging burden and focussed intervention on patients that exhibited greater positional variability. This strategy could be an efficient deployment of departmental resources if full daily correction of positional setup error is not possible. Copyright © 2017. Published by Elsevier Ltd.

  1. Systems and methods for circuit lifetime evaluation

    NASA Technical Reports Server (NTRS)

    Heaps, Timothy L. (Inventor); Sheldon, Douglas J. (Inventor); Bowerman, Paul N. (Inventor); Everline, Chester J. (Inventor); Shalom, Eddy (Inventor); Rasmussen, Robert D. (Inventor)

    2013-01-01

    Systems and methods for estimating the lifetime of an electrical system in accordance with embodiments of the invention are disclosed. One embodiment of the invention includes iteratively performing Worst Case Analysis (WCA) on a system design with respect to different system lifetimes using a computer to determine the lifetime at which the worst case performance of the system indicates the system will pass with zero margin or fail within a predetermined margin for error given the environment experienced by the system during its lifetime. In addition, performing WCA on a system with respect to a specific system lifetime includes identifying subcircuits within the system, performing Extreme Value Analysis (EVA) with respect to each subcircuit to determine whether the subcircuit fails EVA for the specific system lifetime, when the subcircuit passes EVA, determining that the subcircuit does not fail WCA for the specified system lifetime, when a subcircuit fails EVA performing at least one additional WCA process that provides a tighter bound on the WCA than EVA to determine whether the subcircuit fails WCA for the specified system lifetime, determining that the system passes WCA with respect to the specific system lifetime when all subcircuits pass WCA, and determining that the system fails WCA when at least one subcircuit fails WCA.

  2. Parameter constraints from weak-lensing tomography of galaxy shapes and cosmic microwave background fluctuations

    NASA Astrophysics Data System (ADS)

    Merkel, Philipp M.; Schäfer, Björn Malte

    2017-08-01

    Recently, it has been shown that cross-correlating cosmic microwave background (CMB) lensing and three-dimensional (3D) cosmic shear allows to considerably tighten cosmological parameter constraints. We investigate whether similar improvement can be achieved in a conventional tomographic setup. We present Fisher parameter forecasts for a Euclid-like galaxy survey in combination with different ongoing and forthcoming CMB experiments. In contrast to a fully 3D analysis, we find only marginal improvement. Assuming Planck-like CMB data, we show that including the full covariance of the combined CMB and cosmic shear data improves the dark energy figure of merit (FOM) by only 3 per cent. The marginalized error on the sum of neutrino masses is reduced at the same level. For a next generation CMB satellite mission such as Prism, the predicted improvement of the dark energy FOM amounts to approximately 25 per cent. Furthermore, we show that the small improvement is contrasted by an increased bias in the dark energy parameters when the intrinsic alignment of galaxies is not correctly accounted for in the full covariance matrix.

  3. Aircraft Fault Detection Using Real-Time Frequency Response Estimation

    NASA Technical Reports Server (NTRS)

    Grauer, Jared A.

    2016-01-01

    A real-time method for estimating time-varying aircraft frequency responses from input and output measurements was demonstrated. The Bat-4 subscale airplane was used with NASA Langley Research Center's AirSTAR unmanned aerial flight test facility to conduct flight tests and collect data for dynamic modeling. Orthogonal phase-optimized multisine inputs, summed with pilot stick and pedal inputs, were used to excite the responses. The aircraft was tested in its normal configuration and with emulated failures, which included a stuck left ruddervator and an increased command path latency. No prior knowledge of a dynamic model was used or available for the estimation. The longitudinal short period dynamics were investigated in this work. Time-varying frequency responses and stability margins were tracked well using a 20 second sliding window of data, as compared to a post-flight analysis using output error parameter estimation and a low-order equivalent system model. This method could be used in a real-time fault detection system, or for other applications of dynamic modeling such as real-time verification of stability margins during envelope expansion tests.

  4. Comparison of in-situ delay monitors for use in Adaptive Voltage Scaling

    NASA Astrophysics Data System (ADS)

    Pour Aryan, N.; Heiß, L.; Schmitt-Landsiedel, D.; Georgakos, G.; Wirnshofer, M.

    2012-09-01

    In Adaptive Voltage Scaling (AVS) the supply voltage of digital circuits is tuned according to the circuit's actual operating condition, which enables dynamic compensation to PVTA variations. By exploiting the excessive safety margins added in state-of-the-art worst-case designs considerable power saving is achieved. In our approach, the operating condition of the circuit is monitored by in-situ delay monitors. This paper presents different designs to implement the in-situ delay monitors capable of detecting late but still non-erroneous transitions, called Pre-Errors. The developed Pre-Error monitors are integrated in a 16 bit multiplier test circuit and the resulting Pre-Error AVS system is modeled by a Markov chain in order to determine the power saving potential of each Pre-Error detection approach.

  5. Independent Predictors of Prognosis Based on Oral Cavity Squamous Cell Carcinoma Surgical Margins.

    PubMed

    Buchakjian, Marisa R; Ginader, Timothy; Tasche, Kendall K; Pagedar, Nitin A; Smith, Brian J; Sperry, Steven M

    2018-05-01

    Objective To conduct a multivariate analysis of a large cohort of oral cavity squamous cell carcinoma (OCSCC) cases for independent predictors of local recurrence (LR) and overall survival (OS), with emphasis on the relationship between (1) prognosis and (2) main specimen permanent margins and intraoperative tumor bed frozen margins. Study Design Retrospective cohort study. Setting Tertiary academic head and neck cancer program. Subjects and Methods This study included 426 patients treated with OCSCC resection between 2005 and 2014 at University of Iowa Hospitals and Clinics. Patients underwent excision of OCSCC with intraoperative tumor bed frozen margin sampling and main specimen permanent margin assessment. Multivariate analysis of the data set to predict LR and OS was performed. Results Independent predictors of LR included nodal involvement, histologic grade, and main specimen permanent margin status. Specifically, the presence of a positive margin (odds ratio, 6.21; 95% CI, 3.3-11.9) or <1-mm/carcinoma in situ margin (odds ratio, 2.41; 95% CI, 1.19-4.87) on the main specimen was an independent predictor of LR, whereas intraoperative tumor bed margins were not predictive of LR on multivariate analysis. Similarly, independent predictors of OS on multivariate analysis included nodal involvement, extracapsular extension, and a positive main specimen margin. Tumor bed margins did not independently predict OS. Conclusion The main specimen margin is a strong independent predictor of LR and OS on multivariate analysis. Intraoperative tumor bed frozen margins do not independently predict prognosis. We conclude that emphasis should be placed on evaluating the main specimen margins when estimating prognosis after OCSCC resection.

  6. Demonstrating the robustness of population surveillance data: implications of error rates on demographic and mortality estimates.

    PubMed

    Fottrell, Edward; Byass, Peter; Berhane, Yemane

    2008-03-25

    As in any measurement process, a certain amount of error may be expected in routine population surveillance operations such as those in demographic surveillance sites (DSSs). Vital events are likely to be missed and errors made no matter what method of data capture is used or what quality control procedures are in place. The extent to which random errors in large, longitudinal datasets affect overall health and demographic profiles has important implications for the role of DSSs as platforms for public health research and clinical trials. Such knowledge is also of particular importance if the outputs of DSSs are to be extrapolated and aggregated with realistic margins of error and validity. This study uses the first 10-year dataset from the Butajira Rural Health Project (BRHP) DSS, Ethiopia, covering approximately 336,000 person-years of data. Simple programmes were written to introduce random errors and omissions into new versions of the definitive 10-year Butajira dataset. Key parameters of sex, age, death, literacy and roof material (an indicator of poverty) were selected for the introduction of errors based on their obvious importance in demographic and health surveillance and their established significant associations with mortality. Defining the original 10-year dataset as the 'gold standard' for the purposes of this investigation, population, age and sex compositions and Poisson regression models of mortality rate ratios were compared between each of the intentionally erroneous datasets and the original 'gold standard' 10-year data. The composition of the Butajira population was well represented despite introducing random errors, and differences between population pyramids based on the derived datasets were subtle. Regression analyses of well-established mortality risk factors were largely unaffected even by relatively high levels of random errors in the data. The low sensitivity of parameter estimates and regression analyses to significant amounts of randomly introduced errors indicates a high level of robustness of the dataset. This apparent inertia of population parameter estimates to simulated errors is largely due to the size of the dataset. Tolerable margins of random error in DSS data may exceed 20%. While this is not an argument in favour of poor quality data, reducing the time and valuable resources spent on detecting and correcting random errors in routine DSS operations may be justifiable as the returns from such procedures diminish with increasing overall accuracy. The money and effort currently spent on endlessly correcting DSS datasets would perhaps be better spent on increasing the surveillance population size and geographic spread of DSSs and analysing and disseminating research findings.

  7. A randomized trial comparing INR monitoring devices in patients with anticoagulation self-management: evaluation of a novel error-grid approach.

    PubMed

    Hemkens, Lars G; Hilden, Kristian M; Hartschen, Stephan; Kaiser, Thomas; Didjurgeit, Ulrike; Hansen, Roland; Bender, Ralf; Sawicki, Peter T

    2008-08-01

    In addition to the metrological quality of international normalized ratio (INR) monitoring devices used in patients' self-management of long-term anticoagulation, the effectiveness of self-monitoring with such devices has to be evaluated under real-life conditions with a focus on clinical implications. An approach to evaluate the clinical significance of inaccuracies is the error-grid analysis as already established in self-monitoring of blood glucose. Two anticoagulation monitors were compared in a real-life setting and a novel error-grid instrument for oral anticoagulation has been evaluated. In a randomized crossover study 16 patients performed self-management of anticoagulation using the INRatio and the CoaguChek S system. Main outcome measures were clinically relevant INR differences according to established criteria and to the error-grid approach. A lower rate of clinically relevant disagreements according to Anderson's criteria was found with CoaguChek S than with INRatio without statistical significance (10.77% vs. 12.90%; P = 0.787). Using the error-grid we found principally consistent results: More measurement pairs with discrepancies of no or low clinical relevance were found with CoaguChek S, whereas with INRatio we found more differences with a moderate clinical relevance. A high rate of patients' satisfaction with both of the point of care devices was found with only marginal differences. A principal appropriateness of the investigated point-of-care devices to adequately monitor the INR is shown. The error-grid is useful for comparing monitoring methods with a focus on clinical relevance under real-life conditions beyond assessing the pure metrological quality, but we emphasize that additional trials using this instrument with larger patient populations are needed to detect differences in clinically relevant disagreements.

  8. An algorithm of improving speech emotional perception for hearing aid

    NASA Astrophysics Data System (ADS)

    Xi, Ji; Liang, Ruiyu; Fei, Xianju

    2017-07-01

    In this paper, a speech emotion recognition (SER) algorithm was proposed to improve the emotional perception of hearing-impaired people. The algorithm utilizes multiple kernel technology to overcome the drawback of SVM: slow training speed. Firstly, in order to improve the adaptive performance of Gaussian Radial Basis Function (RBF), the parameter determining the nonlinear mapping was optimized on the basis of Kernel target alignment. Then, the obtained Kernel Function was used as the basis kernel of Multiple Kernel Learning (MKL) with slack variable that could solve the over-fitting problem. However, the slack variable also brings the error into the result. Therefore, a soft-margin MKL was proposed to balance the margin against the error. Moreover, the relatively iterative algorithm was used to solve the combination coefficients and hyper-plane equations. Experimental results show that the proposed algorithm can acquire an accuracy of 90% for five kinds of emotions including happiness, sadness, anger, fear and neutral. Compared with KPCA+CCA and PIM-FSVM, the proposed algorithm has the highest accuracy.

  9. A comparison of exact tests for trend with binary endpoints using Bartholomew's statistic.

    PubMed

    Consiglio, J D; Shan, G; Wilding, G E

    2014-01-01

    Tests for trend are important in a number of scientific fields when trends associated with binary variables are of interest. Implementing the standard Cochran-Armitage trend test requires an arbitrary choice of scores assigned to represent the grouping variable. Bartholomew proposed a test for qualitatively ordered samples using asymptotic critical values, but type I error control can be problematic in finite samples. To our knowledge, use of the exact probability distribution has not been explored, and we study its use in the present paper. Specifically we consider an approach based on conditioning on both sets of marginal totals and three unconditional approaches where only the marginal totals corresponding to the group sample sizes are treated as fixed. While slightly conservative, all four tests are guaranteed to have actual type I error rates below the nominal level. The unconditional tests are found to exhibit far less conservatism than the conditional test and thereby gain a power advantage.

  10. Reliability of computer designed surgical guides in six implant rehabilitations with two years follow-up.

    PubMed

    Giordano, Mauro; Ausiello, Pietro; Martorelli, Massimo; Sorrentino, Roberto

    2012-09-01

    To evaluate the reliability and accuracy of computer-designed surgical guides in osseointegrated oral implant rehabilitation. Six implant rehabilitations, with a total of 17 implants, were completed with computer-designed surgical guides, performed with the master model developed by muco-compressive and muco-static impressions. In the first case, the surgical guide had exclusively mucosal support, in the second case exclusively dental support. For all six cases computer-aided surgical planning was performed by virtual analyses with 3D models obtained by dental scan DICOM data. The accuracy and stability of implant osseointegration over two years post surgery was then evaluated with clinical and radiographic examinations. Radiographic examination, performed with digital acquisitions (RVG - Radio Video graph) and parallel techniques, allowed two-dimensional feedback with a margin of linear error of 10%. Implant osseointegration was recorded for all the examined rehabilitations. During the clinical and radiographic post-surgical assessments, over the following two years, the peri-implant bone level was found to be stable and without appearance of any complications. The margin of error recorded between pre-operative positions assigned by virtual analysis and the post-surgical digital radiographic observations was as low as 0.2mm. Computer-guided implant surgery can be very effective in oral rehabilitations, providing an opportunity for the surgeon: (a) to avoid the necessity of muco-periosteal detachments and then (b) to perform minimally invasive interventions, whenever appropriate, with a flapless approach. Copyright © 2012 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  11. Influence of conservative corrections on parameter estimation for extreme-mass-ratio inspirals

    NASA Astrophysics Data System (ADS)

    Huerta, E. A.; Gair, Jonathan R.

    2009-04-01

    We present an improved numerical kludge waveform model for circular, equatorial extreme-mass-ratio inspirals (EMRIs). The model is based on true Kerr geodesics, augmented by radiative self-force corrections derived from perturbative calculations, and in this paper for the first time we include conservative self-force corrections that we derive by comparison to post-Newtonian results. We present results of a Monte Carlo simulation of parameter estimation errors computed using the Fisher matrix and also assess the theoretical errors that would arise from omitting the conservative correction terms we include here. We present results for three different types of system, namely, the inspirals of black holes, neutron stars, or white dwarfs into a supermassive black hole (SMBH). The analysis shows that for a typical source (a 10M⊙ compact object captured by a 106M⊙ SMBH at a signal to noise ratio of 30) we expect to determine the two masses to within a fractional error of ˜10-4, measure the spin parameter q to ˜10-4.5, and determine the location of the source on the sky and the spin orientation to within 10-3 steradians. We show that, for this kludge model, omitting the conservative corrections leads to a small error over much of the parameter space, i.e., the ratio R of the theoretical model error to the Fisher matrix error is R<1 for all ten parameters in the model. For the few systems with larger errors typically R<3 and hence the conservative corrections can be marginally ignored. In addition, we use our model and first-order self-force results for Schwarzschild black holes to estimate the error that arises from omitting the second-order radiative piece of the self-force. This indicates that it may not be necessary to go beyond first order to recover accurate parameter estimates.

  12. Replication and Pedagogy in the History of Psychology VI: Egon Brunswik on Perception and Explicit Reasoning

    NASA Astrophysics Data System (ADS)

    Athy, Jeremy; Friedrich, Jeff; Delany, Eileen

    2008-05-01

    Egon Brunswik (1903 1955) first made an interesting distinction between perception and explicit reasoning, arguing that perception included quick estimates of an object’s size, nearly always resulting in good approximations in uncertain environments, whereas explicit reasoning, while better at achieving exact estimates, could often fail by wide margins. An experiment conducted by Brunswik to investigate these ideas was never published and the only available information is a figure of the results presented in a posthumous book in 1956. We replicated and extended his study to gain insight into the procedures Brunswik used in obtaining his results. Explicit reasoning resulted in fewer errors, yet more extreme ones than perception. Brunswik’s graphical analysis of the results led to different conclusions, however, than did a modern statistically-based analysis.

  13. Investigation on synchronization of the offset printing process for fine patterning and precision overlay

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kang, Dongwoo; Lee, Eonseok; Kim, Hyunchang

    2014-06-21

    Offset printing processes are promising candidates for producing printed electronics due to their capacity for fine patterning and suitability for mass production. To print high-resolution patterns with good overlay using offset printing, the velocities of two contact surfaces, which ink is transferred between, should be synchronized perfectly. However, an exact velocity of the contact surfaces is unknown due to several imperfections, including tolerances, blanket swelling, and velocity ripple, which prevents the system from being operated in the synchronized condition. In this paper, a novel method of measurement based on the sticking model of friction force was proposed to determine themore » best synchronized condition, i.e., the condition in which the rate of synchronization error is minimized. It was verified by experiment that the friction force can accurately represent the rate of synchronization error. Based on the measurement results of the synchronization error, the allowable margin of synchronization error when printing high-resolution patterns was investigated experimentally using reverse offset printing. There is a region where the patterning performance is unchanged even though the synchronization error is varied, and this may be viewed as indirect evidence that printability performance is secured when there is no slip at the contact interface. To understand what happens at the contact surfaces during ink transfer, the deformation model of the blanket's surface was developed. The model estimates how much deformation on the blanket's surface can be borne by the synchronization error when there is no slip at the contact interface. In addition, the model shows that the synchronization error results in scale variation in the machine direction (MD), which means that the printing registration in the MD can be adjusted actively by controlling the synchronization if there is a sufficient margin of synchronization error to guarantee printability. The effect of synchronization on the printing registration was verified experimentally using gravure offset printing. The variations in synchronization result in the differences in the MD scale, and the measured MD scale matches exactly with the modeled MD scale.« less

  14. Ensemble-marginalized Kalman filter for linear time-dependent PDEs with noisy boundary conditions: application to heat transfer in building walls

    NASA Astrophysics Data System (ADS)

    Iglesias, Marco; Sawlan, Zaid; Scavino, Marco; Tempone, Raúl; Wood, Christopher

    2018-07-01

    In this work, we present the ensemble-marginalized Kalman filter (EnMKF), a sequential algorithm analogous to our previously proposed approach (Ruggeri et al 2017 Bayesian Anal. 12 407–33, Iglesias et al 2018 Int. J. Heat Mass Transfer 116 417–31), for estimating the state and parameters of linear parabolic partial differential equations in initial-boundary value problems when the boundary data are noisy. We apply EnMKF to infer the thermal properties of building walls and to estimate the corresponding heat flux from real and synthetic data. Compared with a modified ensemble Kalman filter (EnKF) that is not marginalized, EnMKF reduces the bias error, avoids the collapse of the ensemble without needing to add inflation, and converges to the mean field posterior using or less of the ensemble size required by EnKF. According to our results, the marginalization technique in EnMKF is key to performance improvement with smaller ensembles at any fixed time.

  15. Maximum entropy approach to statistical inference for an ocean acoustic waveguide.

    PubMed

    Knobles, D P; Sagers, J D; Koch, R A

    2012-02-01

    A conditional probability distribution suitable for estimating the statistical properties of ocean seabed parameter values inferred from acoustic measurements is derived from a maximum entropy principle. The specification of the expectation value for an error function constrains the maximization of an entropy functional. This constraint determines the sensitivity factor (β) to the error function of the resulting probability distribution, which is a canonical form that provides a conservative estimate of the uncertainty of the parameter values. From the conditional distribution, marginal distributions for individual parameters can be determined from integration over the other parameters. The approach is an alternative to obtaining the posterior probability distribution without an intermediary determination of the likelihood function followed by an application of Bayes' rule. In this paper the expectation value that specifies the constraint is determined from the values of the error function for the model solutions obtained from a sparse number of data samples. The method is applied to ocean acoustic measurements taken on the New Jersey continental shelf. The marginal probability distribution for the values of the sound speed ratio at the surface of the seabed and the source levels of a towed source are examined for different geoacoustic model representations. © 2012 Acoustical Society of America

  16. Investigation of interfractional shape variations based on statistical point distribution model for prostate cancer radiation therapy.

    PubMed

    Shibayama, Yusuke; Arimura, Hidetaka; Hirose, Taka-Aki; Nakamoto, Takahiro; Sasaki, Tomonari; Ohga, Saiji; Matsushita, Norimasa; Umezu, Yoshiyuki; Nakamura, Yasuhiko; Honda, Hiroshi

    2017-05-01

    The setup errors and organ motion errors pertaining to clinical target volume (CTV) have been considered as two major causes of uncertainties in the determination of the CTV-to-planning target volume (PTV) margins for prostate cancer radiation treatment planning. We based our study on the assumption that interfractional target shape variations are not negligible as another source of uncertainty for the determination of precise CTV-to-PTV margins. Thus, we investigated the interfractional shape variations of CTVs based on a point distribution model (PDM) for prostate cancer radiation therapy. To quantitate the shape variations of CTVs, the PDM was applied for the contours of 4 types of CTV regions (low-risk, intermediate- risk, high-risk CTVs, and prostate plus entire seminal vesicles), which were delineated by considering prostate cancer risk groups on planning computed tomography (CT) and cone beam CT (CBCT) images of 73 fractions of 10 patients. The standard deviations (SDs) of the interfractional random errors for shape variations were obtained from covariance matrices based on the PDMs, which were generated from vertices of triangulated CTV surfaces. The correspondences between CTV surface vertices were determined based on a thin-plate spline robust point matching algorithm. The systematic error for shape variations was defined as the average deviation between surfaces of an average CTV and planning CTVs, and the random error as the average deviation of CTV surface vertices for fractions from an average CTV surface. The means of the SDs of the systematic errors for the four types of CTVs ranged from 1.0 to 2.0 mm along the anterior direction, 1.2 to 2.6 mm along the posterior direction, 1.0 to 2.5 mm along the superior direction, 0.9 to 1.9 mm along the inferior direction, 0.9 to 2.6 mm along the right direction, and 1.0 to 3.0 mm along the left direction. Concerning the random errors, the means of the SDs ranged from 0.9 to 1.2 mm along the anterior direction, 1.0 to 1.4 mm along the posterior direction, 0.9 to 1.3 mm along the superior direction, 0.8 to 1.0 mm along the inferior direction, 0.8 to 0.9 mm along the right direction, and 0.8 to 1.0 mm along the left direction. Since the shape variations were not negligible for intermediate and high-risk CTVs, they should be taken into account for the determination of the CTV-to-PTV margins in radiation treatment planning of prostate cancer. © 2017 American Association of Physicists in Medicine.

  17. Forecast models for suicide: Time-series analysis with data from Italy.

    PubMed

    Preti, Antonio; Lentini, Gianluca

    2016-01-01

    The prediction of suicidal behavior is a complex task. To fine-tune targeted preventative interventions, predictive analytics (i.e. forecasting future risk of suicide) is more important than exploratory data analysis (pattern recognition, e.g. detection of seasonality in suicide time series). This study sets out to investigate the accuracy of forecasting models of suicide for men and women. A total of 101 499 male suicides and of 39 681 female suicides - occurred in Italy from 1969 to 2003 - were investigated. In order to apply the forecasting model and test its accuracy, the time series were split into a training set (1969 to 1996; 336 months) and a test set (1997 to 2003; 84 months). The main outcome was the accuracy of forecasting models on the monthly number of suicides. These measures of accuracy were used: mean absolute error; root mean squared error; mean absolute percentage error; mean absolute scaled error. In both male and female suicides a change in the trend pattern was observed, with an increase from 1969 onwards to reach a maximum around 1990 and decrease thereafter. The variances attributable to the seasonal and trend components were, respectively, 24% and 64% in male suicides, and 28% and 41% in female ones. Both annual and seasonal historical trends of monthly data contributed to forecast future trends of suicide with a margin of error around 10%. The finding is clearer in male than in female time series of suicide. The main conclusion of the study is that models taking seasonality into account seem to be able to derive information on deviation from the mean when this occurs as a zenith, but they fail to reproduce it when it occurs as a nadir. Preventative efforts should concentrate on the factors that influence the occurrence of increases above the main trend in both seasonal and cyclic patterns of suicides.

  18. Two-part models with stochastic processes for modelling longitudinal semicontinuous data: Computationally efficient inference and modelling the overall marginal mean.

    PubMed

    Yiu, Sean; Tom, Brian Dm

    2017-01-01

    Several researchers have described two-part models with patient-specific stochastic processes for analysing longitudinal semicontinuous data. In theory, such models can offer greater flexibility than the standard two-part model with patient-specific random effects. However, in practice, the high dimensional integrations involved in the marginal likelihood (i.e. integrated over the stochastic processes) significantly complicates model fitting. Thus, non-standard computationally intensive procedures based on simulating the marginal likelihood have so far only been proposed. In this paper, we describe an efficient method of implementation by demonstrating how the high dimensional integrations involved in the marginal likelihood can be computed efficiently. Specifically, by using a property of the multivariate normal distribution and the standard marginal cumulative distribution function identity, we transform the marginal likelihood so that the high dimensional integrations are contained in the cumulative distribution function of a multivariate normal distribution, which can then be efficiently evaluated. Hence, maximum likelihood estimation can be used to obtain parameter estimates and asymptotic standard errors (from the observed information matrix) of model parameters. We describe our proposed efficient implementation procedure for the standard two-part model parameterisation and when it is of interest to directly model the overall marginal mean. The methodology is applied on a psoriatic arthritis data set concerning functional disability.

  19. Effect of patient setup errors on simultaneously integrated boost head and neck IMRT treatment plans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Siebers, Jeffrey V.; Keall, Paul J.; Wu Qiuwen

    2005-10-01

    Purpose: The purpose of this study is to determine dose delivery errors that could result from random and systematic setup errors for head-and-neck patients treated using the simultaneous integrated boost (SIB)-intensity-modulated radiation therapy (IMRT) technique. Methods and Materials: Twenty-four patients who participated in an intramural Phase I/II parotid-sparing IMRT dose-escalation protocol using the SIB treatment technique had their dose distributions reevaluated to assess the impact of random and systematic setup errors. The dosimetric effect of random setup error was simulated by convolving the two-dimensional fluence distribution of each beam with the random setup error probability density distribution. Random setup errorsmore » of {sigma} = 1, 3, and 5 mm were simulated. Systematic setup errors were simulated by randomly shifting the patient isocenter along each of the three Cartesian axes, with each shift selected from a normal distribution. Systematic setup error distributions with {sigma} = 1.5 and 3.0 mm along each axis were simulated. Combined systematic and random setup errors were simulated for {sigma} = {sigma} = 1.5 and 3.0 mm along each axis. For each dose calculation, the gross tumor volume (GTV) received by 98% of the volume (D{sub 98}), clinical target volume (CTV) D{sub 90}, nodes D{sub 90}, cord D{sub 2}, and parotid D{sub 50} and parotid mean dose were evaluated with respect to the plan used for treatment for the structure dose and for an effective planning target volume (PTV) with a 3-mm margin. Results: Simultaneous integrated boost-IMRT head-and-neck treatment plans were found to be less sensitive to random setup errors than to systematic setup errors. For random-only errors, errors exceeded 3% only when the random setup error {sigma} exceeded 3 mm. Simulated systematic setup errors with {sigma} = 1.5 mm resulted in approximately 10% of plan having more than a 3% dose error, whereas a {sigma} = 3.0 mm resulted in half of the plans having more than a 3% dose error and 28% with a 5% dose error. Combined random and systematic dose errors with {sigma} = {sigma} = 3.0 mm resulted in more than 50% of plans having at least a 3% dose error and 38% of the plans having at least a 5% dose error. Evaluation with respect to a 3-mm expanded PTV reduced the observed dose deviations greater than 5% for the {sigma} = {sigma} = 3.0 mm simulations to 5.4% of the plans simulated. Conclusions: Head-and-neck SIB-IMRT dosimetric accuracy would benefit from methods to reduce patient systematic setup errors. When GTV, CTV, or nodal volumes are used for dose evaluation, plans simulated including the effects of random and systematic errors deviate substantially from the nominal plan. The use of PTVs for dose evaluation in the nominal plan improves agreement with evaluated GTV, CTV, and nodal dose values under simulated setup errors. PTV concepts should be used for SIB-IMRT head-and-neck squamous cell carcinoma patients, although the size of the margins may be less than those used with three-dimensional conformal radiation therapy.« less

  20. Craniocaudal Safety Margin Calculation Based on Interfractional Changes in Tumor Motion in Lung SBRT Assessed With an EPID in Cine Mode

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ueda, Yoshihiro, E-mail: ueda-yo@mc.pref.osaka.jp; Miyazaki, Masayoshi; Nishiyama, Kinji

    2012-07-01

    Purpose: To evaluate setup error and interfractional changes in tumor motion magnitude using an electric portal imaging device in cine mode (EPID cine) during the course of stereotactic body radiation therapy (SBRT) for non-small-cell lung cancer (NSCLC) and to calculate margins to compensate for these variations. Materials and Methods: Subjects were 28 patients with Stage I NSCLC who underwent SBRT. Respiratory-correlated four-dimensional computed tomography (4D-CT) at simulation was binned into 10 respiratory phases, which provided average intensity projection CT data sets (AIP). On 4D-CT, peak-to-peak motion of the tumor (M-4DCT) in the craniocaudal direction was assessed and the tumor centermore » (mean tumor position [MTP]) of the AIP (MTP-4DCT) was determined. At treatment, the tumor on cone beam CT was registered to that on AIP for patient setup. During three sessions of irradiation, peak-to-peak motion of the tumor (M-cine) and the mean tumor position (MTP-cine) were obtained using EPID cine and in-house software. Based on changes in tumor motion magnitude ( Increment M) and patient setup error ( Increment MTP), defined as differences between M-4DCT and M-cine and between MTP-4DCT and MTP-cine, a margin to compensate for these variations was calculated with Stroom's formula. Results: The means ({+-}standard deviation: SD) of M-4DCT and M-cine were 3.1 ({+-}3.4) and 4.0 ({+-}3.6) mm, respectively. The means ({+-}SD) of Increment M and Increment MTP were 0.9 ({+-}1.3) and 0.2 ({+-}2.4) mm, respectively. Internal target volume-planning target volume (ITV-PTV) margins to compensate for Increment M, Increment MTP, and both combined were 3.7, 5.2, and 6.4 mm, respectively. Conclusion: EPID cine is a useful modality for assessing interfractional variations of tumor motion. The ITV-PTV margins to compensate for these variations can be calculated.« less

  1. Deformable Dose Reconstruction to Optimize the Planning and Delivery of Liver Cancer Radiotherapy

    NASA Astrophysics Data System (ADS)

    Velec, Michael

    The precise delivery of radiation to liver cancer patients results in improved control with higher tumor doses and minimized normal tissues doses. A margin of normal tissue around the tumor requires irradiation however to account for treatment delivery uncertainties. Daily image-guidance allows targeting of the liver, a surrogate for the tumor, to reduce geometric errors. However poor direct tumor visualization, anatomical deformation and breathing motion introduce uncertainties between the planned dose, calculated on a single pre-treatment computed tomography image, and the dose that is delivered. A novel deformable image registration algorithm based on tissue biomechanics was applied to previous liver cancer patients to track targets and surrounding organs during radiotherapy. Modeling these daily anatomic variations permitted dose accumulation, thereby improving calculations of the delivered doses. The accuracy of the algorithm to track dose was validated using imaging from a deformable, 3-dimensional dosimeter able to optically track absorbed dose. Reconstructing the delivered dose revealed that 70% of patients had substantial deviations from the initial planned dose. An alternative image-guidance technique using respiratory-correlated imaging was simulated, which reduced both the residual tumor targeting errors and the magnitude of the delivered dose deviations. A planning and delivery strategy for liver radiotherapy was then developed that minimizes the impact of breathing motion, and applied a margin to account for the impact of liver deformation during treatment. This margin is 38% smaller on average than the margin used clinically, and permitted an average dose-escalation to liver tumors of 9% for the same risk of toxicity. Simulating the delivered dose with deformable dose reconstruction demonstrated the plans with smaller margins were robust as 90% of patients' tumors received the intended dose. This strategy can be readily implemented with widely available technologies and thus can potentially improve local control for liver cancer patients receiving radiotherapy.

  2. 75 FR 29972 - Certain Seamless Carbon and Alloy Steel Standard, Line, and Pressure Pipe from the People's...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-28

    ... errors, (1) would result in a change of at least five absolute percentage points in, but not less than 25... determination; or (2) would result in a difference between a weighted-average dumping margin of zero or de...

  3. Statistics Using Just One Formula

    ERIC Educational Resources Information Center

    Rosenthal, Jeffrey S.

    2018-01-01

    This article advocates that introductory statistics be taught by basing all calculations on a single simple margin-of-error formula and deriving all of the standard introductory statistical concepts (confidence intervals, significance tests, comparisons of means and proportions, etc) from that one formula. It is argued that this approach will…

  4. Wide-area mapping of small-scale features in agricultural landscapes using airborne remote sensing

    NASA Astrophysics Data System (ADS)

    O'Connell, Jerome; Bradter, Ute; Benton, Tim G.

    2015-11-01

    Natural and semi-natural habitats in agricultural landscapes are likely to come under increasing pressure with the global population set to exceed 9 billion by 2050. These non-cropped habitats are primarily made up of trees, hedgerows and grassy margins and their amount, quality and spatial configuration can have strong implications for the delivery and sustainability of various ecosystem services. In this study high spatial resolution (0.5 m) colour infrared aerial photography (CIR) was used in object based image analysis for the classification of non-cropped habitat in a 10,029 ha area of southeast England. Three classification scenarios were devised using 4 and 9 class scenarios. The machine learning algorithm Random Forest (RF) was used to reduce the number of variables used for each classification scenario by 25.5 % ± 2.7%. Proportion of votes from the 4 class hierarchy was made available to the 9 class scenarios and where the highest ranked variables in all cases. This approach allowed for misclassified parent objects to be correctly classified at a lower level. A single object hierarchy with 4 class proportion of votes produced the best result (kappa 0.909). Validation of the optimum training sample size in RF showed no significant difference between mean internal out-of-bag error and external validation. As an example of the utility of this data, we assessed habitat suitability for a declining farmland bird, the yellowhammer (Emberiza citronella), which requires hedgerows associated with grassy margins. We found that ˜22% of hedgerows were within 200 m of margins with an area >183.31 m2. The results from this analysis can form a key information source at the environmental and policy level in landscape optimisation for food production and ecosystem service sustainability.

  5. Wide-area mapping of small-scale features in agricultural landscapes using airborne remote sensing.

    PubMed

    O'Connell, Jerome; Bradter, Ute; Benton, Tim G

    2015-11-01

    Natural and semi-natural habitats in agricultural landscapes are likely to come under increasing pressure with the global population set to exceed 9 billion by 2050. These non-cropped habitats are primarily made up of trees, hedgerows and grassy margins and their amount, quality and spatial configuration can have strong implications for the delivery and sustainability of various ecosystem services. In this study high spatial resolution (0.5 m) colour infrared aerial photography (CIR) was used in object based image analysis for the classification of non-cropped habitat in a 10,029 ha area of southeast England. Three classification scenarios were devised using 4 and 9 class scenarios. The machine learning algorithm Random Forest (RF) was used to reduce the number of variables used for each classification scenario by 25.5 % ± 2.7%. Proportion of votes from the 4 class hierarchy was made available to the 9 class scenarios and where the highest ranked variables in all cases. This approach allowed for misclassified parent objects to be correctly classified at a lower level. A single object hierarchy with 4 class proportion of votes produced the best result (kappa 0.909). Validation of the optimum training sample size in RF showed no significant difference between mean internal out-of-bag error and external validation. As an example of the utility of this data, we assessed habitat suitability for a declining farmland bird, the yellowhammer ( Emberiza citronella ), which requires hedgerows associated with grassy margins. We found that ∼22% of hedgerows were within 200 m of margins with an area >183.31 m 2 . The results from this analysis can form a key information source at the environmental and policy level in landscape optimisation for food production and ecosystem service sustainability.

  6. Quick-low-density parity check and dynamic threshold voltage optimization in 1X nm triple-level cell NAND flash memory with comprehensive analysis of endurance, retention-time, and temperature variation

    NASA Astrophysics Data System (ADS)

    Doi, Masafumi; Tokutomi, Tsukasa; Hachiya, Shogo; Kobayashi, Atsuro; Tanakamaru, Shuhei; Ning, Sheyang; Ogura Iwasaki, Tomoko; Takeuchi, Ken

    2016-08-01

    NAND flash memory’s reliability degrades with increasing endurance, retention-time and/or temperature. After a comprehensive evaluation of 1X nm triple-level cell (TLC) NAND flash, two highly reliable techniques are proposed. The first proposal, quick low-density parity check (Quick-LDPC), requires only one cell read in order to accurately estimate a bit-error rate (BER) that includes the effects of temperature, write and erase (W/E) cycles and retention-time. As a result, 83% read latency reduction is achieved compared to conventional AEP-LDPC. Also, W/E cycling is extended by 100% compared with conventional Bose-Chaudhuri-Hocquenghem (BCH) error-correcting code (ECC). The second proposal, dynamic threshold voltage optimization (DVO) has two parts, adaptive V Ref shift (AVS) and V TH space control (VSC). AVS reduces read error and latency by adaptively optimizing the reference voltage (V Ref) based on temperature, W/E cycles and retention-time. AVS stores the optimal V Ref’s in a table in order to enable one cell read. VSC further improves AVS by optimizing the voltage margins between V TH states. DVO reduces BER by 80%.

  7. Group-sequential three-arm noninferiority clinical trial designs

    PubMed Central

    Ochiai, Toshimitsu; Hamasaki, Toshimitsu; Evans, Scott R.; Asakura, Koko; Ohno, Yuko

    2016-01-01

    We discuss group-sequential three-arm noninferiority clinical trial designs that include active and placebo controls for evaluating both assay sensitivity and noninferiority. We extend two existing approaches, the fixed margin and fraction approaches, into a group-sequential setting with two decision-making frameworks. We investigate the operating characteristics including power, Type I error rate, maximum and expected sample sizes, as design factors vary. In addition, we discuss sample size recalculation and its’ impact on the power and Type I error rate via a simulation study. PMID:26892481

  8. Quantum-state comparison and discrimination

    NASA Astrophysics Data System (ADS)

    Hayashi, A.; Hashimoto, T.; Horibe, M.

    2018-05-01

    We investigate the performance of discrimination strategy in the comparison task of known quantum states. In the discrimination strategy, one infers whether or not two quantum systems are in the same state on the basis of the outcomes of separate discrimination measurements on each system. In some cases with more than two possible states, the optimal strategy in minimum-error comparison is that one should infer the two systems are in different states without any measurement, implying that the discrimination strategy performs worse than the trivial "no-measurement" strategy. We present a sufficient condition for this phenomenon to happen. For two pure states with equal prior probabilities, we determine the optimal comparison success probability with an error margin, which interpolates the minimum-error and unambiguous comparison. We find that the discrimination strategy is not optimal except for the minimum-error case.

  9. Effect of Patient Set-up and Respiration motion on Defining Biological Targets for Image-Guided Targeted Radiotherapy

    NASA Astrophysics Data System (ADS)

    McCall, Keisha C.

    Identification and monitoring of sub-tumor targets will be a critical step for optimal design and evaluation of cancer therapies in general and biologically targeted radiotherapy (dose-painting) in particular. Quantitative PET imaging may be an important tool for these applications. Currently radiotherapy planning accounts for tumor motion by applying geometric margins. These margins create a motion envelope to encompass the most probable positions of the tumor, while also maintaining the appropriate tumor control and normal tissue complication probabilities. This motion envelope is effective for uniform dose prescriptions where the therapeutic dose is conformed to the external margins of the tumor. However, much research is needed to establish the equivalent margins for non-uniform fields, where multiple biological targets are present and each target is prescribed its own dose level. Additionally, the size of the biological targets and close proximity make it impractical to apply planning margins on the sub-tumor level. Also, the extent of high dose regions must be limited to avoid excessive dose to the surrounding tissue. As such, this research project is an investigation of the uncertainty within quantitative PET images of moving and displaced dose-painting targets, and an investigation of the residual errors that remain after motion management. This included characterization of the changes in PET voxel-values as objects are moved relative to the discrete sampling interval of PET imaging systems (SPECIFIC AIM 1). Additionally, the repeatability of PET distributions and the delineating dose-painting targets were measured (SPECIFIC AIM 2). The effect of imaging uncertainty on the dose distributions designed using these images (SPECIFIC AIM 3) has also been investigated. This project also included analysis of methods to minimize motion during PET imaging and reduce the dosimetric impact of motion/position-induced imaging uncertainty (SPECIFIC AIM 4).

  10. SU-E-T-318: The Effect of Patient Positioning Errors On Target Coverage and Cochlear Dose in Stereotactic Radiosurgery Treatment of Acoustic Neuromas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dellamonica, D.; Luo, G.; Ding, G.

    Purpose: Setup errors on the order of millimeters may cause under-dosing of targets and significant changes in dose to critical structures especially when planning with tight margins in stereotactic radiosurgery. This study evaluates the effects of these types of patient positioning uncertainties on planning target volume (PTV) coverage and cochlear dose for stereotactic treatments of acoustic neuromas. Methods: Twelve acoustic neuroma patient treatment plans were retrospectively evaluated in Brainlab iPlan RT Dose 4.1.3. All treatment beams were shaped by HDMLC from a Varian TX machine. Seven patients had planning margins of 2mm, five had 1–1.5mm. Six treatment plans were createdmore » for each patient simulating a 1mm setup error in six possible directions: anterior-posterior, lateral, and superiorinferior. The arcs and HDMLC shapes were kept the same for each plan. Change in PTV coverage and mean dose to the cochlea was evaluated for each plan. Results: The average change in PTV coverage for the 72 simulated plans was −1.7% (range: −5 to +1.1%). The largest average change in coverage was observed for shifts in the patient's superior direction (−2.9%). The change in mean cochlear dose was highly dependent upon the direction of the shift. Shifts in the anterior and superior direction resulted in an average increase in dose of 13.5 and 3.8%, respectively, while shifts in the posterior and inferior direction resulted in an average decrease in dose of 17.9 and 10.2%. The average change in dose to the cochlea was 13.9% (range: 1.4 to 48.6%). No difference was observed based on the size of the planning margin. Conclusion: This study indicates that if the positioning uncertainty is kept within 1mm the setup errors may not result in significant under-dosing of the acoustic neuroma target volumes. However, the change in mean cochlear dose is highly dependent upon the direction of the shift.« less

  11. Evaluating the potential for remote bathymetric mapping of a turbid, sand-bed river: 2. Application to hyperspectral image data from the Platte River

    USGS Publications Warehouse

    Legleiter, C.J.; Kinzel, P.J.; Overstreet, B.T.

    2011-01-01

    This study examined the possibility of mapping depth from optical image data in turbid, sediment-laden channels. Analysis of hyperspectral images from the Platte River indicated that depth retrieval in these environments is feasible, but might not be highly accurate. Four methods of calibrating image-derived depth estimates were evaluated. The first involved extracting image spectra at survey point locations throughout the reach. These paired observations of depth and reflectance were subjected to optimal band ratio analysis (OBRA) to relate (R2 = 0.596) a spectrally based quantity to flow depth. Two other methods were based on OBRA of data from individual cross sections. A fourth strategy used ground-based reflectance measurements to derive an OBRA relation (R2 = 0.944) that was then applied to the image. Depth retrieval accuracy was assessed by visually inspecting cross sections and calculating various error metrics. Calibration via field spectroscopy resulted in a shallow bias but provided relative accuracies similar to image-based methods. Reach-aggregated OBRA was marginally superior to calibrations based on individual cross sections, and depth retrieval accuracy varied considerably along each reach. Errors were lower and observed versus predicted regression R2 values higher for a relatively simple, deeper site than a shallower, braided reach; errors were 1/3 and 1/2 the mean depth for the two reaches. Bathymetric maps were coherent and hydraulically reasonable, however, and might be more reliable than implied by numerical metrics. As an example application, linear discriminant analysis was used to produce a series of depth threshold maps for characterizing shallow-water habitat for roosting cranes. ?? 2011 by the American Geophysical Union.

  12. Evaluating the potential for remote bathymetric mapping of a turbid, sand-bed river: 2. application to hyperspectral image data from the Platte River

    USGS Publications Warehouse

    Legleiter, Carl J.; Kinzel, Paul J.; Overstreet, Brandon T.

    2011-01-01

    This study examined the possibility of mapping depth from optical image data in turbid, sediment-laden channels. Analysis of hyperspectral images from the Platte River indicated that depth retrieval in these environments is feasible, but might not be highly accurate. Four methods of calibrating image-derived depth estimates were evaluated. The first involved extracting image spectra at survey point locations throughout the reach. These paired observations of depth and reflectance were subjected to optimal band ratio analysis (OBRA) to relate (R2 = 0.596) a spectrally based quantity to flow depth. Two other methods were based on OBRA of data from individual cross sections. A fourth strategy used ground-based reflectance measurements to derive an OBRA relation (R2 = 0.944) that was then applied to the image. Depth retrieval accuracy was assessed by visually inspecting cross sections and calculating various error metrics. Calibration via field spectroscopy resulted in a shallow bias but provided relative accuracies similar to image-based methods. Reach-aggregated OBRA was marginally superior to calibrations based on individual cross sections, and depth retrieval accuracy varied considerably along each reach. Errors were lower and observed versus predicted regression R2 values higher for a relatively simple, deeper site than a shallower, braided reach; errors were 1/3 and 1/2 the mean depth for the two reaches. Bathymetric maps were coherent and hydraulically reasonable, however, and might be more reliable than implied by numerical metrics. As an example application, linear discriminant analysis was used to produce a series of depth threshold maps for characterizing shallow-water habitat for roosting cranes.

  13. Detection of gene-environment interactions in the presence of linkage disequilibrium and noise by using genetic risk scores with internal weights from elastic net regression.

    PubMed

    Hüls, Anke; Ickstadt, Katja; Schikowski, Tamara; Krämer, Ursula

    2017-06-12

    For the analysis of gene-environment (GxE) interactions commonly single nucleotide polymorphisms (SNPs) are used to characterize genetic susceptibility, an approach that mostly lacks power and has poor reproducibility. One promising approach to overcome this problem might be the use of weighted genetic risk scores (GRS), which are defined as weighted sums of risk alleles of gene variants. The gold-standard is to use external weights from published meta-analyses. In this study, we used internal weights from the marginal genetic effects of the SNPs estimated by a multivariate elastic net regression and thereby provided a method that can be used if there are no external weights available. We conducted a simulation study for the detection of GxE interactions and compared power and type I error of single SNPs analyses with Bonferroni correction and corresponding analysis with unweighted and our weighted GRS approach in scenarios with six risk SNPs and an increasing number of highly correlated (up to 210) and noise SNPs (up to 840). Applying weighted GRS increased the power enormously in comparison to the common single SNPs approach (e.g. 94.2% vs. 35.4%, respectively, to detect a weak interaction with an OR ≈ 1.04 for six uncorrelated risk SNPs and n = 700 with a well-controlled type I error). Furthermore, weighted GRS outperformed the unweighted GRS, in particular in the presence of SNPs without any effect on the phenotype (e.g. 90.1% vs. 43.9%, respectively, when 20 noise SNPs were added to the six risk SNPs). This outperforming of the weighted GRS was confirmed in a real data application on lung inflammation in the SALIA cohort (n = 402). However, in scenarios with a high number of noise SNPs (>200 vs. 6 risk SNPs), larger sample sizes are needed to avoid an increased type I error, whereas a high number of correlated SNPs can be handled even in small samples (e.g. n = 400). In conclusion, weighted GRS with weights from the marginal genetic effects of the SNPs estimated by a multivariate elastic net regression were shown to be a powerful tool to detect gene-environment interactions in scenarios of high Linkage disequilibrium and noise.

  14. Sensitivity of Fit Indices to Misspecification in Growth Curve Models

    ERIC Educational Resources Information Center

    Wu, Wei; West, Stephen G.

    2010-01-01

    This study investigated the sensitivity of fit indices to model misspecification in within-individual covariance structure, between-individual covariance structure, and marginal mean structure in growth curve models. Five commonly used fit indices were examined, including the likelihood ratio test statistic, root mean square error of…

  15. Moral Philosophy, Disability, and Inclusive Education

    ERIC Educational Resources Information Center

    Fitch, E. Frank

    2009-01-01

    Disability and dependence are integral to the human experience and yet have been largely marginalized or denigrated within Western philosophy. Joining a growing counter narrative from the disability studies movement, several mainstream moral philosophers are helping to redress this error. In this essay, the author discusses ideas from four such…

  16. Infrared thermometry for deficit irrigation of peach trees

    USDA-ARS?s Scientific Manuscript database

    Water shortage has been a major concern for crop production in the western states of the USA and other arid regions in the world. Deficit irrigation can be used in some cropping systems as a potential water saving strategy to alleviate water shortage, however, the margin of error in irrigation manag...

  17. The CO2 laser frequency stability measurements

    NASA Technical Reports Server (NTRS)

    Johnson, E. H., Jr.

    1973-01-01

    Carbon dioxide laser frequency stability data are considered for a receiver design that relates to maximum Doppler frequency and its rate of change. Results show that an adequate margin exists in terms of data acquisition, Doppler tracking, and bit error rate as they relate to laser stability and transmitter power.

  18. ON MODEL SELECTION STRATEGIES TO IDENTIFY GENES UNDERLYING BINARY TRAITS USING GENOME-WIDE ASSOCIATION DATA.

    PubMed

    Wu, Zheyang; Zhao, Hongyu

    2012-01-01

    For more fruitful discoveries of genetic variants associated with diseases in genome-wide association studies, it is important to know whether joint analysis of multiple markers is more powerful than the commonly used single-marker analysis, especially in the presence of gene-gene interactions. This article provides a statistical framework to rigorously address this question through analytical power calculations for common model search strategies to detect binary trait loci: marginal search, exhaustive search, forward search, and two-stage screening search. Our approach incorporates linkage disequilibrium, random genotypes, and correlations among score test statistics of logistic regressions. We derive analytical results under two power definitions: the power of finding all the associated markers and the power of finding at least one associated marker. We also consider two types of error controls: the discovery number control and the Bonferroni type I error rate control. After demonstrating the accuracy of our analytical results by simulations, we apply them to consider a broad genetic model space to investigate the relative performances of different model search strategies. Our analytical study provides rapid computation as well as insights into the statistical mechanism of capturing genetic signals under different genetic models including gene-gene interactions. Even though we focus on genetic association analysis, our results on the power of model selection procedures are clearly very general and applicable to other studies.

  19. ON MODEL SELECTION STRATEGIES TO IDENTIFY GENES UNDERLYING BINARY TRAITS USING GENOME-WIDE ASSOCIATION DATA

    PubMed Central

    Wu, Zheyang; Zhao, Hongyu

    2013-01-01

    For more fruitful discoveries of genetic variants associated with diseases in genome-wide association studies, it is important to know whether joint analysis of multiple markers is more powerful than the commonly used single-marker analysis, especially in the presence of gene-gene interactions. This article provides a statistical framework to rigorously address this question through analytical power calculations for common model search strategies to detect binary trait loci: marginal search, exhaustive search, forward search, and two-stage screening search. Our approach incorporates linkage disequilibrium, random genotypes, and correlations among score test statistics of logistic regressions. We derive analytical results under two power definitions: the power of finding all the associated markers and the power of finding at least one associated marker. We also consider two types of error controls: the discovery number control and the Bonferroni type I error rate control. After demonstrating the accuracy of our analytical results by simulations, we apply them to consider a broad genetic model space to investigate the relative performances of different model search strategies. Our analytical study provides rapid computation as well as insights into the statistical mechanism of capturing genetic signals under different genetic models including gene-gene interactions. Even though we focus on genetic association analysis, our results on the power of model selection procedures are clearly very general and applicable to other studies. PMID:23956610

  20. Reduction of prostate intrafraction motion using gas-release rectal balloons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Su Zhong; Zhao Tianyu; Li Zuofeng

    2012-10-15

    Purpose: To analyze prostate intrafraction motion using both non-gas-release (NGR) and gas-release (GR) rectal balloons and to evaluate the ability of GR rectal balloons to reduce prostate intrafraction motion. Methods: Twenty-nine patients with NGR rectal balloons and 29 patients with GR balloons were randomly selected from prostate patients treated with proton therapy at University of Florida Proton Therapy Institute (Jacksonville, FL). Their pretreatment and post-treatment orthogonal radiographs were analyzed, and both pretreatment setup residual error and intrafraction-motion data were obtained. Population histograms of intrafraction motion were plotted for both types of balloons. Population planning target-volume (PTV) margins were calculated withmore » the van Herk formula of 2.5{Sigma}+ 0.7{sigma} to account for setup residual errors and intrafraction motion errors. Results: Pretreatment and post-treatment radiographs indicated that the use of gas-release rectal balloons reduced prostate intrafraction motion along superior-inferior (SI) and anterior-posterior (AP) directions. Similar patient setup residual errors were exhibited for both types of balloons. Gas-release rectal balloons resulted in PTV margin reductions from 3.9 to 2.8 mm in the SI direction, 3.1 to 1.8 mm in the AP direction, and an increase from 1.9 to 2.1 mm in the left-right direction. Conclusions: Prostate intrafraction motion is an important uncertainty source in radiotherapy after image-guided patient setup with online corrections. Compared to non-gas-release rectal balloons, gas-release balloons can reduce prostate intrafraction motion in the SI and AP directions caused by gas buildup.« less

  1. Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations

    NASA Technical Reports Server (NTRS)

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-01-01

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks [Scargle 1998]-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piece- wise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by [Arias-Castro, Donoho and Huo 2003]. In the spirit of Reproducible Research [Donoho et al. (2008)] all of the code and data necessary to reproduce all of the figures in this paper are included as auxiliary material.

  2. Identifying Pleiotropic Genes in Genome-Wide Association Studies for Multivariate Phenotypes with Mixed Measurement Scales

    PubMed Central

    Williams, L. Keoki; Buu, Anne

    2017-01-01

    We propose a multivariate genome-wide association test for mixed continuous, binary, and ordinal phenotypes. A latent response model is used to estimate the correlation between phenotypes with different measurement scales so that the empirical distribution of the Fisher’s combination statistic under the null hypothesis is estimated efficiently. The simulation study shows that our proposed correlation estimation methods have high levels of accuracy. More importantly, our approach conservatively estimates the variance of the test statistic so that the type I error rate is controlled. The simulation also shows that the proposed test maintains the power at the level very close to that of the ideal analysis based on known latent phenotypes while controlling the type I error. In contrast, conventional approaches–dichotomizing all observed phenotypes or treating them as continuous variables–could either reduce the power or employ a linear regression model unfit for the data. Furthermore, the statistical analysis on the database of the Study of Addiction: Genetics and Environment (SAGE) demonstrates that conducting a multivariate test on multiple phenotypes can increase the power of identifying markers that may not be, otherwise, chosen using marginal tests. The proposed method also offers a new approach to analyzing the Fagerström Test for Nicotine Dependence as multivariate phenotypes in genome-wide association studies. PMID:28081206

  3. A bias correction for covariance estimators to improve inference with generalized estimating equations that use an unstructured correlation matrix.

    PubMed

    Westgate, Philip M

    2013-07-20

    Generalized estimating equations (GEEs) are routinely used for the marginal analysis of correlated data. The efficiency of GEE depends on how closely the working covariance structure resembles the true structure, and therefore accurate modeling of the working correlation of the data is important. A popular approach is the use of an unstructured working correlation matrix, as it is not as restrictive as simpler structures such as exchangeable and AR-1 and thus can theoretically improve efficiency. However, because of the potential for having to estimate a large number of correlation parameters, variances of regression parameter estimates can be larger than theoretically expected when utilizing the unstructured working correlation matrix. Therefore, standard error estimates can be negatively biased. To account for this additional finite-sample variability, we derive a bias correction that can be applied to typical estimators of the covariance matrix of parameter estimates. Via simulation and in application to a longitudinal study, we show that our proposed correction improves standard error estimation and statistical inference. Copyright © 2012 John Wiley & Sons, Ltd.

  4. Measuring Starlight Deflection during the 2017 Eclipse: Repeating the Experiment that made Einstein Famous

    NASA Astrophysics Data System (ADS)

    Bruns, Donald

    2016-05-01

    In 1919, astronomers performed an experiment during a solar eclipse, attempting to measure the deflection of stars near the sun, in order to verify Einstein's theory of general relativity. The experiment was very difficult and the results were marginal, but the success made Albert Einstein famous around the world. Astronomers last repeated the experiment in 1973, achieving an error of 11%. In 2017, using amateur equipment and modern technology, I plan to repeat the experiment and achieve a 1% error. The best available star catalog will be used for star positions. Corrections for optical distortion and atmospheric refraction are better than 0.01 arcsec. During totality, I expect 7 or 8 measurable stars down to magnitude 9.5, based on analysis of previous eclipse measurements taken by amateurs. Reference images, taken near the sun during totality, will be used for precise calibration. Preliminary test runs performed during twilight in April 2016 and April 2017 can accurately simulate the sky conditions during totality, providing an accurate estimate of the final uncertainty.

  5. Synergies in Astrometry: Predicting Navigational Error of Visual Binary Stars

    NASA Astrophysics Data System (ADS)

    Gessner Stewart, Susan

    2015-08-01

    Celestial navigation can employ a number of bright stars which are in binary systems. Often these are unresolved, appearing as a single, center-of-light object. A number of these systems are, however, in wide systems which could introduce a margin of error in the navigation solution if not handled properly. To illustrate the importance of good orbital solutions for binary systems - as well as good astrometry in general - the relationship between the center-of-light versus individual catalog position of celestial bodies and the error in terrestrial position derived via celestial navigation is demonstrated. From the list of navigational binary stars, fourteen such binary systems with at least 3.0 arcseconds apparent separation are explored. Maximum navigational error is estimated under the assumption that the bright star in the pair is observed at maximum separation, but the center-of-light is employed in the navigational solution. The relationships between navigational error and separation, orbital periods, and observers' latitude are discussed.

  6. A margin model to account for respiration-induced tumour motion and its variability

    NASA Astrophysics Data System (ADS)

    Coolens, Catherine; Webb, Steve; Shirato, H.; Nishioka, K.; Evans, Phil M.

    2008-08-01

    In order to reduce the sensitivity of radiotherapy treatments to organ motion, compensation methods are being investigated such as gating of treatment delivery, tracking of tumour position, 4D scanning and planning of the treatment, etc. An outstanding problem that would occur with all these methods is the assumption that breathing motion is reproducible throughout the planning and delivery process of treatment. This is obviously not a realistic assumption and is one that will introduce errors. A dynamic internal margin model (DIM) is presented that is designed to follow the tumour trajectory and account for the variability in respiratory motion. The model statistically describes the variation of the breathing cycle over time, i.e. the uncertainty in motion amplitude and phase reproducibility, in a polar coordinate system from which margins can be derived. This allows accounting for an additional gating window parameter for gated treatment delivery as well as minimizing the area of normal tissue irradiated. The model was illustrated with abdominal motion for a patient with liver cancer and tested with internal 3D lung tumour trajectories. The results confirm that the respiratory phases around exhale are most reproducible and have the smallest variation in motion amplitude and phase (approximately 2 mm). More importantly, the margin area covering normal tissue is significantly reduced by using trajectory-specific margins (as opposed to conventional margins) as the angular component is by far the largest contributor to the margin area. The statistical approach to margin calculation, in addition, offers the possibility for advanced online verification and updating of breathing variation as more data become available.

  7. A mixed-effects regression model for longitudinal multivariate ordinal data.

    PubMed

    Liu, Li C; Hedeker, Donald

    2006-03-01

    A mixed-effects item response theory model that allows for three-level multivariate ordinal outcomes and accommodates multiple random subject effects is proposed for analysis of multivariate ordinal outcomes in longitudinal studies. This model allows for the estimation of different item factor loadings (item discrimination parameters) for the multiple outcomes. The covariates in the model do not have to follow the proportional odds assumption and can be at any level. Assuming either a probit or logistic response function, maximum marginal likelihood estimation is proposed utilizing multidimensional Gauss-Hermite quadrature for integration of the random effects. An iterative Fisher scoring solution, which provides standard errors for all model parameters, is used. An analysis of a longitudinal substance use data set, where four items of substance use behavior (cigarette use, alcohol use, marijuana use, and getting drunk or high) are repeatedly measured over time, is used to illustrate application of the proposed model.

  8. Evaluating elements of trust: Race and class in risk communication in post-Katrina New Orleans.

    PubMed

    Battistoli, B F

    2016-05-01

    This study seeks to determine the relative influence of race and class on trust in sources of messages of environmental risk in post-Katrina New Orleans. It poses two hypotheses to test that influence: H1-African-Americans ("Blacks") trust risk message sources less than European American ("Whites") do and H2-The higher the socioeconomic class, the lower the trust in risk message sources. A 37-question telephone survey (landlines and cellphones) was conducted in Orleans Parish in 2012 (n = 414). The overall margin of error was ±4.8% at a 95% confidence interval. A hierarchical regression analysis revealed that the first hypothesis was rejected, while the second was supported. Additional data analysis revealed that frequency of use of sources of risk information appears to be a positive factor in building trust. © The Author(s) 2015.

  9. Bounded Linear Stability Margin Analysis of Nonlinear Hybrid Adaptive Control

    NASA Technical Reports Server (NTRS)

    Nguyen, Nhan T.; Boskovic, Jovan D.

    2008-01-01

    This paper presents a bounded linear stability analysis for a hybrid adaptive control that blends both direct and indirect adaptive control. Stability and convergence of nonlinear adaptive control are analyzed using an approximate linear equivalent system. A stability margin analysis shows that a large adaptive gain can lead to a reduced phase margin. This method can enable metrics-driven adaptive control whereby the adaptive gain is adjusted to meet stability margin requirements.

  10. Chromatic dispersive confocal technology for intra-oral scanning: first in-vitro results

    NASA Astrophysics Data System (ADS)

    Ertl, T.; Zint, M.; Konz, A.; Brauer, E.; Hörhold, H.; Hibst, R.

    2015-02-01

    Various test objects, plaster models, partially equipped with extracted teeth and pig jaws representing various clinical situations of tooth preparations were used for in-vitro scanning tests with an experimental intra-oral scanning system based on chromatic-dispersive confocal technology. Scanning results were compared against data sets of the same object captured by an industrial μCT measuring system. Compared to μCT data an average error of 18 - 30 μm was achieved for a single tooth scan area and less than 40 to 60 μm error measured over the restoration + the neighbor teeth and pontic areas up to 7 units. Mean error for a full jaw is within 100 - 140 μm. The length error for a 3 - 4 unit bridge situation form contact point to contact point is below 100 μm and excellent interproximal surface coverage and prep margin clarity was achieved.

  11. Seasonal to interannual Arctic sea ice predictability in current global climate models

    NASA Astrophysics Data System (ADS)

    Tietsche, S.; Day, J. J.; Guemas, V.; Hurlin, W. J.; Keeley, S. P. E.; Matei, D.; Msadek, R.; Collins, M.; Hawkins, E.

    2014-02-01

    We establish the first intermodel comparison of seasonal to interannual predictability of present-day Arctic climate by performing coordinated sets of idealized ensemble predictions with four state-of-the-art global climate models. For Arctic sea ice extent and volume, there is potential predictive skill for lead times of up to 3 years, and potential prediction errors have similar growth rates and magnitudes across the models. Spatial patterns of potential prediction errors differ substantially between the models, but some features are robust. Sea ice concentration errors are largest in the marginal ice zone, and in winter they are almost zero away from the ice edge. Sea ice thickness errors are amplified along the coasts of the Arctic Ocean, an effect that is dominated by sea ice advection. These results give an upper bound on the ability of current global climate models to predict important aspects of Arctic climate.

  12. Multi-Reader ROC studies with Split-Plot Designs: A Comparison of Statistical Methods

    PubMed Central

    Obuchowski, Nancy A.; Gallas, Brandon D.; Hillis, Stephen L.

    2012-01-01

    Rationale and Objectives Multi-reader imaging trials often use a factorial design, where study patients undergo testing with all imaging modalities and readers interpret the results of all tests for all patients. A drawback of the design is the large number of interpretations required of each reader. Split-plot designs have been proposed as an alternative, in which one or a subset of readers interprets all images of a sample of patients, while other readers interpret the images of other samples of patients. In this paper we compare three methods of analysis for the split-plot design. Materials and Methods Three statistical methods are presented: Obuchowski-Rockette method modified for the split-plot design, a newly proposed marginal-mean ANOVA approach, and an extension of the three-sample U-statistic method. A simulation study using the Roe-Metz model was performed to compare the type I error rate, power and confidence interval coverage of the three test statistics. Results The type I error rates for all three methods are close to the nominal level but tend to be slightly conservative. The statistical power is nearly identical for the three methods. The coverage of 95% CIs fall close to the nominal coverage for small and large sample sizes. Conclusions The split-plot MRMC study design can be statistically efficient compared with the factorial design, reducing the number of interpretations required per reader. Three methods of analysis, shown to have nominal type I error rate, similar power, and nominal CI coverage, are available for this study design. PMID:23122570

  13. Augmented reality in bone tumour resection: An experimental study.

    PubMed

    Cho, H S; Park, Y K; Gupta, S; Yoon, C; Han, I; Kim, H-S; Choi, H; Hong, J

    2017-03-01

    We evaluated the accuracy of augmented reality (AR)-based navigation assistance through simulation of bone tumours in a pig femur model. We developed an AR-based navigation system for bone tumour resection, which could be used on a tablet PC. To simulate a bone tumour in the pig femur, a cortical window was made in the diaphysis and bone cement was inserted. A total of 133 pig femurs were used and tumour resection was simulated with AR-assisted resection (164 resection in 82 femurs, half by an orthropaedic oncology expert and half by an orthopaedic resident) and resection with the conventional method (82 resection in 41 femurs). In the conventional group, resection was performed after measuring the distance from the edge of the condyle to the expected resection margin with a ruler as per routine clinical practice. The mean error of 164 resections in 82 femurs in the AR group was 1.71 mm (0 to 6). The mean error of 82 resections in 41 femurs in the conventional resection group was 2.64 mm (0 to 11) (p < 0.05, one-way analysis of variance). The probabilities of a surgeon obtaining a 10 mm surgical margin with a 3 mm tolerance were 90.2% in AR-assisted resections, and 70.7% in conventional resections. We demonstrated that the accuracy of tumour resection was satisfactory with the help of the AR navigation system, with the tumour shown as a virtual template. In addition, this concept made the navigation system simple and available without additional cost or time. Cite this article: H. S. Cho, Y. K. Park, S. Gupta, C. Yoon, I. Han, H-S. Kim, H. Choi, J. Hong. Augmented reality in bone tumour resection: An experimental study. Bone Joint Res 2017;6:137-143. © 2017 Cho et al.

  14. Augmented reality in bone tumour resection

    PubMed Central

    Park, Y. K.; Gupta, S.; Yoon, C.; Han, I.; Kim, H-S.; Choi, H.; Hong, J.

    2017-01-01

    Objectives We evaluated the accuracy of augmented reality (AR)-based navigation assistance through simulation of bone tumours in a pig femur model. Methods We developed an AR-based navigation system for bone tumour resection, which could be used on a tablet PC. To simulate a bone tumour in the pig femur, a cortical window was made in the diaphysis and bone cement was inserted. A total of 133 pig femurs were used and tumour resection was simulated with AR-assisted resection (164 resection in 82 femurs, half by an orthropaedic oncology expert and half by an orthopaedic resident) and resection with the conventional method (82 resection in 41 femurs). In the conventional group, resection was performed after measuring the distance from the edge of the condyle to the expected resection margin with a ruler as per routine clinical practice. Results The mean error of 164 resections in 82 femurs in the AR group was 1.71 mm (0 to 6). The mean error of 82 resections in 41 femurs in the conventional resection group was 2.64 mm (0 to 11) (p < 0.05, one-way analysis of variance). The probabilities of a surgeon obtaining a 10 mm surgical margin with a 3 mm tolerance were 90.2% in AR-assisted resections, and 70.7% in conventional resections. Conclusion We demonstrated that the accuracy of tumour resection was satisfactory with the help of the AR navigation system, with the tumour shown as a virtual template. In addition, this concept made the navigation system simple and available without additional cost or time. Cite this article: H. S. Cho, Y. K. Park, S. Gupta, C. Yoon, I. Han, H-S. Kim, H. Choi, J. Hong. Augmented reality in bone tumour resection: An experimental study. Bone Joint Res 2017;6:137–143. PMID:28258117

  15. Testing non-inferiority of a new treatment in three-arm clinical trials with binary endpoints.

    PubMed

    Tang, Nian-Sheng; Yu, Bin; Tang, Man-Lai

    2014-12-18

    A two-arm non-inferiority trial without a placebo is usually adopted to demonstrate that an experimental treatment is not worse than a reference treatment by a small pre-specified non-inferiority margin due to ethical concerns. Selection of the non-inferiority margin and establishment of assay sensitivity are two major issues in the design, analysis and interpretation for two-arm non-inferiority trials. Alternatively, a three-arm non-inferiority clinical trial including a placebo is usually conducted to assess the assay sensitivity and internal validity of a trial. Recently, some large-sample approaches have been developed to assess the non-inferiority of a new treatment based on the three-arm trial design. However, these methods behave badly with small sample sizes in the three arms. This manuscript aims to develop some reliable small-sample methods to test three-arm non-inferiority. Saddlepoint approximation, exact and approximate unconditional, and bootstrap-resampling methods are developed to calculate p-values of the Wald-type, score and likelihood ratio tests. Simulation studies are conducted to evaluate their performance in terms of type I error rate and power. Our empirical results show that the saddlepoint approximation method generally behaves better than the asymptotic method based on the Wald-type test statistic. For small sample sizes, approximate unconditional and bootstrap-resampling methods based on the score test statistic perform better in the sense that their corresponding type I error rates are generally closer to the prespecified nominal level than those of other test procedures. Both approximate unconditional and bootstrap-resampling test procedures based on the score test statistic are generally recommended for three-arm non-inferiority trials with binary outcomes.

  16. Toward real-time endoscopically-guided robotic navigation based on a 3D virtual surgical field model

    NASA Astrophysics Data System (ADS)

    Gong, Yuanzheng; Hu, Danying; Hannaford, Blake; Seibel, Eric J.

    2015-03-01

    The challenge is to accurately guide the surgical tool within the three-dimensional (3D) surgical field for roboticallyassisted operations such as tumor margin removal from a debulked brain tumor cavity. The proposed technique is 3D image-guided surgical navigation based on matching intraoperative video frames to a 3D virtual model of the surgical field. A small laser-scanning endoscopic camera was attached to a mock minimally-invasive surgical tool that was manipulated toward a region of interest (residual tumor) within a phantom of a debulked brain tumor. Video frames from the endoscope provided features that were matched to the 3D virtual model, which were reconstructed earlier by raster scanning over the surgical field. Camera pose (position and orientation) is recovered by implementing a constrained bundle adjustment algorithm. Navigational error during the approach to fluorescence target (residual tumor) is determined by comparing the calculated camera pose to the measured camera pose using a micro-positioning stage. From these preliminary results, computation efficiency of the algorithm in MATLAB code is near real-time (2.5 sec for each estimation of pose), which can be improved by implementation in C++. Error analysis produced 3-mm distance error and 2.5 degree of orientation error on average. The sources of these errors come from 1) inaccuracy of the 3D virtual model, generated on a calibrated RAVEN robotic platform with stereo tracking; 2) inaccuracy of endoscope intrinsic parameters, such as focal length; and 3) any endoscopic image distortion from scanning irregularities. This work demonstrates feasibility of micro-camera 3D guidance of a robotic surgical tool.

  17. Identifying pleiotropic genes in genome-wide association studies from related subjects using the linear mixed model and Fisher combination function.

    PubMed

    Yang, James J; Williams, L Keoki; Buu, Anne

    2017-08-24

    A multivariate genome-wide association test is proposed for analyzing data on multivariate quantitative phenotypes collected from related subjects. The proposed method is a two-step approach. The first step models the association between the genotype and marginal phenotype using a linear mixed model. The second step uses the correlation between residuals of the linear mixed model to estimate the null distribution of the Fisher combination test statistic. The simulation results show that the proposed method controls the type I error rate and is more powerful than the marginal tests across different population structures (admixed or non-admixed) and relatedness (related or independent). The statistical analysis on the database of the Study of Addiction: Genetics and Environment (SAGE) demonstrates that applying the multivariate association test may facilitate identification of the pleiotropic genes contributing to the risk for alcohol dependence commonly expressed by four correlated phenotypes. This study proposes a multivariate method for identifying pleiotropic genes while adjusting for cryptic relatedness and population structure between subjects. The two-step approach is not only powerful but also computationally efficient even when the number of subjects and the number of phenotypes are both very large.

  18. Rules based process window OPC

    NASA Astrophysics Data System (ADS)

    O'Brien, Sean; Soper, Robert; Best, Shane; Mason, Mark

    2008-03-01

    As a preliminary step towards Model-Based Process Window OPC we have analyzed the impact of correcting post-OPC layouts using rules based methods. Image processing on the Brion Tachyon was used to identify sites where the OPC model/recipe failed to generate an acceptable solution. A set of rules for 65nm active and poly were generated by classifying these failure sites. The rules were based upon segment runlengths, figure spaces, and adjacent figure widths. 2.1 million sites for active were corrected in a small chip (comparing the pre and post rules based operations), and 59 million were found at poly. Tachyon analysis of the final reticle layout found weak margin sites distinct from those sites repaired by rules-based corrections. For the active layer more than 75% of the sites corrected by rules would have printed without a defect indicating that most rulesbased cleanups degrade the lithographic pattern. Some sites were missed by the rules based cleanups due to either bugs in the DRC software or gaps in the rules table. In the end dramatic changes to the reticle prevented catastrophic lithography errors, but this method is far too blunt. A more subtle model-based procedure is needed changing only those sites which have unsatisfactory lithographic margin.

  19. Optimization of Coronal Mass Ejection Ensemble Forecasting Using WSA-ENLIL with Coned Model

    DTIC Science & Technology

    2013-03-01

    previous versions by a large margin. The mean absolute forecast error of the median ensemble results was improved by over 43% over the original Coned...for reference for the six extra CMEs. .............................................................................................54 Figure 19...single-shot runs) with the flare location noted for reference for the six extra CMEs

  20. Checked Out: Ohioans' Views on Education 2009

    ERIC Educational Resources Information Center

    Thomas B. Fordham Institute, 2009

    2009-01-01

    In collaboration with the Thomas B. Fordham Institute and Catalyst Ohio, the FDR Group conducted a telephone survey of 1,002 randomly selected Ohio residents between April 1 and April 9, 2009 (margin of error +/- 3 percentage points). The survey--the third in a series--reports Ohioans' views on critical education issues, including school funding,…

  1. Bivariate drought frequency analysis using the copula method

    NASA Astrophysics Data System (ADS)

    Mirabbasi, Rasoul; Fakheri-Fard, Ahmad; Dinpashoh, Yagob

    2012-04-01

    Droughts are major natural hazards with significant environmental and economic impacts. In this study, two-dimensional copulas were applied to the analysis of the meteorological drought characteristics of the Sharafkhaneh gauge station, located in the northwest of Iran. Two major drought characteristics, duration and severity, as defined by the standardized precipitation index, were abstracted from observed drought events. Since drought duration and severity exhibited a significant correlation and since they were modeled using different distributions, copulas were used to construct the joint distribution function of the drought characteristics. The parameter of copulas was estimated using the method of the Inference Function for Margins. Several copulas were tested in order to determine the best data fit. According to the error analysis and the tail dependence coefficient, the Galambos copula provided the best fit for the observed drought data. Some bivariate probabilistic properties of droughts, based on the derived copula-based joint distribution, were also investigated. These probabilistic properties can provide useful information for water resource planning and management.

  2. Trans-dimensional inversion of microtremor array dispersion data with hierarchical autoregressive error models

    NASA Astrophysics Data System (ADS)

    Dettmer, Jan; Molnar, Sheri; Steininger, Gavin; Dosso, Stan E.; Cassidy, John F.

    2012-02-01

    This paper applies a general trans-dimensional Bayesian inference methodology and hierarchical autoregressive data-error models to the inversion of microtremor array dispersion data for shear wave velocity (vs) structure. This approach accounts for the limited knowledge of the optimal earth model parametrization (e.g. the number of layers in the vs profile) and of the data-error statistics in the resulting vs parameter uncertainty estimates. The assumed earth model parametrization influences estimates of parameter values and uncertainties due to different parametrizations leading to different ranges of data predictions. The support of the data for a particular model is often non-unique and several parametrizations may be supported. A trans-dimensional formulation accounts for this non-uniqueness by including a model-indexing parameter as an unknown so that groups of models (identified by the indexing parameter) are considered in the results. The earth model is parametrized in terms of a partition model with interfaces given over a depth-range of interest. In this work, the number of interfaces (layers) in the partition model represents the trans-dimensional model indexing. In addition, serial data-error correlations are addressed by augmenting the geophysical forward model with a hierarchical autoregressive error model that can account for a wide range of error processes with a small number of parameters. Hence, the limited knowledge about the true statistical distribution of data errors is also accounted for in the earth model parameter estimates, resulting in more realistic uncertainties and parameter values. Hierarchical autoregressive error models do not rely on point estimates of the model vector to estimate data-error statistics, and have no requirement for computing the inverse or determinant of a data-error covariance matrix. This approach is particularly useful for trans-dimensional inverse problems, as point estimates may not be representative of the state space that spans multiple subspaces of different dimensionalities. The order of the autoregressive process required to fit the data is determined here by posterior residual-sample examination and statistical tests. Inference for earth model parameters is carried out on the trans-dimensional posterior probability distribution by considering ensembles of parameter vectors. In particular, vs uncertainty estimates are obtained by marginalizing the trans-dimensional posterior distribution in terms of vs-profile marginal distributions. The methodology is applied to microtremor array dispersion data collected at two sites with significantly different geology in British Columbia, Canada. At both sites, results show excellent agreement with estimates from invasive measurements.

  3. Computerized margin and texture analyses for differentiating bacterial pneumonia and invasive mucinous adenocarcinoma presenting as consolidation.

    PubMed

    Koo, Hyun Jung; Kim, Mi Young; Koo, Ja Hwan; Sung, Yu Sub; Jung, Jiwon; Kim, Sung-Han; Choi, Chang-Min; Kim, Hwa Jung

    2017-01-01

    Radiologists have used margin characteristics based on routine visual analysis; however, the attenuation changes at the margin of the lesion on CT images have not been quantitatively assessed. We established a CT-based margin analysis method by comparing a target lesion with normal lung attenuation, drawing a slope to represent the attenuation changes. This approach was applied to patients with invasive mucinous adenocarcinoma (n = 40) or bacterial pneumonia (n = 30). Correlations among multiple regions of interest (ROIs) were obtained using intraclass correlation coefficient (ICC) values. CT visual assessment, margin and texture parameters were compared for differentiating the two disease entities. The attenuation and margin parameters in multiple ROIs showed excellent ICC values. Attenuation slopes obtained at the margins revealed a difference between invasive mucinous adenocarcinoma and pneumonia (P<0.001), and mucinous adenocarcinoma produced a sharply declining attenuation slope. On multivariable logistic regression analysis, pneumonia had an ill-defined margin (odds ratio (OR), 4.84; 95% confidence interval (CI), 1.26-18.52; P = 0.02), ground-glass opacity (OR, 8.55; 95% CI, 2.09-34.95; P = 0.003), and gradually declining attenuation at the margin (OR, 12.63; 95% CI, 2.77-57.51, P = 0.001). CT-based margin analysis method has a potential to act as an imaging parameter for differentiating invasive mucinous adenocarcinoma and bacterial pneumonia.

  4. What is the best surgical margin for a Basal cell carcinoma: a meta-analysis of the literature.

    PubMed

    Gulleth, Yusuf; Goldberg, Nelson; Silverman, Ronald P; Gastman, Brian R

    2010-10-01

    Current management of basal cell carcinoma is surgical excision. Most resections use predetermined surgical margins. The basis of ideal resection margins is almost completely from retrospective data and mainly from small case series. This article presents a systematic analysis from a large pool of data to provide a better basis of determining ideal surgical margin. A systematic analysis was performed on data from 89 articles from a larger group of 973 articles selected from the PubMed database. Relevant inclusion and exclusion criteria were applied to all articles reviewed and the data were entered into a database for statistical analysis. The total number of lesions analyzed was 16,066; size ranged from 3 to 30 mm (mean, 11.7 ± 5.9 mm). Surgical margins ranged from 1 to 10 mm (mean, 3.9 ± 1.4 mm). Negative surgical margins ranged 45 to 100 percent (mean, 86 ± 12 percent). Recurrence rates for 5-, 4-, 3-, and 2-mm surgical margins were 0.39, 1.62, 2.56, and 3.96 percent, respectively. Pooled data for incompletely excised margins have an average recurrence rate of 27 percent. A 3-mm surgical margin can be safely used for nonmorpheaform basal cell carcinoma to attain 95 percent cure rates for lesions 2 cm or smaller. A positive pathologic margin has an average recurrence rate of 27 percent.

  5. Comment on "Hydrogen Balmer beta: The separation between line peaks for plasma electron density diagnostics and self-absorption test"

    NASA Astrophysics Data System (ADS)

    Gautam, Ghaneshwar; Surmick, David M.; Parigger, Christian G.

    2015-07-01

    In this letter, we present a brief comment regarding the recently published paper by Ivković et al., J Quant Spectrosc Radiat Transf 2015;154:1-8. Reference is made to previous experimental results to indicate that self absorption must have occurred; however, when carefully considering error propagation, both widths and peak-separation predict electron densities within the error margins. Yet the diagnosis method and the presented details on the use of the hydrogen beta peak separation are viewed as a welcomed contribution in studies of laser-induced plasma.

  6. Prospective Analysis of Surgical Bone Margins After Partial Foot Amputation in Diabetic Patients Admitted With Moderate to Severe Foot Infections.

    PubMed

    Schmidt, Brian M; McHugh, Jonathan B; Patel, Rajiv M; Wrobel, James S

    2018-04-01

    Osteomyelitis is common in diabetic foot infections and medical management can lead to poor outcomes. Surgical management involves sending histopathologic and microbiologic specimens which guides future intervention. We examined the effect of obtainment of surgical margins in patients undergoing forefoot amputations to identify patient characteristics associated with outcomes. Secondary aims included evaluating interobserver reliability of histopathologic data at both the distal-to and proximal-to surgical bone margin. Data were prospectively collected on 72 individuals and was pooled for analysis. Standardized method to retrieve intraoperative bone margins was established. A univariate analysis was performed. Negative outcomes, including major lower extremity amputation, wound dehiscence, reulceration, reamputation, or death were recorded. Viable proximal margins were obtained in 63 out of 72 cases (87.5%). Strong interobserver reliability of histopathology was recorded. Univariate analysis demonstrated preoperative platelets, albumin, probe-to-bone testing, absolute toe pressures, smaller wound surface area were associated with obtaining viable margins. Residual osteomyelitis resulted in readmission 2.6 times more often and more postoperative complications. Certain patients were significantly different in the viable margin group versus dirty margin group. High interobserver reliability was demonstrated. Obtainment of viable margins resulted in reduced rates of readmission and negative outcomes. Prognostic, Level I: Prospective.

  7. Advancing Optical Imaging for Breast Margin Assessment: An Analysis of Excisional Time, Cautery, and Patent Blue Dye on Underlying Sources of Contrast

    PubMed Central

    Bydlon, Torre M.; Barry, William T.; Kennedy, Stephanie A.; Brown, J. Quincy; Gallagher, Jennifer E.; Wilke, Lee G.; Geradts, Joseph; Ramanujam, Nimmi

    2012-01-01

    Breast conserving surgery (BCS) is a recommended treatment for breast cancer patients where the goal is to remove the tumor and a surrounding rim of normal tissue. Unfortunately, a high percentage of patients return for additional surgeries to remove all of the cancer. Post-operative pathology is the gold standard for evaluating BCS margins but is limited due to the amount of tissue that can be sampled. Frozen section analysis and touch-preparation cytology have been proposed to address the surgical needs but also have sampling limitations. These issues represent an unmet clinical need for guidance in resecting malignant tissue intra-operatively and for pathological sampling. We have developed a quantitative spectral imaging device to examine margins intra-operatively. The context in which this technology is applied (intra-operative or post-operative setting) is influenced by time after excision and surgical factors including cautery and the presence of patent blue dye (specifically Lymphazurin™, used for sentinel lymph node mapping). Optical endpoints of hemoglobin ([THb]), fat ([β-carotene]), and fibroglandular content via light scattering (<µs’>) measurements were quantified from diffuse reflectance spectra of lumpectomy and mastectomy specimens using a Monte Carlo model. A linear longitudinal mixed-effects model was used to fit the optical endpoints for the cautery and kinetics studies. Monte Carlo simulations and tissue mimicking phantoms were used for the patent blue dye experiments. [THb], [β-carotene], and <µs’> were affected by <3.3% error with <80 µM of patent blue dye. The percent change in [β-carotene], <µs’>, and [β-carotene]/<µs’> was <14% in 30 minutes, while percent change in [THb] was >40%. [β-carotene] and [β-carotene]/<µs’> were the only parameters not affected by cautery. This work demonstrates the importance of understanding the post-excision kinetics of ex-vivo tissue and the presence of cautery and patent blue dye for breast tumor margin assessment, to accurately interpret data and exploit underling sources of contrast. PMID:23251526

  8. Real-time prediction and gating of respiratory motion in 3D space using extended Kalman filters and Gaussian process regression network

    NASA Astrophysics Data System (ADS)

    Bukhari, W.; Hong, S.-M.

    2016-03-01

    The prediction as well as the gating of respiratory motion have received much attention over the last two decades for reducing the targeting error of the radiation treatment beam due to respiratory motion. In this article, we present a real-time algorithm for predicting respiratory motion in 3D space and realizing a gating function without pre-specifying a particular phase of the patient’s breathing cycle. The algorithm, named EKF-GPRN+ , first employs an extended Kalman filter (EKF) independently along each coordinate to predict the respiratory motion and then uses a Gaussian process regression network (GPRN) to correct the prediction error of the EKF in 3D space. The GPRN is a nonparametric Bayesian algorithm for modeling input-dependent correlations between the output variables in multi-output regression. Inference in GPRN is intractable and we employ variational inference with mean field approximation to compute an approximate predictive mean and predictive covariance matrix. The approximate predictive mean is used to correct the prediction error of the EKF. The trace of the approximate predictive covariance matrix is utilized to capture the uncertainty in EKF-GPRN+ prediction error and systematically identify breathing points with a higher probability of large prediction error in advance. This identification enables us to pause the treatment beam over such instances. EKF-GPRN+ implements a gating function by using simple calculations based on the trace of the predictive covariance matrix. Extensive numerical experiments are performed based on a large database of 304 respiratory motion traces to evaluate EKF-GPRN+ . The experimental results show that the EKF-GPRN+ algorithm reduces the patient-wise prediction error to 38%, 40% and 40% in root-mean-square, compared to no prediction, at lookahead lengths of 192 ms, 384 ms and 576 ms, respectively. The EKF-GPRN+ algorithm can further reduce the prediction error by employing the gating function, albeit at the cost of reduced duty cycle. The error reduction allows the clinical target volume to planning target volume (CTV-PTV) margin to be reduced, leading to decreased normal-tissue toxicity and possible dose escalation. The CTV-PTV margin is also evaluated to quantify clinical benefits of EKF-GPRN+ prediction.

  9. Analysis of System Margins on Missions Utilizing Solar Electric Propulsion

    NASA Technical Reports Server (NTRS)

    Oh, David Y.; Landau, Damon; Randolph, Thomas; Timmerman, Paul; Chase, James; Sims, Jon; Kowalkowski, Theresa

    2008-01-01

    NASA's Jet Propulsion Laboratory has conducted a study focused on the analysis of appropriate margins for deep space missions using solar electric propulsion (SEP). The purpose of this study is to understand the links between disparate system margins (power, mass, thermal, etc.) and their impact on overall mission performance and robustness. It is determined that the various sources of uncertainty and risk associated with electric propulsion mission design can be summarized into three relatively independent parameters 1) EP Power Margin, 2) Propellant Margin and 3) Duty Cycle Margin. The overall relationship between these parameters and other major sources of uncertainty is presented. A detailed trajectory analysis is conducted to examine the impact that various assumptions related to power, duty cycle, destination, and thruster performance including missed thrust periods have on overall performance. Recommendations are presented for system margins for deep space missions utilizing solar electric propulsion.

  10. Residual Seminal Vesicle Displacement in Marker-Based Image-Guided Radiotherapy for Prostate Cancer and the Impact on Margin Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smitsmans, Monique H.P.; Bois, Josien de; Sonke, Jan-Jakob

    Purpose: The objectives of this study were to quantify residual interfraction displacement of seminal vesicles (SV) and investigate the efficacy of rotation correction on SV displacement in marker-based prostate image-guided radiotherapy (IGRT). We also determined the effect of marker registration on the measured SV displacement and its impact on margin design. Methods and Materials: SV displacement was determined relative to marker registration by using 296 cone beam computed tomography scans of 13 prostate cancer patients with implanted markers. SV were individually registered in the transverse plane, based on gray-value information. The target registration error (TRE) for the SV due tomore » marker registration inaccuracies was estimated. Correlations between prostate gland rotations and SV displacement and between individual SV displacements were determined. Results: The SV registration success rate was 99%. Displacement amounts of both SVs were comparable. Systematic and random residual SV displacements were 1.6 mm and 2.0 mm in the left-right direction, respectively, and 2.8 mm and 3.1 mm in the anteroposterior (AP) direction, respectively. Rotation correction did not reduce residual SV displacement. Prostate gland rotation around the left-right axis correlated with SV AP displacement (R{sup 2} = 42%); a correlation existed between both SVs for AP displacement (R{sup 2} = 62%); considerable correlation existed between random errors of SV displacement and TRE (R{sup 2} = 34%). Conclusions: Considerable residual SV displacement exists in marker-based IGRT. Rotation correction barely reduced SV displacement, rather, a larger SV displacement was shown relative to the prostate gland that was not captured by the marker position. Marker registration error partly explains SV displacement when correcting for rotations. Correcting for rotations, therefore, is not advisable when SV are part of the target volume. Margin design for SVs should take these uncertainties into account.« less

  11. Immobilisation precision in VMAT for oral cancer patients

    NASA Astrophysics Data System (ADS)

    Norfadilah, M. N.; Ahmad, R.; Heng, S. P.; Lam, K. S.; Radzi, A. B. Ahmad; John, L. S. H.

    2017-05-01

    A study was conducted to evaluate and quantify a precision of the interfraction setup with different immobilisation devices throughout the treatment time. Local setup accuracy was analysed for 8 oral cancer patients receiving radiotherapy; 4 with HeadFIX® mouthpiece moulded with wax (HFW) and 4 with 10 ml/cc syringe barrel (SYR). Each patients underwent Image Guided Radiotherapy (IGRT) with total of 209 cone-beam computed tomography (CBCT) data sets for position set up errors measurement. The setup variations in the mediolateral (ML), craniocaudal (CC), and anteroposterior (AP) dimensions were measured. Overall mean displacement (M), the population systematic (Σ) and random (σ) errors and the 3D vector length were calculated. Clinical target volume to planning target volume (CTV-PTV) margins were calculated according to the van Herk formula (2.5Σ+0.7σ). The M values for both group were < 1 mm and < 1° in all translational and rotational directions. This indicate there is no significant imprecision in the equipment (lasers) and during procedure. The interfraction translational 3 dimension vector for HFW and SYR were 1.93±0.66mm and 3.84±1.34mm, respectively. The interfraction average rotational error were 0.00°±0.65° and 0.34°±0.59°, respectively. CTV-PTV margins along the 3 translational axis (Right-Left, Superior-Inferior, Anterior-Posterior) calculated were 3.08, 2.22 and 0.81 mm for HFW and 3.76, 6.24 and 5.06 mm for SYR. The results of this study have demonstrated that HFW more precise in reproducing patient position compared to conventionally used SYR (p<0.001). All margin calculated did not exceed hospital protocol (5mm) except S-I and A-P axes using syringe. For this reason, a daily IGRT is highly recommended to improve the immobilisation precision.

  12. Close Margins Less Than 2 mm Are Not Associated With Higher Risks of 10-Year Local Recurrence and Breast Cancer Mortality Compared With Negative Margins in Women Treated With Breast-Conserving Therapy.

    PubMed

    Tyler, Susan; Truong, Pauline T; Lesperance, Mary; Nichol, Alan; Baliski, Chris; Warburton, Rebecca; Tyldesley, Scott

    2018-03-13

    The 2014 Society of Surgical Oncology-American Society for Radiation Oncology consensus suggested "no ink on tumor" is a sufficient surgical margin for invasive breast cancer treated with breast-conserving surgery (BCS). Whether close margins <2 mm are associated with inferior outcomes remains controversial. This study evaluated 10-year outcomes by margin status in a population-based cohort treated with BCS and adjuvant radiation therapy (RT). The subjects were 10,863 women with invasive cancer categorized as pT1 to T3, any N, and M0 referred from 2001 to 2011, an era in which the institutional policy was to re-excise close or positive margins, except in select cases. All women underwent BCS and whole-breast RT with or without boost RT. Local recurrence (LR) and breast cancer-specific survival (BCSS) were examined using competing-risk analysis in cohorts with negative (≥2 mm; n = 9241, 85%), close (<2 mm; n = 1310, 12%), or positive (tumor touching ink; n = 312, 3%) margins. Multivariable analysis and matched-pair analysis were performed. The median follow-up period was 8 years. Systemic therapy was used in 87% of patients. Boost RT was used in 34.1%, 76.9%, and 79.5% of patients with negative, close, and positive margins, respectively. In the negative, close, and positive margin cohorts, the 10-year cumulative incidence of LR was 1.8%, 2.0%, and 1.1%, respectively (P = .759). Corresponding BCSS estimates were 93.9%, 91.8%, and 87.9%, respectively (P < .001). On multivariable analysis, close margins were not associated with increased LR (hazard ratio, 1.25; 95% confidence interval 0.79-1.97; P = .350) or reduced BCSS (hazard ratio, 1.25; 95% confidence interval 0.98-1.58, P = .071) relative to negative margins. On matched-pair analysis, close margin cases had similar LR (P = .114) and BCSS (P = .100) to negative margin controls. Select cases with close or positive margins in this population-based analysis had similar LR and BCSS to cases with negative margins. While these findings do not endorse omitting re-excision for all cases, the data support a policy of accepting carefully selected cases with close margins for adjuvant RT without re-excision. Copyright © 2018 Elsevier Inc. All rights reserved.

  13. A semi-Markov model for mitosis segmentation in time-lapse phase contrast microscopy image sequences of stem cell populations.

    PubMed

    Liu, An-An; Li, Kang; Kanade, Takeo

    2012-02-01

    We propose a semi-Markov model trained in a max-margin learning framework for mitosis event segmentation in large-scale time-lapse phase contrast microscopy image sequences of stem cell populations. Our method consists of three steps. First, we apply a constrained optimization based microscopy image segmentation method that exploits phase contrast optics to extract candidate subsequences in the input image sequence that contains mitosis events. Then, we apply a max-margin hidden conditional random field (MM-HCRF) classifier learned from human-annotated mitotic and nonmitotic sequences to classify each candidate subsequence as a mitosis or not. Finally, a max-margin semi-Markov model (MM-SMM) trained on manually-segmented mitotic sequences is utilized to reinforce the mitosis classification results, and to further segment each mitosis into four predefined temporal stages. The proposed method outperforms the event-detection CRF model recently reported by Huh as well as several other competing methods in very challenging image sequences of multipolar-shaped C3H10T1/2 mesenchymal stem cells. For mitosis detection, an overall precision of 95.8% and a recall of 88.1% were achieved. For mitosis segmentation, the mean and standard deviation for the localization errors of the start and end points of all mitosis stages were well below 1 and 2 frames, respectively. In particular, an overall temporal location error of 0.73 ± 1.29 frames was achieved for locating daughter cell birth events.

  14. The Prognostic Value of Varying Definitions of Positive Resection Margin in Patients with Colorectal Cancer Liver Metastases.

    PubMed

    Wang, Jane; Margonis, Georgios Antonios; Amini, Neda; Andreatos, Nikolaos; Yuan, Chunhui; Damaskos, Christos; Antoniou, Efstathios; Garmpis, Nikolaos; Buettner, Stefan; Barbon, Carlotta; Deshwar, Amar; He, Jin; Burkhart, Richard; Pawlik, Timothy M; Wolfgang, Christopher L; Weiss, Matthew J

    2018-04-09

    Varying definitions of resection margin clearance are currently employed among patients with colorectal cancer liver metastases (CRLM). Specifically, a microscopically positive margin (R1) has alternatively been equated with an involved margin (margin width = 0 mm) or a margin width < 1 mm. Consequently, patients with a margin width of 0-1 mm (sub-mm) are inconsistently classified in either the R0 or R1 categories, thus obscuring the prognostic implications of sub-mm margins. Six hundred thirty-three patients who underwent resection of CRLM were identified. Both R1 definitions were alternatively employed and multivariable analysis was used to determine the predictive power of each definition, as well as the prognostic implications of a sub-mm margin. Five hundred thirty-nine (85.2%) patients had a margin width ≥ 1 mm, 42 had a sub-mm margin width, and 52 had an involved margin (0 mm). A margin width ≥ 1 mm was associated with improved survival vs. a sub-mm margin (65 vs. 36 months; P = 0.03) or an involved margin (65 vs. 33 months; P < 0.001). No significant difference in survival was detected between patients with involved vs. sub-mm margins (P = 0.31). A sub-mm margin and an involved margin were both independent predictors of worse OS (HR 1.66, 1.04-2.67; P = 0.04, and HR 2.14, 1.46-3.16; P < 0.001, respectively) in multivariable analysis. Importantly, after combining the two definitions, patients with either an involved margin or a sub-mm margin were associated with worse OS in multivariable analysis (HR 1.94, 1.41-2.65; P < 0.001). Patients with involved or sub-mm margins demonstrated a similar inferior OS vs. patients with a margin width > 1 mm. Consequently, a uniform definition of R1 as a margin width < 1 mm should perhaps be employed by future studies.

  15. The Effects of Combining Videogame Dancing and Pelvic Floor Training to Improve Dual-Task Gait and Cognition in Women with Mixed-Urinary Incontinence.

    PubMed

    Fraser, Sarah A; Elliott, Valerie; de Bruin, Eling D; Bherer, Louis; Dumoulin, Chantal

    2014-06-01

    Many women over 65 years of age suffer from mixed urinary incontinence (MUI) and executive function (EF) deficits. Both incontinence and EF declines increase fall risk. The current study assessed EF and dual-task gait after a multicomponent intervention that combined pelvic floor muscle (PFM) training and videogame dancing (VGD). Baseline (Pre1), pretraining (Pre2), and post-training (Post) neuropsychological and dual-task gait assessments were completed by 23 women (mean age, 70.4 years) with MUI. During the dual-task, participants walked and performed an auditory n-back task. From Pre2 to Post, all women completed 12 weeks of combined PFM and VGD training. After training (Pre2 to Post), the number of errors in the Inhibition/Switch Stroop condition decreased significantly, the Trail Making Test difference score improved marginally, and the number of n-back errors during dual-task gait significantly decreased. A subgroup analysis based on continence improvements (pad test) revealed that only those subjects who improved in the pad test had significantly reduced numbers of n-back errors during dual-task gait. The results of this study suggest that a multicomponent intervention can improve EFs and the dual-task gait of older women with MUI. Future research is needed to determine if the training-induced improvements in these factors reduce fall risk.

  16. Accuracy of lesion boundary tracking in navigated breast tumor excision

    NASA Astrophysics Data System (ADS)

    Heffernan, Emily; Ungi, Tamas; Vaughan, Thomas; Pezeshki, Padina; Lasso, Andras; Gauvin, Gabrielle; Rudan, John; Engel, C. Jay; Morin, Evelyn; Fichtinger, Gabor

    2016-03-01

    PURPOSE: An electromagnetic navigation system for tumor excision in breast conserving surgery has recently been developed. Preoperatively, a hooked needle is positioned in the tumor and the tumor boundaries are defined in the needle coordinate system. The needle is tracked electromagnetically throughout the procedure to localize the tumor. However, the needle may move and the tissue may deform, leading to errors in maintaining a correct excision boundary. It is imperative to quantify these errors so the surgeon can choose an appropriate resection margin. METHODS: A commercial breast biopsy phantom with several inclusions was used. Location and shape of a lesion before and after mechanical deformation were determined using 3D ultrasound volumes. Tumor location and shape were estimated from initial contours and tracking data. The difference in estimated and actual location and shape of the lesion after deformation was quantified using the Hausdorff distance. Data collection and analysis were done using our 3D Slicer software application and PLUS toolkit. RESULTS: The deformation of the breast resulted in 3.72 mm (STD 0.67 mm) average boundary displacement for an isoelastic lesion and 3.88 mm (STD 0.43 mm) for a hyperelastic lesion. The difference between the actual and estimated tracked tumor boundary was 0.88 mm (STD 0.20 mm) for the isoelastic and 1.78 mm (STD 0.18 mm) for the hyperelastic lesion. CONCLUSION: The average lesion boundary tracking error was below 2mm, which is clinically acceptable. We suspect that stiffness of the phantom tissue affected the error measurements. Results will be validated in patient studies.

  17. Abutment Disconnection/Reconnection Affects Peri-implant Marginal Bone Levels: A Meta-Analysis.

    PubMed

    Koutouzis, Theofilos; Gholami, Fatemeh; Reynolds, John; Lundgren, Tord; Kotsakis, Georgios A

    Preclinical and clinical studies have shown that marginal bone loss can be secondary to repeated disconnection and reconnection of abutments that affect the peri-implant mucosal seal. The aim of this systematic review and meta-analysis was to evaluate the impact of abutment disconnections/reconnections on peri-implant marginal bone level changes. To address this question, two reviewers independently performed an electronic search of three major databases up to October 2015 complemented by manual searches. Eligible articles were selected on the basis of prespecified inclusion and exclusion criteria after a two-phase search strategy and assessed for risk of bias. A random-effects meta-analysis was performed for marginal bone loss. The authors initially identified 392 titles and abstracts. After evaluation, seven controlled clinical studies were included. Qualitative assessment of the articles revealed a trend toward protective marginal bone level preservation for implants with final abutment placement (FAP) at the time of implant placement compared with implants for which there were multiple abutment placements (MAP). The FAP group exhibited a marginal bone level change ranging from 0.08 to 0.34 mm, whereas the MAP group exhibited a marginal bone level change ranging from 0.09 to 0.55 mm. Meta-analysis of the seven studies reporting on 396 implants showed significantly greater bone loss in cases of multiple abutment disconnections/reconnections. The weighted mean difference in marginal bone loss was 0.19 mm (95% confidence interval, 0.06-0.32 mm), favoring bone preservation in the FAP group. Within the limitations of this meta-analysis, abutment disconnection and reconnection significantly affected peri-implant marginal bone levels. These findings pave the way for revisiting current restorative protocols at the restorative treatment planning stage to prevent incipient marginal bone loss.

  18. Not to put too fine a point on it - does increasing precision of geographic referencing improve species distribution models for a wide-ranging migratory bat?

    USGS Publications Warehouse

    Hayes, Mark A.; Ozenberger, Katharine; Cryan, Paul M.; Wunder, Michael B.

    2015-01-01

    Bat specimens held in natural history museum collections can provide insights into the distribution of species. However, there are several important sources of spatial error associated with natural history specimens that may influence the analysis and mapping of bat species distributions. We analyzed the importance of geographic referencing and error correction in species distribution modeling (SDM) using occurrence records of hoary bats (Lasiurus cinereus). This species is known to migrate long distances and is a species of increasing concern due to fatalities documented at wind energy facilities in North America. We used 3,215 museum occurrence records collected from 1950–2000 for hoary bats in North America. We compared SDM performance using five approaches: generalized linear models, multivariate adaptive regression splines, boosted regression trees, random forest, and maximum entropy models. We evaluated results using three SDM performance metrics (AUC, sensitivity, and specificity) and two data sets: one comprised of the original occurrence data, and a second data set consisting of these same records after the locations were adjusted to correct for identifiable spatial errors. The increase in precision improved the mean estimated spatial error associated with hoary bat records from 5.11 km to 1.58 km, and this reduction in error resulted in a slight increase in all three SDM performance metrics. These results provide insights into the importance of geographic referencing and the value of correcting spatial errors in modeling the distribution of a wide-ranging bat species. We conclude that the considerable time and effort invested in carefully increasing the precision of the occurrence locations in this data set was not worth the marginal gains in improved SDM performance, and it seems likely that gains would be similar for other bat species that range across large areas of the continent, migrate, and are habitat generalists.

  19. [Planning of esthetic oral rehabilitation according to correlative analysis of clinical and morphological features of the marginal gingiva].

    PubMed

    Stafeev, A A; Zinov'ev, G I; Drozdov, D D

    2015-01-01

    The orthopedic restoration and related to its clinical stages (preparation, gingival retraction, impression) is often associated with complications which arise from the marginal gingiva. The technology of indirect ceramic restoration requires an assessment of the clinical and morphological parameters of periodontal tissues. The study outlines correlation between the type of periodontal histhology and inflammatory and degenerative complications that has been established after the analysis of morphofunctional state of periodontal tissue. Results of clinical studies and correlation analysis of clinical and morphological parameters of marginal gingiva has shown that important parameter influencing the choice of manufacturing technology are the position of restoration margin relatively to marginal gingiva and periodontal morphotype.

  20. What's the Value of VAM (Value-Added Modeling)?

    ERIC Educational Resources Information Center

    Scherrer, Jimmy

    2012-01-01

    The use of value-added modeling (VAM) in school accountability is expanding, but deciding how to embrace VAM is difficult. Various experts say it's too unreliable, causes more harm than good, and has a big margin for error. Others assert VAM is imperfect but useful, and provides valuable feedback. A closer look at the models, and their use,…

  1. Global Trends 2025: A Transformed World

    DTIC Science & Technology

    2008-11-01

    Remain Strong, Capacities Will Shrink New Relationships and Recalibrated Old Partnerships Less Financial Margin of Error More Limited Military...making reforms that create a “European President” and “European Foreign Minister” and develops greater institutional capacity for crisis management...domestic issues and sustaining their economic development, increasingly, as outlined in this chapter, they will have the capacity to be global

  2. A flexible wearable sensor for knee flexion assessment during gait.

    PubMed

    Papi, Enrica; Bo, Yen Nee; McGregor, Alison H

    2018-05-01

    Gait analysis plays an important role in the diagnosis and management of patients with movement disorders but it is usually performed within a laboratory. Recently interest has shifted towards the possibility of conducting gait assessments in everyday environments thus facilitating long-term monitoring. This is possible by using wearable technologies rather than laboratory based equipment. This study aims to validate a novel wearable sensor system's ability to measure peak knee sagittal angles during gait. The proposed system comprises a flexible conductive polymer unit interfaced with a wireless acquisition node attached over the knee on a pair of leggings. Sixteen healthy volunteers participated to two gait assessments on separate occasions. Data was simultaneously collected from the novel sensor and a gold standard 10 camera motion capture system. The relationship between sensor signal and reference knee flexion angles was defined for each subject to allow the transformation of sensor voltage outputs to angular measures (degrees). The knee peak flexion angle from the sensor and reference system were compared by means of root mean square error (RMSE), absolute error, Bland-Altman plots and intra-class correlation coefficients (ICCs) to assess test-retest reliability. Comparisons of knee peak flexion angles calculated from the sensor and gold standard yielded an absolute error of 0.35(±2.9°) and RMSE of 1.2(±0.4)°. Good agreement was found between the two systems with the majority of data lying within the limits of agreement. The sensor demonstrated high test-retest reliability (ICCs>0.8). These results show the ability of the sensor to monitor knee peak sagittal angles with small margins of error and in agreement with the gold standard system. The sensor has potential to be used in clinical settings as a discreet, unobtrusive wearable device allowing for long-term gait analysis. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  3. An estimator-predictor approach to PLL loop filter design

    NASA Technical Reports Server (NTRS)

    Statman, J. I.; Hurd, W. J.

    1986-01-01

    An approach to the design of digital phase locked loops (DPLLs), using estimation theory concepts in the selection of a loop filter, is presented. The key concept is that the DPLL closed-loop transfer function is decomposed into an estimator and a predictor. The estimator provides recursive estimates of phase, frequency, and higher order derivatives, while the predictor compensates for the transport lag inherent in the loop. This decomposition results in a straightforward loop filter design procedure, enabling use of techniques from optimal and sub-optimal estimation theory. A design example for a particular choice of estimator is presented, followed by analysis of the associated bandwidth, gain margin, and steady state errors caused by unmodeled dynamics. This approach is under consideration for the design of the Deep Space Network (DSN) Advanced Receiver Carrier DPLL.

  4. Follow-up on hang gliding injuries in Colorado.

    PubMed

    Krissoff, W B

    1976-01-01

    In a period extending from July 1973 to December 1975, seven fatal hang glider accidents were recorded in Colorado, all among experienced pilots. In addition, 11 serious nonfatal injuries were reported, which may represent only a fraction of those occurring. Accidents were noted to be multifactorial, caused by (1) pilot error, (2) equipment failure, (3) terrain hazards, and (4) possible design shortcomings. Accidents can be expected to decline in frequency with improved pilot training programs, grading and regulation of sites, and standardized safety clothing. No doubt over time, the less safe standard Rogallo wing will be replaced by the more stable Superkites and controlled collapsibles, which offer a higher safety margin. In the last analysis, this sport will remain a popular yet high risk endeavor (Figs. 2 through 5).

  5. Economic Implications of Widespread Expansion of Frozen Section Margin Analysis to Guide Surgical Resection in Women With Breast Cancer Undergoing Breast-Conserving Surgery.

    PubMed

    Boughey, Judy C; Keeney, Gary L; Radensky, Paul; Song, Christine P; Habermann, Elizabeth B

    2016-04-01

    In the current health care environment, cost effectiveness is critically important in policy setting and care of patients. This study performed a health economic analysis to assess the implications to providers and payers of expanding the use of frozen section margin analysis to minimize reoperations for patients undergoing breast cancer lumpectomy. A health care economic impact model was built to assess annual costs associated with breast lumpectomy procedures with and without frozen section margin analysis to avoid reoperation. If frozen section margin analysis is used in 20% of breast lumpectomies and under a baseline assumption that 35% of initial lumpectomies without frozen section analysis result in reoperations, the potential annual cost savings are $18.2 million to payers and $0.4 million to providers. Under the same baseline assumption, if 100% of all health care facilities adopted the use of frozen section margin analysis for breast lumpectomy procedures, the potential annual cost savings are $90.9 million to payers and $1.8 million to providers. On the basis of 10,000 simulations, use of intraoperative frozen section margin analysis yields cost saving for payers and is cost neutral to slightly cost saving for providers. This economic analysis indicates that widespread use of frozen section margin evaluation intraoperatively to guide surgical resection in breast lumpectomy cases and minimize reoperations would be beneficial to cost savings not only for the patient but also for payers and, in most cases, for providers. Copyright © 2016 by American Society of Clinical Oncology.

  6. A method for identifying EMI critical circuits during development of a large C3

    NASA Astrophysics Data System (ADS)

    Barr, Douglas H.

    The circuit analysis methods and process Boeing Aerospace used on a large, ground-based military command, control, and communications (C3) system are described. This analysis was designed to help identify electromagnetic interference (EMI) critical circuits. The methodology used the MIL-E-6051 equipment criticality categories as the basis for defining critical circuits, relational database technology to help sort through and account for all of the approximately 5000 system signal cables, and Macintosh Plus personal computers to predict critical circuits based on safety margin analysis. The EMI circuit analysis process systematically examined all system circuits to identify which ones were likely to be EMI critical. The process used two separate, sequential safety margin analyses to identify critical circuits (conservative safety margin analysis, and detailed safety margin analysis). These analyses used field-to-wire and wire-to-wire coupling models using both worst-case and detailed circuit parameters (physical and electrical) to predict circuit safety margins. This process identified the predicted critical circuits that could then be verified by test.

  7. Applied Time Domain Stability Margin Assessment for Nonlinear Time-Varying Systems

    NASA Technical Reports Server (NTRS)

    Kiefer, J. M.; Johnson, M. D.; Wall, J. H.; Dominguez, A.

    2016-01-01

    The baseline stability margins for NASA's Space Launch System (SLS) launch vehicle were generated via the classical approach of linearizing the system equations of motion and determining the gain and phase margins from the resulting frequency domain model. To improve the fidelity of the classical methods, the linear frequency domain approach can be extended by replacing static, memoryless nonlinearities with describing functions. This technique, however, does not address the time varying nature of the dynamics of a launch vehicle in flight. An alternative technique for the evaluation of the stability of the nonlinear launch vehicle dynamics along its trajectory is to incrementally adjust the gain and/or time delay in the time domain simulation until the system exhibits unstable behavior. This technique has the added benefit of providing a direct comparison between the time domain and frequency domain tools in support of simulation validation. This technique was implemented by using the Stability Aerospace Vehicle Analysis Tool (SAVANT) computer simulation to evaluate the stability of the SLS system with the Adaptive Augmenting Control (AAC) active and inactive along its ascent trajectory. The gains for which the vehicle maintains apparent time-domain stability defines the gain margins, and the time delay similarly defines the phase margin. This method of extracting the control stability margins from the time-domain simulation is relatively straightforward and the resultant margins can be compared to the linearized system results. The sections herein describe the techniques employed to extract the time-domain margins, compare the results between these nonlinear and the linear methods, and provide explanations for observed discrepancies. The SLS ascent trajectory was simulated with SAVANT and the classical linear stability margins were evaluated at one second intervals. The linear analysis was performed with the AAC algorithm disabled to attain baseline stability margins. At each time point, the system was linearized about the current operating point using Simulink's built-in solver. Each linearized system in time was evaluated for its rigid-body gain margin (high frequency gain margin), rigid-body phase margin, and aero gain margin (low frequency gain margin) for each control axis. Using the stability margins derived from the baseline linearization approach, the time domain derived stability margins were determined by executing time domain simulations in which axis-specific incremental gain and phase adjustments were made to the nominal system about the expected neutral stability point at specific flight times. The baseline stability margin time histories were used to shift the system gain to various values around the zero margin point such that a precise amount of expected gain margin was maintained throughout flight. When assessing the gain margins, the gain was applied starting at the time point under consideration, thereafter following the variation in the margin found in the linear analysis. When assessing the rigid-body phase margin, a constant time delay was applied to the system starting at the time point under consideration. If the baseline stability margins were correctly determined via the linear analysis, the time domain simulation results should contain unstable behavior at certain gain and phase values. Examples will be shown from repeated simulations with variable added gain and phase lag. Faithfulness of margins calculated from the linear analysis to the nonlinear system will be demonstrated.

  8. Margin reduction from image guided radiation therapy for soft tissue sarcoma: Secondary analysis of Radiation Therapy Oncology Group 0630 results.

    PubMed

    Li, X Allen; Chen, Xiaojian; Zhang, Qiang; Kirsch, David G; Petersen, Ivy; DeLaney, Thomas F; Freeman, Carolyn R; Trotti, Andy; Hitchcock, Ying; Bedi, Meena; Haddock, Michael; Salerno, Kilian; Dundas, George; Wang, Dian

    2016-01-01

    Six imaging modalities were used in Radiation Therapy Oncology Group (RTOG) 0630, a study of image guided radiation therapy (IGRT) for primary soft tissue sarcomas of the extremity. We analyzed all daily patient-repositioning data collected in this trial to determine the impact of daily IGRT on clinical target volume-to-planning target volume (CTV-to-PTV) margin. Daily repositioning data, including shifts in right-left (RL), superior-inferior (SI), and anterior-posterior (AP) directions and rotations for 98 patients enrolled in RTOG 0630 from 18 institutions were analyzed. Patients were repositioned daily on the basis of bone anatomy by using pretreatment images, including kilovoltage orthogonal images (KVorth), megavoltage orthogonal images (MVorth), KV fan-beam computed tomography (KVCT), KV cone beam CT (KVCB), MV fan-beam CT (MVCT), and MV cone beam CT (MVCB). Means and standard deviations (SDs) for each shift and rotation were calculated for each patient and for each IGRT modality. The Student's t tests and F-tests were performed to analyze the differences in the means and SDs. Necessary CTV-to-PTV margins were estimated. The repositioning shifts and day-to-day variations were large and generally similar for the 6 imaging modalities. Of the 2 most commonly used modalities, MVCT and KVorth, there were no statistically significant differences in the shifts and rotations (P = .15 and .59 for the RL and SI shifts, respectively; and P = .22 for rotation), except for shifts in AP direction (P = .002). The estimated CTV-to-PTV margins in the RL, SI, and AP directions would be 13.0, 10.4, and 11.7 mm from MVCT data, respectively, and 13.1, 8.6, and 10.8 mm from KVorth data, respectively, indicating that margins substantially larger than 5 mm used with daily IGRT would be required in the absence of IGRT. The observed large daily repositioning errors and the large variations among institutions imply that daily IGRT is necessary for this tumor site, particularly in multi-institutional trials. Otherwise, a CTV-to-PTV margin of 1.5 cm is required to account for daily setup variations. Copyright © 2016 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  9. Novel maximum-margin training algorithms for supervised neural networks.

    PubMed

    Ludwig, Oswaldo; Nunes, Urbano

    2010-06-01

    This paper proposes three novel training methods, two of them based on the backpropagation approach and a third one based on information theory for multilayer perceptron (MLP) binary classifiers. Both backpropagation methods are based on the maximal-margin (MM) principle. The first one, based on the gradient descent with adaptive learning rate algorithm (GDX) and named maximum-margin GDX (MMGDX), directly increases the margin of the MLP output-layer hyperplane. The proposed method jointly optimizes both MLP layers in a single process, backpropagating the gradient of an MM-based objective function, through the output and hidden layers, in order to create a hidden-layer space that enables a higher margin for the output-layer hyperplane, avoiding the testing of many arbitrary kernels, as occurs in case of support vector machine (SVM) training. The proposed MM-based objective function aims to stretch out the margin to its limit. An objective function based on Lp-norm is also proposed in order to take into account the idea of support vectors, however, overcoming the complexity involved in solving a constrained optimization problem, usually in SVM training. In fact, all the training methods proposed in this paper have time and space complexities O(N) while usual SVM training methods have time complexity O(N (3)) and space complexity O(N (2)) , where N is the training-data-set size. The second approach, named minimization of interclass interference (MICI), has an objective function inspired on the Fisher discriminant analysis. Such algorithm aims to create an MLP hidden output where the patterns have a desirable statistical distribution. In both training methods, the maximum area under ROC curve (AUC) is applied as stop criterion. The third approach offers a robust training framework able to take the best of each proposed training method. The main idea is to compose a neural model by using neurons extracted from three other neural networks, each one previously trained by MICI, MMGDX, and Levenberg-Marquard (LM), respectively. The resulting neural network was named assembled neural network (ASNN). Benchmark data sets of real-world problems have been used in experiments that enable a comparison with other state-of-the-art classifiers. The results provide evidence of the effectiveness of our methods regarding accuracy, AUC, and balanced error rate.

  10. Schedule for CT image guidance in treating prostate cancer with helical tomotherapy

    PubMed Central

    Beldjoudi, G; Yartsev, S; Bauman, G; Battista, J; Van Dyk, J

    2010-01-01

    The aim of this study was to determine the effect of reducing the number of image guidance sessions and patient-specific target margins on the dose distribution in the treatment of prostate cancer with helical tomotherapy. 20 patients with prostate cancer who were treated with helical tomotherapy using daily megavoltage CT (MVCT) imaging before treatment served as the study population. The average geometric shifts applied for set-up corrections, as a result of co-registration of MVCT and planning kilovoltage CT studies over an increasing number of image guidance sessions, were determined. Simulation of the consequences of various imaging scenarios on the dose distribution was performed for two patients with different patterns of interfraction changes in anatomy. Our analysis of the daily set-up correction shifts for 20 prostate cancer patients suggests that the use of four fractions would result in a population average shift that was within 1 mm of the average obtained from the data accumulated over all daily MVCT sessions. Simulation of a scenario in which imaging sessions are performed at a reduced frequency and the planning target volume margin is adapted provided significantly better sparing of organs at risk, with acceptable reproducibility of dose delivery to the clinical target volume. Our results indicate that four MVCT sessions on helical tomotherapy are sufficient to provide information for the creation of personalised target margins and the establishment of the new reference position that accounts for the systematic error. This simplified approach reduces overall treatment session time and decreases the imaging dose to the patient. PMID:19505966

  11. Effect of Processing Conditions on the Anelastic Behavior of Plasma Sprayed Thermal Barrier Coatings

    NASA Astrophysics Data System (ADS)

    Viswanathan, Vaishak

    2011-12-01

    Plasma sprayed ceramic materials contain an assortment of micro-structural defects, including pores, cracks, and interfaces arising from the droplet based assemblage of the spray deposition technique. The defective architecture of the deposits introduces a novel "anelastic" response in the coatings comprising of their non-linear and hysteretic stress-strain relationship under mechanical loading. It has been established that this anelasticity can be attributed to the relative movement of the embedded defects under varying stresses. While the non-linear response of the coatings arises from the opening/closure of defects, hysteresis is produced by the frictional sliding among defect surfaces. Recent studies have indicated that anelastic behavior of coatings can be a unique descriptor of their mechanical behavior and related to the defect configuration. In this dissertation, a multi-variable study employing systematic processing strategies was conducted to augment the understanding on various aspects of the reported anelastic behavior. A bi-layer curvature measurement technique was adapted to measure the anelastic properties of plasma sprayed ceramic. The quantification of anelastic parameters was done using a non-linear model proposed by Nakamura et.al. An error analysis was conducted on the technique to know the available margins for both experimental as well as computational errors. The error analysis was extended to evaluate its sensitivity towards different coating microstructure. For this purpose, three coatings with significantly different microstructures were fabricated via tuning of process parameters. Later the three coatings were also subjected to different strain ranges systematically, in order to understand the origin and evolution of anelasticity on different microstructures. The last segment of this thesis attempts to capture the intricacies on the processing front and tries to evaluate and establish a correlation between them and the anelastic parameters.

  12. Lateral control system design for VTOL landing on a DD963 in high sea states. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Bodson, M.

    1982-01-01

    The problem of designing lateral control systems for the safe landing of VTOL aircraft on small ships is addressed. A ship model is derived. The issues of estimation and prediction of ship motions are discussed, using optimal linear linear estimation techniques. The roll motion is the most important of the lateral motions, and it is found that it can be predicted for up to 10 seconds in perfect conditions. The automatic landing of the VTOL aircraft is considered, and a lateral controller, defined as a ship motion tracker, is designed, using optimal control techniqes. The tradeoffs between the tracking errors and the control authority are obtained. The important couplings between the lateral motions and controls are demonstrated, and it is shown that the adverse couplings between the sway and the roll motion at the landing pad are significant constraints in the tracking of the lateral ship motions. The robustness of the control system, including the optimal estimator, is studied, using the singular values analysis. Through a robustification procedure, a robust control system is obtained, and the usefulness of the singular values to define stability margins that take into account general types of unstructured modelling errors is demonstrated. The minimal destabilizing perturbations indicated by the singular values analysis are interpreted and related to the multivariable Nyquist diagrams.

  13. Assessment and quantification of patient set-up errors in nasopharyngeal cancer patients and their biological and dosimetric impact in terms of generalized equivalent uniform dose (gEUD), tumour control probability (TCP) and normal tissue complication probability (NTCP).

    PubMed

    Boughalia, A; Marcie, S; Fellah, M; Chami, S; Mekki, F

    2015-06-01

    The aim of this study is to assess and quantify patients' set-up errors using an electronic portal imaging device and to evaluate their dosimetric and biological impact in terms of generalized equivalent uniform dose (gEUD) on predictive models, such as the tumour control probability (TCP) and the normal tissue complication probability (NTCP). 20 patients treated for nasopharyngeal cancer were enrolled in the radiotherapy-oncology department of HCA. Systematic and random errors were quantified. The dosimetric and biological impact of these set-up errors on the target volume and the organ at risk (OARs) coverage were assessed using calculation of dose-volume histogram, gEUD, TCP and NTCP. For this purpose, an in-house software was developed and used. The standard deviations (1SDs) of the systematic set-up and random set-up errors were calculated for the lateral and subclavicular fields and gave the following results: ∑ = 0.63 ± (0.42) mm and σ = 3.75 ± (0.79) mm, respectively. Thus a planning organ at risk volume (PRV) margin of 3 mm was defined around the OARs, and a 5-mm margin used around the clinical target volume. The gEUD, TCP and NTCP calculations obtained with and without set-up errors have shown increased values for tumour, where ΔgEUD (tumour) = 1.94% Gy (p = 0.00721) and ΔTCP = 2.03%. The toxicity of OARs was quantified using gEUD and NTCP. The values of ΔgEUD (OARs) vary from 0.78% to 5.95% in the case of the brainstem and the optic chiasm, respectively. The corresponding ΔNTCP varies from 0.15% to 0.53%, respectively. The quantification of set-up errors has a dosimetric and biological impact on the tumour and on the OARs. The developed in-house software using the concept of gEUD, TCP and NTCP biological models has been successfully used in this study. It can be used also to optimize the treatment plan established for our patients. The gEUD, TCP and NTCP may be more suitable tools to assess the treatment plans before treating the patients.

  14. A class of optimum digital phase locked loops for the DSN advanced receiver

    NASA Technical Reports Server (NTRS)

    Hurd, W. J.; Kumar, R.

    1985-01-01

    A class of optimum digital filters for digital phase locked loop of the deep space network advanced receiver is discussed. The filter minimizes a weighted combination of the variance of the random component of the phase error and the sum square of the deterministic dynamic component of phase error at the output of the numerically controlled oscillator (NCO). By varying the weighting coefficient over a suitable range of values, a wide set of filters are obtained such that, for any specified value of the equivalent loop-noise bandwidth, there corresponds a unique filter in this class. This filter thus has the property of having the best transient response over all possible filters of the same bandwidth and type. The optimum filters are also evaluated in terms of their gain margin for stability and their steady-state error performance.

  15. Ceramic inlays and partial ceramic crowns: influence of remaining cusp wall thickness on the marginal integrity and enamel crack formation in vitro.

    PubMed

    Krifka, Stephanie; Anthofer, Thomas; Fritzsch, Marcus; Hiller, Karl-Anton; Schmalz, Gottfried; Federlin, Marianne

    2009-01-01

    No information is currently available about what the critical cavity wall thickness is and its influence upon 1) the marginal integrity of ceramic inlays (CI) and partial ceramic crowns (PCC) and 2) the crack formation of dental tissues. This in vitro study of CI and PCC tested the effects of different remaining cusp wall thicknesses on marginal integrity and enamel crack formation. CI (n = 25) and PCC (n = 26) preparations were performed in extracted human molars. Functional cusps of CI and PCC were adjusted to a 2.5 mm thickness; for PCC, the functional cusps were reduced to a thickness of 2.0 mm. Non-functional cusps were adjusted to wall thicknesses of 1) 1.0 mm and 2) 2.0 mm. Ceramic restorations (Vita Mark II, Cerec3 System) were fabricated and adhesively luted to the cavities with Excite/Variolink II. The specimens were exposed to thermocycling and central mechanical loading (TCML: 5000 x 5 degrees C-55 degrees C; 30 seconds/cycle; 500000 x 72.5N, 1.6Hz). Marginal integrity was assessed by evaluating a) dye penetration (fuchsin) on multiple sections after TCML and by using b) quantitative margin analysis in the scanning electron microscope (SEM) before and after TCML. Ceramic- and tooth-luting agent interfaces (LA) were evaluated separately. Enamel cracks were documented under a reflective light microscope. The data were statistically analyzed with the Mann Whitney U-test (alpha = 0.05) and the Error Rates Method (ERM). Crack formation was analyzed with the Chi-Square-test (alpha = 0.05) and ERM. In general, the remaining cusp wall thickness, interface, cavity design and TCML had no statistically significant influence on marginal integrity for both CI and PCC (ERM). Single pairwise comparisons showed that the CI and PCC of Group 2 had a tendency towards less microleakage along the dentin/LA interface than Group 1. Cavity design and location had no statistically significant influence on crack formation, but the specimens with 1.0 mm of remaining wall thickness had statistically significantly more crack formation after TCML than the group with 2.0 mm of remaining cusp wall thickness for CI. The remaining cusp wall thickness of non-functional cusps of adhesively bonded restorations (especially for CI) should have a thickness of at least 2.0 mm to avoid cracks and marginal deficiency at the dentin/LA interface.

  16. A Research Roadmap for Computation-Based Human Reliability Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is oftenmore » secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.« less

  17. The role of adjuvant treatment in early-stage oral cavity squamous cell carcinoma: An international collaborative study.

    PubMed

    Fridman, Eran; Na'ara, Shorook; Agarwal, Jaiprakash; Amit, Moran; Bachar, Gideon; Villaret, Andrea Bolzoni; Brandao, Jose; Cernea, Claudio R; Chaturvedi, Pankaj; Clark, Jonathan; Ebrahimi, Ardalan; Fliss, Dan M; Jonnalagadda, Sashikanth; Kohler, Hugo F; Kowalski, Luiz P; Kreppel, Matthias; Liao, Chun-Ta; Patel, Snehal G; Patel, Raj P; Robbins, K Thomas; Shah, Jatin P; Shpitzer, Thomas; Yen, Tzu-Chen; Zöller, Joachim E; Gil, Ziv

    2018-05-14

    Up to half of patients with oral cavity squamous cell carcinoma (OCSCC) have stage I to II disease. When adequate resection is attained, no further treatment is needed; however, re-resection or radiotherapy may be indicated for patients with positive or close margins. This multicenter study evaluated the outcomes and role of adjuvant treatment in patients with stage I to II OCSCC. Overall survival (OS), disease-specific survival, local-free survival, and disease-free survival rates were calculated with Kaplan-Meier analysis. Of 1257 patients with T1-2N0M0 disease, 33 (2.6%) had positive margins, and 205 (16.3%) had close margins. The 5-year OS rate was 80% for patients with clear margins, 52% for patients with close margins, and 63% for patients with positive margins (P < .0001). In a multivariate analysis, age, depth of invasion, and margins were independent predictors of outcome. Close margins were associated with a >2-fold increase in the risk of recurrence (P < .0001). The multivariate analysis revealed that adjuvant treatment significantly improved the outcomes of patients with close/positive margins (P = .002 to .03). Patients with stage I to II OCSCC and positive/close margins have poor long-term outcomes. For this population, adjuvant treatment may be associated with improved survival. Cancer 2018. © 2018 American Cancer Society. © 2018 American Cancer Society.

  18. Robot-assisted radical prostatectomy: Multiparametric MR imaging-directed intraoperative frozen-section analysis to reduce the rate of positive surgical margins.

    PubMed

    Petralia, Giuseppe; Musi, Gennaro; Padhani, Anwar R; Summers, Paul; Renne, Giuseppe; Alessi, Sarah; Raimondi, Sara; Matei, Deliu V; Renne, Salvatore L; Jereczek-Fossa, Barbara A; De Cobelli, Ottavio; Bellomi, Massimo

    2015-02-01

    To investigate whether use of multiparametric magnetic resonance (MR) imaging-directed intraoperative frozen-section (IFS) analysis during nerve-sparing robot-assisted radical prostatectomy reduces the rate of positive surgical margins. This retrospective analysis of prospectively acquired data was approved by an institutional ethics committee, and the requirement for informed consent was waived. Data were reviewed for 134 patients who underwent preoperative multiparametric MR imaging (T2 weighted, diffusion weighted, and dynamic contrast-material enhanced) and nerve-sparing robot-assisted radical prostatectomy, during which IFS analysis was used, and secondary resections were performed when IFS results were positive for cancer. Control patients (n = 134) matched for age, prostate-specific antigen level, and stage were selected from a pool of 322 patients who underwent nerve-sparing robot-assisted radical prostatectomy without multiparametric MR imaging and IFS analysis. Rates of positive surgical margins were compared by means of the McNemar test, and a multivariate conditional logistic regression model was used to estimate the odds ratio of positive surgical margins for patients who underwent MR imaging and IFS analysis compared with control subjects. Eighteen patients who underwent MR imaging and IFS analysis underwent secondary resections, and 13 of these patients were found to have negative surgical margins at final pathologic examination. Positive surgical margins were found less frequently in the patients who underwent MR imaging and IFS analysis than in control patients (7.5% vs 18.7%, P = .01). When the differences in risk factors are taken into account, patients who underwent MR imaging and IFS had one-seventh the risk of having positive surgical margins relative to control patients (adjusted odds ratio: 0.15; 95% confidence interval: 0.04, 0.61). The significantly lower rate of positive surgical margins compared with that in control patients provides preliminary evidence of the positive clinical effect of multiparametric MR imaging-directed IFS analysis for patients who undergo prostatectomy. © RSNA, 2014.

  19. CCD Camera Lens Interface for Real-Time Theodolite Alignment

    NASA Technical Reports Server (NTRS)

    Wake, Shane; Scott, V. Stanley, III

    2012-01-01

    Theodolites are a common instrument in the testing, alignment, and building of various systems ranging from a single optical component to an entire instrument. They provide a precise way to measure horizontal and vertical angles. They can be used to align multiple objects in a desired way at specific angles. They can also be used to reference a specific location or orientation of an object that has moved. Some systems may require a small margin of error in position of components. A theodolite can assist with accurately measuring and/or minimizing that error. The technology is an adapter for a CCD camera with lens to attach to a Leica Wild T3000 Theodolite eyepiece that enables viewing on a connected monitor, and thus can be utilized with multiple theodolites simultaneously. This technology removes a substantial part of human error by relying on the CCD camera and monitors. It also allows image recording of the alignment, and therefore provides a quantitative means to measure such error.

  20. Does Mckuer's Law Hold for Heart Rate Control via Biofeedback Display?

    NASA Technical Reports Server (NTRS)

    Courter, B. J.; Jex, H. R.

    1984-01-01

    Some persons can control their pulse rate with the aid of a biofeedback display. If the biofeedback display is modified to show the error between a command pulse-rate and the measured rate, a compensatory (error correcting) heart rate tracking control loop can be created. The dynamic response characteristics of this control loop when subjected to step and quasi-random disturbances were measured. The control loop includes a beat-to-beat cardiotachmeter differenced with a forcing function from a quasi-random input generator; the resulting error pulse-rate is displayed as feedback. The subject acts to null the displayed pulse-rate error, thereby closing a compensatory control loop. McRuer's Law should hold for this case. A few subjects already skilled in voluntary pulse-rate control were tested for heart-rate control response. Control-law properties are derived, such as: crossover frequency, stability margins, and closed-loop bandwidth. These are evaluated for a range of forcing functions and for step as well as random disturbances.

  1. Analytical quality goals derived from the total deviation from patients' homeostatic set points, with a margin for analytical errors.

    PubMed

    Bolann, B J; Asberg, A

    2004-01-01

    The deviation of test results from patients' homeostatic set points in steady-state conditions may complicate interpretation of the results and the comparison of results with clinical decision limits. In this study the total deviation from the homeostatic set point is defined as the maximum absolute deviation for 95% of measurements, and we present analytical quality requirements that prevent analytical error from increasing this deviation to more than about 12% above the value caused by biology alone. These quality requirements are: 1) The stable systematic error should be approximately 0, and 2) a systematic error that will be detected by the control program with 90% probability, should not be larger than half the value of the combined analytical and intra-individual standard deviation. As a result, when the most common control rules are used, the analytical standard deviation may be up to 0.15 times the intra-individual standard deviation. Analytical improvements beyond these requirements have little impact on the interpretability of measurement results.

  2. Evaluation of the Marginal Fit of CAD/CAM Crowns Fabricated Using Two Different Chairside CAD/CAM Systems on Preparations of Varying Quality.

    PubMed

    Renne, Walter; Wolf, Bethany; Kessler, Raymond; McPherson, Karen; Mennito, Anthony S

    2015-01-01

    This study evaluated the marginal gap of crowns fabricated using two new chairside computer-aided design/computer-aided manufacturing systems on preparations completed by clinicians with varying levels of expertise to identify whether common preparation errors affect marginal fit. The null hypothesis is that there is no difference in the mean marginal gaps of restorations of varying qualities and no difference in the mean marginal gap size between restorations fabricated using the PlanScan (D4D, Richardson, TX, USA) and the CEREC Omnicam (Sirona, Bensheim, Germany). The fit of 80 lithium disilicate crowns fabricated with the E4D PlanScan or CEREC Omnicam systems on preparations of varying quality were examined for marginal fit by using the replica technique. These same preparations were then visually examined against common criteria for anterior all-ceramic restorations and placed in one of four categories: excellent, good, fair, and poor. Linear mixed modeling was used to evaluate associations between marginal gap, tooth preparation rating, and fabrication machine. The fit was not significantly different between both systems across all qualities of preparation. The average fit was 104 μm for poor-quality preparations, 87.6 μm for fair preparations, 67.2 μm for good preparations, and 36.6 μm for excellent preparations. The null hypothesis is rejected. It can be concluded that preparation quality has a significant impact on marginal gap regardless of which system is used. However, no significant difference was found when comparing the systems to each other. Within the limitations of this in vitro study, it can be concluded that crown preparation quality has a significant effect on marginal gap of the restoration when the clinician uses either CEREC Omnicam or E4D PlansScan. © 2015 Wiley Periodicals, Inc.

  3. Intrafraction Variability and Deformation Quantification in the Breast

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Glide-Hurst, Carri K., E-mail: churst2@hfhs.org; Shah, Mira M.; Price, Ryan G.

    2015-03-01

    Purpose: To evaluate intrafraction variability and deformation of the lumpectomy cavity (LC), breast, and nearby organs. Methods and Materials: Sixteen left-sided postlumpectomy and 1 bilateral breast cancer cases underwent free-breathing CT (FBCT) and 10-phase 4-dimensional CT (4DCT). Deformable image registration was used for deformation analysis and contour propagation of breast, heart, lungs, and LC between end-exhale and end-inhale 4DCT phases. Respiration-induced motion was calculated via centroid analysis. Two planning target volumes (PTVs) were compared: PTV{sub FBCT} from the FBCT volume with an isotropic 10 mm expansion (5 mm excursion and 5 mm setup error) and PTV{sub 4DCT} generated from themore » union of 4DCT contours with isotropic 5 mm margin for setup error. Volume and geometry were evaluated via percent difference and bounding box analysis, respectively. Deformation correlations between breast/cavity, breast/lung, and breast/heart were evaluated. Associations were tested between cavity deformation and proximity to chest wall and breast surface. Results: Population-based 3-dimensional vector excursions were 2.5 ± 1.0 mm (range, 0.8-3.8 mm) for the cavity and 2.0 ± 0.8 mm (range, 0.7-3.0 mm) for the ipsilateral breast. Cavity excursion was predominantly in the anterior and superior directions (1.0 ± 0.8 mm and −1.8 ± 1.2 mm, respectively). Similarly, for all cases, LCs and ipsilateral breasts yielded median deformation values in the superior direction. For 14 of 17 patients, the LCs and breast interquartile ranges tended toward the anterior direction. The PTV{sub FBCT} was 51.5% ± 10.8% larger (P<.01) than PTV{sub 4DCT}. Bounding box analysis revealed that PTV{sub FBCT} was 9.8 ± 1.2 (lateral), 9.0 ± 2.2 (anterior–posterior), and 3.9 ± 1.8 (superior–inferior) mm larger than PTV{sub 4DCT}. Significant associations between breast and cavity deformation were found for 6 of 9 axes. No dependency was found between cavity deformation and proximity to chest wall or breast surface. Conclusions: Lumpectomy cavity and breast deformation and motion demonstrated large variability. A PTV{sub 4DCT} approach showed value in patient-specific margins, particularly if robust interfraction setup analysis can be performed.« less

  4. H. Pylori as a predictor of marginal ulceration: A nationwide analysis.

    PubMed

    Schulman, Allison R; Abougergi, Marwan S; Thompson, Christopher C

    2017-03-01

    Helicobacter pylori has been implicated as a risk factor for development of marginal ulceration following gastric bypass, although studies have been small and yielded conflicting results. This study sought to determine the relationship between H. pylori infection and development of marginal ulceration following bariatric surgery in a nationwide analysis. This was a retrospective cohort study using the 2012 Nationwide Inpatient Sample (NIS) database. Discharges with ICD-9-CM code indicating marginal ulceration and a secondary ICD-9-CM code for bariatric surgery were included. Primary outcome was incidence of marginal ulceration. A stepwise forward selection model was used to build the multivariate logistic regression model based on known risk factors. A P value of 0.05 was considered significant. There were 253,765 patients who met inclusion criteria. Prevalence of marginal ulceration was 3.90%. Of those patients found to have marginal ulceration, 31.20% of patients were H. pylori-positive. Final multivariate regression analysis revealed that H. pylori was the strongest independent predictor of marginal ulceration. H. pylori is an independent predictor of marginal ulceration using a large national database. Preoperative testing for and eradication of H. pylori prior to bariatric surgery may be an important preventive measure to reduce the incidence of ulcer development. © 2017 The Obesity Society.

  5. Physician response to financial incentives when choosing drugs to treat breast cancer.

    PubMed

    Epstein, Andrew J; Johnson, Scott J

    2012-12-01

    This paper considers physician agency in choosing drugs to treat metastatic breast cancer, a clinical setting in which patients have few protections from physicians' rent seeking. Physicians have explicit financial incentives attached to each potential drug treatment, with profit margins ranging more than a hundred fold. SEER-Medicare claims and Medispan pricing data were formed into a panel of 4,503 patients who were diagnosed with metastatic breast cancer and treated with anti-cancer drugs from 1992 to 2002. We analyzed the effects of product attributes, including profit margin, randomized controlled trial citations, FDA label, generic status, and other covariates on therapy choice. Instruments and drug fixed effects were used to control for omitted variables and possible measurement error associated with margin. We find that increasing physician margin by 10% yields between an 11 and 177% increase in the likelihood of drug choice on average across drugs. Physicians were more likely to use drugs with which they had experience, had more citations, and were FDA-approved to treat breast cancer. Oncologists are susceptible to financial incentives when choosing drugs, though other factors play a large role in their choice of drug.

  6. Wide-area mapping of small-scale features in agricultural landscapes using airborne remote sensing

    PubMed Central

    O’Connell, Jerome; Bradter, Ute; Benton, Tim G.

    2015-01-01

    Natural and semi-natural habitats in agricultural landscapes are likely to come under increasing pressure with the global population set to exceed 9 billion by 2050. These non-cropped habitats are primarily made up of trees, hedgerows and grassy margins and their amount, quality and spatial configuration can have strong implications for the delivery and sustainability of various ecosystem services. In this study high spatial resolution (0.5 m) colour infrared aerial photography (CIR) was used in object based image analysis for the classification of non-cropped habitat in a 10,029 ha area of southeast England. Three classification scenarios were devised using 4 and 9 class scenarios. The machine learning algorithm Random Forest (RF) was used to reduce the number of variables used for each classification scenario by 25.5 % ± 2.7%. Proportion of votes from the 4 class hierarchy was made available to the 9 class scenarios and where the highest ranked variables in all cases. This approach allowed for misclassified parent objects to be correctly classified at a lower level. A single object hierarchy with 4 class proportion of votes produced the best result (kappa 0.909). Validation of the optimum training sample size in RF showed no significant difference between mean internal out-of-bag error and external validation. As an example of the utility of this data, we assessed habitat suitability for a declining farmland bird, the yellowhammer (Emberiza citronella), which requires hedgerows associated with grassy margins. We found that ∼22% of hedgerows were within 200 m of margins with an area >183.31 m2. The results from this analysis can form a key information source at the environmental and policy level in landscape optimisation for food production and ecosystem service sustainability. PMID:26664131

  7. Wafer hot spot identification through advanced photomask characterization techniques

    NASA Astrophysics Data System (ADS)

    Choi, Yohan; Green, Michael; McMurran, Jeff; Ham, Young; Lin, Howard; Lan, Andy; Yang, Richer; Lung, Mike

    2016-10-01

    As device manufacturers progress through advanced technology nodes, limitations in standard 1-dimensional (1D) mask Critical Dimension (CD) metrics are becoming apparent. Historically, 1D metrics such as Mean to Target (MTT) and CD Uniformity (CDU) have been adequate for end users to evaluate and predict the mask impact on the wafer process. However, the wafer lithographer's process margin is shrinking at advanced nodes to a point that the classical mask CD metrics are no longer adequate to gauge the mask contribution to wafer process error. For example, wafer CDU error at advanced nodes is impacted by mask factors such as 3-dimensional (3D) effects and mask pattern fidelity on subresolution assist features (SRAFs) used in Optical Proximity Correction (OPC) models of ever-increasing complexity. These items are not quantifiable with the 1D metrology techniques of today. Likewise, the mask maker needs advanced characterization methods in order to optimize the mask process to meet the wafer lithographer's needs. These advanced characterization metrics are what is needed to harmonize mask and wafer processes for enhanced wafer hot spot analysis. In this paper, we study advanced mask pattern characterization techniques and their correlation with modeled wafer performance.

  8. Cultural background shapes spatial reference frame proclivity

    PubMed Central

    Goeke, Caspar; Kornpetpanee, Suchada; Köster, Moritz; Fernández-Revelles, Andrés B.; Gramann, Klaus; König, Peter

    2015-01-01

    Spatial navigation is an essential human skill that is influenced by several factors. The present study investigates how gender, age, and cultural background account for differences in reference frame proclivity and performance in a virtual navigation task. Using an online navigation study, we recorded reaction times, error rates (confusion of turning axis), and reference frame proclivity (egocentric vs. allocentric reference frame) of 1823 participants. Reaction times significantly varied with gender and age, but were only marginally influenced by the cultural background of participants. Error rates were in line with these results and exhibited a significant influence of gender and culture, but not age. Participants’ cultural background significantly influenced reference frame selection; the majority of North-Americans preferred an allocentric strategy, while Latin-Americans preferred an egocentric navigation strategy. European and Asian groups were in between these two extremes. Neither the factor of age nor the factor of gender had a direct impact on participants’ navigation strategies. The strong effects of cultural background on navigation strategies without the influence of gender or age underlines the importance of socialized spatial cognitive processes and argues for socio-economic analysis in studies investigating human navigation. PMID:26073656

  9. Tests of Independence in Contingency Tables with Small Samples: A Comparison of Statistical Power.

    ERIC Educational Resources Information Center

    Parshall, Cynthia G.; Kromrey, Jeffrey D.

    1996-01-01

    Power and Type I error rates were estimated for contingency tables with small sample sizes for the following four types of tests: (1) Pearson's chi-square; (2) chi-square with Yates's continuity correction; (3) the likelihood ratio test; and (4) Fisher's Exact Test. Various marginal distributions, sample sizes, and effect sizes were examined. (SLD)

  10. A Computer Based Cognitive Simulation of Cataract Surgery

    DTIC Science & Technology

    2011-12-01

    for zonular absence, assess for notable lenticular astigmatism ** How and when do you decide to use a capsular tension ring? (Expert) Zonular...INTRODUCTION The Virtual Mentor Cataract Surgery Trainer is a computer based, cognitive simulation of phacoemulsification cataract surgery. It is...the Cataract Trainer. BODY Phacoemulsification cataract surgery (phaco) is a difficult procedure to learn, with little margin for error. As in other

  11. SU-F-J-130: Margin Determination for Hypofractionated Partial Breast Irradiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geady, C; Keller, B; Hahn, E

    2016-06-15

    Purpose: To determine the Planning Target Volume (PTV) margin for Hypofractionated Partial Breast Irradiation (HPBI) using the van Herk formalism (M=2.5∑+0.7σ). HPBI is a novel technique intended to provide local control in breast cancer patients not eligible for surgical resection, using 40 Gy in 5 fractions prescribed to the gross disease. Methods: Setup uncertainties were quantified through retrospective analysis of cone-beam computed tomography (CBCT) data sets, collected prior to (prefraction) and after (postfraction) treatment delivery. During simulation and treatment, patients were immobilized using a wing board and an evacuated bag. Prefraction CBCT was rigidly registered to planning 4-dimensional computed tomographymore » (4DCT) using the chest wall and tumor, and translational couch shifts were applied as needed. This clinical workflow was faithfully reproduced in Pinnacle (Philips Medical Systems) to yield residual setup and intrafractional error through translational shifts and rigid registrations (ribs and sternum) of prefraction CBCT to 4DCT and postfraction CBCT to prefraction CBCT, respectively. All ten patients included in this investigation were medically inoperable; the median age was 84 (range, 52–100) years. Results: Systematic (and random) setup uncertainties (in mm) detected for the left-right, craniocaudal and anteroposterior directions were 0.4 (1.5), 0.8 (1.8) and 0.4 (1.0); net uncertainty was determined to be 0.7 (1.5). Rotations >2° in any axis occurred on 8/72 (11.1%) registrations. Conclusion: Preliminary results suggest a non-uniform setup margin (in mm) of 2.2, 3.3 and 1.7 for the left-right, craniocaudal and anteroposterior directions is required for HPBI, given its immobilization techniques and online setup verification protocol. This investigation is ongoing, though published results from similar studies are consistent with the above findings. Determination of margins in breast radiotherapy is a paradigm shift, but a necessary step in moving towards hypofractionated regiments, which may ultimately redefine the standard of care for this select patient population.« less

  12. Assessment of three-dimensional setup errors in image-guided pelvic radiotherapy for uterine and cervical cancer using kilovoltage cone-beam computed tomography and its effect on planning target volume margins.

    PubMed

    Patni, Nidhi; Burela, Nagarjuna; Pasricha, Rajesh; Goyal, Jaishree; Soni, Tej Prakash; Kumar, T Senthil; Natarajan, T

    2017-01-01

    To achieve the best possible therapeutic ratio using high-precision techniques (image-guided radiation therapy/volumetric modulated arc therapy [IGRT/VMAT]) of external beam radiation therapy in cases of carcinoma cervix using kilovoltage cone-beam computed tomography (kV-CBCT). One hundred and five patients of gynecological malignancies who were treated with IGRT (IGRT/VMAT) were included in the study. CBCT was done once a week for intensity-modulated radiation therapy and daily in IGRT/VMAT. These images were registered with the planning CT scan images and translational errors were applied and recorded. In all, 2078 CBCT images were studied. The margins of planning target volume were calculated from the variations in the setup. The setup variation was 5.8, 10.3, and 5.6 mm in anteroposterior, superoinferior, and mediolateral direction. This allowed adequate dose delivery to the clinical target volume and the sparing of organ at risks. Daily kV-CBCT is a satisfactory method of accurate patient positioning in treating gynecological cancers with high-precision techniques. This resulted in avoiding geographic miss.

  13. The role of three-dimensional high-definition laparoscopic surgery for gynaecology.

    PubMed

    Usta, Taner A; Gundogdu, Elif C

    2015-08-01

    This article reviews the potential benefits and disadvantages of new three-dimensional (3D) high-definition laparoscopic surgery for gynaecology. With the new-generation 3D high-definition laparoscopic vision systems (LVSs), operation time and learning period are reduced and procedural error margin is decreased. New-generation 3D high-definition LVSs enable to reduce operation time both for novice and experienced surgeons. Headache, eye fatigue or nausea reported with first-generation systems are not different than two-dimensional (2D) LVSs. The system's being more expensive, having the obligation to wear glasses, big and heavy camera probe in some of the devices are accounted for negative aspects of the system that need to be improved. Depth loss in tissues in 2D LVSs and associated adverse events can be eliminated with 3D high-definition LVSs. By virtue of faster learning curve, shorter operation time, reduced error margin and lack of side-effects reported by surgeons with first-generation systems, 3D LVSs seem to be a strong competition to classical laparoscopic imaging systems. Thanks to technological advancements, using lighter and smaller cameras and monitors without glasses is in the near future.

  14. Animation and radiobiological analysis of 3D motion in conformal radiotherapy.

    PubMed

    MacKay, R I; Graham, P A; Moore, C J; Logue, J P; Sharrock, P J

    1999-07-01

    To allow treatment plans to be evaluated against the range of expected organ motion and set up error anticipated during treatment. Planning tools have been developed to allow concurrent animation and radiobiological analysis of three dimensional (3D) target and organ motion in conformal radiotherapy. Surfaces fitted to structures outlined on CT studies are projected onto pre-treatment images or onto megavoltage images collected during the patient treatment. Visual simulation of tumour and normal tissue movement is then performed by the application of three dimensional affine transformations, to the selected surface. Concurrent registration of the surface motion with the 3D dose distribution allows calculation of the change in dose to the volume. Realistic patterns of motion can be applied to the structure to simulate inter-fraction motion and set-up error. The biologically effective dose for the structure is calculated for each fraction as the surface moves over the course of the treatment and is used to calculate the normal tissue complication probability (NTCP) or tumour control probability (TCP) for the moving structure. The tool has been used to evaluate conformal therapy plans against set up measurements recorded during patient treatments. NTCP and TCP were calculated for a patient whose set up had been corrected after systematic deviations from plan geometry were measured during treatment, the effect of not making the correction were also assessed. TCP for the moving tumour was reduced if inadequate margins were set for the treatment. Modelling suggests that smaller margins could have been set for the set up corrected during the course of the treatment. The NTCP for the rectum was also higher for the uncorrected set up due to a more rectal tissue falling in the high dose region. This approach provides a simple way for clinical users to utilise information incrementally collected throughout the whole of a patient's treatment. In particular it is possible to test the robustness of a patient plan against a range of possible motion patterns. The methods described represent a move from the inspection of static pre-treatment plans to a review of the dynamic treatment.

  15. A Singular Perturbation Approach for Time-Domain Assessment of Phase Margin

    NASA Technical Reports Server (NTRS)

    Zhu, J. Jim; Yang, Xiaojing; Hodel, A Scottedward

    2010-01-01

    This paper considers the problem of time-domain assessment of the Phase Margin (PM) of a Single Input Single Output (SISO) Linear Time-Invariant (LTI) system using a singular perturbation approach, where a SISO LTI fast loop system, whose phase lag increases monotonically with frequency, is introduced into the loop as a singular perturbation with a singular perturbation (time-scale separation) parameter Epsilon. First, a bijective relationship between the Singular Perturbation Margin (SPM) max and the PM of the nominal (slow) system is established with an approximation error on the order of Epsilon(exp 2). In proving this result, relationships between the singular perturbation parameter Epsilon, PM of the perturbed system, PM and SPM of the nominal system, and the (monotonically increasing) phase of the fast system are also revealed. These results make it possible to assess the PM of the nominal system in the time-domain for SISO LTI systems using the SPM with a standardized testing system called "PM-gauge," as demonstrated by examples. PM is a widely used stability margin for LTI control system design and certification. Unfortunately, it is not applicable to Linear Time-Varying (LTV) and Nonlinear Time-Varying (NLTV) systems. The approach developed here can be used to establish a theoretical as well as practical metric of stability margin for LTV and NLTV systems using a standardized SPM that is backward compatible with PM.

  16. Influences of microgap and micromotion of implant-abutment interface on marginal bone loss around implant neck.

    PubMed

    Liu, Yang; Wang, Jiawei

    2017-11-01

    To review the influences and clinical implications of micro-gap and micro-motion of implant-abutment interface on marginal bone loss around the neck of implant. Literatures were searched based on the following Keywords: implant-abutment interface/implant-abutment connection/implant-abutment conjunction, microgap, micromotion/micromovement, microleakage, and current control methods available. The papers were then screened through titles, abstracts, and full texts. A total of 83 studies were included in the literature review. Two-piece implant systems are widely used in clinics. However, the production error and masticatory load result in the presence of microgap and micromotion between the implant and the abutment, which directly or indirectly causes microleakage and mechanical damage. Consequently, the degrees of microgap and micromotion further increase, and marginal bone absorption finally occurs. We summarize the influences of microgap and micromotion at the implant-abutment interface on marginal bone loss around the neck of the implant. We also recommend some feasible methods to reduce their effect. Clinicians and patients should pay more attention to the mechanisms as well as the control methods of microgap and micromotion. To reduce the corresponding detriment to the implant marginal bone, suitable Morse taper or hybrid connection implants and platform switching abutments should be selected, as well as other potential methods. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Application of fermionic marginal constraints to hybrid quantum algorithms

    NASA Astrophysics Data System (ADS)

    Rubin, Nicholas C.; Babbush, Ryan; McClean, Jarrod

    2018-05-01

    Many quantum algorithms, including recently proposed hybrid classical/quantum algorithms, make use of restricted tomography of the quantum state that measures the reduced density matrices, or marginals, of the full state. The most straightforward approach to this algorithmic step estimates each component of the marginal independently without making use of the algebraic and geometric structure of the marginals. Within the field of quantum chemistry, this structure is termed the fermionic n-representability conditions, and is supported by a vast amount of literature on both theoretical and practical results related to their approximations. In this work, we introduce these conditions in the language of quantum computation, and utilize them to develop several techniques to accelerate and improve practical applications for quantum chemistry on quantum computers. As a general result, we demonstrate how these marginals concentrate to diagonal quantities when measured on random quantum states. We also show that one can use fermionic n-representability conditions to reduce the total number of measurements required by more than an order of magnitude for medium sized systems in chemistry. As a practical demonstration, we simulate an efficient restoration of the physicality of energy curves for the dilation of a four qubit diatomic hydrogen system in the presence of three distinct one qubit error channels, providing evidence these techniques are useful for pre-fault tolerant quantum chemistry experiments.

  18. Will kinematic Sunyaev-Zel'dovich measurements enhance the science return from galaxy redshift surveys?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sugiyama, Naonori S.; Okumura, Teppei; Spergel, David N., E-mail: nao.s.sugiyama@gmail.com, E-mail: tokumura@asiaa.sinica.edu.tw, E-mail: dns@astro.princeton.edu

    2017-01-01

    Yes. Future CMB experiments such as Advanced ACTPol and CMB-S4 should achieve measurements with S/N of > 0.1 for the typical host halo of galaxies in redshift surveys. These measurements will provide complementary measurements of the growth rate of large scale structure f and the expansion rate of the Universe H to galaxy clustering measurements. This paper emphasizes that there is significant information in the anisotropy of the relative pairwise kSZ measurements. We expand the relative pairwise kSZ power spectrum in Legendre polynomials and consider up to its octopole. Assuming that the noise in the filtered maps is uncorrelated betweenmore » the positions of galaxies in the survey, we derive a simple analytic form for the power spectrum covariance of the relative pairwise kSZ temperature in redshift space. While many previous studies have assumed optimistically that the optical depth of the galaxies τ{sub T} in the survey is known, we marginalize over τ{sub T}, to compute constraints on the growth rate f and the expansion rate H . For realistic survey parameters, we find that combining kSZ and galaxy redshift survey data reduces the marginalized 1-σ errors on H and f to ∼50-70% compared to the galaxy-only analysis.« less

  19. Will kinematic Sunyaev-Zel'dovich measurements enhance the science return from galaxy redshift surveys?

    NASA Astrophysics Data System (ADS)

    Sugiyama, Naonori S.; Okumura, Teppei; Spergel, David N.

    2017-01-01

    Yes. Future CMB experiments such as Advanced ACTPol and CMB-S4 should achieve measurements with S/N of > 0.1 for the typical host halo of galaxies in redshift surveys. These measurements will provide complementary measurements of the growth rate of large scale structure f and the expansion rate of the Universe H to galaxy clustering measurements. This paper emphasizes that there is significant information in the anisotropy of the relative pairwise kSZ measurements. We expand the relative pairwise kSZ power spectrum in Legendre polynomials and consider up to its octopole. Assuming that the noise in the filtered maps is uncorrelated between the positions of galaxies in the survey, we derive a simple analytic form for the power spectrum covariance of the relative pairwise kSZ temperature in redshift space. While many previous studies have assumed optimistically that the optical depth of the galaxies τT in the survey is known, we marginalize over τT, to compute constraints on the growth rate f and the expansion rate H. For realistic survey parameters, we find that combining kSZ and galaxy redshift survey data reduces the marginalized 1-σ errors on H and f to ~50-70% compared to the galaxy-only analysis.

  20. High-Density Near-Field Optical Disc Recording

    NASA Astrophysics Data System (ADS)

    Shinoda, Masataka; Saito, Kimihiro; Ishimoto, Tsutomu; Kondo, Takao; Nakaoki, Ariyoshi; Ide, Naoki; Furuki, Motohiro; Takeda, Minoru; Akiyama, Yuji; Shimouma, Takashi; Yamamoto, Masanobu

    2005-05-01

    We developed a high-density near-field optical recording disc system using a solid immersion lens. The near-field optical pick-up consists of a solid immersion lens with a numerical aperture of 1.84. The laser wavelength for recording is 405 nm. In order to realize the near-field optical recording disc, we used a phase-change recording media and a molded polycarbonate substrate. A clear eye pattern of 112 GB capacity with 160 nm track pitch and 50 nm bit length was observed. The equivalent areal density is 80.6 Gbit/in2. The bottom bit error rate of 3 tracks-write was 4.5× 10-5. The readout power margin and the recording power margin were ± 30.4% and ± 11.2%, respectively.

  1. Estimation of Aerodynamic Stability Derivatives for Space Launch System and Impact on Stability Margins

    NASA Technical Reports Server (NTRS)

    Pei, Jing; Wall, John

    2013-01-01

    This paper describes the techniques involved in determining the aerodynamic stability derivatives for the frequency domain analysis of the Space Launch System (SLS) vehicle. Generally for launch vehicles, determination of the derivatives is fairly straightforward since the aerodynamic data is usually linear through a moderate range of angle of attack. However, if the wind tunnel data lacks proper corrections then nonlinearities and asymmetric behavior may appear in the aerodynamic database coefficients. In this case, computing the derivatives becomes a non-trivial task. Errors in computing the nominal derivatives could lead to improper interpretation regarding the natural stability of the system and tuning of the controller parameters, which would impact both stability and performance. The aerodynamic derivatives are also provided at off nominal operating conditions used for dispersed frequency domain Monte Carlo analysis. Finally, results are shown to illustrate that the effects of aerodynamic cross axis coupling can be neglected for the SLS configuration studied

  2. One-millimeter cancer-free margin is curative for colorectal liver metastases: a propensity score case-match approach.

    PubMed

    Hamady, Zaed Z R; Lodge, J Peter A; Welsh, Fenella K; Toogood, Giles J; White, Alan; John, Timothy; Rees, Myrddin

    2014-03-01

    To investigate the influence of clear surgical resection margin width on disease recurrence rate after intentionally curative resection of colorectal liver metastases. There is consensus that a histological positive resection margin is a predictor of disease recurrence after resection of colorectal liver metastases. The dispute, however, over the width of cancer-free resection margin required is ongoing. Analysis of observational prospectively collected data for 2715 patients who underwent primary resection of colorectal liver metastases from 2 major hepatobiliary units in the United Kingdom. Histological cancer-free resection margin was classified as positive (if cancer cells present at less than 1 mm from the resection margin) or negative (if the distance between the cancer and the margin is 1 mm or more). The negative margin was further classified according to the distance from the tumor in millimeters. Predictors of disease-free survival were analyzed in univariate and multivariate analyses. A case-match analysis by a propensity score method was undertaken to reduce bias. A 1-mm cancer-free resection margin was sufficient to achieve 33% 5-year overall disease-free survival. Extra margin width did not add disease-free survival advantage (P > 0.05). After the propensity case-match analysis, there is no statistical difference in disease-free survival between patients with negative narrow and wider margin clearance [hazard ratio (HR) 1.0; 95% (confidence interval) CI: 0.9-1.2; P = 0.579 at 5-mm cutoff and HR 1.1; 95% CI: 0.96-1.3; P = 0.149 at 10-mm cutoff]. Patients with extrahepatic disease and positive lymph node primary tumor did not have disease-free survival advantage despite surgical margin clearance (9 months for <1-mm vs 12 months for ≥1-mm margin clearance; P = 0.062). One-mm cancer-free resection margin achieved in patients with colorectal liver metastases should now be considered the standard of care.

  3. SU-F-BRD-09: Is It Sufficient to Use Only Low Density Tissue-Margin to Compensate Inter-Fractionation Setup Uncertainties in Lung Treatment?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nie, K; Yue, N; Chen, T

    2014-06-15

    Purpose: In lung radiation treatment, PTV is formed with a margin around GTV (or CTV/ITV). Although GTV is most likely of water equivalent density, the PTV margin may be formed with the surrounding low-density tissues, which may lead to unreal dosimetric plan. This study is to evaluate whether the concern of dose calculation inside the PTV with only low density margin could be justified in lung treatment. Methods: Three SBRT cases were analyzed. The PTV from the original plan (Plan-O) was created with a 5–10 mm margin outside the ITV to incorporate setup errors and all mobility from 10 respiratorymore » phases. Test plans were generated with the GTV shifted to the PTV edge to simulate the extreme situations with maximum setup uncertainties. Two representative positions as the very posterior-superior (Plan-PS) and anterior-inferior (Plan-AI) edge were considered. The virtual GTV was assigned a density of 1.0 g.cm−3 and surrounding lung, including the PTV margin, was defined as 0.25 g.cm−3. Also, additional plan with a 1mm tissue-margin instead of full lung-margin was created to evaluate whether a composite-margin (Plan-Comp) has a better approximation for dose calculation. All plans were generated on the average CT using Analytical Anisotropic Algorithm with heterogeneity correction on and all planning parameters/monitor unites remained unchanged. DVH analyses were performed for comparisons. Results: Despite the non-static dose distribution, the high-dose region synchronized with tumor positions. This might due to scatter conditions as greater doses were absorbed in the solid-tumor than in the surrounding low-density lungtissue. However, it still showed missing target coverage in general. Certain level of composite-margin might give better approximation for the dosecalculation. Conclusion: Our exploratory results suggest that with the lungmargin only, the planning dose of PTV might overestimate the coverage of the target during treatment. The significance of this overestimation might warrant further investigation.« less

  4. MIMO equalization with adaptive step size for few-mode fiber transmission systems.

    PubMed

    van Uden, Roy G H; Okonkwo, Chigo M; Sleiffer, Vincent A J M; de Waardt, Hugo; Koonen, Antonius M J

    2014-01-13

    Optical multiple-input multiple-output (MIMO) transmission systems generally employ minimum mean squared error time or frequency domain equalizers. Using an experimental 3-mode dual polarization coherent transmission setup, we show that the convergence time of the MMSE time domain equalizer (TDE) and frequency domain equalizer (FDE) can be reduced by approximately 50% and 30%, respectively. The criterion used to estimate the system convergence time is the time it takes for the MIMO equalizer to reach an average output error which is within a margin of 5% of the average output error after 50,000 symbols. The convergence reduction difference between the TDE and FDE is attributed to the limited maximum step size for stable convergence of the frequency domain equalizer. The adaptive step size requires a small overhead in the form of a lookup table. It is highlighted that the convergence time reduction is achieved without sacrificing optical signal-to-noise ratio performance.

  5. Estimation of pulse rate from ambulatory PPG using ensemble empirical mode decomposition and adaptive thresholding.

    PubMed

    Pittara, Melpo; Theocharides, Theocharis; Orphanidou, Christina

    2017-07-01

    A new method for deriving pulse rate from PPG obtained from ambulatory patients is presented. The method employs Ensemble Empirical Mode Decomposition to identify the pulsatile component from noise-corrupted PPG, and then uses a set of physiologically-relevant rules followed by adaptive thresholding, in order to estimate the pulse rate in the presence of noise. The method was optimized and validated using 63 hours of data obtained from ambulatory hospital patients. The F1 score obtained with respect to expertly annotated data was 0.857 and the mean absolute errors of estimated pulse rates with respect to heart rates obtained from ECG collected in parallel were 1.72 bpm for "good" quality PPG and 4.49 bpm for "bad" quality PPG. Both errors are within the clinically acceptable margin-of-error for pulse rate/heart rate measurements, showing the promise of the proposed approach for inclusion in next generation wearable sensors.

  6. Improved techniques for predicting spacecraft power

    NASA Technical Reports Server (NTRS)

    Chmielewski, A. B.

    1987-01-01

    Radioisotope Thermoelectric Generators (RTGs) are going to supply power for the NASA Galileo and Ulysses spacecraft now scheduled to be launched in 1989 and 1990. The duration of the Galileo mission is expected to be over 8 years. This brings the total RTG lifetime to 13 years. In 13 years, the RTG power drops more than 20 percent leaving a very small power margin over what is consumed by the spacecraft. Thus it is very important to accurately predict the RTG performance and be able to assess the magnitude of errors involved. The paper lists all the error sources involved in the RTG power predictions and describes a statistical method for calculating the tolerance.

  7. Design and evaluation of a robust dynamic neurocontroller for a multivariable aircraft control problem

    NASA Technical Reports Server (NTRS)

    Troudet, T.; Garg, S.; Merrill, W.

    1992-01-01

    The design of a dynamic neurocontroller with good robustness properties is presented for a multivariable aircraft control problem. The internal dynamics of the neurocontroller are synthesized by a state estimator feedback loop. The neurocontrol is generated by a multilayer feedforward neural network which is trained through backpropagation to minimize an objective function that is a weighted sum of tracking errors, and control input commands and rates. The neurocontroller exhibits good robustness through stability margins in phase and vehicle output gains. By maintaining performance and stability in the presence of sensor failures in the error loops, the structure of the neurocontroller is also consistent with the classical approach of flight control design.

  8. The effects of age and mood on saccadic function in older individuals.

    PubMed

    Shafiq-Antonacci, R; Maruff, P; Whyte, S; Tyler, P; Dudgeon, P; Currie, J

    1999-11-01

    To investigate the effect of age and mood on saccadic function, we recorded prosaccades, predictive saccades, and antisaccades from 238 cognitively normal, physically healthy volunteers aged 44 to 85 years old. Mood levels were measured using the State-Trait Anxiety Inventory and Center for Epidemiological Studies Depression Scale inventories. Small, but significant, positive relationships with age were observed for the mean latency and associated variability of latency for all types of saccades, as well as the antisaccade error rate. Saccade velocity or accuracy was unaffected by age. Increasing levels of depression had a minor negative influence on the antisaccade latency, whereas increasing levels of anxiety raised the antisaccade error rate marginally.

  9. Variable Structure PID Control to Prevent Integrator Windup

    NASA Technical Reports Server (NTRS)

    Hall, C. E.; Hodel, A. S.; Hung, J. Y.

    1999-01-01

    PID controllers are frequently used to control systems requiring zero steady-state error while maintaining requirements for settling time and robustness (gain/phase margins). PID controllers suffer significant loss of performance due to short-term integrator wind-up when used in systems with actuator saturation. We examine several existing and proposed methods for the prevention of integrator wind-up in both continuous and discrete time implementations.

  10. Low Average Sidelobe Slot Array Antennas for Radiometer Applications

    NASA Technical Reports Server (NTRS)

    Rengarajan, Sembiam; Zawardzki, Mark S.; Hodges, Richard E.

    2012-01-01

    In radiometer applications, it is required to design antennas that meet low average sidelobe levels and low average return loss over a specified frequency bandwidth. It is a challenge to meet such specifications over a frequency range when one uses resonant elements such as waveguide feed slots. In addition to their inherent narrow frequency band performance, the problem is exacerbated due to modeling errors and manufacturing tolerances. There was a need to develop a design methodology to solve the problem. An iterative design procedure was developed by starting with an array architecture, lattice spacing, aperture distribution, waveguide dimensions, etc. The array was designed using Elliott s technique with appropriate values of the total slot conductance in each radiating waveguide, and the total resistance in each feed waveguide. Subsequently, the array performance was analyzed by the full wave method of moments solution to the pertinent integral equations. Monte Carlo simulations were also carried out to account for amplitude and phase errors introduced for the aperture distribution due to modeling errors as well as manufacturing tolerances. If the design margins for the average sidelobe level and the average return loss were not adequate, array architecture, lattice spacing, aperture distribution, and waveguide dimensions were varied in subsequent iterations. Once the design margins were found to be adequate, the iteration was stopped and a good design was achieved. A symmetric array architecture was found to meet the design specification with adequate margin. The specifications were near 40 dB for angular regions beyond 30 degrees from broadside. Separable Taylor distribution with nbar=4 and 35 dB sidelobe specification was chosen for each principal plane. A non-separable distribution obtained by the genetic algorithm was found to have similar characteristics. The element spacing was obtained to provide the required beamwidth and close to a null in the E-plane end-fire direction. Because of the alternating slot offsets, grating lobes called butterfly lobes are produced in non-principal planes close to the H-plane. An attempt to reduce the influence of such grating lobes resulted in a symmetric design.

  11. Multi-Dimensional Shallow Landslide Stability Analysis Suitable for Application at the Watershed Scale

    NASA Astrophysics Data System (ADS)

    Milledge, D.; Bellugi, D.; McKean, J. A.; Dietrich, W.

    2012-12-01

    The infinite slope model is the basis for almost all watershed scale slope stability models. However, it assumes that a potential landslide is infinitely long and wide. As a result, it cannot represent resistance at the margins of a potential landslide (e.g. from lateral roots), and is unable to predict the size of a potential landslide. Existing three-dimensional models generally require computationally expensive numerical solutions and have previously been applied only at the hillslope scale. Here we derive an alternative analytical treatment that accounts for lateral resistance by representing the forces acting on each margin of an unstable block. We apply 'at rest' earth pressure on the lateral sides, and 'active' and 'passive' pressure using a log-spiral method on the upslope and downslope margins. We represent root reinforcement on each margin assuming that root cohesion is an exponential function of soil depth. We benchmark this treatment against other more complete approaches (Finite Element (FE) and closed form solutions) and find that our model: 1) converges on the infinite slope predictions as length / depth and width / depth ratios become large; 2) agrees with the predictions from state-of-the-art FE models to within +/- 30% error, for the specific cases in which these can be applied. We then test our model's ability to predict failure of an actual (mapped) landslide where the relevant parameters are relatively well constrained. We find that our model predicts failure at the observed location with a nearly identical shape and predicts that larger or smaller shapes conformal to the observed shape are indeed more stable. Finally, we perform a sensitivity analysis using our model to show that lateral reinforcement sets a minimum landslide size, while the additional strength at the downslope boundary means that the optimum shape for a given size is longer in a downslope direction. However, reinforcement effects cannot fully explain the size or shape distributions of observed landslides, highlighting the importance of spatial patterns of key parameters (e.g. pore water pressure) and motivating the model's watershed scale application. This watershed scale application requires an efficient method to find the least stable shapes among an almost infinite set. However, when applied in this context, it allows a more complete examination of the controls on landslide size, shape and location.

  12. Lithosphere structure and subsidence evolution of the conjugate S-African and Argentine margins

    NASA Astrophysics Data System (ADS)

    Dressel, Ingo; Scheck-Wenderoth, Magdalena; Cacace, Mauro; Götze, Hans-Jürgen; Franke, Dieter

    2016-04-01

    The bathymetric evolution of the South Atlantic passive continental margins is a matter of debate. Though it is commonly accepted that passive margins experience thermal subsidence as a result of lithospheric cooling as well as load induced subsidence in response to sediment deposition it is disputed if the South Atlantic passive margins were affected by additional processes affecting the subsidence history after continental breakup. We present a subsidence analysis along the SW African margin and offshore Argentina and restore paleobathymetries to assess the subsidence evolution of the margin. These results are discussed with respect to mechanisms behind margin evolution. Therefore, we use available information about the lithosphere-scale present-day structural configuration of these margins as a starting point for the subsidence analysis. A multi 1D backward modelling method is applied to separate individual subsidence components such as the thermal- as well as the load induced subsidence and to restore paleobathymetries for the conjugate margins. The comparison of the restored paleobathymetries shows that the conjugate margins evolve differently: Continuous subsidence is obtained offshore Argentina whereas the subsidence history of the SW African margin is interrupted by phases of uplift. This differing results for both margins correlate also with different structural configurations of the subcrustal mantle. In the light of these results we discuss possible implications for uplift mechanisms.

  13. An efficient genome-wide association test for multivariate phenotypes based on the Fisher combination function.

    PubMed

    Yang, James J; Li, Jia; Williams, L Keoki; Buu, Anne

    2016-01-05

    In genome-wide association studies (GWAS) for complex diseases, the association between a SNP and each phenotype is usually weak. Combining multiple related phenotypic traits can increase the power of gene search and thus is a practically important area that requires methodology work. This study provides a comprehensive review of existing methods for conducting GWAS on complex diseases with multiple phenotypes including the multivariate analysis of variance (MANOVA), the principal component analysis (PCA), the generalizing estimating equations (GEE), the trait-based association test involving the extended Simes procedure (TATES), and the classical Fisher combination test. We propose a new method that relaxes the unrealistic independence assumption of the classical Fisher combination test and is computationally efficient. To demonstrate applications of the proposed method, we also present the results of statistical analysis on the Study of Addiction: Genetics and Environment (SAGE) data. Our simulation study shows that the proposed method has higher power than existing methods while controlling for the type I error rate. The GEE and the classical Fisher combination test, on the other hand, do not control the type I error rate and thus are not recommended. In general, the power of the competing methods decreases as the correlation between phenotypes increases. All the methods tend to have lower power when the multivariate phenotypes come from long tailed distributions. The real data analysis also demonstrates that the proposed method allows us to compare the marginal results with the multivariate results and specify which SNPs are specific to a particular phenotype or contribute to the common construct. The proposed method outperforms existing methods in most settings and also has great applications in GWAS on complex diseases with multiple phenotypes such as the substance abuse disorders.

  14. Dental impression technique using optoelectronic devices

    NASA Astrophysics Data System (ADS)

    Sinescu, Cosmin; Barua, Souman; Topala, Florin Ionel; Negrutiu, Meda Lavinia; Duma, Virgil-Florin; Gabor, Alin Gabriel; Zaharia, Cristian; Bradu, Adrian; Podoleanu, Adrian G.

    2018-03-01

    INTRODUCTION: The use of Optical Coherence Tomography (OCT) as a non-invasive and high precision quantitative information providing tool has been well established by researches within the last decade. The marginal discrepancy values can be scrutinized in optical biopsy made in three dimensional (3D) micro millimetre scale and reveal detailed qualitative and quantitative information of soft and hard tissues. OCT-based high resolution 3D images can provide a significant impact on finding recurrent caries, restorative failure, analysing the precision of crown preparation, and prosthetic elements marginal adaptation error with the gingiva and dental hard tissues. During the CAD/CAM process of prosthodontic restorations, the circumvent of any error is important for the practitioner and the technician to reduce waste of time and material. Additionally, OCT images help to achieve a new or semi-skilled practitioner to analyse their crown preparation works and help to develop their skills faster than in a conventional way. The aim of this study is to highlight the advantages of OCT in high precision prosthodontic restorations. MATERIALS AND METHODS: 25 preparations of frontal and lateral teeth were performed for 7 different patients. The impressions of the prosthetic fields were obtained both using a conventional optoelectronic system (Apolo Di, Syrona) and a Spectral Domain using OCT (Dental prototype, working at 860 nm). For the conventional impression technique the preparation margins were been prelevated by gingival impregnated cords. No specific treatments were performed by the OCT impression technique. RESULTS: The scanning performed by conventional optoelectronic system proved to be quick and accurate in terms of impression technology. The results were represented by 3D virtual models obtained after the scanning procedure was completed. In order to obtain a good optical impression a gingival retraction cord was inserted between the prepared tooth and the gingival tissue for a better elevation of the tooth cervical margin preparation. Spectral OCT was enforced in order to observe the quality but also the advantages coming from this technology. No special preparation was performed for this operation. CONCLUSION: Considering these aspects, OCT could be used as a valuable tool for dental impression technology, being non-invasive but also non-destructive on the marginal gingival tissue, in comparison with conventional optoelectronic technology where the gingival retraction cord is still mandatory.

  15. Pattern of eyelid motion predictive of decision errors during drowsiness: oculomotor indices of altered states.

    PubMed

    Lobb, M L; Stern, J A

    1986-08-01

    Sequential patterns of eye and eyelid motion were identified in seven subjects performing a modified serial probe recognition task under drowsy conditions. Using simultaneous EOG and video recordings, eyelid motion was divided into components above, within, and below the pupil and the durations in sequence were recorded. A serial probe recognition task was modified to allow for distinguishing decision errors from attention errors. Decision errors were found to be more frequent following a downward shift in the gaze angle which the eyelid closing sequence was reduced from a five element to a three element sequence. The velocity of the eyelid moving over the pupil during decision errors was slow in the closing and fast in the reopening phase, while on decision correct trials it was fast in closing and slower in reopening. Due to the high variability of eyelid motion under drowsy conditions these findings were only marginally significant. When a five element blink occurred, the velocity of the lid over pupil motion component of these endogenous eye blinks was significantly faster on decision correct than on decision error trials. Furthermore, the highly variable, long duration closings associated with the decision response produced slow eye movements in the horizontal plane (SEM) which were more frequent and significantly longer in duration on decision error versus decision correct responses.

  16. Comparison of direct and heterodyne detection optical intersatellite communication links

    NASA Technical Reports Server (NTRS)

    Chen, C. C.; Gardner, C. S.

    1987-01-01

    The performance of direct and heterodyne detection optical intersatellite communication links are evaluated and compared. It is shown that the performance of optical links is very sensitive to the pointing and tracking errors at the transmitter and receiver. In the presence of random pointing and tracking errors, optimal antenna gains exist that will minimize the required transmitter power. In addition to limiting the antenna gains, random pointing and tracking errors also impose a power penalty in the link budget. This power penalty is between 1.6 to 3 dB for a direct detection QPPM link, and 3 to 5 dB for a heterodyne QFSK system. For the heterodyne systems, the carrier phase noise presents another major factor of performance degradation that must be considered. In contrast, the loss due to synchronization error is small. The link budgets for direct and heterodyne detection systems are evaluated. It is shown that, for systems with large pointing and tracking errors, the link budget is dominated by the spatial tracking error, and the direct detection system shows a superior performance because it is less sensitive to the spatial tracking error. On the other hand, for systems with small pointing and tracking jitters, the antenna gains are in general limited by the launch cost, and suboptimal antenna gains are often used in practice. In which case, the heterodyne system has a slightly higher power margin because of higher receiver sensitivity.

  17. A phantom study for the comparison of different brands of computed tomography scanners and software packages for endovascular aneurysm repair sizing and planning.

    PubMed

    Velu, Juliëtte F; Groot Jebbink, Erik; de Vries, Jean-Paul Pm; van der Palen, Job Am; Slump, Cornelis H; Geelkerken, Robert H

    2018-04-01

    Objectives Correct sizing of endoprostheses used for the treatment of abdominal aortic aneurysms is important to prevent endoleaks and migration. Sizing requires several steps and each step introduces a possible sizing error. The goal of this study was to investigate the magnitude of these errors compared to the golden standard: a vessel phantom. This study focuses on the errors in sizing with three different brands of computed tomography angiography scanners in combination with three reconstruction software packages. Methods Three phantoms with a different diameter, altitude and azimuth were scanned with three computed tomography scanners: Toshiba Aquilion 64-slice, Philips Brilliance iCT 256-slice and Siemens Somatom Sensation 64-slice. The phantom diameters were determined in the stretched view after central lumen line reconstruction by three observers using Simbionix PROcedure Rehearsal Studio, 3mensio and TeraRecon planning software. The observers, all novices in sizing endoprostheses using planning software, measured 108 slices each. Two senior vascular surgeons set the tolerated error margin of sizing on ±1.0 mm. Results In total, 11.3% of the measurements (73/648) were outside the set margins of ±1.0 mm from the phantom diameter, with significant differences between the scanner types (14.8%, 12.1%, 6.9% for the Siemens scanner, Philips scanner and Toshiba scanner, respectively, p-value = 0.032), but not between the software packages (8.3%, 11.1%, 14.4%, p-value = 0.141) or the observers (10.6%, 9.7%, 13.4%, p-value = 0.448). Conclusions It can be concluded that the errors in sizing were independent of the used software packages, but the phantoms scanned with Siemens scanner were significantly more measured incorrectly than the phantoms scanned with the Toshiba scanner. Consequently, awareness on the type of computed tomography scanner and computed tomography scanner setting is necessary, especially in complex abdominal aortic aneurysms sizing for fenestrated or branched endovascular aneurysm repair if appropriate the sizing is of upmost importance.

  18. [Effect of stock abundance and environmental factors on the recruitment success of small yellow croaker in the East China Sea].

    PubMed

    Liu, Zun-lei; Yuan, Xing-wei; Yang, Lin-lin; Yan, Li-ping; Zhang, Hui; Cheng, Jia-hua

    2015-02-01

    Multiple hypotheses are available to explain recruitment rate. Model selection methods can be used to identify the best model that supports a particular hypothesis. However, using a single model for estimating recruitment success is often inadequate for overexploited population because of high model uncertainty. In this study, stock-recruitment data of small yellow croaker in the East China Sea collected from fishery dependent and independent surveys between 1992 and 2012 were used to examine density-dependent effects on recruitment success. Model selection methods based on frequentist (AIC, maximum adjusted R2 and P-values) and Bayesian (Bayesian model averaging, BMA) methods were applied to identify the relationship between recruitment and environment conditions. Interannual variability of the East China Sea environment was indicated by sea surface temperature ( SST) , meridional wind stress (MWS), zonal wind stress (ZWS), sea surface pressure (SPP) and runoff of Changjiang River ( RCR). Mean absolute error, mean squared predictive error and continuous ranked probability score were calculated to evaluate the predictive performance of recruitment success. The results showed that models structures were not consistent based on three kinds of model selection methods, predictive variables of models were spawning abundance and MWS by AIC, spawning abundance by P-values, spawning abundance, MWS and RCR by maximum adjusted R2. The recruitment success decreased linearly with stock abundance (P < 0.01), suggesting overcompensation effect in the recruitment success might be due to cannibalism or food competition. Meridional wind intensity showed marginally significant and positive effects on the recruitment success (P = 0.06), while runoff of Changjiang River showed a marginally negative effect (P = 0.07). Based on mean absolute error and continuous ranked probability score, predictive error associated with models obtained from BMA was the smallest amongst different approaches, while that from models selected based on the P-value of the independent variables was the highest. However, mean squared predictive error from models selected based on the maximum adjusted R2 was highest. We found that BMA method could improve the prediction of recruitment success, derive more accurate prediction interval and quantitatively evaluate model uncertainty.

  19. Modeling Precheck Parallel Screening Process in the Face of Strategic Applicants with Incomplete Information and Screening Errors.

    PubMed

    Song, Cen; Zhuang, Jun

    2018-01-01

    In security check systems, tighter screening processes increase the security level, but also cause more congestion, which could cause longer wait times. Having to deal with more congestion in lines could also cause issues for the screeners. The Transportation Security Administration (TSA) Precheck Program was introduced to create fast lanes in airports with the goal of expediting passengers who the TSA does not deem to be threats. In this lane, the TSA allows passengers to enjoy fewer restrictions in order to speed up the screening time. Motivated by the TSA Precheck Program, we study parallel queueing imperfect screening systems, where the potential normal and adversary participants/applicants decide whether to apply to the Precheck Program or not. The approved participants would be assigned to a faster screening channel based on a screening policy determined by an approver, who balances the concerns of safety of the passengers and congestion of the lines. There exist three types of optimal normal applicant's application strategy, which depend on whether the marginal payoff is negative or positive, or whether the marginal benefit equals the marginal cost. An adversary applicant would not apply when the screening policy is sufficiently large or the number of utilized benefits is sufficiently small. The basic model is extended by considering (1) applicants' parameters to follow different distributions and (2) applicants to have risk levels, where the approver determines the threshold value needed to qualify for Precheck. This article integrates game theory and queueing theory to study the optimal screening policy and provides some insights to imperfect parallel queueing screening systems. © 2017 Society for Risk Analysis.

  20. Quantification of Organ Motion During Chemoradiotherapy of Rectal Cancer Using Cone-Beam Computed Tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chong, Irene; Hawkins, Maria; Hansen, Vibeke

    2011-11-15

    Purpose: There has been no previously published data related to the quantification of rectal motion using cone-beam computed tomography (CBCT) during standard conformal long-course chemoradiotherapy. The purpose of the present study was to quantify the interfractional changes in rectal movement and dimensions and rectal and bladder volume using CBCT and to quantify the bony anatomy displacements to calculate the margins required to account for systematic ({Sigma}) and random ({sigma}) setup errors. Methods and Materials: CBCT images were acquired from 16 patients on the first 3 days of treatment and weekly thereafter. The rectum and bladder were outlined on all CBCTmore » images. The interfraction movement was measured using fixed bony landmarks as references to define the rectal location (upper, mid, and low), The maximal rectal diameter at the three rectal locations was also measured. The bony anatomy displacements were quantified, allowing the calculation of systematic ({Sigma}) and random ({sigma}) setup errors. Results: A total of 123 CBCT data sets were analyzed. Analysis of variance for standard deviation from planning scans showed that rectal anterior and lateral wall movement differed significantly by rectal location. Anterior and lateral rectal wall movements were larger in the mid and upper rectum compared with the low rectum. The posterior rectal wall movement did not change significantly with the rectal location. The rectal diameter changed more in the mid and upper than in the low rectum. No consistent relationship was found between the rectal and bladder volume and time, nor was a significant relationship found between the rectal volume and bladder volume. Conclusions: In the present study, the anterior and lateral rectal movement and rectal diameter were found to change most in the upper rectum, followed by the mid rectum, with the smallest changes seen in the low rectum. Asymmetric margins are warranted to ensure phase 2 coverage.« less

  1. Carbon-Ion Pencil Beam Scanning Treatment With Gated Markerless Tumor Tracking: An Analysis of Positional Accuracy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mori, Shinichiro, E-mail: shinshin@nirs.go.jp; Karube, Masataka; Shirai, Toshiyuki

    Purpose: Having implemented amplitude-based respiratory gating for scanned carbon-ion beam therapy, we sought to evaluate its effect on positional accuracy and throughput. Methods and Materials: A total of 10 patients with tumors of the lung and liver participated in the first clinical trials at our center. Treatment planning was conducted with 4-dimensional computed tomography (4DCT) under free-breathing conditions. The planning target volume (PTV) was calculated by adding a 2- to 3-mm setup margin outside the clinical target volume (CTV) within the gating window. The treatment beam was on when the CTV was within the PTV. Tumor position was detected inmore » real time with a markerless tumor tracking system using paired x-ray fluoroscopic imaging units. Results: The patient setup error (mean ± SD) was 1.1 ± 1.2 mm/0.6 ± 0.4°. The mean internal gating accuracy (95% confidence interval [CI]) was 0.5 mm. If external gating had been applied to this treatment, the mean gating accuracy (95% CI) would have been 4.1 mm. The fluoroscopic radiation doses (mean ± SD) were 23.7 ± 21.8 mGy per beam and less than 487.5 mGy total throughout the treatment course. The setup, preparation, and irradiation times (mean ± SD) were 8.9 ± 8.2 min, 9.5 ± 4.6 min, and 4.0 ± 2.4 min, respectively. The treatment room occupation time was 36.7 ± 67.5 min. Conclusions: Internal gating had a much higher accuracy than external gating. By the addition of a setup margin of 2 to 3 mm, internal gating positional error was less than 2.2 mm at 95% CI.« less

  2. Systematic reviews: I. The correlation between laboratory tests on marginal quality and bond strength. II. The correlation between marginal quality and clinical outcome.

    PubMed

    Heintze, Siegward D

    2007-01-01

    An accepted principle in restorative dentistry states that restorations should be placed with the best marginal quality possible to avoid postoperative sensitivity, marginal discoloration, and secondary caries. Different laboratory methods claim to predict the clinical performance of restorative materials, for example, tests of bond strength and microleakage and gap analysis. The purpose of this review was twofold: (1) find studies that correlated the results of bond strength tests with either microleakage or gap analysis for the same materials, and (2) find studies that correlated the results of microleakage and/or gaps with the clinical parameters for the same materials. Furthermore, influencing factors on the results of the laboratory tests were reviewed and assessed. For the first question, searches for studies were conducted in the MEDLINE database and IADR/AADR abtracts online with specific search and inclusion criteria. The outcome for each study was assessed on the basis of the statistical test applied in the study, and finally the number of studies with or without correlation was compiled. For the second question, results of the quantitative marginal analysis of Class V restorations published by the University of Zürich with the same test protocol and prospective clinical trials were searched that investigated the same materials for at least 2 years in Class V cavities. Pearson correlation coefficients were calculated for pooled data of materials and clinical outcome parameters such as retention loss, marginal discoloration, marginal integrity, and secondary caries. For the correlation of dye penetration and clinical outcome, studies on Class V restorations published by the same research institute were searched in MEDLINE that examined the same adhesive systems as the selected clinical trials. For the correlation bond strength/microleakage, 30 studies were included into the review, and for the correlation bond strength/gap analysis 18 studies. For both topics, about 80% of the studies revealed that there was no correlation between the two methods. For the correlation quantitative marginal analysis/clinical outcome, data were compared to the clinical outcome of 11 selected clinical studies. In only 2 out of the 11 studies (18%) did the clinical outcome match the prognosis based on the laboratory tests; the remaining studies did not show any correlation. When pooling data on 20 adhesive systems, no correlation was found between the percentage of continuous margin of restorations placed in extracted premolars and the percentage of teeth that showed no retention loss in clinical studies, no discoloured margins, acceptable margins, or absence of secondary caries. With regard to the correlation of dye penetration and clinical studies, no sufficient number of studies was found that matched the inclusion criteria. However, literature data suggest that there is no correlation between microleakage data as measured in the laboratory and clinical parameters. The results of bond strength tests did not correlate with laboratory tests that evaluated the marginal seal of restorations such as microleakage or gap analysis. The quantitative marginal analysis of Class V fillings in the laboratory was unable to predict the performance of the same materials in vivo. Therefore, microleakage tests or the quantitative marginal analysis should be abandoned and research should focus on laboratory tests that are validated with regard to their ability to satisfactorily predict the clinical performance of restorative materials.

  3. Margined winner-take-all: New learning rule for pattern recognition.

    PubMed

    Fukushima, Kunihiko

    2018-01-01

    The neocognitron is a deep (multi-layered) convolutional neural network that can be trained to recognize visual patterns robustly. In the intermediate layers of the neocognitron, local features are extracted from input patterns. In the deepest layer, based on the features extracted in the intermediate layers, input patterns are classified into classes. A method called IntVec (interpolating-vector) is used for this purpose. This paper proposes a new learning rule called margined Winner-Take-All (mWTA) for training the deepest layer. Every time when a training pattern is presented during the learning, if the result of recognition by WTA (Winner-Take-All) is an error, a new cell is generated in the deepest layer. Here we put a certain amount of margin to the WTA. In other words, only during the learning, a certain amount of handicap is given to cells of classes other than that of the training vector, and the winner is chosen under this handicap. By introducing the margin to the WTA, we can generate a compact set of cells, with which a high recognition rate can be obtained with a small computational cost. The ability of this mWTA is demonstrated by computer simulation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. [A new technique for ensuring negative surgical margins during partial nephrectomy: the ex vivo ultrasound control].

    PubMed

    Desmonts, A; Tillou, X; Le Gal, S; Secco, M; Orczyk, C; Bensadoun, H; Doerfler, A

    2013-10-01

    To evaluate the feasibility and the efficiency of intraoperative ex vivo ultrasound of resection margins in patients undergoing partial nephrectomy by urologist. Patients undergoing partial nephrectomy from July 2010 to November 2012 for T1-T2 renal tumors were included in analysis. Tumor margin status was immediately determined by ex vivo ultrasound done by the surgeon himself. Results were compared with margin status on definitive pathological evaluation. A total of 26 men and 15 women with a median age of 61 (30-82) years old were included in analysis. Intraoperative ex vivo ultrasound revealed negative surgical margins in 38 cases and positive margins in two. Final pathological results revealed negative margins in all except one case. Ultrasound sensitivity and specificity were 100% and 97%, respectively. Mean ultrasound duration was 1minute±1. Mean tumor and margin sizes were 3.4±1.8cm and 2.38±1.76mm, respectively. Intraoperative ex vivo ultrasound of resection margins in patients undergoing partial nephrectomy by a urologist seemed to be feasible, efficient and easy. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  5. Meta-analysis of studies with bivariate binary outcomes: a marginal beta-binomial model approach

    PubMed Central

    Chen, Yong; Hong, Chuan; Ning, Yang; Su, Xiao

    2018-01-01

    When conducting a meta-analysis of studies with bivariate binary outcomes, challenges arise when the within-study correlation and between-study heterogeneity should be taken into account. In this paper, we propose a marginal beta-binomial model for the meta-analysis of studies with binary outcomes. This model is based on the composite likelihood approach, and has several attractive features compared to the existing models such as bivariate generalized linear mixed model (Chu and Cole, 2006) and Sarmanov beta-binomial model (Chen et al., 2012). The advantages of the proposed marginal model include modeling the probabilities in the original scale, not requiring any transformation of probabilities or any link function, having closed-form expression of likelihood function, and no constraints on the correlation parameter. More importantly, since the marginal beta-binomial model is only based on the marginal distributions, it does not suffer from potential misspecification of the joint distribution of bivariate study-specific probabilities. Such misspecification is difficult to detect and can lead to biased inference using currents methods. We compare the performance of the marginal beta-binomial model with the bivariate generalized linear mixed model and the Sarmanov beta-binomial model by simulation studies. Interestingly, the results show that the marginal beta-binomial model performs better than the Sarmanov beta-binomial model, whether or not the true model is Sarmanov beta-binomial, and the marginal beta-binomial model is more robust than the bivariate generalized linear mixed model under model misspecifications. Two meta-analyses of diagnostic accuracy studies and a meta-analysis of case-control studies are conducted for illustration. PMID:26303591

  6. Clinical and semiquantitative marginal analysis of four tooth-coloured inlay systems at 3 years.

    PubMed

    Gladys, S; Van Meerbeek, B; Inokoshi, S; Willems, G; Braem, M; Lambrechts, P; Vanherle, G

    1995-12-01

    The marginal quality of four tooth-coloured inlay systems was clinically investigated and subjected to computer-aided semiquantitative marginal analysis under scanning electron microscopy (SEM) after 3 years of clinical service. Three of the restoration types were made using the Cerec CAD-CAM apparatus: one was milled from preformed glass ceramic blocks, and the two other inlay types were milled from preformed porcelain blocks. The fourth system was based on an experimental indirect resin composite inlay system. Each inlay type was luted with a different luting resin composite. The clinical evaluation was performed with a mirror and explorer by two clinicians separately, and the marginal analysis was conducted microscopically on replicas (SEM x 200). After 3 years in situ, all the restorations were clinically acceptable. No recurrent caries was observed. Marginal analysis under SEM detected a high percentage of submargination for all four systems, which suggests that their respective resin composite luting agents were all subject to wear. The percentage of marginal fractures on the enamel side as well as on the inlay side did not increase dramatically compared to the 6-month results. The first recall after 6 months of clinical service indicated how tooth-coloured inlays behave at their margins. The 3-year results confirmed the early findings, indicating that wear of resin composite lutes is important and present in all systems. The two ceramic materials showed a similar behaviour at the margins. The resin composite inlay performed better at the inlay site than at the enamel site.

  7. Data analysis in emission tomography using emission-count posteriors

    NASA Astrophysics Data System (ADS)

    Sitek, Arkadiusz

    2012-11-01

    A novel approach to the analysis of emission tomography data using the posterior probability of the number of emissions per voxel (emission count) conditioned on acquired tomographic data is explored. The posterior is derived from the prior and the Poisson likelihood of the emission-count data by marginalizing voxel activities. Based on emission-count posteriors, examples of Bayesian analysis including estimation and classification tasks in emission tomography are provided. The application of the method to computer simulations of 2D tomography is demonstrated. In particular, the minimum-mean-square-error point estimator of the emission count is demonstrated. The process of finding this estimator can be considered as a tomographic image reconstruction technique since the estimates of the number of emissions per voxel divided by voxel sensitivities and acquisition time are the estimates of the voxel activities. As an example of a classification task, a hypothesis stating that some region of interest (ROI) emitted at least or at most r-times the number of events in some other ROI is tested. The ROIs are specified by the user. The analysis described in this work provides new quantitative statistical measures that can be used in decision making in diagnostic imaging using emission tomography.

  8. PAX5 methylation detection by droplet digital PCR for ultra-sensitive deep surgical margins analysis of head and neck squamous cell carcinoma

    PubMed Central

    Hayashi, Masamichi; Guerrero-Preston, Rafael; Sidransky, David; Koch, Wayne M.

    2015-01-01

    Molecular deep surgical margin analysis has been shown to predict locoregional recurrences of head and neck squamous cell carcinoma (HNSCC). In order to improve the accuracy and versatility of the analysis, we used a highly tumor-specific methylation marker and highly sensitive detection technology to test DNA from surgical margins. Histologically cancer-negative deep surgical margin samples were prospectively collected from 82 eligible HNSCC surgeries by an imprinting procedure (n=75) and primary tissue collection (n=70). Bisulfite treated DNA from each sample was analyzed by both conventional quantitative methylation-specific polymerase chain reaction (QMSP) and QMSP by droplet digital PCR (ddQMSP) targeting PAX5 gene promoter methylation. The association between the presence of PAX5 methylation and locoregional recurrence free survival (LRFS) was evaluated. PAX5 methylation was found in 68.0% (51/75) of tumors in the imprint samples and 71.4% (50/70) in the primary tissue samples. Among cases which did not have postoperative radiation, (n=31 in imprint samples, n=29 in tissue samples), both conventional QMSP and ddQMSP revealed that PAX5 methylation positive margins was significantly associated with poor LRFS by univariate analysis. In particular, ddQMSP increased detection of the PAX5 marker from 29% to 71% in the non-radiated imprint cases. Also, PAX5 methylated imprint margins were an excellent predictor of poor LRFS (HR=3.89, 95%CI:1.19-17.52, P=0.023) by multivariate analysis. PAX5 methylation appears to be an excellent tumor-specific marker for molecular deep surgical margin analysis of HNSCC. Moreover, the ddQMSP assay displays increased sensitivity for methylation marker detection. PMID:26304463

  9. The application of Gaussian mixture models for signal quantification in MALDI-TOF mass spectrometry of peptides.

    PubMed

    Spainhour, John Christian G; Janech, Michael G; Schwacke, John H; Velez, Juan Carlos Q; Ramakrishnan, Viswanathan

    2014-01-01

    Matrix assisted laser desorption/ionization time-of-flight (MALDI-TOF) coupled with stable isotope standards (SIS) has been used to quantify native peptides. This peptide quantification by MALDI-TOF approach has difficulties quantifying samples containing peptides with ion currents in overlapping spectra. In these overlapping spectra the currents sum together, which modify the peak heights and make normal SIS estimation problematic. An approach using Gaussian mixtures based on known physical constants to model the isotopic cluster of a known compound is proposed here. The characteristics of this approach are examined for single and overlapping compounds. The approach is compared to two commonly used SIS quantification methods for single compound, namely Peak Intensity method and Riemann sum area under the curve (AUC) method. For studying the characteristics of the Gaussian mixture method, Angiotensin II, Angiotensin-2-10, and Angiotenisn-1-9 and their associated SIS peptides were used. The findings suggest, Gaussian mixture method has similar characteristics as the two methods compared for estimating the quantity of isolated isotopic clusters for single compounds. All three methods were tested using MALDI-TOF mass spectra collected for peptides of the renin-angiotensin system. The Gaussian mixture method accurately estimated the native to labeled ratio of several isolated angiotensin peptides (5.2% error in ratio estimation) with similar estimation errors to those calculated using peak intensity and Riemann sum AUC methods (5.9% and 7.7%, respectively). For overlapping angiotensin peptides, (where the other two methods are not applicable) the estimation error of the Gaussian mixture was 6.8%, which is within the acceptable range. In summary, for single compounds the Gaussian mixture method is equivalent or marginally superior compared to the existing methods of peptide quantification and is capable of quantifying overlapping (convolved) peptides within the acceptable margin of error.

  10. Influence of different cusp coverage methods for the extension of ceramic inlays on marginal integrity and enamel crack formation in vitro.

    PubMed

    Krifka, Stephanie; Stangl, Martin; Wiesbauer, Sarah; Hiller, Karl-Anton; Schmalz, Gottfried; Federlin, Marianne

    2009-09-01

    No information is available to date about cusp design of thin (1.0 mm) non-functional cusps and its influence upon (1) marginal integrity of ceramic inlays (CI) and partial ceramic crowns (PCC) and (2) crack formation of dental tissues. The aim of this in vitro study was to investigate the effect of cusp coverage of thin non-functional cusps on marginal integrity and enamel crack formation. CI and PCC preparations were performed on extracted human molars. Non-functional cusps were adjusted to 1.0-mm wall thickness and 1.0-mm wall thickness with horizontal reduction of about 2.0 mm. Ceramic restorations (Vita Mark II, Cerec3 System) were adhesively luted with Excite/Variolink II. The specimens were exposed to thermocycling and central mechanical loading. Marginal integrity was assessed by evaluating dye penetration after thermal cycling and mechanical loading. Enamel cracks were documented under a reflective-light microscope. The data were statistically analysed with the Mann-Whitney U test, the Fishers exact test (alpha = 0.05) and the error rates method. PCC with horizontal reduction of non-functional cusps showed statistically significant less microleakage than PCC without such a cusp coverage. Preparation designs with horizontal reduction of non-functional cusps showed a tendency to less enamel crack formation than preparation designs without cusp coverage. Thin non-functional cusp walls of adhesively bonded restorations should be completely covered or reduced to avoid enamel cracks and marginal deficiency.

  11. Nuclear power: Siting and safety

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Openshaw, S.

    1986-01-01

    By 2030, half, or even two-thirds, of all electricity may be generated by nuclear power. Major reactor accidents are still expected to be rare occurrences, but nuclear safety is largely a matter of faith. Terrorist attacks, sabotage, and human error could cause a significant accident. Reactor siting can offer an additional, design-independent margin of safety. Remote geographical sites for new plants would minimize health risks, protect the industry from negative changes in public opinion concerning nuclear energy, and improve long-term public acceptance of nuclear power. U.K. siting practices usually do not consider the contribution to safety that could be obtainedmore » from remote sites. This book discusses the present trends of siting policies of nuclear power and their design-independent margin of safety.« less

  12. Supervised learning with decision margins in pools of spiking neurons.

    PubMed

    Le Mouel, Charlotte; Harris, Kenneth D; Yger, Pierre

    2014-10-01

    Learning to categorise sensory inputs by generalising from a few examples whose category is precisely known is a crucial step for the brain to produce appropriate behavioural responses. At the neuronal level, this may be performed by adaptation of synaptic weights under the influence of a training signal, in order to group spiking patterns impinging on the neuron. Here we describe a framework that allows spiking neurons to perform such "supervised learning", using principles similar to the Support Vector Machine, a well-established and robust classifier. Using a hinge-loss error function, we show that requesting a margin similar to that of the SVM improves performance on linearly non-separable problems. Moreover, we show that using pools of neurons to discriminate categories can also increase the performance by sharing the load among neurons.

  13. Assessment and quantification of patient set-up errors in nasopharyngeal cancer patients and their biological and dosimetric impact in terms of generalized equivalent uniform dose (gEUD), tumour control probability (TCP) and normal tissue complication probability (NTCP)

    PubMed Central

    Marcie, S; Fellah, M; Chami, S; Mekki, F

    2015-01-01

    Objective: The aim of this study is to assess and quantify patients' set-up errors using an electronic portal imaging device and to evaluate their dosimetric and biological impact in terms of generalized equivalent uniform dose (gEUD) on predictive models, such as the tumour control probability (TCP) and the normal tissue complication probability (NTCP). Methods: 20 patients treated for nasopharyngeal cancer were enrolled in the radiotherapy–oncology department of HCA. Systematic and random errors were quantified. The dosimetric and biological impact of these set-up errors on the target volume and the organ at risk (OARs) coverage were assessed using calculation of dose–volume histogram, gEUD, TCP and NTCP. For this purpose, an in-house software was developed and used. Results: The standard deviations (1SDs) of the systematic set-up and random set-up errors were calculated for the lateral and subclavicular fields and gave the following results: ∑ = 0.63 ± (0.42) mm and σ = 3.75 ± (0.79) mm, respectively. Thus a planning organ at risk volume (PRV) margin of 3 mm was defined around the OARs, and a 5-mm margin used around the clinical target volume. The gEUD, TCP and NTCP calculations obtained with and without set-up errors have shown increased values for tumour, where ΔgEUD (tumour) = 1.94% Gy (p = 0.00721) and ΔTCP = 2.03%. The toxicity of OARs was quantified using gEUD and NTCP. The values of ΔgEUD (OARs) vary from 0.78% to 5.95% in the case of the brainstem and the optic chiasm, respectively. The corresponding ΔNTCP varies from 0.15% to 0.53%, respectively. Conclusion: The quantification of set-up errors has a dosimetric and biological impact on the tumour and on the OARs. The developed in-house software using the concept of gEUD, TCP and NTCP biological models has been successfully used in this study. It can be used also to optimize the treatment plan established for our patients. Advances in knowledge: The gEUD, TCP and NTCP may be more suitable tools to assess the treatment plans before treating the patients. PMID:25882689

  14. Interobserver delineation variation in lung tumour stereotactic body radiotherapy

    PubMed Central

    Persson, G F; Nygaard, D E; Hollensen, C; Munck af Rosenschöld, P; Mouritsen, L S; Due, A K; Berthelsen, A K; Nyman, J; Markova, E; Roed, A P; Roed, H; Korreman, S; Specht, L

    2012-01-01

    Objectives In radiotherapy, delineation uncertainties are important as they contribute to systematic errors and can lead to geographical miss of the target. For margin computation, standard deviations (SDs) of all uncertainties must be included as SDs. The aim of this study was to quantify the interobserver delineation variation for stereotactic body radiotherapy (SBRT) of peripheral lung tumours using a cross-sectional study design. Methods 22 consecutive patients with 26 tumours were included. Positron emission tomography/CT scans were acquired for planning of SBRT. Three oncologists and three radiologists independently delineated the gross tumour volume. The interobserver variation was calculated as a mean of multiple SDs of distances to a reference contour, and calculated for the transversal plane (SDtrans) and craniocaudal (CC) direction (SDcc) separately. Concordance indexes and volume deviations were also calculated. Results Median tumour volume was 13.0 cm3, ranging from 0.3 to 60.4 cm3. The mean SDtrans was 0.15 cm (SD 0.08 cm) and the overall mean SDcc was 0.26 cm (SD 0.15 cm). Tumours with pleural contact had a significantly larger SDtrans than tumours surrounded by lung tissue. Conclusions The interobserver delineation variation was very small in this systematic cross-sectional analysis, although significantly larger in the CC direction than in the transversal plane, stressing that anisotropic margins should be applied. This study is the first to make a systematic cross-sectional analysis of delineation variation for peripheral lung tumours referred for SBRT, establishing the evidence that interobserver variation is very small for these tumours. PMID:22919015

  15. On the tidally driven circulation in the South China Sea: modeling and analysis

    NASA Astrophysics Data System (ADS)

    Nelko, Varjola; Saha, Abhishek; Chua, Vivien P.

    2014-03-01

    The South China Sea is a large marginal sea surrounded by land masses and island chains, and characterized by complex bathymetry and irregular coastlines. An unstructured-grid SUNTANS model is employed to perform depth-averaged simulations of the circulation in the South China Sea. The model is tidally forced at the open ocean boundaries using the eight main tidal constituents as derived from the OSU Tidal Prediction Software. The model simulations are performed for the year 2005 using a time step of 60 s. The model reproduces the spring-neap and diurnal and semidiurnal variability in the observed data. Skill assessment of the model is performed by comparing model-predicted surface elevations with observations. For stations located in the central region of the South China Sea, the root mean squared errors (RMSE) are less than 10 % and the Pearson's correlation coefficient ( r) is as high as 0.9. The simulations show that the quality of the model prediction is dependent on the horizontal grid resolution, coastline accuracy, and boundary locations. The maximum RMSE errors and minimum correlation coefficients occur at Kaohsiung (located in northern South China Sea off Taiwan coast) and Tioman (located in southern South China Sea off Malaysia coast). This may be explained with spectral analysis of sea level residuals and winds, which reveal dynamics at Kaohsiung and Tioman are strongly influenced by the seasonal monsoon winds. Our model demonstrates the importance of tidally driven circulation in the central region of the South China Sea.

  16. The Disk Wind in the Rapidly Spinning Stellar-mass Black Hole 4U 1630-472 Observed with NuSTAR

    NASA Technical Reports Server (NTRS)

    King, Ashley L.; Walton, Dominic J.; Miller, Jon M.; Barret, Didier; Boggs, Steven E.; Christensen, Finn E.; Craig, William W.; Fabian, Andy C.; Furst, Felix; Hailey, Charles J.; hide

    2014-01-01

    We present an analysis of a short NuSTAR observation of the stellar-mass black hole and low-mass X-ray binary 4U 1630-472. Reflection from the inner accretion disk is clearly detected for the first time in this source, owing to the sensitivity of NuSTAR. With fits to the reflection spectrum, we find evidence for a rapidly spinning black hole, a* = 0.985(+0.005/-0.014) (1 sigma statistical errors). However, archival data show that the source has relatively low radio luminosity. Recently claimed relationships between jet power and black hole spin would predict either a lower spin or a higher peak radio luminosity. We also report the clear detection of an absorption feature at 7.03 +/- 0.03 keV, likely signaling a disk wind. If this line arises in dense, moderately ionized gas (log xi = 3.6(+0.2/-0.3) and is dominated by He-like Fe xxv, the wind has a velocity of v/c = 0.043(+0.002/-0.007) (12900(+600/-2100) km s(exp -1)). If the line is instead associated with a more highly ionized gas (log xi = 6.1(+0.7/-0.6)), and is dominated by Fe xxvi, evidence of a blueshift is only marginal, after taking systematic errors into account. Our analysis suggests the ionized wind may be launched within 200-1100 Rg, and may be magnetically driven.

  17. Measurement error of mean sac diameter and crown-rump length among pregnant women at Mulago hospital, Uganda.

    PubMed

    Ali, Sam; Byanyima, Rosemary Kusaba; Ononge, Sam; Ictho, Jerry; Nyamwiza, Jean; Loro, Emmanuel Lako Ernesto; Mukisa, John; Musewa, Angella; Nalutaaya, Annet; Ssenyonga, Ronald; Kawooya, Ismael; Temper, Benjamin; Katamba, Achilles; Kalyango, Joan; Karamagi, Charles

    2018-05-04

    Ultrasonography is essential in the prenatal diagnosis and care for the pregnant mothers. However, the measurements obtained often contain a small percentage of unavoidable error that may have serious clinical implications if substantial. We therefore evaluated the level of intra and inter-observer error in measuring mean sac diameter (MSD) and crown-rump length (CRL) in women between 6 and 10 weeks' gestation at Mulago hospital. This was a cross-sectional study conducted from January to March 2016. We enrolled 56 women with an intrauterine single viable embryo. The women were scanned using a transvaginal (TVS) technique by two observers who were blinded of each other's measurements. Each observer measured the CRL twice and the MSD once for each woman. Intra-class correlation coefficients (ICCs), 95% limits of agreement (LOA) and technical error of measurement (TEM) were used for analysis. Intra-observer ICCs for CRL measurements were 0.995 and 0.993 while inter-observer ICCs were 0.988 for CRL and 0.955 for MSD measurements. Intra-observer 95% LOA for CRL were ± 2.04 mm and ± 1.66 mm. Inter-observer LOA were ± 2.35 mm for CRL and ± 4.87 mm for MSD. The intra-observer relative TEM for CRL were 4.62% and 3.70% whereas inter-observer relative TEM were 5.88% and 5.93% for CRL and MSD respectively. Intra- and inter-observer error of CRL and MSD measurements among pregnant women at Mulago hospital were acceptable. This implies that at Mulago hospital, the error in pregnancy dating is within acceptable margins of ±3 days in first trimester, and the CRL and MSD cut offs of ≥7 mm and ≥ 25 mm respectively are fit for diagnosis of miscarriage on TVS. These findings should be extrapolated to the whole country with caution. Sonographers can achieve acceptable and comparable diagnostic accuracy levels of MSD and CLR measurements with proper training and adherence to practice guidelines.

  18. Analysis of Fractal Parameters of the Lunar Surface

    NASA Astrophysics Data System (ADS)

    Nefedyev, Yuri; Petrova, Natalia; Andreev, Alexey; Demina, Natalya; Demin, Sergey

    2016-07-01

    Analysis of complex selenographic systems is a complicatedissue. This fully applies to the lunar topography. In this report a new method of the comparative reliable estimation of thelunar mapsdata is represented. The estimation was made by the comparison of high-altitude lines using the fractal analysis. The influence of the lunar macrofigure variances were determined by the method of fractal dimensions comparison. It should be noted the investigations of the lunar figure and rotation implystudy itsmarginal zone charts constructionwith various methods and this is traditionally carried out at the Engelhardt Astronomical Observatory (EAO). In particular this research is important for lunar occultations reductions and on the basis of that it is possible to solve a number of astrometric and astrophysical problems. By now the highly accurate theories of the lunar movement have been obtained and stars coordinates have been determined on the basis of space measurements with the several multiarcseconds accuracy but there are factors highly influencingon the accuracy of the results of these observations. They are: exactitude of the occultation moment recording, errors of the stars coordinates, accuracy of lunar ephemeris positions and unreliability of lunar marginal zone charts. Therefore difficulties arise during the reduction process of lunar occultations by the reason of irregularities of lunar limb. Existing charts of the lunar marginal zone have some defects. The researching of lunar marginal zone maps is very difficult. First of all, it concernsthe reliability of maps data. To resolve this task thecomparison method in which the structure of the high-altitude lines of data appropriated with identical lunar coordinates can used. However, such comparison requires a lot of calculations. In addition there is a large number of the marginal zone maps constructed by different methods and the accuracy of their data causes many questions. In other words, the lunar relief has a very complex structure and traditional research methods are unacceptable. After considering this, it was decided to use the method of fractal dimensionsd comparisons. For this purpose lunar marginal zone maps made in the celestial coordinate system (maps N1) and oneconstructed on the basis of data obtained from heliometric observations with taking into account thefirst model of the figure of the Moon given by Jakovkin (maps N2) were taken. The charts contain isohypses of the lunar marginal zone extending over 10" on both sides of the mean position of the limb line. In order to find thevariations of irregularities for thelimb points above the mean level of lunar surface werecomputed the position angles of this pointsP (reckoned from the centre of the Moon's disc) and D coordinates. This coordinates introduced by Hayn: P is the selenocentric longitude reckoned along the mean limb from the north pole of the Moon, like the position angles, and D is the latitude counted positively for that part of the disc that is nearer to the observer. Thus the data of our studies was obtained by identical types. Then the first, segments of a lunar marginal zone for every 45" on P were considered. For each segment profile of the surface for a constant D were constructed with a step of 2". Thus 80 profiles were obtained. Secondly the fractal dimensions d for each considered structure was defined. Third the obtained values d werecompared with the othersmaps considered in this work. The obtained results show some well agreement between the mean fractal dimensions for maps N1 and N2. Thus it can be concluded that the using of fractal method for lunar maps analysis to determine the accuracy of the presented to themdata give good results. The work was supported by grants RFBR 15-02-01638-a, 16-32-60071-mol-dk-a and 16-02-00496-a.

  19. Tissue preservation with mass spectroscopic analysis: Implications for cancer diagnostics.

    PubMed

    Hall, O Morgan; Peer, Cody J; Figg, William D

    2018-05-17

    Surgical intervention is a common treatment modality for localized cancer. Post-operative analysis involves evaluation of surgical margins to assess whether all malignant tissue has been resected because positive surgical margins lead to a greater likelihood of recurrence. Secondary treatments are utilized to minimize the negative effects of positive surgical margins. Recently, in Science Translational Medicine, Zhang et al describe a new mass spectroscopic technique that could potentially decrease the likelihood of positive surgical margins. Their nondestructive in vivo tissue sampling leads to a highly accurate and rapid cancer diagnosis with great precision between healthy and malignant tissue. This new tool has the potential to improve surgical margins and accelerate cancer diagnostics by analyzing biomolecular signatures of various tissues and diseases.

  20. Key Findings from a National Internet Survey of 400 Teachers and 95 Principals Conducted November 12-21, 2008

    ERIC Educational Resources Information Center

    McCleskey, Nicole

    2010-01-01

    This paper presents the key findings from a national Internet survey of 400 teachers and 95 principals. This survey was conducted November 12-21, 2008. The sample was based on a list provided by EMI Surveys, a custom online research sample provider with an extensive portfolio of projects. The margin of error for a sample of 495 interviews is [plus…

  1. Self-Supervised Learning to Visually Detect Terrain Surfaces for Autonomous Robots Operating in Forested Terrain

    DTIC Science & Technology

    2012-01-01

    values of EAFP, EAFN, and EAF, can be compared with three user-defined threshold values, TAFP, TAFN, and TAF . These threshold values determine the update...values were chosen as TAFP = E0AFP + 0.02, TAFN = E0AFN + 0.02, and TAF = E0AF + 0.02). We called the value of 0.02 the margin of error tolerance. In

  2. Future consequences of decreasing marginal production efficiency in the high-yielding dairy cow.

    PubMed

    Moallem, U

    2016-04-01

    The objectives were to examine the gross and marginal production efficiencies in high-yielding dairy cows and the future consequences on dairy industry profitability. Data from 2 experiments were used in across-treatments analysis (n=82 mid-lactation multiparous Israeli-Holstein dairy cows). Milk yields, body weights (BW), and dry matter intakes (DMI) were recorded daily. In both experiments, cows were fed a diet containing 16.5 to 16.6% crude protein and net energy for lactation (NEL) at 1.61 Mcal/kg of dry matter (DM). The means of milk yield, BW, DMI, NEL intake, and energy required for maintenance were calculated individually over the whole study, and used to calculate gross and marginal efficiencies. Data were analyzed in 2 ways: (1) simple correlation between variables; and (2) cows were divided into 3 subgroups, designated low, moderate, and high DMI (LDMI, MDMI, and HDMI), according to actual DMI per day: ≤ 26 kg (n=27); >26 through 28.2 kg (n=28); and >28.2 kg (n=27). The phenotypic Pearson correlations among variables were analyzed, and the GLM procedure was used to test differences between subgroups. The relationships between milk and fat-corrected milk yields and the corresponding gross efficiencies were positive, whereas BW and gross production efficiency were negatively correlated. The marginal production efficiency from DM and energy consumed decreased with increasing DMI. The difference between BW gain as predicted by the National Research Council model (2001) and the present measurements increased with increasing DMI (r=0.68). The average calculated energy balances were 1.38, 2.28, and 4.20 Mcal/d (standard error of the mean=0.64) in the LDMI, MDMI, and HDMI groups, respectively. The marginal efficiency for milk yields from DMI or energy consumed was highest in LDMI, intermediate in MDMI, and lowest in HDMI. The predicted BW gains for the whole study period were 22.9, 37.9, and 75.8 kg for the LDMI, MDMI, and HDMI groups, respectively. The present study demonstrated that marginal production efficiency decreased with increasing feed intake. Because of the close association between production and intake, the principle of diminishing marginal productivity may explain why increasing milk production (and consequently increasing intake) does not always enhance profitability. To maintain high production efficiency in the future, more attention should be given to optimizing rather than maximizing feed intake, a goal that could be achieved by nutritional manipulations that would increase digestibility or by using a diet of denser nutrients that would provide all nutritional requirements from lower intake. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  3. Surface Temperature Reconstructions for the Last 1000 Years

    NASA Astrophysics Data System (ADS)

    North, G. R.

    2006-12-01

    This is a presentation of results from a recently released report written by a committee established by the National Research Council and chaired by the speaker. The report was titled the same as the title of this talk. It focused on the methods of reconstructing the large scales of such surface temperature fields, since there has been considerable discussion in the scientific literature, assessments such as the IPCC, the popular press, blogs and even Congressional Hearings. The so-called `hockey stick' curve indicating a gradual cooling from the beginning of the record at about 1000AD to roughly 150 years ago when the curve take a steep upward trend (the so-called global warming). The original publications by Mann, Bradley and Hughes were careful to present and emphasize error margins that have been ignored by many in the controversy. The Committee found that numerous subsequent publications have reported reconstructions that utilized different data and different statistical assumptions. These all fall within the error margins of the original studies. While the committee has some reservations about the period prior to the year 1600AD, it still concludes that it is plausible that surface temperatures averaged over the Northern Hemisphere over the last three decades are plausibly the warmest for any such comparable period in the last 1000 years.

  4. Accuracy of Patient Specific Cutting Blocks in Total Knee Arthroplasty

    PubMed Central

    Helmy, Naeder; Kühnel, Stefanie P.

    2014-01-01

    Background. Long-term survival of total knee arthroplasty (TKA) is mainly determined by optimal positioning of the components and prosthesis alignment. Implant positioning can be optimized by computer assisted surgery (CAS). Patient specific cutting blocks (PSCB) seem to have the potential to improve component alignment compared to the conventional technique and to be comparable to CAS. Methods. 113 knees were selected for PSI and included in this study. Pre- and postoperative mechanical axis, represented by the hip-knee-angle (HKA), the proximal tibial angle (PTA), the distal femoral angle (DFA), and the tibial slope (TS) were measured and the deviation from expected ideal values was calculated. Results. With a margin of error of ±3°, success rates were 81.4% for HKA, 92.0% for TPA, and 94.7% for DFA. With the margin of error for alignments extended to ±4°, we obtained a success rate of 92.9% for the HKA, 98.2% for the PTA, and 99.1% for the DFA. The TS showed postoperative results of 2.86 ± 2.02° (mean change 1.76 ± 2.85°). Conclusion. PSCBs for TKA seem to restore the overall leg alignment. Our data suggest that each individual component can be implanted accurately and the results are comparable to the ones in CAS. PMID:25254210

  5. Read margin analysis of crossbar arrays using the cell-variability-aware simulation method

    NASA Astrophysics Data System (ADS)

    Sun, Wookyung; Choi, Sujin; Shin, Hyungsoon

    2018-02-01

    This paper proposes a new concept of read margin analysis of crossbar arrays using cell-variability-aware simulation. The size of the crossbar array should be considered to predict the read margin characteristic of the crossbar array because the read margin depends on the number of word lines and bit lines. However, an excessively high-CPU time is required to simulate large arrays using a commercial circuit simulator. A variability-aware MATLAB simulator that considers independent variability sources is developed to analyze the characteristics of the read margin according to the array size. The developed MATLAB simulator provides an effective method for reducing the simulation time while maintaining the accuracy of the read margin estimation in the crossbar array. The simulation is also highly efficient in analyzing the characteristic of the crossbar memory array considering the statistical variations in the cell characteristics.

  6. Evaluation of margins in head and neck squamous cell carcinoma from the surgeon's perspective.

    PubMed

    Baumeister, Philipp; Baumüller, Konstantin; Harréus, Ulrich; Reiter, Maximilian; Welz, Christian

    2018-05-01

    The surgeon's evaluation of resection status based on frozen section analysis during operation and pathological examination of resected specimens often differ. For this study, we recapitulated the surgeon's perspective during an operation, accordingly classified the evaluation of margins by the surgeon, and analyzed its impact on the outcome compared with the pathological results. This was a retrospective analysis. As data sources, paper-based and digital patient files, as well as the Munich Cancer Registry database were used. Three hundred ninety-six cases were included in this analysis. Only the evaluation of margins by the surgeon influenced local control, and the pathological results influenced disease-free survival (DFS). Surprisingly, margins of >5 mm of normal tissue to cancer growth led to local control and overall survival (OS) significantly worse than 1 to 5-mm resections. The evaluation of margins by the surgeon is of significant importance for local control and OS. It is largely based on frozen section analysis, which, therefore, should be used whenever possible. © 2018 Wiley Periodicals, Inc.

  7. Spatial and temporal distributions of surface mass balance between Concordia and Vostok stations, Antarctica, from combined radar and ice core data: first results and detailed error analysis

    NASA Astrophysics Data System (ADS)

    Le Meur, Emmanuel; Magand, Olivier; Arnaud, Laurent; Fily, Michel; Frezzotti, Massimo; Cavitte, Marie; Mulvaney, Robert; Urbini, Stefano

    2018-05-01

    Results from ground-penetrating radar (GPR) measurements and shallow ice cores carried out during a scientific traverse between Dome Concordia (DC) and Vostok stations are presented in order to infer both spatial and temporal characteristics of snow accumulation over the East Antarctic Plateau. Spatially continuous accumulation rates along the traverse are computed from the identification of three equally spaced radar reflections spanning about the last 600 years. Accurate dating of these internal reflection horizons (IRHs) is obtained from a depth-age relationship derived from volcanic horizons and bomb testing fallouts on a DC ice core and shows a very good consistency when tested against extra ice cores drilled along the radar profile. Accumulation rates are then inferred by accounting for density profiles down to each IRH. For the latter purpose, a careful error analysis showed that using a single and more accurate density profile along a DC core provided more reliable results than trying to include the potential spatial variability in density from extra (but less accurate) ice cores distributed along the profile. The most striking feature is an accumulation pattern that remains constant through time with persistent gradients such as a marked decrease from 26 mm w.e. yr-1 at DC to 20 mm w.e. yr-1 at the south-west end of the profile over the last 234 years on average (with a similar decrease from 25 to 19 mm w.e. yr-1 over the last 592 years). As for the time dependency, despite an overall consistency with similar measurements carried out along the main East Antarctic divides, interpreting possible trends remains difficult. Indeed, error bars in our measurements are still too large to unambiguously infer an apparent time increase in accumulation rate. For the proposed absolute values, maximum margins of error are in the range 4 mm w.e. yr-1 (last 234 years) to 2 mm w.e. yr-1 (last 592 years), a decrease with depth mainly resulting from the time-averaging when computing accumulation rates.

  8. MR Imaging Radiomics Signatures for Predicting the Risk of Breast Cancer Recurrence as Given by Research Versions of MammaPrint, Oncotype DX, and PAM50 Gene Assays.

    PubMed

    Li, Hui; Zhu, Yitan; Burnside, Elizabeth S; Drukker, Karen; Hoadley, Katherine A; Fan, Cheng; Conzen, Suzanne D; Whitman, Gary J; Sutton, Elizabeth J; Net, Jose M; Ganott, Marie; Huang, Erich; Morris, Elizabeth A; Perou, Charles M; Ji, Yuan; Giger, Maryellen L

    2016-11-01

    Purpose To investigate relationships between computer-extracted breast magnetic resonance (MR) imaging phenotypes with multigene assays of MammaPrint, Oncotype DX, and PAM50 to assess the role of radiomics in evaluating the risk of breast cancer recurrence. Materials and Methods Analysis was conducted on an institutional review board-approved retrospective data set of 84 deidentified, multi-institutional breast MR examinations from the National Cancer Institute Cancer Imaging Archive, along with clinical, histopathologic, and genomic data from The Cancer Genome Atlas. The data set of biopsy-proven invasive breast cancers included 74 (88%) ductal, eight (10%) lobular, and two (2%) mixed cancers. Of these, 73 (87%) were estrogen receptor positive, 67 (80%) were progesterone receptor positive, and 19 (23%) were human epidermal growth factor receptor 2 positive. For each case, computerized radiomics of the MR images yielded computer-extracted tumor phenotypes of size, shape, margin morphology, enhancement texture, and kinetic assessment. Regression and receiver operating characteristic analysis were conducted to assess the predictive ability of the MR radiomics features relative to the multigene assay classifications. Results Multiple linear regression analyses demonstrated significant associations (R 2 = 0.25-0.32, r = 0.5-0.56, P < .0001) between radiomics signatures and multigene assay recurrence scores. Important radiomics features included tumor size and enhancement texture, which indicated tumor heterogeneity. Use of radiomics in the task of distinguishing between good and poor prognosis yielded area under the receiver operating characteristic curve values of 0.88 (standard error, 0.05), 0.76 (standard error, 0.06), 0.68 (standard error, 0.08), and 0.55 (standard error, 0.09) for MammaPrint, Oncotype DX, PAM50 risk of relapse based on subtype, and PAM50 risk of relapse based on subtype and proliferation, respectively, with all but the latter showing statistical difference from chance. Conclusion Quantitative breast MR imaging radiomics shows promise for image-based phenotyping in assessing the risk of breast cancer recurrence. © RSNA, 2016 Online supplemental material is available for this article.

  9. Differential Motion Between Mediastinal Lymph Nodes and Primary Tumor in Radically Irradiated Lung Cancer Patients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schaake, Eva E.; Department of Radiation Oncology, The Netherlands Cancer Institute, Amsterdam; Rossi, Maddalena M.G.

    2014-11-15

    Purpose/Objective: In patients with locally advanced lung cancer, planning target volume margins for mediastinal lymph nodes and tumor after a correction protocol based on bony anatomy registration typically range from 1 to 1.5 cm. Detailed information about lymph node motion variability and differential motion with the primary tumor, however, is lacking from large series. In this study, lymph node and tumor position variability were analyzed in detail and correlated to the main carina to evaluate possible margin reduction. Methods and Materials: Small gold fiducial markers (0.35 × 5 mm) were placed in the mediastinal lymph nodes of 51 patients with non-small cell lung cancermore » during routine diagnostic esophageal or bronchial endoscopic ultrasonography. Four-dimensional (4D) planning computed tomographic (CT) and daily 4D cone beam (CB) CT scans were acquired before and during radical radiation therapy (66 Gy in 24 fractions). Each CBCT was registered in 3-dimensions (bony anatomy) and 4D (tumor, marker, and carina) to the planning CT scan. Subsequently, systematic and random residual misalignments of the time-averaged lymph node and tumor position relative to the bony anatomy and carina were determined. Additionally, tumor and lymph node respiratory amplitude variability was quantified. Finally, required margins were quantified by use of a recipe for dual targets. Results: Relative to the bony anatomy, systematic and random errors ranged from 0.16 to 0.32 cm for the markers and from 0.15 to 0.33 cm for the tumor, but despite similar ranges there was limited correlation (0.17-0.71) owing to differential motion. A large variability in lymph node amplitude between patients was observed, with an average motion of 0.56 cm in the cranial-caudal direction. Margins could be reduced by 10% (left-right), 27% (cranial-caudal), and 10% (anteroposterior) for the lymph nodes and −2%, 15%, and 7% for the tumor if an online carina registration protocol replaced a protocol based on bony anatomy registration. Conclusions: Detailed analysis revealed considerable lymph node position variability, differential motion, and respiratory motion. Planning target volume margins can be reduced up to 27% in lung cancer patients when the carina registration replaces bony anatomy registration.« less

  10. A Quantitative Diffuse Reflectance Imaging (QDRI) System for Comprehensive Surveillance of the Morphological Landscape in Breast Tumor Margins.

    PubMed

    Nichols, Brandon S; Schindler, Christine E; Brown, Jonathon Q; Wilke, Lee G; Mulvey, Christine S; Krieger, Marlee S; Gallagher, Jennifer; Geradts, Joseph; Greenup, Rachel A; Von Windheim, Jesko A; Ramanujam, Nirmala

    2015-01-01

    In an ongoing effort to address the clear clinical unmet needs surrounding breast conserving surgery (BCS), our group has developed a next-generation multiplexed optical-fiber-based tool to assess breast tumor margin status during initial surgeries. Specifically detailed in this work is the performance and clinical validation of a research-grade intra-operative tool for margin assessment based on diffuse optical spectroscopy. Previous work published by our group has illustrated the proof-of-concept generations of this device; here we incorporate a highly optimized quantitative diffuse reflectance imaging (QDRI) system utilizing a wide-field (imaging area = 17 cm(2)) 49-channel multiplexed fiber optic probe, a custom raster-scanning imaging platform, a custom dual-channel white LED source, and an astronomy grade imaging CCD and spectrograph. The system signal to noise ratio (SNR) was found to be greater than 40 dB for all channels. Optical property estimation error was found to be less than 10%, on average, over a wide range of absorption (μa = 0-8.9 cm(-1)) and scattering (μs' = 7.0-9.7 cm(-1)) coefficients. Very low inter-channel and CCD crosstalk was observed (2% max) when used on turbid media (including breast tissue). A raster-scanning mechanism was developed to achieve sub-pixel resolution and was found to be optimally performed at an upsample factor of 8, affording 0.75 mm spatially resolved diffuse reflectance images (λ = 450-600 nm) of an entire margin (area = 17 cm(2)) in 13.8 minutes (1.23 cm(2)/min). Moreover, controlled pressure application at the probe-tissue interface afforded by the imaging platform reduces repeated scan variability, providing <1% variation across repeated scans of clinical specimens. We demonstrate the clinical utility of this device through a pilot 20-patient study of high-resolution optical parameter maps of the ratio of the β-carotene concentration to the reduced scattering coefficient. An empirical cumulative distribution function (eCDF) analysis is used to reduce optical property maps to quantitative distributions representing the morphological landscape of breast tumor margins. The optimizations presented in this work provide an avenue to rapidly survey large tissue areas on intra-operative time scales with improved sensitivity to regions of focal disease that may otherwise be overlooked.

  11. Using sediment 'fingerprints' to assess sediment-budget errors, north Halawa Valley, Oahu, Hawaii, 1991-92

    USGS Publications Warehouse

    Hill, B.R.; DeCarlo, E.H.; Fuller, C.C.; Wong, M.F.

    1998-01-01

    Reliable estimates of sediment-budget errors are important for interpreting sediment-budget results. Sediment-budget errors are commonly considered equal to sediment-budget imbalances, which may underestimate actual sediment-budget errors if they include compensating positive and negative errors. We modified the sediment 'fingerprinting' approach to qualitatively evaluate compensating errors in an annual (1991) fine (<63 ??m) sediment budget for the North Halawa Valley, a mountainous, forested drainage basin on the island of Oahu, Hawaii, during construction of a major highway. We measured concentrations of aeolian quartz and 137Cs in sediment sources and fluvial sediments, and combined concentrations of these aerosols with the sediment budget to construct aerosol budgets. Aerosol concentrations were independent of the sediment budget, hence aerosol budgets were less likely than sediment budgets to include compensating errors. Differences between sediment-budget and aerosol-budget imbalances therefore provide a measure of compensating errors in the sediment budget. The sediment-budget imbalance equalled 25% of the fluvial fine-sediment load. Aerosol-budget imbalances were equal to 19% of the fluvial 137Cs load and 34% of the fluval quartz load. The reasonably close agreement between sediment- and aerosol-budget imbalances indicates that compensating errors in the sediment budget were not large and that the sediment-budget imbalance as a reliable measure of sediment-budget error. We attribute at least one-third of the 1991 fluvial fine-sediment load to highway construction. Continued monitoring indicated that highway construction produced 90% of the fluvial fine-sediment load during 1992. Erosion of channel margins and attrition of coarse particles provided most of the fine sediment produced by natural processes. Hillslope processes contributed relatively minor amounts of sediment.

  12. Enhanced robust fractional order proportional-plus-integral controller based on neural network for velocity control of permanent magnet synchronous motor.

    PubMed

    Zhang, Bitao; Pi, YouGuo

    2013-07-01

    The traditional integer order proportional-integral-differential (IO-PID) controller is sensitive to the parameter variation or/and external load disturbance of permanent magnet synchronous motor (PMSM). And the fractional order proportional-integral-differential (FO-PID) control scheme based on robustness tuning method is proposed to enhance the robustness. But the robustness focuses on the open-loop gain variation of controlled plant. In this paper, an enhanced robust fractional order proportional-plus-integral (ERFOPI) controller based on neural network is proposed. The control law of the ERFOPI controller is acted on a fractional order implement function (FOIF) of tracking error but not tracking error directly, which, according to theory analysis, can enhance the robust performance of system. Tuning rules and approaches, based on phase margin, crossover frequency specification and robustness rejecting gain variation, are introduced to obtain the parameters of ERFOPI controller. And the neural network algorithm is used to adjust the parameter of FOIF. Simulation and experimental results show that the method proposed in this paper not only achieve favorable tracking performance, but also is robust with regard to external load disturbance and parameter variation. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.

  13. Comparison of frequency-domain and time-domain rotorcraft vibration control methods

    NASA Technical Reports Server (NTRS)

    Gupta, N. K.

    1984-01-01

    Active control of rotor-induced vibration in rotorcraft has received significant attention recently. Two classes of techniques have been proposed. The more developed approach works with harmonic analysis of measured time histories and is called the frequency-domain approach. The more recent approach computes the control input directly using the measured time history data and is called the time-domain approach. The report summarizes the results of a theoretical investigation to compare the two approaches. Five specific areas were addressed: (1) techniques to derive models needed for control design (system identification methods), (2) robustness with respect to errors, (3) transient response, (4) susceptibility to noise, and (5) implementation difficulties. The system identification methods are more difficult for the time-domain models. The time-domain approach is more robust (e.g., has higher gain and phase margins) than the frequency-domain approach. It might thus be possible to avoid doing real-time system identification in the time-domain approach by storing models at a number of flight conditions. The most significant error source is the variation in open-loop vibrations caused by pilot inputs, maneuvers or gusts. The implementation requirements are similar except that the time-domain approach can be much simpler to implement if real-time system identification were not necessary.

  14. Meta-analysis of studies with bivariate binary outcomes: a marginal beta-binomial model approach.

    PubMed

    Chen, Yong; Hong, Chuan; Ning, Yang; Su, Xiao

    2016-01-15

    When conducting a meta-analysis of studies with bivariate binary outcomes, challenges arise when the within-study correlation and between-study heterogeneity should be taken into account. In this paper, we propose a marginal beta-binomial model for the meta-analysis of studies with binary outcomes. This model is based on the composite likelihood approach and has several attractive features compared with the existing models such as bivariate generalized linear mixed model (Chu and Cole, 2006) and Sarmanov beta-binomial model (Chen et al., 2012). The advantages of the proposed marginal model include modeling the probabilities in the original scale, not requiring any transformation of probabilities or any link function, having closed-form expression of likelihood function, and no constraints on the correlation parameter. More importantly, because the marginal beta-binomial model is only based on the marginal distributions, it does not suffer from potential misspecification of the joint distribution of bivariate study-specific probabilities. Such misspecification is difficult to detect and can lead to biased inference using currents methods. We compare the performance of the marginal beta-binomial model with the bivariate generalized linear mixed model and the Sarmanov beta-binomial model by simulation studies. Interestingly, the results show that the marginal beta-binomial model performs better than the Sarmanov beta-binomial model, whether or not the true model is Sarmanov beta-binomial, and the marginal beta-binomial model is more robust than the bivariate generalized linear mixed model under model misspecifications. Two meta-analyses of diagnostic accuracy studies and a meta-analysis of case-control studies are conducted for illustration. Copyright © 2015 John Wiley & Sons, Ltd.

  15. Kerr Reservoir LANDSAT experiment analysis for March 1981

    NASA Technical Reports Server (NTRS)

    Lecroy, S. R. (Principal Investigator)

    1982-01-01

    LANDSAT radiance data were used in an experiment conducted on the waters of Kerr Reservoir to determine if reliable algorithms could be developed that relate water quality parameters to remotely sensed data. A mix of different types of algorithms using the LANDSAT bands was generated to provide a thorough understanding of the relationships among the data involved. Except for secchi depth, the study demonstrated that for the ranges measured, the algorithms that satisfactorily represented the data encompass a mix of linear and nonlinear forms using only one LANDSAT band. Ratioing techniques did not improve the results since the initial design of the experiment minimized the errors against which this procedure is effective. Good correlations were found for total suspended solids, iron, turbidity, and secchi depth. Marginal correlations were discovered for nitrate and tannin + lignin. Quantification maps of Kerr Reservoir are presented for many of the water quality parameters using the developed algorithms.

  16. Hybrid density-functional calculations of phonons in LaCoO3

    NASA Astrophysics Data System (ADS)

    Gryaznov, Denis; Evarestov, Robert A.; Maier, Joachim

    2010-12-01

    Phonon frequencies at Γ point in nonmagnetic rhombohedral phase of LaCoO3 were calculated using density-functional theory with hybrid exchange correlation functional PBE0. The calculations involved a comparison of results for two types of basis functions commonly used in ab initio calculations, namely, the plane-wave approach and linear combination of atomic orbitals, as implemented in VASP and CRYSTAL computer codes, respectively. A good qualitative, but also within an error margin of less than 30%, a quantitative agreement was observed not only between the two formalisms but also between theoretical and experimental phonon frequency predictions. Moreover, the correlation between the phonon symmetries in cubic and rhombohedral phases is discussed in detail on the basis of group-theoretical analysis. It is concluded that the hybrid PBE0 functional is able to predict correctly the phonon properties in LaCoO3 .

  17. An efficient genome-wide association test for mixed binary and continuous phenotypes with applications to substance abuse research.

    PubMed

    Buu, Anne; Williams, L Keoki; Yang, James J

    2018-03-01

    We propose a new genome-wide association test for mixed binary and continuous phenotypes that uses an efficient numerical method to estimate the empirical distribution of the Fisher's combination statistic under the null hypothesis. Our simulation study shows that the proposed method controls the type I error rate and also maintains its power at the level of the permutation method. More importantly, the computational efficiency of the proposed method is much higher than the one of the permutation method. The simulation results also indicate that the power of the test increases when the genetic effect increases, the minor allele frequency increases, and the correlation between responses decreases. The statistical analysis on the database of the Study of Addiction: Genetics and Environment demonstrates that the proposed method combining multiple phenotypes can increase the power of identifying markers that may not be, otherwise, chosen using marginal tests.

  18. Aeronautical audio broadcasting via satellite

    NASA Technical Reports Server (NTRS)

    Tzeng, Forrest F.

    1993-01-01

    A system design for aeronautical audio broadcasting, with C-band uplink and L-band downlink, via Inmarsat space segments is presented. Near-transparent-quality compression of 5-kHz bandwidth audio at 20.5 kbit/s is achieved based on a hybrid technique employing linear predictive modeling and transform-domain residual quantization. Concatenated Reed-Solomon/convolutional codes with quadrature phase shift keying are selected for bandwidth and power efficiency. RF bandwidth at 25 kHz per channel, and a decoded bit error rate at 10(exp -6) with E(sub b)/N(sub o) at 3.75 dB are obtained. An interleaver, scrambler, modem synchronization, and frame format were designed, and frequency-division multiple access was selected over code-division multiple access. A link budget computation based on a worst-case scenario indicates sufficient system power margins. Transponder occupancy analysis for 72 audio channels demonstrates ample remaining capacity to accommodate emerging aeronautical services.

  19. Power Budget Analysis of Colorless Hybrid WDM/TDM-PON Scheme Using Downstream DPSK and Re-modulated Upstream OOK Data Signals

    NASA Astrophysics Data System (ADS)

    Khan, Yousaf; Afridi, Muhammad Idrees; Khan, Ahmed Mudassir; Rehman, Waheed Ur; Khan, Jahanzeb

    2014-09-01

    Hybrid wavelength-division multiplexed/time-division multiplexed passive optical access networks (WDM/TDM-PONs) combine the advance features of both WDM and TDM PONs to provide a cost-effective access network solution. We demonstrate and analyze the transmission performances and power budget issues of a colorless hybrid WDM/TDM-PON scheme. A 10-Gb/s downstream differential phase shift keying (DPSK) and remodulated upstream on/off keying (OOK) data signals are transmitted over 25 km standard single mode fiber. Simulation results show error free transmission having adequate power margins in both downstream and upstream transmission, which prove the applicability of the proposed scheme to future passive optical access networks. The power budget confines both the PON splitting ratio and the distance between the Optical Line Terminal (OLT) and Optical Network Unit (ONU).

  20. Accuracy of computer-guided implantation in a human cadaver model.

    PubMed

    Yatzkair, Gustavo; Cheng, Alice; Brodie, Stan; Raviv, Eli; Boyan, Barbara D; Schwartz, Zvi

    2015-10-01

    To examine the accuracy of computer-guided implantation using a human cadaver model with reduced experimental variability. Twenty-eight (28) dental implants representing 12 clinical cases were placed in four cadaver heads using a static guided implantation template. All planning and surgeries were performed by one clinician. All radiographs and measurements were performed by two examiners. The distance of the implants from buccal and lingual bone and mesial implant or tooth was analyzed at the apical and coronal levels, and measurements were compared to the planned values. No significant differences were seen between planned and implanted measurements. Average deviation of an implant from its planning radiograph was 0.8 mm, which is within the range of variability expected from CT analysis. Guided implantation can be used safely with a margin of error of 1 mm. © 2014 The Authors. Clinical Oral Implants Research Published by John Wiley & Sons Ltd.

  1. CEREC CAD/CAM Chairside System

    PubMed Central

    SANNINO, G.; GERMANO, F.; ARCURI, L.; BIGELLI, E.; ARCURI, C.; BARLATTANI, A.

    2014-01-01

    SUMMARY Purpose. The aim of this paper was to describe the CEREC 3 chairside system, providing the clinicians a detailed analysis of the whole digital workflow. Benefits and limitations of this technology compared with the conventional prosthetic work-flow were also highlighted and discussed. Materials and methods. Clinical procedures (tooth preparation, impression taking, adhesive luting), operational components and their capabilities as well as restorative materials used with CEREC 3 chairside system were reported. Results. The CEREC system has shown many positive aspects that make easier, faster and less expensive the prosthetic workflow. The operator-dependent errors are minimized compared to the conventional prosthetic protocol. Furthermore, a better acceptance level for the impression procedure has shown by the patients. The only drawback could be the subgingival placement of the margins compared with the supra/juxta gingival margins, since more time was required for the impression taking as well as the adhesive luting phase. The biocopy project seemed to be the best tool to obtain functionalized surfaces and keep unchanged gnathological data. Material selection was related to type of restoration. Conclusions. The evidence of our clinical practice suggests that CEREC 3 chairside system allows to produce highly aesthetic and reliable restorations in a single visit, while minimizing costs and patient discomfort during prosthetic treatment. However improvements in materials and technologies are needed in order to overcome the actual drawbacks. PMID:25992260

  2. Stability analysis of fuzzy parametric uncertain systems.

    PubMed

    Bhiwani, R J; Patre, B M

    2011-10-01

    In this paper, the determination of stability margin, gain and phase margin aspects of fuzzy parametric uncertain systems are dealt. The stability analysis of uncertain linear systems with coefficients described by fuzzy functions is studied. A complexity reduced technique for determining the stability margin for FPUS is proposed. The method suggested is dependent on the order of the characteristic polynomial. In order to find the stability margin of interval polynomials of order less than 5, it is not always necessary to determine and check all four Kharitonov's polynomials. It has been shown that, for determining stability margin of FPUS of order five, four, and three we require only 3, 2, and 1 Kharitonov's polynomials respectively. Only for sixth and higher order polynomials, a complete set of Kharitonov's polynomials are needed to determine the stability margin. Thus for lower order systems, the calculations are reduced to a large extent. This idea has been extended to determine the stability margin of fuzzy interval polynomials. It is also shown that the gain and phase margin of FPUS can be determined analytically without using graphical techniques. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.

  3. Program budgeting and marginal analysis: a case study in chronic airflow limitation.

    PubMed

    Crockett, A; Cranston, J; Moss, J; Scown, P; Mooney, G; Alpers, J

    1999-01-01

    Program budgeting and marginal analysis is a method of priority-setting in health care. This article describes how this method was applied to the management of a disease-specific group, chronic airflow limitation. A sub-program flow chart clarified the major cost drivers. After assessment of the technical efficiency of the sub-programs and careful and detailed analysis, incremental and decremental wish lists of activities were established. Program budgeting and marginal analysis provides a framework for rational resource allocation. The nurturing of a vigorous program management group, with members representing all participants in the process (including patients/consumers), is the key to a successful outcome.

  4. Dosimetric effects of patient rotational setup errors on prostate IMRT treatments

    NASA Astrophysics Data System (ADS)

    Fu, Weihua; Yang, Yong; Li, Xiang; Heron, Dwight E.; Saiful Huq, M.; Yue, Ning J.

    2006-10-01

    The purpose of this work is to determine dose delivery errors that could result from systematic rotational setup errors (ΔΦ) for prostate cancer patients treated with three-phase sequential boost IMRT. In order to implement this, different rotational setup errors around three Cartesian axes were simulated for five prostate patients and dosimetric indices, such as dose-volume histogram (DVH), tumour control probability (TCP), normal tissue complication probability (NTCP) and equivalent uniform dose (EUD), were employed to evaluate the corresponding dosimetric influences. Rotational setup errors were simulated by adjusting the gantry, collimator and horizontal couch angles of treatment beams and the dosimetric effects were evaluated by recomputing the dose distributions in the treatment planning system. Our results indicated that, for prostate cancer treatment with the three-phase sequential boost IMRT technique, the rotational setup errors do not have significant dosimetric impacts on the cumulative plan. Even in the worst-case scenario with ΔΦ = 3°, the prostate EUD varied within 1.5% and TCP decreased about 1%. For seminal vesicle, slightly larger influences were observed. However, EUD and TCP changes were still within 2%. The influence on sensitive structures, such as rectum and bladder, is also negligible. This study demonstrates that the rotational setup error degrades the dosimetric coverage of target volume in prostate cancer treatment to a certain degree. However, the degradation was not significant for the three-phase sequential boost prostate IMRT technique and for the margin sizes used in our institution.

  5. TED: A Tolerant Edit Distance for segmentation evaluation.

    PubMed

    Funke, Jan; Klein, Jonas; Moreno-Noguer, Francesc; Cardona, Albert; Cook, Matthew

    2017-02-15

    In this paper, we present a novel error measure to compare a computer-generated segmentation of images or volumes against ground truth. This measure, which we call Tolerant Edit Distance (TED), is motivated by two observations that we usually encounter in biomedical image processing: (1) Some errors, like small boundary shifts, are tolerable in practice. Which errors are tolerable is application dependent and should be explicitly expressible in the measure. (2) Non-tolerable errors have to be corrected manually. The effort needed to do so should be reflected by the error measure. Our measure is the minimal weighted sum of split and merge operations to apply to one segmentation such that it resembles another segmentation within specified tolerance bounds. This is in contrast to other commonly used measures like Rand index or variation of information, which integrate small, but tolerable, differences. Additionally, the TED provides intuitive numbers and allows the localization and classification of errors in images or volumes. We demonstrate the applicability of the TED on 3D segmentations of neurons in electron microscopy images where topological correctness is arguable more important than exact boundary locations. Furthermore, we show that the TED is not just limited to evaluation tasks. We use it as the loss function in a max-margin learning framework to find parameters of an automatic neuron segmentation algorithm. We show that training to minimize the TED, i.e., to minimize crucial errors, leads to higher segmentation accuracy compared to other learning methods. Copyright © 2016. Published by Elsevier Inc.

  6. Maximizing the quantitative accuracy and reproducibility of Förster resonance energy transfer measurement for screening by high throughput widefield microscopy

    PubMed Central

    Schaufele, Fred

    2013-01-01

    Förster resonance energy transfer (FRET) between fluorescent proteins (FPs) provides insights into the proximities and orientations of FPs as surrogates of the biochemical interactions and structures of the factors to which the FPs are genetically fused. As powerful as FRET methods are, technical issues have impeded their broad adoption in the biologic sciences. One hurdle to accurate and reproducible FRET microscopy measurement stems from variable fluorescence backgrounds both within a field and between different fields. Those variations introduce errors into the precise quantification of fluorescence levels on which the quantitative accuracy of FRET measurement is highly dependent. This measurement error is particularly problematic for screening campaigns since minimal well-to-well variation is necessary to faithfully identify wells with altered values. High content screening depends also upon maximizing the numbers of cells imaged, which is best achieved by low magnification high throughput microscopy. But, low magnification introduces flat-field correction issues that degrade the accuracy of background correction to cause poor reproducibility in FRET measurement. For live cell imaging, fluorescence of cell culture media in the fluorescence collection channels for the FPs commonly used for FRET analysis is a high source of background error. These signal-to-noise problems are compounded by the desire to express proteins at biologically meaningful levels that may only be marginally above the strong fluorescence background. Here, techniques are presented that correct for background fluctuations. Accurate calculation of FRET is realized even from images in which a non-flat background is 10-fold higher than the signal. PMID:23927839

  7. Engineering analysis division internal note. OFT-1 margin assessment. [results of analysis to determine space shuttle subsystem margins for orbital flight test

    NASA Technical Reports Server (NTRS)

    Austin, L. D., Jr.

    1977-01-01

    The results are documented of an analysis conducted to determine space shuttle vehicle (SSV) subsystem margins for the reference flight profile for the first orbital flight test (OFT-1). In general, the results show increased margins for the OFT-1 indicators when compared to Mission 3A resutls. The inclusion of element and wing aerodynamic variations on OFT-1, in combination with winds and systems dispersions, resulted in the forward Z (FTO1) attach load indicator exceeding specified limits. The inboard and outboard elevon hinge moment margins (9 percent and 3 percent, respectively) on OFT-1 reflect values consistent with the load relief requirements dictated by elevon hinge moment variations. All other structural load indicators for OFT-1 had margins in excess of 10 percent. The SSV first stage stagnation heating indicators for OFT-1 were about 45 percent less than for Mission 3A. The hydraulic systems demands for both missions were essentially the same. The results also show OFT-1 performance requirements from the consideration of winds and dispersions to be approximately 3,300 lbs greater than Mission 3A.

  8. Evaluation of marginal fit of two all-ceramic copings with two finish lines

    PubMed Central

    Subasi, Gulce; Ozturk, Nilgun; Inan, Ozgur; Bozogullari, Nalan

    2012-01-01

    Objectives: This in-vitro study investigated the marginal fit of two all-ceramic copings with 2 finish line designs. Methods: Forty machined stainless steel molar die models with two different margin designs (chamfer and rounded shoulder) were prepared. A total of 40 standardized copings were fabricated and divided into 4 groups (n=10 for each finish line-coping material). Coping materials tested were IPS e.max Press and Zirkonzahn; luting agent was Variolink II. Marginal fit was evaluated after cementation with a stereomicroscope (Leica MZ16). Two-way analysis of variance and Tukey-HSD test were performed to assess the influence of each finish line design and ceramic type on the marginal fit of 2 all-ceramic copings (α =.05). Results: Two-way analysis of variance revealed no statistically significant differences for marginal fit relative to finish lines (P=.362) and ceramic types (P=.065). Conclusion: Within the limitations of this study, both types of all-ceramic copings demonstrated that the mean marginal fit was considered acceptable for clinical application (⩽120 μm). PMID:22509119

  9. The Ocean and Climate: Results from the TOPEX/POSEIDON Mission

    NASA Technical Reports Server (NTRS)

    Fu, L. -L.

    1995-01-01

    Since 1992, the TOPEX/POSEIDON satellite has been making altimetric sea surface observations with a sea level accuracy of 4.4 cm. This data can be used for studying regional and seasonal differences in sea level and for evaluating oceanic circulation models and tidal models. Longer term changes can also be studied, such as El Nino and overall sea level rising (although the latter is still within the margin of error).

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iverson, Aaron

    Ra Power Management (RPM) has developed a cloud based software platform that manages the financial and operational functions of third party financed solar projects throughout their lifecycle. RPM’s software streamlines and automates the sales, financing, and management of a portfolio of solar assets. The software helps solar developers automate the most difficult aspects of asset management, leading to increased transparency, efficiency, and reduction in human error. More importantly, our platform will help developers save money by improving their operating margins

  11. Overlay improvement methods with diffraction based overlay and integrated metrology

    NASA Astrophysics Data System (ADS)

    Nam, Young-Sun; Kim, Sunny; Shin, Ju Hee; Choi, Young Sin; Yun, Sang Ho; Kim, Young Hoon; Shin, Si Woo; Kong, Jeong Heung; Kang, Young Seog; Ha, Hun Hwan

    2015-03-01

    To accord with new requirement of securing more overlay margin, not only the optical overlay measurement is faced with the technical limitations to represent cell pattern's behavior, but also the larger measurement samples are inevitable for minimizing statistical errors and better estimation of circumstance in a lot. From these reasons, diffraction based overlay (DBO) and integrated metrology (IM) were mainly proposed as new approaches for overlay enhancement in this paper.

  12. Setup errors and effectiveness of Optical Laser 3D Surface imaging system (Sentinel) in postoperative radiotherapy of breast cancer.

    PubMed

    Wei, Xiaobo; Liu, Mengjiao; Ding, Yun; Li, Qilin; Cheng, Changhai; Zong, Xian; Yin, Wenming; Chen, Jie; Gu, Wendong

    2018-05-08

    Breast-conserving surgery (BCS) plus postoperative radiotherapy has become the standard treatment for early-stage breast cancer. The aim of this study was to compare the setup accuracy of optical surface imaging by the Sentinel system with cone-beam computerized tomography (CBCT) imaging currently used in our clinic for patients received BCS. Two optical surface scans were acquired before and immediately after couch movement correction. The correlation between the setup errors as determined by the initial optical surface scan and CBCT was analyzed. The deviation of the second optical surface scan from the reference planning CT was considered an estimate for the residual errors for the new method for patient setup correction. The consequences in terms for necessary planning target volume (PTV) margins for treatment sessions without setup correction applied. We analyzed 145 scans in 27 patients treated for early stage breast cancer. The setup errors of skin marker based patient alignment by optical surface scan and CBCT were correlated, and the residual setup errors as determined by the optical surface scan after couch movement correction were reduced. Optical surface imaging provides a convenient method for improving the setup accuracy for breast cancer patient without unnecessary imaging dose.

  13. A Bayesian Hierarchical Model for Glacial Dynamics Based on the Shallow Ice Approximation and its Evaluation Using Analytical Solutions

    NASA Astrophysics Data System (ADS)

    Gopalan, Giri; Hrafnkelsson, Birgir; Aðalgeirsdóttir, Guðfinna; Jarosch, Alexander H.; Pálsson, Finnur

    2018-03-01

    Bayesian hierarchical modeling can assist the study of glacial dynamics and ice flow properties. This approach will allow glaciologists to make fully probabilistic predictions for the thickness of a glacier at unobserved spatio-temporal coordinates, and it will also allow for the derivation of posterior probability distributions for key physical parameters such as ice viscosity and basal sliding. The goal of this paper is to develop a proof of concept for a Bayesian hierarchical model constructed, which uses exact analytical solutions for the shallow ice approximation (SIA) introduced by Bueler et al. (2005). A suite of test simulations utilizing these exact solutions suggests that this approach is able to adequately model numerical errors and produce useful physical parameter posterior distributions and predictions. A byproduct of the development of the Bayesian hierarchical model is the derivation of a novel finite difference method for solving the SIA partial differential equation (PDE). An additional novelty of this work is the correction of numerical errors induced through a numerical solution using a statistical model. This error correcting process models numerical errors that accumulate forward in time and spatial variation of numerical errors between the dome, interior, and margin of a glacier.

  14. Utilizing Integrated Prediction Error Filter Analysis (INPEFA) to divide base-level cycle of fan-deltas: A case study of the Triassic Baikouquan Formation in Mabei Slope Area, Mahu Depression, Junggar Basin, China

    NASA Astrophysics Data System (ADS)

    Yuan, Rui; Zhu, Rui; Qu, Jianhua; Wu, Jun; You, Xincai; Sun, Yuqiu; Zhou, Yuanquan (Nancy)

    2018-05-01

    The Mahu Depression is an important hydrocarbon-bearing foreland sag located at the northwestern margin of the Junggar Basin, China. On the northern slope of the depression, large coarse-grained proximal fan-delta depositional systems developed in the Lower Triassic Baikouquan Formation (T1b). Some lithologic hydrocarbon reservoirs have been found in the conglomerates of the formation since recent years. However, the rapid vertical and horizontal lithology variations make it is difficult to divide the base-level cycle of the formation using the conventional methods. Spectral analysis technologies, such as Integrated Prediction Error Filter Analysis (INPEFA), provide another effective way to overcome this difficultly. In this paper, processed by INPEFA, conventional resistivity logs are utilized to study the base-level cycle of the fan-delta depositional systems. The negative trend of the INPEFA curve indicates the base-level fall semi-cycles, adversely, positive trend suggests the rise semi-cycles. Base-level cycles of Baikouquan Formation are divided in single and correlation wells. One long-term base-level rise semi-cycle, including three medium-term base-level cycles, is identified overall the Baikouquan Formation. The medium-term base-level cycles are characterized as rise semi-cycles mainly in the fan-delta plain, symmetric cycles in the fan-delta front and fall semi-cycles mainly in the pro-fan-delta. The short-term base-level rise semi-cycles most developed in the braided channels, sub-aqueous distributary channels and sheet sands. While, the interdistributary bays and pro-fan-delta mud indicate short-term base-level fall semi-cycles. Finally, based on the method of INPEFA, sequence filling model of Baikouquan formation is established.

  15. A space imaging concept based on a 4m structured spun-cast borosilicate monolithic primary mirror

    NASA Astrophysics Data System (ADS)

    West, S. C.; Bailey, S. H.; Bauman, S.; Cuerden, B.; Granger, Z.; Olbert, B. H.

    2010-07-01

    Lockheed Martin Corporation (LMC) tasked The University of Arizona Steward Observatory (UASO) to conduct an engineering study to examine the feasibility of creating a 4m space telescope based on mature borosilicate technology developed at the UASO for ground-based telescopes. UASO has completed this study and concluded that existing launch vehicles can deliver a 4m monolithic telescope system to a 500 km circular orbit and provide reliable imagery at NIIRS 7-8. An analysis of such an imager based on a lightweight, high-performance, structured 4m primary mirror cast from borosilicate glass is described. The relatively high CTE of this glass is used to advantage by maintaining mirror shape quality with a thermal figuring method. Placed in a 290 K thermal shroud (similar to the Hubble Space Telescope), the orbit averaged figure surface error is 6nm rms when earth-looking. Space-looking optical performance shows that a similar thermal conditioning scheme combined with a 270 K shroud achieves primary mirror distortion of 10 nm rms surface. Analysis shows that a 3-point bipod mount will provide launch survivability with ample margin. The primary mirror naturally maintains its shape at 1g allowing excellent end-to-end pre-launch testing with e.g. the LOTIS 6.5m Collimator. The telescope includes simple systems to measure and correct mirror shape and alignment errors incorporating technologies already proven on the LOTIS Collimator. We have sketched a notional earth-looking 4m telescope concept combined with a wide field TMA concept into a DELTA IV or ATLAS 552 EELV fairing. We have combined an initial analysis of launch and space performance of a special light-weighted honeycomb borosilicate mirror (areal density 95 kg/m2) with public domain information on the existing launch vehicles.

  16. Data file, Continental Margin Program, Atlantic Coast of the United States: vol. 2 sample collection and analytical data

    USGS Publications Warehouse

    Hathaway, John C.

    1971-01-01

    The purpose of the data file presented below is twofold: the first purpose is to make available in printed form the basic data relating to the samples collected as part of the joint U.S. Geological Survey - Woods Hole Oceanographic Institution program of study of the Atlantic continental margin of the United States; the second purpose is to maintain these data in a form that is easily retrievable by modern computer methods. With the data in such form, repeate manual transcription for statistical or similar mathematical treatment becomes unnecessary. Manual plotting of information or derivatives from the information may also be eliminated. Not only is handling of data by the computer considerably faster than manual techniques, but a fruitful source of errors, transcription mistakes, is eliminated.

  17. Replacing the CCSDS Telecommand Protocol with Next Generation Uplink

    NASA Technical Reports Server (NTRS)

    Kazz, Greg; Burleigh, Scott; Greenberg, Ed

    2012-01-01

    Better performing Forward Error Correction on the forward link along with adequate power in the data open an uplink operations trade space that enable missions to: Command to greater distances in deep space (increased uplink margin) Increase the size of the payload data (latency may be a factor) Provides space for the security header/trailer of the CCSDS Space Data Link Security Protocol Note: These higher rates could be used for relief of emergency communication margins/rates and not limited to improving top-end rate performance. A higher performance uplink could also reduce the requirements on flight emergency antenna size and/or the performance required from ground stations. Use of a selective repeat ARQ protocol may increase the uplink design requirements but the resultant development is deemed acceptable, due the factor of 4 to 8 potential increase in uplink data rate.

  18. NAND Flash Qualification Guideline

    NASA Technical Reports Server (NTRS)

    Heidecker, Jason

    2012-01-01

    Better performing Forward Error Correction on the forward link along with adequate power in the data open an uplink operations trade space that enable missions to: Command to greater distances in deep space (increased uplink margin). Increase the size of the payload data (latency may be a factor). Provides space for the security header/trailer of the CCSDS Space Data Link Security Protocol. Note: These higher rates could be used for relief of emergency communication margins/rates and not limited to improving top-end rate performance. A higher performance uplink could also reduce the requirements on flight emergency antenna size and/or the performance required from ground stations. Use of a selective repeat ARQ protocol may increase the uplink design requirements but the resultant development is deemed acceptable, due the factor of 4 to 8 potential increase in uplink data rate.

  19. Impact of Resection Margin Distance on Survival of Pancreatic Cancer: A Systematic Review and Meta-Analysis

    PubMed Central

    Kim, Kyung Su; Kwon, Jeanny; Kim, Kyubo; Chie, Eui Kyu

    2017-01-01

    Purpose While curative resection is the only chance of cure in pancreatic cancer, controversies exist about the impact of surgical margin status on survival. Non-standardized pathologic report and different criteria on the R1 status made it difficult to implicate adjuvant therapy after resection based on the margin status. We evaluated the influence of resection margins on survival by meta-analysis. Materials and Methods We thoroughly searched electronic databases of PubMed, EMBASE, and Cochrane Library. We included studies reporting survival outcomes with different margin status: involved margin (R0 mm), margin clearance with ≤ 1 mm (R0-1 mm), and margin with > 1 mm (R>1 mm). Hazard ratio (HR) for overall survival was extracted, and a random-effects model was used for pooled analysis. Results A total of eight retrospective studies involving 1,932 patients were included. Pooled HR for overall survival showed that patients with R>1 mm had reduced risk of death than those with R0-1 mm (HR, 0.74; 95% confidence interval [CI], 0.61 to 0.88; p=0.001). In addition, patients with R0-1 mm had reduced risk of death than those with R0 mm (HR, 0.81; 95% CI, 0.72 to 0.91; p < 0.001). There was no heterogeneity between the included studies (I2 index, 42% and 0%; p=0.10 and p=0.82, respectively). Conclusion Our results suggest that stratification of the patients based on margin status is warranted in the clinical trials assessing the role of adjuvant treatment for pancreatic cancer. PMID:27561314

  20. Analysis of vertical marginal discrepancy in feldspathic porcelain crowns manufactured with different CAD/CAM systems: Closed and open.

    PubMed

    Kricheldorf, Fabio; Bueno, Cleuber Rodrigo de Souza; Amaral, Wilson da Silva; Junior, Joel Ferreira Santiago; Filho, Hugo Nary

    2018-01-01

    The objective of this study is to compare the marginal adaptation of feldspathic porcelain crowns using two computer-aided design/computer-aided manufacturing systems, one of them is open and the other is closed. Twenty identical titanium abutments were divided into two groups: open system (OS), where ceramic crowns were created using varied equipment and software, and closed system (CS), where ceramic crowns were created using the CEREC system. Through optical microscopy analysis, we assess the marginal adaptation of the prosthetic interfaces. The data were subjected to the distribution of normality and variance. The t -test was used for the analysis of the comparison factor between the groups, and the one-way ANOVA was used to compare the variance of crown analysis regions within the group. A significance level of 5% was considered for the analyses. There was a significant difference between the systems ( P = 0.007), with the CS group having the higher mean (23.75 μm ± 3.05) of marginal discrepancy when compared to the open group (17.94 μm ± 4.77). Furthermore, there were no differences in marginal discrepancy between the different points between the groups ( P ≥ 0.05). The studied groups presented results within the requirements set out in the literature. However, the OS used presented better results in marginal adaptation.

  1. The impact of frozen sections on final surgical margins in squamous cell carcinoma of the oral cavity and lips: a retrospective analysis over an 11 years period

    PubMed Central

    2011-01-01

    Background Taking intraoperative frozen sections (FS) is a widely used procedure in oncologic surgery. However so far no evidence of an association of FS analysis and premalignant changes in the surgical margin exists. Therefore, the aim of this study was to evaluate the impact of FS on different categories of the final margins of squamous cell carcinoma (SCC) of the oral cavity and lips. Methods FS, pT-stage, grading, and tumor localization of 178 patients with SCC of the oral cavity and lips were compared by uni- and multivariate analysis in patients with positive, dysplastic and negative surgical margin status. Results Performed on 111 patients (62.4%), intraoperative FS did not have any statistically significant influence on final margin status, independent of whether it was positive (p = 0.40), dysplastic (p = 0.70), or negative (p = 0.70). Positive surgical margins in permanent sections were significantly associated with pT4-tumors (OR 5.61, p = 0.001). The chance for negative margins in permanent sections was significantly higher in tumors located in the tongue (OR 4.70, p = 0.01). Conclusions Our data suggests that intraoperative FS in SCC can be useful in selected cases. However it is not advisable as a routine approach. PMID:22208692

  2. Effect of Immobilization and Performance Status on Intrafraction Motion for Stereotactic Lung Radiotherapy: Analysis of 133 Patients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Winnie, E-mail: winnie.li@rmp.uhn.on.ca; Department of Radiation Oncology, University of Toronto, Toronto, Ontario; Purdie, Thomas G.

    2011-12-01

    Purpose: To assess intrafractional geometric accuracy of lung stereotactic body radiation therapy (SBRT) patients treated with volumetric image guidance. Methods and Materials: Treatment setup accuracy was analyzed in 133 SBRT patients treated via research ethics board-approved protocols. For each fraction, a localization cone-beam computed tomography (CBCT) scan was acquired for soft-tissue registration to the internal target volume, followed by a couch adjustment for positional discrepancies greater than 3 mm, verified with a second CBCT scan. CBCT scans were also performed at intrafraction and end fraction. Patient positioning data from 2047 CBCT scans were recorded to determine systematic ({Sigma}) and randommore » ({sigma}) uncertainties, as well as planning target volume margins. Data were further stratified and analyzed by immobilization method (evacuated cushion [n = 75], evacuated cushion plus abdominal compression [n = 33], or chest board [n = 25]) and by patients' Eastern Cooperative Oncology Group performance status (PS): 0 (n = 31), 1 (n = 70), or 2 (n = 32). Results: Using CBCT internal target volume was matched within {+-}3 mm in 16% of all fractions at localization, 89% at verification, 72% during treatment, and 69% after treatment. Planning target volume margins required to encompass residual setup errors after couch corrections (verification CBCT scans) were 4 mm, and they increased to 5 mm with target intrafraction motion (post-treatment CBCT scans). Small differences (<1 mm) in the cranial-caudal direction of target position were observed between the immobilization cohorts in the localization, verification, intrafraction, and post-treatment CBCT scans (p < 0.01). Positional drift varied according to patient PS, with the PS 1 and 2 cohorts drifting out of position by mid treatment more than the PS 0 cohort in the cranial-caudal direction (p = 0.04). Conclusions: Image guidance ensures high geometric accuracy for lung SBRT irrespective of immobilization method or PS. A 5-mm setup margin suffices to address intrafraction motion. This setup margin may be further reduced by strategies such as frequent image guidance or volumetric arc therapy to correct or limit intrafraction motion.« less

  3. Effect of immobilization and performance status on intrafraction motion for stereotactic lung radiotherapy: analysis of 133 patients.

    PubMed

    Li, Winnie; Purdie, Thomas G; Taremi, Mojgan; Fung, Sharon; Brade, Anthony; Cho, B C John; Hope, Andrew; Sun, Alexander; Jaffray, David A; Bezjak, Andrea; Bissonnette, Jean-Pierre

    2011-12-01

    To assess intrafractional geometric accuracy of lung stereotactic body radiation therapy (SBRT) patients treated with volumetric image guidance. Treatment setup accuracy was analyzed in 133 SBRT patients treated via research ethics board-approved protocols. For each fraction, a localization cone-beam computed tomography (CBCT) scan was acquired for soft-tissue registration to the internal target volume, followed by a couch adjustment for positional discrepancies greater than 3 mm, verified with a second CBCT scan. CBCT scans were also performed at intrafraction and end fraction. Patient positioning data from 2047 CBCT scans were recorded to determine systematic (Σ) and random (σ) uncertainties, as well as planning target volume margins. Data were further stratified and analyzed by immobilization method (evacuated cushion [n=75], evacuated cushion plus abdominal compression [n=33], or chest board [n=25]) and by patients' Eastern Cooperative Oncology Group performance status (PS): 0 (n=31), 1 (n=70), or 2 (n=32). Using CBCT internal target volume was matched within ±3 mm in 16% of all fractions at localization, 89% at verification, 72% during treatment, and 69% after treatment. Planning target volume margins required to encompass residual setup errors after couch corrections (verification CBCT scans) were 4 mm, and they increased to 5 mm with target intrafraction motion (post-treatment CBCT scans). Small differences (<1 mm) in the cranial-caudal direction of target position were observed between the immobilization cohorts in the localization, verification, intrafraction, and post-treatment CBCT scans (p<0.01). Positional drift varied according to patient PS, with the PS 1 and 2 cohorts drifting out of position by mid treatment more than the PS 0 cohort in the cranial-caudal direction (p=0.04). Image guidance ensures high geometric accuracy for lung SBRT irrespective of immobilization method or PS. A 5-mm setup margin suffices to address intrafraction motion. This setup margin may be further reduced by strategies such as frequent image guidance or volumetric arc therapy to correct or limit intrafraction motion. Copyright © 2011 Elsevier Inc. All rights reserved.

  4. Combining forecast weights: Why and how?

    NASA Astrophysics Data System (ADS)

    Yin, Yip Chee; Kok-Haur, Ng; Hock-Eam, Lim

    2012-09-01

    This paper proposes a procedure called forecast weight averaging which is a specific combination of forecast weights obtained from different methods of constructing forecast weights for the purpose of improving the accuracy of pseudo out of sample forecasting. It is found that under certain specified conditions, forecast weight averaging can lower the mean squared forecast error obtained from model averaging. In addition, we show that in a linear and homoskedastic environment, this superior predictive ability of forecast weight averaging holds true irrespective whether the coefficients are tested by t statistic or z statistic provided the significant level is within the 10% range. By theoretical proofs and simulation study, we have shown that model averaging like, variance model averaging, simple model averaging and standard error model averaging, each produces mean squared forecast error larger than that of forecast weight averaging. Finally, this result also holds true marginally when applied to business and economic empirical data sets, Gross Domestic Product (GDP growth rate), Consumer Price Index (CPI) and Average Lending Rate (ALR) of Malaysia.

  5. Improve homology search sensitivity of PacBio data by correcting frameshifts.

    PubMed

    Du, Nan; Sun, Yanni

    2016-09-01

    Single-molecule, real-time sequencing (SMRT) developed by Pacific BioSciences produces longer reads than secondary generation sequencing technologies such as Illumina. The long read length enables PacBio sequencing to close gaps in genome assembly, reveal structural variations, and identify gene isoforms with higher accuracy in transcriptomic sequencing. However, PacBio data has high sequencing error rate and most of the errors are insertion or deletion errors. During alignment-based homology search, insertion or deletion errors in genes will cause frameshifts and may only lead to marginal alignment scores and short alignments. As a result, it is hard to distinguish true alignments from random alignments and the ambiguity will incur errors in structural and functional annotation. Existing frameshift correction tools are designed for data with much lower error rate and are not optimized for PacBio data. As an increasing number of groups are using SMRT, there is an urgent need for dedicated homology search tools for PacBio data. In this work, we introduce Frame-Pro, a profile homology search tool for PacBio reads. Our tool corrects sequencing errors and also outputs the profile alignments of the corrected sequences against characterized protein families. We applied our tool to both simulated and real PacBio data. The results showed that our method enables more sensitive homology search, especially for PacBio data sets of low sequencing coverage. In addition, we can correct more errors when comparing with a popular error correction tool that does not rely on hybrid sequencing. The source code is freely available at https://sourceforge.net/projects/frame-pro/ yannisun@msu.edu. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  6. Independent Component Analysis of Textures

    NASA Technical Reports Server (NTRS)

    Manduchi, Roberto; Portilla, Javier

    2000-01-01

    A common method for texture representation is to use the marginal probability densities over the outputs of a set of multi-orientation, multi-scale filters as a description of the texture. We propose a technique, based on Independent Components Analysis, for choosing the set of filters that yield the most informative marginals, meaning that the product over the marginals most closely approximates the joint probability density function of the filter outputs. The algorithm is implemented using a steerable filter space. Experiments involving both texture classification and synthesis show that compared to Principal Components Analysis, ICA provides superior performance for modeling of natural and synthetic textures.

  7. Importance of flexure in response to sedimentation and erosion along the US Atlantic passive margin in reconciling sea level change and paleoshorelines

    NASA Astrophysics Data System (ADS)

    Moucha, R.; Ruetenik, G.; de Boer, B.

    2017-12-01

    Reconciling elevations of paleoshorelines along the US Atlantic passive margin with estimates of eustatic sea level have long posed to be a challenge. Discrepancies between shoreline elevation and sea level have been attributed to combinations of tectonics, glacial isostatic adjustment, mantle convection, gravitation and/or errors, for example, in the inference of eustatic sea level from the marine 18O record. Herein we present a numerical model of landscape evolution combined with sea level change and solid Earth deformations to demonstrate the importance of flexural effects in response to erosion and sedimentation along the US Atlantic passive margin. We quantify these effects using two different temporal models. One reconciles the Orangeburg scarp, a well-documented 3.5 million-year-old mid-Pliocene shoreline, with a 15 m mid-Pliocene sea level above present-day (Moucha and Ruetenik, 2017). The other model focuses on the evolution of the South Carolina and northern Georgia margin since MIS 11 ( 400 Ka) using a fully coupled ice sheet, sea level and solid Earth model (de Boer et al, 2014) while relating our results to a series of enigmatic sea level high stand markers. de Boer, B., Stocci, P., and van de Wal, R. (2014). A fully coupled 3-d ice-sheet-sea-level model: algorithm and applications. Geoscientific Model Development, 7:2141-2156. Moucha, R. and Ruetenik, G. A. (2017). Interplay between dynamic topography and flexure along the US Atlantic passive margin: Insights from landscape evolution modeling. Global and Planetary Change, 149: 72-78

  8. Indication and method of frozen section in vaginal radical trachelectomy.

    PubMed

    Chênevert, Jacinthe; Têtu, Bernard; Plante, Marie; Roy, Michel; Renaud, Marie-Claude; Grégoire, Jean; Grondin, Katherine; Dubé, Valérie

    2009-09-01

    Vaginal radical trachelectomy (VRT) is a fertility-sparing surgical technique used as an alternative to radical hysterectomy in early stage cervical carcinoma. With the advent of VRT, preoperative evaluation of the surgical margin has become imperative, because if the tumor is found within 5 mm of the endocervical margin, additional surgical resection is required. In a study published earlier from our center, we came to the conclusion that a frozen section should be conducted only when a cancerous lesion is grossly visible, and that it could be omitted in normal-looking specimens or VRT with nonspecific lesions. Since then, 53 VRT have been performed in our center, and frozen sections were conducted according to these recommendations. Fifteen VRT were grossly normal, 24 had a nonspecific lesion and 14 showed a grossly visible lesion. Final margins were satisfactory on all 15 grossly normal specimens. Of the 24 VRT with nonspecific lesions, 2 cases for which no frozen section was performed had unsatisfactory final margins (<5 mm). Of the 14 VRT with grossly visible lesions, 3 cases were inadequately evaluated by frozen section due to sampling errors, which led to unsatisfactory final margin assessment. These results confirm that a frozen section can be omitted on normal looking VRT specimens, but contrary to results published earlier, we recommend that a frozen section be performed on all VRT with nonspecific lesions. As for VRT with a grossly visible lesion, frozen section evaluation is still warranted, and we recommend increasing the sampling to improve the adequacy of frozen sections.

  9. Cumulative uncertainty in measured streamflow and water quality data for small watersheds

    USGS Publications Warehouse

    Harmel, R.D.; Cooper, R.J.; Slade, R.M.; Haney, R.L.; Arnold, J.G.

    2006-01-01

    The scientific community has not established an adequate understanding of the uncertainty inherent in measured water quality data, which is introduced by four procedural categories: streamflow measurement, sample collection, sample preservation/storage, and laboratory analysis. Although previous research has produced valuable information on relative differences in procedures within these categories, little information is available that compares the procedural categories or presents the cumulative uncertainty in resulting water quality data. As a result, quality control emphasis is often misdirected, and data uncertainty is typically either ignored or accounted for with an arbitrary margin of safety. Faced with the need for scientifically defensible estimates of data uncertainty to support water resource management, the objectives of this research were to: (1) compile selected published information on uncertainty related to measured streamflow and water quality data for small watersheds, (2) use a root mean square error propagation method to compare the uncertainty introduced by each procedural category, and (3) use the error propagation method to determine the cumulative probable uncertainty in measured streamflow, sediment, and nutrient data. Best case, typical, and worst case "data quality" scenarios were examined. Averaged across all constituents, the calculated cumulative probable uncertainty (??%) contributed under typical scenarios ranged from 6% to 19% for streamflow measurement, from 4% to 48% for sample collection, from 2% to 16% for sample preservation/storage, and from 5% to 21% for laboratory analysis. Under typical conditions, errors in storm loads ranged from 8% to 104% for dissolved nutrients, from 8% to 110% for total N and P, and from 7% to 53% for TSS. Results indicated that uncertainty can increase substantially under poor measurement conditions and limited quality control effort. This research provides introductory scientific estimates of uncertainty in measured water quality data. The results and procedures presented should also assist modelers in quantifying the "quality"of calibration and evaluation data sets, determining model accuracy goals, and evaluating model performance.

  10. Historical shoreline mapping (II): Application of the Digital Shoreline Mapping and Analysis Systems (DSMS/DSAS) to shoreline change mapping in Puerto Rico

    USGS Publications Warehouse

    Thieler, E. Robert; Danforth, William W.

    1994-01-01

    A new, state-of-the-art method for mapping historical shorelines from maps and aerial photographs, the Digital Shoreline Mapping System (DSMS), has been developed. The DSMS is a freely available, public domain software package that meets the cartographic and photogrammetric requirements of precise coastal mapping, and provides a means to quantify and analyze different sources of error in the mapping process. The DSMS is also capable of resolving imperfections in aerial photography that commonly are assumed to be nonexistent. The DSMS utilizes commonly available computer hardware and software, and permits the entire shoreline mapping process to be executed rapidly by a single person in a small lab. The DSMS generates output shoreline position data that are compatible with a variety of Geographic Information Systems (GIS). A second suite of programs, the Digital Shoreline Analysis System (DSAS) has been developed to calculate shoreline rates-of-change from a series of shoreline data residing in a GIS. Four rate-of-change statistics are calculated simultaneously (end-point rate, average of rates, linear regression and jackknife) at a user-specified interval along the shoreline using a measurement baseline approach. An example of DSMS and DSAS application using historical maps and air photos of Punta Uvero, Puerto Rico provides a basis for assessing the errors associated with the source materials as well as the accuracy of computed shoreline positions and erosion rates. The maps and photos used here represent a common situation in shoreline mapping: marginal-quality source materials. The maps and photos are near the usable upper limit of scale and accuracy, yet the shoreline positions are still accurate ±9.25 m when all sources of error are considered. This level of accuracy yields a resolution of ±0.51 m/yr for shoreline rates-of-change in this example, and is sufficient to identify the short-term trend (36 years) of shoreline change in the study area.

  11. Structure of the North American Atlantic Continental Margin.

    ERIC Educational Resources Information Center

    Klitgord, K. K.; Schlee, J. S.

    1986-01-01

    Offers explanations on the origin of the North American Atlantic continental margin. Provides an analysis and illustrations of structural and strategraphic elements of cross sections of the Atlantic continental margin. Also explains the operations and applications of seismic-relection profiles in studying ocean areas. (ML)

  12. Novel Method for Incorporating Model Uncertainties into Gravitational Wave Parameter Estimates

    NASA Astrophysics Data System (ADS)

    Moore, Christopher J.; Gair, Jonathan R.

    2014-12-01

    Posterior distributions on parameters computed from experimental data using Bayesian techniques are only as accurate as the models used to construct them. In many applications, these models are incomplete, which both reduces the prospects of detection and leads to a systematic error in the parameter estimates. In the analysis of data from gravitational wave detectors, for example, accurate waveform templates can be computed using numerical methods, but the prohibitive cost of these simulations means this can only be done for a small handful of parameters. In this Letter, a novel method to fold model uncertainties into data analysis is proposed; the waveform uncertainty is analytically marginalized over using with a prior distribution constructed by using Gaussian process regression to interpolate the waveform difference from a small training set of accurate templates. The method is well motivated, easy to implement, and no more computationally expensive than standard techniques. The new method is shown to perform extremely well when applied to a toy problem. While we use the application to gravitational wave data analysis to motivate and illustrate the technique, it can be applied in any context where model uncertainties exist.

  13. Intra-surgical total and re-constructible pathological prostate examination for safer margins and nerve preservation (Istanbul preserve).

    PubMed

    Öbek, Can; Saglican, Yesim; Ince, Umit; Argun, Omer Burak; Tuna, Mustafa Bilal; Doganca, Tunkut; Tufek, Ilter; Keskin, Selcuk; Kural, Ali Riza

    2018-04-01

    To demonstrate a novel frozen section analysis technique during robot assisted radical prostatectomy with 2 distinct advantages: evaluation of the entire circumference and easier reconstruction for whole mount evaluation. Istanbul Preserve was performed on patients who underwent robotic prostatectomy with nerve sparing between 10/2014 and 7/2016. Gland was sectioned at 3-4mm intervals from apex to bladder neck. Entire tissue representing margins (except for the most anterior portion) was circumferentially excised and microscopically analyzed. In margin positivity, approach was individualized based on extent of positive margin and Gleason pattern. A matched cohort was established for comparison. Retrospective analysis of a prospectively maintained database was performed. Impact of FSA on PSM rate was primarily assessed. Data on 170 patients was analyzed. Positive surgical margin was reported in 56(33%) on frozen section. Neurovascular bundle was partially or totally resected in 79% and 18%. Conversion of positive margin to negative was achieved in 85%. Overall positive margin rate decreased from 22.5% to 7.5%. Nerve sparing increased from 87% to 93%. Location of positive margin at frozen was at the neurovascular bundle area in 39%; thus Istanbul Preserve detected 61% additional margin positivity compared to other techniques. Reconstruction for whole mount was easy. Istanbul Preserve is a novel technique for intraoperative FSA during RARP allowing for microscopic examination of the entire prostate for margin status and easy re-construction for whole mount examination. It guarantees safer margins together with increased rate of nerve sparing. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Interpretable inference on the mixed effect model with the Box-Cox transformation.

    PubMed

    Maruo, K; Yamaguchi, Y; Noma, H; Gosho, M

    2017-07-10

    We derived results for inference on parameters of the marginal model of the mixed effect model with the Box-Cox transformation based on the asymptotic theory approach. We also provided a robust variance estimator of the maximum likelihood estimator of the parameters of this model in consideration of the model misspecifications. Using these results, we developed an inference procedure for the difference of the model median between treatment groups at the specified occasion in the context of mixed effects models for repeated measures analysis for randomized clinical trials, which provided interpretable estimates of the treatment effect. From simulation studies, it was shown that our proposed method controlled type I error of the statistical test for the model median difference in almost all the situations and had moderate or high performance for power compared with the existing methods. We illustrated our method with cluster of differentiation 4 (CD4) data in an AIDS clinical trial, where the interpretability of the analysis results based on our proposed method is demonstrated. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  15. A novel method to quantify and compare anatomical shape: application in cervix cancer radiotherapy

    NASA Astrophysics Data System (ADS)

    Oh, Seungjong; Jaffray, David; Cho, Young-Bin

    2014-06-01

    Adaptive radiation therapy (ART) had been proposed to restore dosimetric deficiencies during treatment delivery. In this paper, we developed a technique of Geometric reLocation for analyzing anatomical OBjects' Evolution (GLOBE) for a numerical model of tumor evolution under radiation therapy and characterized geometric changes of the target using GLOBE. A total of 174 clinical target volumes (CTVs) obtained from 32 cervical cancer patients were analyzed. GLOBE consists of three main steps; step (1) deforming a 3D surface object to a sphere by parametric active contour (PAC), step (2) sampling a deformed PAC on 642 nodes of icosahedron geodesic dome for reference frame, and step (3) unfolding 3D data to 2D plane for convenient visualization and analysis. The performance was evaluated with respect to (1) convergence of deformation (iteration number and computation time) and (2) accuracy of deformation (residual deformation). Based on deformation vectors from planning CTV to weekly CTVs, target specific (TS) margins were calculated on each sampled node of GLOBE and the systematic (Σ) and random (σ) variations of the vectors were calculated. Population based anisotropic (PBA) margins were generated using van Herk's margin recipe. GLOBE successfully modeled 152 CTVs from 28 patients. Fast convergence was observed for most cases (137/152) with the iteration number of 65 ± 74 (average ± STD) and the computation time of 13.7 ± 18.6 min. Residual deformation of PAC was 0.9 ± 0.7 mm and more than 97% was less than 3 mm. Margin analysis showed random nature of TS-margin. As a consequence, PBA-margins perform similarly to ISO-margins. For example, PBA-margins for 90% patients' coverage with 95% dose level is close to 13 mm ISO-margins in the aspect of target coverage and OAR sparing. GLOBE demonstrates a systematic analysis of tumor motion and deformation of patients with cervix cancer during radiation therapy and numerical modeling of PBA-margin on 642 locations of CTV surface.

  16. Wavelet Filtering to Reduce Conservatism in Aeroservoelastic Robust Stability Margins

    NASA Technical Reports Server (NTRS)

    Brenner, Marty; Lind, Rick

    1998-01-01

    Wavelet analysis for filtering and system identification was used to improve the estimation of aeroservoelastic stability margins. The conservatism of the robust stability margins was reduced with parametric and nonparametric time-frequency analysis of flight data in the model validation process. Nonparametric wavelet processing of data was used to reduce the effects of external desirableness and unmodeled dynamics. Parametric estimates of modal stability were also extracted using the wavelet transform. Computation of robust stability margins for stability boundary prediction depends on uncertainty descriptions derived from the data for model validation. F-18 high Alpha Research Vehicle aeroservoelastic flight test data demonstrated improved robust stability prediction by extension of the stability boundary beyond the flight regime.

  17. Inter- and Intrafraction Target Motion in Highly Focused Single Vocal Cord Irradiation of T1a Larynx Cancer Patients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kwa, Stefan L.S., E-mail: s.kwa@erasmusmc.nl; Al-Mamgani, Abrahim; Osman, Sarah O.S.

    2015-09-01

    Purpose: The purpose of this study was to verify clinical target volume–planning target volume (CTV-PTV) margins in single vocal cord irradiation (SVCI) of T1a larynx tumors and characterize inter- and intrafraction target motion. Methods and Materials: For 42 patients, a single vocal cord was irradiated using intensity modulated radiation therapy at a total dose of 58.1 Gy (16 fractions × 3.63 Gy). A daily cone beam computed tomography (CBCT) scan was performed to online correct the setup of the thyroid cartilage after patient positioning with in-room lasers (interfraction motion correction). To monitor intrafraction motion, CBCT scans were also acquired just after patient repositioning and aftermore » dose delivery. A mixed online-offline setup correction protocol (“O2 protocol”) was designed to compensate for both inter- and intrafraction motion. Results: Observed interfraction, systematic (Σ), and random (σ) setup errors in left-right (LR), craniocaudal (CC), and anteroposterior (AP) directions were 0.9, 2.0, and 1.1 mm and 1.0, 1.6, and 1.0 mm, respectively. After correction of these errors, the following intrafraction movements derived from the CBCT acquired after dose delivery were: Σ = 0.4, 1.3, and 0.7 mm, and σ = 0.8, 1.4, and 0.8 mm. More than half of the patients showed a systematic non-zero intrafraction shift in target position, (ie, the mean intrafraction displacement over the treatment fractions was statistically significantly different from zero; P<.05). With the applied CTV-PTV margins (for most patients 3, 5, and 3 mm in LR, CC, and AP directions, respectively), the minimum CTV dose, estimated from the target displacements observed in the last CBCT, was at least 94% of the prescribed dose for all patients and more than 98% for most patients (37 of 42). The proposed O2 protocol could effectively reduce the systematic intrafraction errors observed after dose delivery to almost zero (Σ = 0.1, 0.2, 0.2 mm). Conclusions: With adequate image guidance and CTV-PTV margins in LR, CC, and AP directions of 3, 5, and 3 mm, respectively, excellent target coverage in SVCI could be ensured.« less

  18. A comparative study of set up variations and bowel volumes in supine versus prone positions of patients treated with external beam radiation for carcinoma rectum.

    PubMed

    Rajeev, K R; Menon, Smrithy S; Beena, K; Holla, Raghavendra; Kumar, R Rajaneesh; Dinesh, M

    2014-01-01

    A prospective study was undertaken to evaluate the influence of patient positioning on the set up variations to determine the planning target volume (PTV) margins and to evaluate the clinical relevance volume assessment of the small bowel (SB) within the irradiated volume. During the period of months from December 2011 to April 2012, a computed tomography (CT) scan was done either in supine position or in prone position using a belly board (BB) for 20 consecutive patients. All the patients had histologically proven rectal cancer and received either post- or pre-operative pelvic irradiation. Using a three-dimensional planning system, the dose-volume histogram for SB was defined in each axial CT slice. Total dose was 46-50 Gy (2 Gy/fraction), delivered using the 4-field box technique. The set up variation of the study group was assessed from the data received from the electronic portal imaging device in the linear accelerator. The shift along X, Y, and Z directions were noted. Both systematic and random errors were calculated and using both these values the PTV margin was calculated. The systematic errors of patients treated in the supine position were 0.87 (X-mm), 0.66 (Y-mm), 1.6 (Z-mm) and in the prone position were 1.3 (X-mm), 0.59 (Y-mm), 1.17 (Z-mm). The random errors of patients treated in the supine positions were 1.81 (X-mm), 1.73 (Y-mm), 1.83 (Z-mm) and in prone position were 2.02 (X-mm), 1.21 (Y-mm), 3.05 (Z-mm). The calculated PTV margins in the supine position were 3.45 (X-mm), 2.87 (Y-mm), 5.31 (Z-mm) and in the prone position were 4.91 (X-mm), 2.32 (Y-mm), 5.08 (Z-mm). The mean volume of the peritoneal cavity was 648.65 cm 3 in the prone position and 1197.37 cm 3 in the supine position. The prone position using BB device was more effective in reducing irradiated SB volume in rectal cancer patients. There were no significant variations in the daily set up for patients treated in both supine and prone positions.

  19. Technical Note: Introduction of variance component analysis to setup error analysis in radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matsuo, Yukinori, E-mail: ymatsuo@kuhp.kyoto-u.ac.

    Purpose: The purpose of this technical note is to introduce variance component analysis to the estimation of systematic and random components in setup error of radiotherapy. Methods: Balanced data according to the one-factor random effect model were assumed. Results: Analysis-of-variance (ANOVA)-based computation was applied to estimate the values and their confidence intervals (CIs) for systematic and random errors and the population mean of setup errors. The conventional method overestimates systematic error, especially in hypofractionated settings. The CI for systematic error becomes much wider than that for random error. The ANOVA-based estimation can be extended to a multifactor model considering multiplemore » causes of setup errors (e.g., interpatient, interfraction, and intrafraction). Conclusions: Variance component analysis may lead to novel applications to setup error analysis in radiotherapy.« less

  20. Error field assessment from driven rotation of stable external kinks at EXTRAP-T2R reversed field pinch

    NASA Astrophysics Data System (ADS)

    Volpe, F. A.; Frassinetti, L.; Brunsell, P. R.; Drake, J. R.; Olofsson, K. E. J.

    2013-04-01

    A new non-disruptive error field (EF) assessment technique not restricted to low density and thus low beta was demonstrated at the EXTRAP-T2R reversed field pinch. Stable and marginally stable external kink modes of toroidal mode number n = 10 and n = 8, respectively, were generated, and their rotation sustained, by means of rotating magnetic perturbations of the same n. Due to finite EFs, and in spite of the applied perturbations rotating uniformly and having constant amplitude, the kink modes were observed to rotate non-uniformly and be modulated in amplitude. This behaviour was used to precisely infer the amplitude and approximately estimate the toroidal phase of the EF. A subsequent scan permitted to optimize the toroidal phase. The technique was tested against deliberately applied as well as intrinsic EFs of n = 8 and 10. Corrections equal and opposite to the estimated error fields were applied. The efficacy of the error compensation was indicated by the increased discharge duration and more uniform mode rotation in response to a uniformly rotating perturbation. The results are in good agreement with theory, and the extension to lower n, to tearing modes and to tokamaks, including ITER, is discussed.

  1. Evaluation of marginal failures of dental composite restorations by acoustic emission analysis.

    PubMed

    Gu, Ja-Uk; Choi, Nak-Sam

    2013-01-01

    In this study, a nondestructive method based on acoustic emission (AE) analysis was developed to evaluate the marginal failure states of dental composite restorations. Three types of ring-shaped substrates, which were modeled after a Class I cavity, were prepared from polymethyl methacrylate, stainless steel, and human molar teeth. A bonding agent and a composite resin were applied to the ring-shaped substrates and cured by light exposure. At each time-interval measurement, the tooth substrate presented a higher number of AE hits than polymethyl methacrylate and steel substrates. Marginal disintegration estimations derived from cumulative AE hits and cumulative AE energy parameters showed that a signification portion of marginal gap formation was already realized within 1 min at the initial light-curing stage. Estimation based on cumulative AE energy gave a higher level of marginal failure than that based on AE hits. It was concluded that the AE analysis method developed in this study was a viable approach in predicting the clinical survival of dental composite restorations efficiently within a short test period.

  2. Comparison of weighting approaches for genetic risk scores in gene-environment interaction studies.

    PubMed

    Hüls, Anke; Krämer, Ursula; Carlsten, Christopher; Schikowski, Tamara; Ickstadt, Katja; Schwender, Holger

    2017-12-16

    Weighted genetic risk scores (GRS), defined as weighted sums of risk alleles of single nucleotide polymorphisms (SNPs), are statistically powerful for detection gene-environment (GxE) interactions. To assign weights, the gold standard is to use external weights from an independent study. However, appropriate external weights are not always available. In such situations and in the presence of predominant marginal genetic effects, we have shown in a previous study that GRS with internal weights from marginal genetic effects ("GRS-marginal-internal") are a powerful and reliable alternative to single SNP approaches or the use of unweighted GRS. However, this approach might not be appropriate for detecting predominant interactions, i.e. interactions showing an effect stronger than the marginal genetic effect. In this paper, we present a weighting approach for such predominant interactions ("GRS-interaction-training") in which parts of the data are used to estimate the weights from the interaction terms and the remaining data are used to determine the GRS. We conducted a simulation study for the detection of GxE interactions in which we evaluated power, type I error and sign-misspecification. We compared this new weighting approach to the GRS-marginal-internal approach and to GRS with external weights. Our simulation study showed that in the absence of external weights and with predominant interaction effects, the highest power was reached with the GRS-interaction-training approach. If marginal genetic effects were predominant, the GRS-marginal-internal approach was more appropriate. Furthermore, the power to detect interactions reached by the GRS-interaction-training approach was only slightly lower than the power achieved by GRS with external weights. The power of the GRS-interaction-training approach was confirmed in a real data application to the Traffic, Asthma and Genetics (TAG) Study (N = 4465 observations). When appropriate external weights are unavailable, we recommend to use internal weights from the study population itself to construct weighted GRS for GxE interaction studies. If the SNPs were chosen because a strong marginal genetic effect was hypothesized, GRS-marginal-internal should be used. If the SNPs were chosen because of their collective impact on the biological mechanisms mediating the environmental effect (hypothesis of predominant interactions) GRS-interaction-training should be applied.

  3. Street Sex Work: Re/Constructing Discourse from Margin to Center

    ERIC Educational Resources Information Center

    McCracken, Jill Linnette

    2009-01-01

    Newspaper media create interpretations of marginalized groups that require rhetorical analysis so that we can better understand these representations. This article focuses on how newspaper articles create interpretations of sex work that affect both the marginalized and mainstream communities. My ethnographic case study argues that the material…

  4. On Time Delay Margin Estimation for Adaptive Control and Optimal Control Modification

    NASA Technical Reports Server (NTRS)

    Nguyen, Nhan T.

    2011-01-01

    This paper presents methods for estimating time delay margin for adaptive control of input delay systems with almost linear structured uncertainty. The bounded linear stability analysis method seeks to represent an adaptive law by a locally bounded linear approximation within a small time window. The time delay margin of this input delay system represents a local stability measure and is computed analytically by three methods: Pade approximation, Lyapunov-Krasovskii method, and the matrix measure method. These methods are applied to the standard model-reference adaptive control, s-modification adaptive law, and optimal control modification adaptive law. The windowing analysis results in non-unique estimates of the time delay margin since it is dependent on the length of a time window and parameters which vary from one time window to the next. The optimal control modification adaptive law overcomes this limitation in that, as the adaptive gain tends to infinity and if the matched uncertainty is linear, then the closed-loop input delay system tends to a LTI system. A lower bound of the time delay margin of this system can then be estimated uniquely without the need for the windowing analysis. Simulation results demonstrates the feasibility of the bounded linear stability method for time delay margin estimation.

  5. Robust Stability Analysis of the Space Launch System Control Design: A Singular Value Approach

    NASA Technical Reports Server (NTRS)

    Pei, Jing; Newsome, Jerry R.

    2015-01-01

    Classical stability analysis consists of breaking the feedback loops one at a time and determining separately how much gain or phase variations would destabilize the stable nominal feedback system. For typical launch vehicle control design, classical control techniques are generally employed. In addition to stability margins, frequency domain Monte Carlo methods are used to evaluate the robustness of the design. However, such techniques were developed for Single-Input-Single-Output (SISO) systems and do not take into consideration the off-diagonal terms in the transfer function matrix of Multi-Input-Multi-Output (MIMO) systems. Robust stability analysis techniques such as H(sub infinity) and mu are applicable to MIMO systems but have not been adopted as standard practices within the launch vehicle controls community. This paper took advantage of a simple singular-value-based MIMO stability margin evaluation method based on work done by Mukhopadhyay and Newsom and applied it to the SLS high-fidelity dynamics model. The method computes a simultaneous multi-loop gain and phase margin that could be related back to classical margins. The results presented in this paper suggest that for the SLS system, traditional SISO stability margins are similar to the MIMO margins. This additional level of verification provides confidence in the robustness of the control design.

  6. The Influence of Observation Errors on Analysis Error and Forecast Skill Investigated with an Observing System Simulation Experiment

    NASA Technical Reports Server (NTRS)

    Prive, N. C.; Errico, R. M.; Tai, K.-S.

    2013-01-01

    The Global Modeling and Assimilation Office (GMAO) observing system simulation experiment (OSSE) framework is used to explore the response of analysis error and forecast skill to observation quality. In an OSSE, synthetic observations may be created that have much smaller error than real observations, and precisely quantified error may be applied to these synthetic observations. Three experiments are performed in which synthetic observations with magnitudes of applied observation error that vary from zero to twice the estimated realistic error are ingested into the Goddard Earth Observing System Model (GEOS-5) with Gridpoint Statistical Interpolation (GSI) data assimilation for a one-month period representing July. The analysis increment and observation innovation are strongly impacted by observation error, with much larger variances for increased observation error. The analysis quality is degraded by increased observation error, but the change in root-mean-square error of the analysis state is small relative to the total analysis error. Surprisingly, in the 120 hour forecast increased observation error only yields a slight decline in forecast skill in the extratropics, and no discernable degradation of forecast skill in the tropics.

  7. Nuclear Munitions and Missile Maintenance Officer Attraction and Retention

    DTIC Science & Technology

    2009-03-24

    referring to Maslow and Herzberg.25 Abraham Maslow developed a five-level hierarchy of needs, where once a lower level of needs is met, they no...purely on 21M experience (need either 21A or 13S as well) 2. Interesting work – although nuclear deterrence theory is interesting, the actual work...by necessity , is repetitive with no margin for error 3. Recognition for good work – while the normal awards programs exist in the nuclear enterprise

  8. Report of the Commission to Assess the Threat to the United States from Electromagnetic Pulse (EMP) Attack: Critical National Infrastructures

    DTIC Science & Technology

    2008-04-01

    consumers and electric utilities in Arizona and Southern California. Twelve people, including five children, died as a result of the explosion. The...Modern electronics, communications, pro- tection, control and computers have allowed the physical system to be utilized fully with ever smaller... margins for error. Therefore, a relatively modest upset to the system can cause functional collapse. As the system grows in complexity and interdependence

  9. Solar Asset Management Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iverson, Aaron; Zviagin, George

    Ra Power Management (RPM) has developed a cloud based software platform that manages the financial and operational functions of third party financed solar projects throughout their lifecycle. RPM’s software streamlines and automates the sales, financing, and management of a portfolio of solar assets. The software helps solar developers automate the most difficult aspects of asset management, leading to increased transparency, efficiency, and reduction in human error. More importantly, our platform will help developers save money by improving their operating margins.

  10. Recruiter Stress: An Experiment Using Text-messages as a Stress Intervention

    DTIC Science & Technology

    2013-02-01

    messages covered a number of areas (e.g., physiological, cognitive , social) and were generally self-contained with 140 characters or less (to meet text...although most differences here are small and within margins of errors. The two most noticeable differences are for the impact of work stress on job...order to compute estimated hour and day losses. Using recruiter population estimates, over 4,500 man-hours were lost due to late arrivals and over 9,500

  11. Margins in extra-abdominal desmoid tumors: a comparative analysis.

    PubMed

    Leithner, Andreas; Gapp, Markus; Leithner, Katharina; Radl, Roman; Krippl, Peter; Beham, Alfred; Windhager, Reinhard

    2004-06-01

    The main treatment of extra-abdominal desmoid tumors remains surgery, but recurrence rates up to 80% are reported. The impact of microscopic surgical margin status according to the Enneking classification system is discussed controversially. Therefore, the authors screened the published literature for reliable data on the importance of a wide or radical excision of extra-abdominal desmoid tumors. All studies with more than ten patients, a surgical treatment only, and margin status stated were included. Only 12 out of 49 identified studies fulfilled the inclusion criteria. One hundred fifty-two primary tumors were excised with wide or radical microscopic surgical margins, while in 260 cases a marginal or intralesional excision was performed. In the first group 41 patients (27%) and in the second one 187 patients (72%) developed a recurrence. Therefore, microscopic surgical margin status according to the Enneking classification system is a significant prognostic factor (P < 0.001). The data of this review underline the strategy of a wide or radical local excision as the treatment of choice. Furthermore, as a large number of studies had to be excluded from this analysis, exact microscopic surgical margin status should be provided in future studies in order to allow comparability. . Copyright 2004 Wiley-Liss, Inc.

  12. Intra-patient semi-automated segmentation of the cervix-uterus in CT-images for adaptive radiotherapy of cervical cancer

    NASA Astrophysics Data System (ADS)

    Luiza Bondar, M.; Hoogeman, Mischa; Schillemans, Wilco; Heijmen, Ben

    2013-08-01

    For online adaptive radiotherapy of cervical cancer, fast and accurate image segmentation is required to facilitate daily treatment adaptation. Our aim was twofold: (1) to test and compare three intra-patient automated segmentation methods for the cervix-uterus structure in CT-images and (2) to improve the segmentation accuracy by including prior knowledge on the daily bladder volume or on the daily coordinates of implanted fiducial markers. The tested methods were: shape deformation (SD) and atlas-based segmentation (ABAS) using two non-rigid registration methods: demons and a hierarchical algorithm. Tests on 102 CT-scans of 13 patients demonstrated that the segmentation accuracy significantly increased by including the bladder volume predicted with a simple 1D model based on a manually defined bladder top. Moreover, manually identified implanted fiducial markers significantly improved the accuracy of the SD method. For patients with large cervix-uterus volume regression, the use of CT-data acquired toward the end of the treatment was required to improve segmentation accuracy. Including prior knowledge, the segmentation results of SD (Dice similarity coefficient 85 ± 6%, error margin 2.2 ± 2.3 mm, average time around 1 min) and of ABAS using hierarchical non-rigid registration (Dice 82 ± 10%, error margin 3.1 ± 2.3 mm, average time around 30 s) support their use for image guided online adaptive radiotherapy of cervical cancer.

  13. Intra-patient semi-automated segmentation of the cervix-uterus in CT-images for adaptive radiotherapy of cervical cancer.

    PubMed

    Bondar, M Luiza; Hoogeman, Mischa; Schillemans, Wilco; Heijmen, Ben

    2013-08-07

    For online adaptive radiotherapy of cervical cancer, fast and accurate image segmentation is required to facilitate daily treatment adaptation. Our aim was twofold: (1) to test and compare three intra-patient automated segmentation methods for the cervix-uterus structure in CT-images and (2) to improve the segmentation accuracy by including prior knowledge on the daily bladder volume or on the daily coordinates of implanted fiducial markers. The tested methods were: shape deformation (SD) and atlas-based segmentation (ABAS) using two non-rigid registration methods: demons and a hierarchical algorithm. Tests on 102 CT-scans of 13 patients demonstrated that the segmentation accuracy significantly increased by including the bladder volume predicted with a simple 1D model based on a manually defined bladder top. Moreover, manually identified implanted fiducial markers significantly improved the accuracy of the SD method. For patients with large cervix-uterus volume regression, the use of CT-data acquired toward the end of the treatment was required to improve segmentation accuracy. Including prior knowledge, the segmentation results of SD (Dice similarity coefficient 85 ± 6%, error margin 2.2 ± 2.3 mm, average time around 1 min) and of ABAS using hierarchical non-rigid registration (Dice 82 ± 10%, error margin 3.1 ± 2.3 mm, average time around 30 s) support their use for image guided online adaptive radiotherapy of cervical cancer.

  14. Baryon acoustic oscillations from the complete SDSS-III Lyα-quasar cross-correlation function at z = 2.4

    DOE PAGES

    du Mas des Bourboux, Helion; Le Goff, Jean-Marc; Blomqvist, Michael; ...

    2017-08-08

    We present a measurement of baryon acoustic oscillations (BAO) in the cross-correlation of quasars with the Lyα-forest flux-transmission at a mean redshift z = 2.40. The measurement uses the complete SDSS-III data sample: 168,889 forests and 234,367 quasars from the SDSS Data Release DR12. In addition to the statistical improvement on our previous study using DR11, we have implemented numerous improvements at the analysis level allowing a more accurate measurement of this cross-correlation. We also developed the first simulations of the cross-correlation allowing us to test different aspects of our data analysis and to search for potential systematic errors inmore » the determination of the BAO peak position. We measure the two ratios D H(z = 2.40)=r d = 9.01 ± 0.36 and D M(z = 2.40)=r d = 35.7 ±1.7, where the errors include marginalization over the non-linear velocity of quasars and the metal - quasar cross-correlation contribution, among other effects. These results are within 1.8σ of the prediction of the flat-ΛCDM model describing the observed CMB anisotropies.We combine this study with the Lyα-forest auto-correlation function (Bautista et al. 2017), yielding D H(z = 2.40)=r d = 8.94 ± 0.22 and D M(z = 2.40)=r d = 36.6 ± 1.2, within 2.3σ of the same flat-ΛCDM model.« less

  15. Baryon acoustic oscillations from the complete SDSS-III Lyα-quasar cross-correlation function at z = 2.4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    du Mas des Bourboux, Helion; Le Goff, Jean-Marc; Blomqvist, Michael

    We present a measurement of baryon acoustic oscillations (BAO) in the cross-correlation of quasars with the Lyα-forest flux-transmission at a mean redshift z = 2.40. The measurement uses the complete SDSS-III data sample: 168,889 forests and 234,367 quasars from the SDSS Data Release DR12. In addition to the statistical improvement on our previous study using DR11, we have implemented numerous improvements at the analysis level allowing a more accurate measurement of this cross-correlation. We also developed the first simulations of the cross-correlation allowing us to test different aspects of our data analysis and to search for potential systematic errors inmore » the determination of the BAO peak position. We measure the two ratios D H(z = 2.40)=r d = 9.01 ± 0.36 and D M(z = 2.40)=r d = 35.7 ±1.7, where the errors include marginalization over the non-linear velocity of quasars and the metal - quasar cross-correlation contribution, among other effects. These results are within 1.8σ of the prediction of the flat-ΛCDM model describing the observed CMB anisotropies.We combine this study with the Lyα-forest auto-correlation function (Bautista et al. 2017), yielding D H(z = 2.40)=r d = 8.94 ± 0.22 and D M(z = 2.40)=r d = 36.6 ± 1.2, within 2.3σ of the same flat-ΛCDM model.« less

  16. Bounded Linear Stability Analysis - A Time Delay Margin Estimation Approach for Adaptive Control

    NASA Technical Reports Server (NTRS)

    Nguyen, Nhan T.; Ishihara, Abraham K.; Krishnakumar, Kalmanje Srinlvas; Bakhtiari-Nejad, Maryam

    2009-01-01

    This paper presents a method for estimating time delay margin for model-reference adaptive control of systems with almost linear structured uncertainty. The bounded linear stability analysis method seeks to represent the conventional model-reference adaptive law by a locally bounded linear approximation within a small time window using the comparison lemma. The locally bounded linear approximation of the combined adaptive system is cast in a form of an input-time-delay differential equation over a small time window. The time delay margin of this system represents a local stability measure and is computed analytically by a matrix measure method, which provides a simple analytical technique for estimating an upper bound of time delay margin. Based on simulation results for a scalar model-reference adaptive control system, both the bounded linear stability method and the matrix measure method are seen to provide a reasonably accurate and yet not too conservative time delay margin estimation.

  17. OCT structure, COB location and magmatic type of the SE Brazilian & S Angolan margins from integrated quantitative analysis of deep seismic reflection and gravity anomaly data

    NASA Astrophysics Data System (ADS)

    Cowie, L.; Kusznir, N. J.; Horn, B.

    2013-12-01

    Knowledge of ocean-continent transition (OCT) structure, continent-ocean boundary (COB) location and magmatic type are of critical importance for understanding rifted continental margin formation processes and in evaluating petroleum systems in deep-water frontier oil and gas exploration. The OCT structure, COB location and magmatic type of the SE Brazilian and S Angolan rifted continental margins are much debated; exhumed and serpentinised mantle have been reported at these margins. Integrated quantitative analysis using deep seismic reflection data and gravity inversion have been used to determine OCT structure, COB location and magmatic type for the SE Brazilian and S Angolan margins. Gravity inversion has been used to determine Moho depth, crustal basement thickness and continental lithosphere thinning. Residual Depth Anomaly (RDA) analysis has been used to investigate OCT bathymetric anomalies with respect to expected oceanic bathymetries and subsidence analysis has been used to determine the distribution of continental lithosphere thinning. These techniques have been validated on the Iberian margin for profiles IAM9 and ISE-01. In addition a joint inversion technique using deep seismic reflection and gravity anomaly data has been applied to the ION-GXT BS1-575 SE Brazil and ION-GXT CS1-2400 S Angola. The joint inversion method solves for coincident seismic and gravity Moho in the time domain and calculates the lateral variations in crustal basement densities and velocities along profile. Gravity inversion, RDA and subsidence analysis along the S Angolan ION-GXT CS1-2400 profile has been used to determine OCT structure and COB location. Analysis suggests that exhumed mantle, corresponding to a magma poor margin, is absent beneath the allochthonous salt. The thickness of earliest oceanic crust, derived from gravity and deep seismic reflection data is approximately 7km. The joint inversion predicts crustal basement densities and seismic velocities which are slightly less than expected for 'normal' oceanic crust. The difference between the sediment corrected RDA and that predicted from gravity inversion crustal thickness variation implies that this margin is experiencing ~300m of anomalous uplift attributed to mantle dynamic uplift. Gravity inversion, RDA and subsidence analysis have also been used to determine OCT structure and COB location along the ION-GXT BS1-575 profile, crossing the Sao Paulo Plateau and Florianopolis Ridge of the SE Brazilian margin. Gravity inversion, RDA and subsidence analysis predict the COB to be located SE of the Florianopolis Ridge. Analysis shows no evidence for exhumed mantle on this margin profile. The joint inversion technique predicts normal oceanic basement seismic velocities and densities and beneath the Sao Paulo Plateau and Florianopolis Ridge predicts crustal basement thicknesses between 10-15km. The Sao Paulo Plateau and Florianopolis Ridge are separated by a thin region of crustal basement beneath the salt interpreted as a regional transtensional structure. Sediment corrected RDAs and gravity derived 'synthetic' RDAs are of a similar magnitude on oceanic crust, implying negligible mantle dynamic topography.

  18. CO2 laser versus cold steel margin analysis following endoscopic excision of glottic cancer

    PubMed Central

    2014-01-01

    Objective To compare the suitability of CO2 laser with steel instruments for margin excision in transoral laser microsurgery. Methods Prospective randomized blinded study. Patients with glottic cancer undergoing laser resection were randomized to margin excision by either steel instruments or CO2 laser. Margins were analyzed for size, interpretability and degree of artifact by a pathologist who was blinded to technique. Results 45 patients were enrolled in the study with 226 total margins taken. 39 margins taken by laser had marked artifact and 0 were uninterpretable. 20 margins taken by steel instruments had marked artifact, and 2 were uninterpretable. Controlling for margin size, the laser technique was associated with increasing degrees of margin artifact (p = 0.210), but there was no difference in crude rates of uninterpretability (p = 0.24). Conclusion Laser margin excision is associated with a greater degree of artifact than steel instrument excision, but was not associated with higher rate of uninterpretability. PMID:24502856

  19. A primer on marginal effects-part II: health services research applications.

    PubMed

    Onukwugha, E; Bergtold, J; Jain, R

    2015-02-01

    Marginal analysis evaluates changes in a regression function associated with a unit change in a relevant variable. The primary statistic of marginal analysis is the marginal effect (ME). The ME facilitates the examination of outcomes for defined patient profiles or individuals while measuring the change in original units (e.g., costs, probabilities). The ME has a long history in economics; however, it is not widely used in health services research despite its flexibility and ability to provide unique insights. This article, the second in a two-part series, discusses practical issues that arise in the estimation and interpretation of the ME for a variety of regression models often used in health services research. Part one provided an overview of prior studies discussing ME followed by derivation of ME formulas for various regression models relevant for health services research studies examining costs and utilization. The current article illustrates the calculation and interpretation of ME in practice and discusses practical issues that arise during the implementation, including: understanding differences between software packages in terms of functionality available for calculating the ME and its confidence interval, interpretation of average marginal effect versus marginal effect at the mean, and the difference between ME and relative effects (e.g., odds ratio). Programming code to calculate ME using SAS, STATA, LIMDEP, and MATLAB are also provided. The illustration, discussion, and application of ME in this two-part series support the conduct of future studies applying the concept of marginal analysis.

  20. Effect of resealing on microleakage of resin composite restorations in relationship to margin design and composite type

    PubMed Central

    Antonson, Sibel A.; Yazici, A. Ruya; Okte, Zeynep; Villalta, Patricia; Antonson, Donald E.; Hardigan, Patrick C.

    2012-01-01

    Objective: To determine the relationship between margin preparation design and resin-composite type on microleakage with or without re-application of surface-penetrating sealant. Methods: Class-I resin-composite restorations were completed for 128 extracted human molars. Half of the margins were beveled, the other half, butt-joint. Half of each group was restored with Filtek-Supreme (FS), the other half with Esthet-X (EX) using their respective adhesive systems. Margins were etched and sealed with a surface-penetrating sealant, Fortify. The samples were stored in water 24h, and thermocycled (5,000 cycles, 5°C–55°C). Then, samples were abraded using a toothbrush machine (6,000 strokes). Half of the restorations from each sealant group (n=16) were resealed, and the other half had no further treatment. Thermocycling and tooth brushing were repeated. The samples were sealed with nail polish, immersed in methylene-blue for 8h, sectioned, and magnified digital photographs were taken. Three examiners assessed dye penetration. A 2×2×2 multi-layered Chi-Square analysis, using Cochran-Mantel-Haenszel test was conducted for statistical analysis. Results: No difference was observed between sealed and resealed FS and EX restorations with butt-joint margins. In beveled margins, resealing caused significantly less microleakage (P<.01). No differences were found between restorations either sealed or resealed with bevel margins. In butt-joint margins, at the leakage level deeper than 2/3 of the preparation depth, resealed FS showed less microleakage than EX resealed restorations (P<.01). Conclusion: Resealing reduced microleakage in bevel margins, however, in butt-joint margins resealing did not affect the leakage. A significant statistical relationship exists between and within resealing, margin preparation design, type of composite, and microleakage. PMID:23077418

  1. Optimum Component Design in N-Stage Series Systems to Maximize the Reliability Under Budget Constraint

    DTIC Science & Technology

    2003-03-01

    27 2.8.5 Marginal Analysis Method...Figure 11 Improved Configuration of Figure 10; Increases Basic System Reliability..... 26 Figure 12 Example of marginal analysis ...View of Main Book of Software ............................................................... 51 Figure 20 The View of Data Worksheet

  2. Estimation Of TMDLs And Margin Of Safety Under Conditions Of Uncertainty

    EPA Science Inventory

    In TMDL development, an adequate margin of safety (MOS) is required in the calculation process to provide a cushion needed because of uncertainties in the data and analysis. Current practices, however, rarely factor analysis' uncertainty in TMDL development and the MOS is largel...

  3. Lack of dependence on resonant error field of locked mode island size in ohmic plasmas in DIII-D

    NASA Astrophysics Data System (ADS)

    La Haye, R. J.; Paz-Soldan, C.; Strait, E. J.

    2015-02-01

    DIII-D experiments show that fully penetrated resonant n = 1 error field locked modes in ohmic plasmas with safety factor q95 ≳ 3 grow to similar large disruptive size, independent of resonant error field correction. Relatively small resonant (m/n = 2/1) static error fields are shielded in ohmic plasmas by the natural rotation at the electron diamagnetic drift frequency. However, the drag from error fields can lower rotation such that a bifurcation results, from nearly complete shielding to full penetration, i.e., to a driven locked mode island that can induce disruption. Error field correction (EFC) is performed on DIII-D (in ITER relevant shape and safety factor q95 ≳ 3) with either the n = 1 C-coil (no handedness) or the n = 1 I-coil (with ‘dominantly’ resonant field pitch). Despite EFC, which allows significantly lower plasma density (a ‘figure of merit’) before penetration occurs, the resulting saturated islands have similar large size; they differ only in the phase of the locked mode after typically being pulled (by up to 30° toroidally) in the electron diamagnetic drift direction as they grow to saturation. Island amplification and phase shift are explained by a second change-of-state in which the classical tearing index changes from stable to marginal by the presence of the island, which changes the current density profile. The eventual island size is thus governed by the inherent stability and saturation mechanism rather than the driving error field.

  4. A comparison of machine learning methods for classification using simulation with multiple real data examples from mental health studies.

    PubMed

    Khondoker, Mizanur; Dobson, Richard; Skirrow, Caroline; Simmons, Andrew; Stahl, Daniel

    2016-10-01

    Recent literature on the comparison of machine learning methods has raised questions about the neutrality, unbiasedness and utility of many comparative studies. Reporting of results on favourable datasets and sampling error in the estimated performance measures based on single samples are thought to be the major sources of bias in such comparisons. Better performance in one or a few instances does not necessarily imply so on an average or on a population level and simulation studies may be a better alternative for objectively comparing the performances of machine learning algorithms. We compare the classification performance of a number of important and widely used machine learning algorithms, namely the Random Forests (RF), Support Vector Machines (SVM), Linear Discriminant Analysis (LDA) and k-Nearest Neighbour (kNN). Using massively parallel processing on high-performance supercomputers, we compare the generalisation errors at various combinations of levels of several factors: number of features, training sample size, biological variation, experimental variation, effect size, replication and correlation between features. For smaller number of correlated features, number of features not exceeding approximately half the sample size, LDA was found to be the method of choice in terms of average generalisation errors as well as stability (precision) of error estimates. SVM (with RBF kernel) outperforms LDA as well as RF and kNN by a clear margin as the feature set gets larger provided the sample size is not too small (at least 20). The performance of kNN also improves as the number of features grows and outplays that of LDA and RF unless the data variability is too high and/or effect sizes are too small. RF was found to outperform only kNN in some instances where the data are more variable and have smaller effect sizes, in which cases it also provide more stable error estimates than kNN and LDA. Applications to a number of real datasets supported the findings from the simulation study. © The Author(s) 2013.

  5. Common Bolted Joint Analysis Tool

    NASA Technical Reports Server (NTRS)

    Imtiaz, Kauser

    2011-01-01

    Common Bolted Joint Analysis Tool (comBAT) is an Excel/VB-based bolted joint analysis/optimization program that lays out a systematic foundation for an inexperienced or seasoned analyst to determine fastener size, material, and assembly torque for a given design. Analysts are able to perform numerous what-if scenarios within minutes to arrive at an optimal solution. The program evaluates input design parameters, performs joint assembly checks, and steps through numerous calculations to arrive at several key margins of safety for each member in a joint. It also checks for joint gapping, provides fatigue calculations, and generates joint diagrams for a visual reference. Optimum fastener size and material, as well as correct torque, can then be provided. Analysis methodology, equations, and guidelines are provided throughout the solution sequence so that this program does not become a "black box:" for the analyst. There are built-in databases that reduce the legwork required by the analyst. Each step is clearly identified and results are provided in number format, as well as color-coded spelled-out words to draw user attention. The three key features of the software are robust technical content, innovative and user friendly I/O, and a large database. The program addresses every aspect of bolted joint analysis and proves to be an instructional tool at the same time. It saves analysis time, has intelligent messaging features, and catches operator errors in real time.

  6. Evaluation of the marginal fit of single-unit, complete-coverage ceramic restorations fabricated after digital and conventional impressions: A systematic review and meta-analysis.

    PubMed

    Tsirogiannis, Panagiotis; Reissmann, Daniel R; Heydecke, Guido

    2016-09-01

    In existing published reports, some studies indicate the superiority of digital impression systems in terms of the marginal accuracy of ceramic restorations, whereas others show that the conventional method provides restorations with better marginal fit than fully digital fabrication. Which impression method provides the lowest mean values for marginal adaptation is inconclusive. The findings from those studies cannot be easily generalized, and in vivo studies that could provide valid and meaningful information are limited in the existing publications. The purpose of this study was to systematically review existing reports and evaluate the marginal fit of ceramic single-tooth restorations after either digital or conventional impression methods by combining the available evidence in a meta-analysis. The search strategy for this systematic review of the publications was based on a Population, Intervention, Comparison, and Outcome (PICO) framework. For the statistical analysis, the mean marginal fit values of each study were extracted and categorized according to the impression method to calculate the mean value, together with the 95% confidence intervals (CI) of each category, and to evaluate the impact of each impression method on the marginal adaptation by comparing digital and conventional techniques separately for in vitro and in vivo studies. Twelve studies were included in the meta-analysis from the 63 identified records after database searching. For the in vitro studies, where ceramic restorations were fabricated after conventional impressions, the mean value of the marginal fit was 58.9 μm (95% CI: 41.1-76.7 μm), whereas after digital impressions, it was 63.3 μm (95% CI: 50.5-76.0 μm). In the in vivo studies, the mean marginal discrepancy of the restorations after digital impressions was 56.1 μm (95% CI: 46.3-65.8 μm), whereas after conventional impressions, it was 79.2 μm (95% CI: 59.6-98.9 μm) No significant difference was observed regarding the marginal discrepancy of single-unit ceramic restorations fabricated after digital or conventional impressions. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  7. "Doing School": A New Unit of Analysis for Schools Serving Marginalized Students

    ERIC Educational Resources Information Center

    Atkinson, Helen

    2009-01-01

    This study asserts a new unit of analysis for school reform that goes beyond the mental representations of individuals, beyond the isolated lesson, and beyond the confines of a school building. I argue that the special case of expanding time and space as a method of engagement for marginalized students requires that the unit of analysis change to…

  8. Evaluation of the marginal fit of metal copings fabricated on three different marginal designs using conventional and accelerated casting techniques: an in vitro study.

    PubMed

    Vaidya, Sharad; Parkash, Hari; Bhargava, Akshay; Gupta, Sharad

    2014-01-01

    Abundant resources and techniques have been used for complete coverage crown fabrication. Conventional investing and casting procedures for phosphate-bonded investments require a 2- to 4-h procedure before completion. Accelerated casting techniques have been used, but may not result in castings with matching marginal accuracy. The study measured the marginal gap and determined the clinical acceptability of single cast copings invested in a phosphate-bonded investment with the use of conventional and accelerated methods. One hundred and twenty cast coping samples were fabricated using conventional and accelerated methods, with three finish lines: Chamfer, shoulder and shoulder with bevel. Sixty copings were prepared with each technique. Each coping was examined with a stereomicroscope at four predetermined sites and measurements of marginal gaps were documented for each. A master chart was prepared for all the data and was analyzed using Statistical Package for the Social Sciences version. Evidence of marginal gap was then evaluated by t-test. Analysis of variance and Post-hoc analysis were used to compare two groups as well as to make comparisons between three subgroups . Measurements recorded showed no statistically significant difference between conventional and accelerated groups. Among the three marginal designs studied, shoulder with bevel showed the best marginal fit with conventional as well as accelerated casting techniques. Accelerated casting technique could be a vital alternative to the time-consuming conventional casting technique. The marginal fit between the two casting techniques showed no statistical difference.

  9. Large Area Crop Inventory Experiment (LACIE). Phase 1: Evaluation report

    NASA Technical Reports Server (NTRS)

    1976-01-01

    It appears that the Large Area Crop Inventory Experiment over the Great Plains, can with a reasonable expectation, be a satisfactory component of a 90/90 production estimator. The area estimator produced more accurate area estimates for the total winter wheat region than for the mixed spring and winter wheat region of the northern Great Plains. The accuracy does appear to degrade somewhat in regions of marginal agriculture where there are small fields and abundant confusion crops. However, it would appear that these regions tend also to be marginal with respect to wheat production and thus increased area estimation errors do not greatly influence the overall production estimation accuracy in the United States. The loss of segments resulting from cloud cover appears to be a random phenomenon that introduces no significant bias into the estimates. This loss does increase the variance of the estimates.

  10. Microwave and physical properties of sea ice in the winter marginal ice zone

    NASA Technical Reports Server (NTRS)

    Tucker, W. B., III; Perovich, D. K.; Gow, A. J.; Grenfell, T. C.; Onstott, R. G.

    1991-01-01

    Surface-based active and passive microwave measurements were made in conjunction with ice property measurements for several distinct ice types in the Fram Strait during March and April 1987. Synthesis aperture radar imagery downlinked from an aircraft was used to select study sites. The surface-based radar scattering cross section and emissivity spectra generally support previously inferred qualitative relationships between ice types, exhibiting expected separation between young, first-year and multiyear ice. Gradient ratios, calculated for both active and passive data, appear to allow clear separation of ice types when used jointly. Surface flooding of multiyear floes, resulting from excessive loading and perhaps wave action, causes both active and passive signatures to resemble those of first-year ice. This effect could possibly cause estimates of ice type percentages in the marginal ice zone to be in error when derived from aircraft- or satellite-born sensors.

  11. Ensuring the validity of calculated subcritical limits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clark, H.K.

    1977-01-01

    The care taken at the Savannah River Laboratory and Plant to ensure the validity of calculated subcritical limits is described. Close attention is given to ANSI N16.1-1975, ''Validation of Calculational Methods for Nuclear Criticality Safety.'' The computer codes used for criticality safety computations, which are listed and are briefly described, have been placed in the SRL JOSHUA system to facilitate calculation and to reduce input errors. A driver module, KOKO, simplifies and standardizes input and links the codes together in various ways. For any criticality safety evaluation, correlations of the calculational methods are made with experiment to establish bias. Occasionallymore » subcritical experiments are performed expressly to provide benchmarks. Calculated subcritical limits contain an adequate but not excessive margin to allow for uncertainty in the bias. The final step in any criticality safety evaluation is the writing of a report describing the calculations and justifying the margin.« less

  12. Profitability analysis in the hospital industry.

    PubMed Central

    Cleverley, W O

    1978-01-01

    Measures of marginal profit are derived for the two payment classes--cost payers and charge payers--that the hospital industry must consider in profitability analysis, i.e., prediction of the excess of revenue over expenses. Two indexes of profitability, use when payment mix is constant and when it is nonconstant, respectively, are derived from the two marginal profit measures, and one of them is shown to be a modification of the contribution margin, the conventional measure of profitability used in general industry. All three measures--the contribution margin and the two new indexes of profitability--are used to estimate changes in net income resulting from changes in patient volume with and without accompanying changes in payment mix. The conventional measure yields large overestimates of expected excess revenue. PMID:632101

  13. TU-AB-303-06: Does Online Adaptive Radiation Therapy Mean Zero Margin for Intermediate-Risk Prostate Cancer? An Intra-Fractional Seminal Vesicles Motion Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sheng, Y; Li, T; Lee, W

    Purpose: To provide benchmark for seminal vesicles (SVs) margin selection to account for intra-fractional motion; and to investigate the effectiveness of two motion surrogates in predicting intra-fractional SV underdosage. Methods: 9 prostate SBRT patients were studied; each has five pairs of pre-treatment and post-treatment cone-beam CTs (CBCTs). Each pair of CBCTs was registered based on fiducial markers in the prostate. To provide “ground truth” for coverage evaluation, all pre-treatment SVs were expanded with isotropic margin of 1,2,3,5 and 8mm, and their overlap with post-treatment SVs were used to quantify intra-fractional coverage. Two commonly used motion surrogates, the center-of-mass (COM) andmore » the border of contour (the most distal points in SI/AP/LR directions) were evaluated using Receiver-Operating Characteristic (ROC) analyses for predicting SV underdosage due to intra-fractional motion. Action threshold of determining underdosage for each surrogate was calculated by selecting the optimal balancing between sensitivity and specificity. For comparison, margin for each surrogate was also calculated based on traditional margin recipe. Results: 90% post-treatment SV coverage can be achieved in 47%, 82%, 91%, 98% and 98% fractions for 1,2,3,5 and 8mm margins. 3mm margin ensured the 90% intra-fractional SV coverage in 90% fractions when prostate was aligned. The ROC analysis indicated the AUC for COM and border were 0.88 and 0.72. The underdosage threshold was 2.9mm for COM and 4.1mm for border. The Van Herk’s margin recipe recommended 0.5, 0 and 1.8mm margin in LR, AP and SI direction based on COM and for border, the corresponding margin was 2.1, 4.5 and 3mm. Conclusion: 3mm isotropic margin is the minimum required to mitigate the intra-fractional SV motion when prostate is aligned. ROC analysis reveals that both COM and border are acceptable predictors for SV underdosage with 2.9mm and 4.1mm action threshold. Traditional margin calculation is less reliable for this application. This work is partially supported a master research grant from Varian Medical Systems.« less

  14. Mexican Americans in Higher Education: Cultural Adaptation and Marginalization as Predictors of College Persistence Intentions and Life Satisfaction

    ERIC Educational Resources Information Center

    Ojeda, Lizette; Castillo, Linda G.; Rosales Meza, Rocío; Piña-Watson, Brandy

    2014-01-01

    This study examined how college persistence intentions and life satisfaction influenced by acculturation, enculturation, White marginalization, and Mexican American marginalization among 515 Mexican American college students. The utility of a path analysis model was supported. Enculturation positively predicted persistence and life satisfaction.…

  15. MRI Evaluation of Resection Margins in Bone Tumour Surgery

    PubMed Central

    Traore, Sidi Yaya; Lecouvet, Frédéric; Galant, Christine

    2014-01-01

    In 12 patients operated on for bone sarcoma resection, a postoperative magnetic resonance imaging of the resection specimens was obtained in order to assess the surgical margins. Margins were classified according to MRI in R0, R1, and R2 by three independent observers: a radiologist and two orthopaedic surgeons. Final margin evaluation (R0, R1, and R2) was assessed by a confirmed pathologist. Agreement for margin evaluation between the pathologist and the radiologist was perfect (κ = 1). Agreement between the pathologist and an experienced orthopaedic surgeon was very good while it was fair between the pathologist and a junior orthopaedic surgeon. MRI should be considered as a tool to give quick information about the adequacy of margins and to help the pathologist to focus on doubtful areas and to spare time in specimen analysis. But it may not replace the pathological evaluation that gives additional information about tumor necrosis. This study shows that MRI extemporaneous analysis of a resection specimen may be efficient in bone tumor oncologic surgery, if made by an experienced radiologist with perfect agreement with the pathologist. PMID:24976785

  16. WE-AB-209-08: Novel Beam-Specific Adaptive Margins for Reducing Organ-At-Risk Doses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsang, H; Kamerling, CP; Ziegenhein, P

    2016-06-15

    Purpose: Current practice of using 3D margins in radiotherapy with high-energy photon beams provides larger-than-required target coverage. According to the photon depth-dose curve, target displacements in beam direction result in minute changes in dose delivered. We exploit this behavior by generating margins on a per-beam basis which simultaneously account for the relative distance of the target and adjacent organs-at-risk (OARs). Methods: For each beam, we consider only geometrical uncertainties of the target location perpendicular to beam direction. By weighting voxels based on its proximity to an OAR, we generate adaptive margins that yield similar overall target coverage probability and reducedmore » OAR dose-burden, at the expense of increased target volume. Three IMRT plans, using 3D margins and 2D per-beam margins with and without adaptation, were generated for five prostate patients with a prescription dose Dpres of 78Gy in 2Gy fractions using identical optimisation constraints. Systematic uncertainties of 1.1, 1.1, 1.5mm in the LR, SI, and AP directions, respectively, and 0.9, 1.1, 1.0mm for the random uncertainties, were assumed. A verification tool was employed to simulate the effects of systematic and random errors using a population size of 50,000. The fraction of the population that satisfies or violates a given DVH constraint was used for comparison. Results: We observe similar target coverage across all plans, with at least 97.5% of the population meeting the D98%>95%Dpres constraint. When looking at the probability of the population receiving D5<70Gy for the rectum, we observed median absolute increases of 23.61% (range, 2.15%–27.85%) and 6.97% (range, 0.65%–17.76%) using per-beam margins with and without adaptation, respectively, relative to using 3D margins. Conclusion: We observed sufficient and similar target coverage using per-beam margins. By adapting each per-beam margin away from an OAR, we can further reduce OAR dose without significantly lowering target coverage probability by irradiating more less-important tissues. This work is supported by Cancer Research UK under Programme C33589/A19908. Research at ICR is also supported by Cancer Research UK under Programme C33589/A19727 and NHS funding to the NIHR Biomedical Research Centre at RMH and ICR.« less

  17. OCT structure, COB location and magmatic type of the S Angolan & SE Brazilian margins from integrated quantitative analysis of deep seismic reflection and gravity anomaly data

    NASA Astrophysics Data System (ADS)

    Cowie, Leanne; Kusznir, Nick; Horn, Brian

    2014-05-01

    Integrated quantitative analysis using deep seismic reflection data and gravity inversion have been applied to the S Angolan and SE Brazilian margins to determine OCT structure, COB location and magmatic type. Knowledge of these margin parameters are of critical importance for understanding rifted continental margin formation processes and in evaluating petroleum systems in deep-water frontier oil and gas exploration. The OCT structure, COB location and magmatic type of the S Angolan and SE Brazilian rifted continental margins are much debated; exhumed and serpentinised mantle have been reported at these margins. Gravity anomaly inversion, incorporating a lithosphere thermal gravity anomaly correction, has been used to determine Moho depth, crustal basement thickness and continental lithosphere thinning. Residual Depth Anomaly (RDA) analysis has been used to investigate OCT bathymetric anomalies with respect to expected oceanic bathymetries and subsidence analysis has been used to determine the distribution of continental lithosphere thinning. These techniques have been validated for profiles Lusigal 12 and ISE-01 on the Iberian margin. In addition a joint inversion technique using deep seismic reflection and gravity anomaly data has been applied to the ION-GXT BS1-575 SE Brazil and ION-GXT CS1-2400 S Angola deep seismic reflection lines. The joint inversion method solves for coincident seismic and gravity Moho in the time domain and calculates the lateral variations in crustal basement densities and velocities along the seismic profiles. Gravity inversion, RDA and subsidence analysis along the ION-GXT BS1-575 profile, which crosses the Sao Paulo Plateau and Florianopolis Ridge of the SE Brazilian margin, predict the COB to be located SE of the Florianopolis Ridge. Integrated quantitative analysis shows no evidence for exhumed mantle on this margin profile. The joint inversion technique predicts oceanic crustal thicknesses of between 7 and 8 km thickness with normal oceanic basement seismic velocities and densities. Beneath the Sao Paulo Plateau and Florianopolis Ridge, joint inversion predicts crustal basement thicknesses between 10-15km with high values of basement density and seismic velocities under the Sao Paulo Plateau which are interpreted as indicating a significant magmatic component within the crustal basement. The Sao Paulo Plateau and Florianopolis Ridge are separated by a thin region of crustal basement beneath the salt interpreted as a regional transtensional structure. Sediment corrected RDAs and gravity derived "synthetic" RDAs are of a similar magnitude on oceanic crust, implying negligible mantle dynamic topography. Gravity inversion, RDA and subsidence analysis along the S Angolan ION-GXT CS1-2400 profile suggests that exhumed mantle, corresponding to a magma poor margin, is absent..The thickness of earliest oceanic crust, derived from gravity and deep seismic reflection data, is approximately 7km consistent with the global average oceanic crustal thicknesses. The joint inversion predicts a small difference between oceanic and continental crustal basement density and seismic velocity, with the change in basement density and velocity corresponding to the COB independently determined from RDA and subsidence analysis. The difference between the sediment corrected RDA and that predicted from gravity inversion crustal thickness variation implies that this margin is experiencing approximately 500m of anomalous uplift attributed to mantle dynamic uplift.

  18. Study of a safety margin system for powered-lift STOL aircraft

    NASA Technical Reports Server (NTRS)

    Heffley, R. K.; Jewell, W. F.

    1978-01-01

    A study was conducted to explore the feasibility of a safety margin system for powered-lift aircraft which require a backside piloting technique. The objective of the safety margin system was to present multiple safety margin criteria as a single variable which could be tracked manually or automatically and which could be monitored for the purpose of deriving safety margin status. The study involved a pilot-in-the-loop analysis of several safety margin system concepts and a simulation experiment to evaluate those concepts which showed promise of providing a good solution. A system was ultimately configured which offered reasonable compromises in controllability, status information content, and the ability to regulate the safety margin at some expense of the allowable low speed flight path envelope.

  19. Incorporating rainfall uncertainty in a SWAT model: the river Zenne basin (Belgium) case study

    NASA Astrophysics Data System (ADS)

    Tolessa Leta, Olkeba; Nossent, Jiri; van Griensven, Ann; Bauwens, Willy

    2013-04-01

    The European Union Water Framework Directive (EU-WFD) called its member countries to achieve a good ecological status for all inland and coastal water bodies by 2015. According to recent studies, the river Zenne (Belgium) is far from this objective. Therefore, an interuniversity and multidisciplinary project "Towards a Good Ecological Status in the river Zenne (GESZ)" was launched to evaluate the effects of wastewater management plans on the river. In this project, different models have been developed and integrated using the Open Modelling Interface (OpenMI). The hydrologic, semi-distributed Soil and Water Assessment Tool (SWAT) is hereby used as one of the model components in the integrated modelling chain in order to model the upland catchment processes. The assessment of the uncertainty of SWAT is an essential aspect of the decision making process, in order to design robust management strategies that take the predicted uncertainties into account. Model uncertainty stems from the uncertainties on the model parameters, the input data (e.g, rainfall), the calibration data (e.g., stream flows) and on the model structure itself. The objective of this paper is to assess the first three sources of uncertainty in a SWAT model of the river Zenne basin. For the assessment of rainfall measurement uncertainty, first, we identified independent rainfall periods, based on the daily precipitation and stream flow observations and using the Water Engineering Time Series PROcessing tool (WETSPRO). Secondly, we assigned a rainfall multiplier parameter for each of the independent rainfall periods, which serves as a multiplicative input error corruption. Finally, we treated these multipliers as latent parameters in the model optimization and uncertainty analysis (UA). For parameter uncertainty assessment, due to the high number of parameters of the SWAT model, first, we screened out its most sensitive parameters using the Latin Hypercube One-factor-At-a-Time (LH-OAT) technique. Subsequently, we only considered the most sensitive parameters for parameter optimization and UA. To explicitly account for the stream flow uncertainty, we assumed that the stream flow measurement error increases linearly with the stream flow value. To assess the uncertainty and infer posterior distributions of the parameters, we used a Markov Chain Monte Carlo (MCMC) sampler - differential evolution adaptive metropolis (DREAM) that uses sampling from an archive of past states to generate candidate points in each individual chain. It is shown that the marginal posterior distributions of the rainfall multipliers vary widely between individual events, as a consequence of rainfall measurement errors and the spatial variability of the rain. Only few of the rainfall events are well defined. The marginal posterior distributions of the SWAT model parameter values are well defined and identified by DREAM, within their prior ranges. The posterior distributions of output uncertainty parameter values also show that the stream flow data is highly uncertain. The approach of using rainfall multipliers to treat rainfall uncertainty for a complex model has an impact on the model parameter marginal posterior distributions and on the model results Corresponding author: Tel.: +32 (0)2629 3027; fax: +32(0)2629 3022. E-mail: otolessa@vub.ac.be

  20. A comparative analysis of primary and secondary Gleason pattern predictive ability for positive surgical margins after radical prostatectomy.

    PubMed

    Sfoungaristos, S; Kavouras, A; Kanatas, P; Polimeros, N; Perimenis, P

    2011-01-01

    To compare the predictive ability of primary and secondary Gleason pattern for positive surgical margins in patients with clinically localized prostate cancer and a preoperative Gleason score ≤ 6. A retrospective analysis of the medical records of patients undergone a radical prostatectomy between January 2005 and October 2010 was conducted. Patients' age, prostate volume, preoperative PSA, biopsy Gleason score, the 1st and 2nd Gleason pattern were entered a univariate and multivariate analysis. The 1st and 2nd pattern were tested for their ability to predict positive surgical margins using receiver operating characteristic curves. Positive surgical margins were noticed in 56 cases (38.1%) out of 147 studied patients. The 2nd pattern was significantly greater in those with positive surgical margins while the 1st pattern was not significantly different between the 2 groups of patients. ROC analysis revealed that area under the curve was 0.53 (p=0.538) for the 1st pattern and 0.60 (p=0.048) for the 2nd pattern. Concerning the cases with PSA <10 ng/ml, it was also found that only the 2nd pattern had a predictive ability (p=0.050). When multiple logistic regression analysis was conducted it was found that the 2nd pattern was the only independent predictor. The second Gleason pattern was found to be of higher value than the 1st one for the prediction of positive surgical margins in patients with preoperative Gleason score ≤ 6 and this should be considered especially when a neurovascular bundle sparing radical prostatectomy is planned, in order not to harm the oncological outcome.

  1. GIS-based multicriteria overlay analysis in soil-suitability evaluation for cotton (Gossypium spp.): A case study in the black soil region of Central India

    NASA Astrophysics Data System (ADS)

    Walke, N.; Obi Reddy, G. P.; Maji, A. K.; Thayalan, S.

    2012-04-01

    In this study an attempt was made to characterize the soils of the Ringanbodi watershed, Nagpur district, Maharashtra, Central India, for soil-suitability evaluation for cotton using geographic information system (GIS)-based multicriteria overlay analysis techniques. The study shows that 8 soil series and 16 soil series associations in the study area and soils were classified into three orders, i.e., Entisol, Inceptisol, and Vertisol. The analysis reveals that the soil associations E-F, F-G, G-H, and H-G are "moderately suitable" (S2), D-E are "marginally to moderately suitable," and C-D are marginally (S3) suitable. However, soils B-C are "not suitable" to "marginally suitable" (N2-S3) and A-B are "unsuitable" (N2) for cultivation of cotton. The area analysis shows that for a cotton crop an area about 966.7 ha (49.1%) of TGA is moderately suitable and classified as S2. An area about 469.9 ha (23.8%) of TGA is marginal to moderately suitable (S3-S2). The marginally suitable soils for cotton are classified as S3 and cover an area about 35.2 ha (1.8%) of TGA. However, a 172.3 ha (8.7%) area is not suitable (N2) to marginally suitable (S3) and a 326.9 (16.6%) area is not suitable (N2) for cotton because of uncorrectable factors like soil depth, slope, etc. The study demonstrated that GIS-based multicriteria overlay analysis of soil thematic parameters will be of immense help in soil-suitability evaluation for cotton.

  2. Cervical and Incisal Marginal Discrepancy in Ceramic Laminate Veneering Materials: A SEM Analysis

    PubMed Central

    Ranganathan, Hemalatha; Ganapathy, Dhanraj M.; Jain, Ashish R.

    2017-01-01

    Context: Marginal discrepancy influenced by the choice of processing material used for the ceramic laminate veneers needs to be explored further for better clinical application. Aims: This study aimed to evaluate the amount of cervical and incisal marginal discrepancy associated with different ceramic laminate veneering materials. Settings and Design: This was an experimental, single-blinded, in vitro trial. Subjects and Methods: Ten central incisors were prepared for laminate veneers with 2 mm uniform reduction and heavy chamfer finish line. Ceramic laminate veneers fabricated over the prepared teeth using four different processing materials were categorized into four groups as Group I - aluminous porcelain veneers, Group II - lithium disilicate ceramic veneers, Group III - lithium disilicate-leucite-based veneers, Group IV - zirconia-based ceramic veneers. The cervical and incisal marginal discrepancy was measured using a scanning electron microscope. Statistical Analysis Used: ANOVA and post hoc Tukey honest significant difference (HSD) tests were used for statistical analysis. Results: The cervical and incisal marginal discrepancy for four groups was Group I - 114.6 ± 4.3 μm, 132.5 ± 6.5 μm, Group II - 86.1 ± 6.3 μm, 105.4 ± 5.3 μm, Group III - 71.4 ± 4.4 μm, 91.3 ± 4.7 μm, and Group IV - 123.1 ± 4.1 μm, 142.0 ± 5.4 μm. ANOVA and post hoc Tukey HSD tests observed a statistically significant difference between the four test specimens with regard to cervical marginal discrepancy. The cervical and incisal marginal discrepancy scored F = 243.408, P < 0.001 and F = 180.844, P < 0.001, respectively. Conclusion: This study concluded veneers fabricated using leucite reinforced lithium disilicate exhibited the least marginal discrepancy followed by lithium disilicate ceramic, aluminous porcelain, and zirconia-based ceramics. The marginal discrepancy was more in the incisal region than in the cervical region in all the groups. PMID:28839415

  3. Sci-Fri PM: Radiation Therapy, Planning, Imaging, and Special Techniques - 04: Assessment of intra-fraction motion during lung SABR VMAT using a custom abdominal compression device

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hyde, Derek; Robinson, Mark; Araujo, Cynthia

    2016-08-15

    Purpose: Lung SABR patients are treated using Volumetrically Modulated Arc Therapy (VMAT), utilizing 2 arcs with Conebeam CT (CBCT) image-guidance prior to each arc. Intra-fraction imaging can prolong treatment time (up to 20%), and the aim of this study is to determine if it is necessary. Methods: We utilize an in-house abdominal compression device to minimize respiratory motion, 4DCT to define the ITV, a 5 mm PTV margin and a 2–3 mm PRV margin. We treated 23 patients with VMAT, fifteen were treated to 48 Gy in 4 fractions, while eight were treated with up to 60 Gy in 8more » fractions. Intrafraction motion was assessed by the translational errors recorded for the second CBCT. Results: There was no significant difference (t-test, p=0.93) in the intra-fraction motion between the patients treated with 4 and 8 fractions, or between the absolute translations in each direction (ANOVA, p=0.17). All 124 intra-fraction CBCT images were analysed and 95% remained localized within the 5 mm PTV margin The mean magnitude of the vector displacement was 1.8 mm. Conclusions: For patients localized with an abdominal compression device, the intrafraction CBCT image may not be necessary, if it is only the tumor coverage that is of concern, as the patients are typically well within the 5 mm PTV margin. On the other hand, if there is a structure with a smaller PRV margin, an intrafraction CBCT is recommended to ensure that the dose limit for the organ at risk is not exceeded.« less

  4. Distribution and depth of bottom-simulating reflectors in the Nankai subduction margin.

    PubMed

    Ohde, Akihiro; Otsuka, Hironori; Kioka, Arata; Ashi, Juichiro

    2018-01-01

    Surface heat flow has been observed to be highly variable in the Nankai subduction margin. This study presents an investigation of local anomalies in surface heat flows on the undulating seafloor in the Nankai subduction margin. We estimate the heat flows from bottom-simulating reflectors (BSRs) marking the lower boundaries of the methane hydrate stability zone and evaluate topographic effects on heat flow via two-dimensional thermal modeling. BSRs have been used to estimate heat flows based on the known stability characteristics of methane hydrates under low-temperature and high-pressure conditions. First, we generate an extensive map of the distribution and subseafloor depths of the BSRs in the Nankai subduction margin. We confirm that BSRs exist at the toe of the accretionary prism and the trough floor of the offshore Tokai region, where BSRs had previously been thought to be absent. Second, we calculate the BSR-derived heat flow and evaluate the associated errors. We conclude that the total uncertainty of the BSR-derived heat flow should be within 25%, considering allowable ranges in the P-wave velocity, which influences the time-to-depth conversion of the BSR position in seismic images, the resultant geothermal gradient, and thermal resistance. Finally, we model a two-dimensional thermal structure by comparing the temperatures at the observed BSR depths with the calculated temperatures at the same depths. The thermal modeling reveals that most local variations in BSR depth over the undulating seafloor can be explained by topographic effects. Those areas that cannot be explained by topographic effects can be mainly attributed to advective fluid flow, regional rapid sedimentation, or erosion. Our spatial distribution of heat flow data provides indispensable basic data for numerical studies of subduction zone modeling to evaluate margin parallel age dependencies of subducting plates.

  5. The performance of different propensity score methods for estimating marginal hazard ratios.

    PubMed

    Austin, Peter C

    2013-07-20

    Propensity score methods are increasingly being used to reduce or minimize the effects of confounding when estimating the effects of treatments, exposures, or interventions when using observational or non-randomized data. Under the assumption of no unmeasured confounders, previous research has shown that propensity score methods allow for unbiased estimation of linear treatment effects (e.g., differences in means or proportions). However, in biomedical research, time-to-event outcomes occur frequently. There is a paucity of research into the performance of different propensity score methods for estimating the effect of treatment on time-to-event outcomes. Furthermore, propensity score methods allow for the estimation of marginal or population-average treatment effects. We conducted an extensive series of Monte Carlo simulations to examine the performance of propensity score matching (1:1 greedy nearest-neighbor matching within propensity score calipers), stratification on the propensity score, inverse probability of treatment weighting (IPTW) using the propensity score, and covariate adjustment using the propensity score to estimate marginal hazard ratios. We found that both propensity score matching and IPTW using the propensity score allow for the estimation of marginal hazard ratios with minimal bias. Of these two approaches, IPTW using the propensity score resulted in estimates with lower mean squared error when estimating the effect of treatment in the treated. Stratification on the propensity score and covariate adjustment using the propensity score result in biased estimation of both marginal and conditional hazard ratios. Applied researchers are encouraged to use propensity score matching and IPTW using the propensity score when estimating the relative effect of treatment on time-to-event outcomes. Copyright © 2012 John Wiley & Sons, Ltd.

  6. Univariate and bivariate likelihood-based meta-analysis methods performed comparably when marginal sensitivity and specificity were the targets of inference.

    PubMed

    Dahabreh, Issa J; Trikalinos, Thomas A; Lau, Joseph; Schmid, Christopher H

    2017-03-01

    To compare statistical methods for meta-analysis of sensitivity and specificity of medical tests (e.g., diagnostic or screening tests). We constructed a database of PubMed-indexed meta-analyses of test performance from which 2 × 2 tables for each included study could be extracted. We reanalyzed the data using univariate and bivariate random effects models fit with inverse variance and maximum likelihood methods. Analyses were performed using both normal and binomial likelihoods to describe within-study variability. The bivariate model using the binomial likelihood was also fit using a fully Bayesian approach. We use two worked examples-thoracic computerized tomography to detect aortic injury and rapid prescreening of Papanicolaou smears to detect cytological abnormalities-to highlight that different meta-analysis approaches can produce different results. We also present results from reanalysis of 308 meta-analyses of sensitivity and specificity. Models using the normal approximation produced sensitivity and specificity estimates closer to 50% and smaller standard errors compared to models using the binomial likelihood; absolute differences of 5% or greater were observed in 12% and 5% of meta-analyses for sensitivity and specificity, respectively. Results from univariate and bivariate random effects models were similar, regardless of estimation method. Maximum likelihood and Bayesian methods produced almost identical summary estimates under the bivariate model; however, Bayesian analyses indicated greater uncertainty around those estimates. Bivariate models produced imprecise estimates of the between-study correlation of sensitivity and specificity. Differences between methods were larger with increasing proportion of studies that were small or required a continuity correction. The binomial likelihood should be used to model within-study variability. Univariate and bivariate models give similar estimates of the marginal distributions for sensitivity and specificity. Bayesian methods fully quantify uncertainty and their ability to incorporate external evidence may be useful for imprecisely estimated parameters. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Plutonium Critical Mass Curve Comparison to Mass at Upper Subcritical Limit (USL) Using Whisper

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alwin, Jennifer Louise; Zhang, Ning

    Whisper is computational software designed to assist the nuclear criticality safety analyst with validation studies with the MCNP ® Monte Carlo radiation transport package. Standard approaches to validation rely on the selection of benchmarks based upon expert judgment. Whisper uses sensitivity/uncertainty (S/U) methods to select relevant benchmarks to a particular application or set of applications being analyzed. Using these benchmarks, Whisper computes a calculational margin. Whisper attempts to quantify the margin of subcriticality (MOS) from errors in software and uncertainties in nuclear data. The combination of the Whisper-derived calculational margin and MOS comprise the baseline upper subcritical limit (USL), tomore » which an additional margin may be applied by the nuclear criticality safety analyst as appropriate to ensure subcriticality. A series of critical mass curves for plutonium, similar to those found in Figure 31 of LA-10860-MS, have been generated using MCNP6.1.1 and the iterative parameter study software, WORM_Solver. The baseline USL for each of the data points of the curves was then computed using Whisper 1.1. The USL was then used to determine the equivalent mass for plutonium metal-water system. ANSI/ANS-8.1 states that it is acceptable to use handbook data, such as the data directly from the LA-10860-MS, as it is already considered validated (Section 4.3 4) “Use of subcritical limit data provided in ANSI/ANS standards or accepted reference publications does not require further validation.”). This paper attempts to take a novel approach to visualize traditional critical mass curves and allows comparison with the amount of mass for which the k eff is equal to the USL (calculational margin + margin of subcriticality). However, the intent is to plot the critical mass data along with USL, not to suggest that already accepted handbook data should have new and more rigorous requirements for validation.« less

  8. Comparison of gating methods for the real-time analysis of left ventricular function in nonimaging blood pool studies.

    PubMed

    Beard, B B; Stewart, J R; Shiavi, R G; Lorenz, C H

    1995-01-01

    Gating methods developed for electrocardiographic-triggered radionuclide ventriculography are being used with nonimaging detectors. These methods have not been compared on the basis of their real-time performance or suitability for determination of load-independent indexes of left ventricular function. This work evaluated the relative merits of different gating methods for nonimaging radionuclude ventriculographic studies, with particular emphasis on their suitability for real-time measurements and the determination of pressure-volume loops. A computer model was used to investigate the relative accuracy of forward gating, backward gating, and phase-mode gating. The durations of simulated left ventricular time-activity curves were randomly varied. Three acquisition parameters were considered: frame rate, acceptance window, and sample size. Twenty-five studies were performed for each combination of acquisition parameters. Hemodynamic and shape parameters from each study were compared with reference parameters derived directly from the random time-activity curves. Backward gating produced the largest errors under all conditions. For both forward gating and phase-mode gating, ejection fraction was underestimated and time to end systole and normalized peak ejection rate were overestimated. For the hemodynamic parameters, forward gating was marginally superior to phase-mode gating. The mean difference in errors between forward and phase-mode gating was 1.47% (SD 2.78%). However, for root mean square shape error, forward gating was several times worse in every case and seven times worse than phase-mode gating on average. Both forward and phase-mode gating are suitable for real-time hemodynamic measurements by nonimaging techniques. The small statistical difference between the methods is not clinically significant. The true shape of the time-activity curve is maintained most accurately by phase-mode gating.

  9. Comparison of gating methods for the real-time analysis of left ventricular function in nonimaging blood pool studies

    PubMed Central

    Beard, Brian B.; Stewart, James R.; Shiavi, Richard G.; Lorenz, Christine H.

    2018-01-01

    Background Gating methods developed for electrocardiographic-triggered radionuclide ventriculography are being used with nonimaging detectors. These methods have not been compared on the basis of their real-time performance or suitability for determination of load-independent indexes of left ventricular function. This work evaluated the relative merits of different gating methods for nonimaging radionuclude ventriculographic studies, with particular emphasis on their suitability for real-time measurements and the determination of pressure-volume loops. Methods and Results A computer model was used to investigate the relative accuracy of forward gating, backward gating, and phase-mode gating. The durations of simulated left ventricular time-activity curves were randomly varied. Three acquisition parameters were considered: frame rate, acceptance window, and sample size. Twenty-five studies were performed for each combination of acquisition parameters. Hemodynamic and shape parameters from each study were compared with reference parameters derived directly from the random time-activity curves. Backward gating produced the largest errors under all conditions. For both forward gating and phase-mode gating, ejection fraction was underestimated and time to end systole and normalized peak ejection rate were overestimated. For the hemodynamic parameters, forward gating was marginally superior to phase-mode gating. The mean difference in errors between forward and phase-mode gating was 1.47% (SD 2.78%). However, for root mean square shape error, forward gating was several times worse in every case and seven times worse than phase-mode gating on average. Conclusions Both forward and phase-mode gating are suitable for real-time hemodynamic measurements by nonimaging techniques. The small statistical difference between the methods is not clinically significant. The true shape of the time-activity curve is maintained most accurately by phase-mode gating. PMID:9420820

  10. Derivation and Error Analysis of the Earth Magnetic Anomaly Grid at 2 arc min Resolution Version 3 (EMAG2v3)

    NASA Astrophysics Data System (ADS)

    Meyer, B.; Chulliat, A.; Saltus, R.

    2017-12-01

    The Earth Magnetic Anomaly Grid at 2 arc min resolution version 3, EMAG2v3, combines marine and airborne trackline observations, satellite data, and magnetic observatory data to map the location, intensity, and extent of lithospheric magnetic anomalies. EMAG2v3 includes over 50 million new data points added to NCEI's Geophysical Database System (GEODAS) in recent years. The new grid relies only on observed data, and does not utilize a priori geologic structure or ocean-age information. Comparing this grid to other global magnetic anomaly compilations (e.g., EMAG2 and WDMAM), we can see that the inclusion of a priori ocean-age patterns forces an artificial linear pattern to the grid; the data-only approach allows for greater complexity in representing the evolution along oceanic spreading ridges and continental margins. EMAG2v3 also makes use of the satellite-derived lithospheric field model MF7 in order to accurately represent anomalies with wavelengths greater than 300 km and to create smooth grid merging boundaries. The heterogeneous distribution of errors in the observations used in compiling the EMAG2v3 was explored, and is reported in the final distributed grid. This grid is delivered at both 4 km continuous altitude above WGS84, as well as at sea level for all oceanic and coastal regions.

  11. Effect of aerodynamic and angle-of-attack uncertainties on the blended entry flight control system of the Space Shuttle from Mach 10 to 2.5

    NASA Technical Reports Server (NTRS)

    Stone, H. W.; Powell, R. W.

    1984-01-01

    A six-degree-of-freedom simulation analysis has been performed for the Space Shuttle Orbiter during entry from Mach 10 to 2.5 with realistic off-nominal conditions using the entry flight control system specified in May 1978. The off-nominal conditions included the following: (1) aerodynamic uncertainties, (2) an error in deriving the angle of attack from onboard instrumentation, (3) the failure of two of the four reaction control-system thrusters on each side, and (4) a lateral center-of-gravity offset. With combinations of the above off-nominal conditions, the control system performed satisfactorily with a few exceptions. The cases that did not exhibit satisfactory performance displayed the following main weaknesses. Marginal performance was exhibited at hypersonic speeds with a sensed angle-of-attack error of 4 deg. At supersonic speeds the system tended to be oscillatory, and the system diverged for several cases because of the inability to hold lateral trim. Several system modifications were suggested to help solve these problems and to maximize safety on the first flight: alter the elevon-trim and speed-brake schedules, delay switching to rudder trim until the rudder effectiveness is adequate, and reduce the overall rudder loop gain. These and other modifications were incorporated in a flight-control-system redesign in May 1979.

  12. A replication and methodological critique of the study "Evaluating drug trafficking on the Tor Network".

    PubMed

    Munksgaard, Rasmus; Demant, Jakob; Branwen, Gwern

    2016-09-01

    The development of cryptomarkets has gained increasing attention from academics, including growing scientific literature on the distribution of illegal goods using cryptomarkets. Dolliver's 2015 article "Evaluating drug trafficking on the Tor Network: Silk Road 2, the Sequel" addresses this theme by evaluating drug trafficking on one of the most well-known cryptomarkets, Silk Road 2.0. The research on cryptomarkets in general-particularly in Dolliver's article-poses a number of new questions for methodologies. This commentary is structured around a replication of Dolliver's original study. The replication study is not based on Dolliver's original dataset, but on a second dataset collected applying the same methodology. We have found that the results produced by Dolliver differ greatly from our replicated study. While a margin of error is to be expected, the inconsistencies we found are too great to attribute to anything other than methodological issues. The analysis and conclusions drawn from studies using these methods are promising and insightful. However, based on the replication of Dolliver's study, we suggest that researchers using these methodologies consider and that datasets be made available for other researchers, and that methodology and dataset metrics (e.g. number of downloaded pages, error logs) are described thoroughly in the context of web-o-metrics and web crawling. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Incidence and location of positive surgical margin among open, laparoscopic and robot-assisted radical prostatectomy in prostate cancer patients: a single institutional analysis.

    PubMed

    Koizumi, Atsushi; Narita, Shintaro; Nara, Taketoshi; Takayama, Koichiro; Kanda, Sohei; Numakura, Kazuyuki; Tsuruta, Hiroshi; Maeno, Atsushi; Huang, Mingguo; Saito, Mitsuru; Inoue, Takamitsu; Tsuchiya, Norihiko; Satoh, Shigeru; Nanjo, Hiroshi; Habuchi, Tomonori

    2018-06-19

    To evaluate the positive surgical margin rates and locations in radical prostatectomy among three surgical approaches, including open radical prostatectomy, laparoscopic radical prostatectomy and robot-assisted radical prostatectomy. We retrospectively reviewed clinical outcomes at our institution of 450 patients who received radical prostatectomy. Multiple surgeons were involved in the three approaches, and a single pathologist conducted the histopathological diagnoses. Positive surgical margin rates and locations among the three approaches were statistically assessed, and the risk factors of positive surgical margin were analyzed. This study included 127, 136 and 187 patients in the open radical prostatectomy, laparoscopic radical prostatectomy and robot-assisted radical prostatectomy groups, respectively. The positive surgical margin rates were 27.6% (open radical prostatectomy), 18.4% (laparoscopic radical prostatectomy) and 13.4% (robot-assisted radical prostatectomy). In propensity score-matched analyses, the positive surgical margin rate in the robot-assisted radical prostatectomy was significantly lower than that in the open radical prostatectomy, whereas there was no significant difference in the positive surgical margin rates between robot-assisted radical prostatectomy and laparoscopic radical prostatectomy. In the multivariable analysis, PSA level at diagnosis and surgical approach (open radical prostatectomy vs robot-assisted radical prostatectomy) were independent risk factors for positive surgical margin. The apex was the most common location of positive surgical margin in the open radical prostatectomy and laparoscopic radical prostatectomy groups, whereas the bladder neck was the most common location in the robot-assisted radical prostatectomy group. The significant difference of positive surgical margin locations continued after the propensity score adjustment. Robot-assisted radical prostatectomy may potentially achieve the lowest positive surgical margin rate among three surgical approaches. The bladder neck was the most common location of positive surgical margin in robot-assisted radical prostatectomy and apex in open radical prostatectomy and laparoscopic radical prostatectomy. Although robot-assisted radical prostatectomy may contribute to the reduction of positive surgical margin, dissection of the bladder neck requires careful attention to avoid positive surgical margins.

  14. Joint Inversion for 3-Dimensional S-Velocity Mantle Structure Along the Tethyan Margin

    DTIC Science & Technology

    2007-09-01

    Hindu Kush and encompasses northeastern Africa, the Arabian peninsula, the Middle East, and part of the Atlantic Ocean for reference. We have fitted...several microplates within an area of one quarter of the Earth’s circumference yields this region rich with tectonic complexity. The three...assigned the largest errors. For the oceans we use a constraint of 10 km for Moho depth, but only for points also covered by data from our other data sets

  15. Palladium-chromium static strain gage for high temperature propulsion systems

    NASA Technical Reports Server (NTRS)

    Lei, Jih-Fen

    1991-01-01

    The present electrical strain gage for high temperature static strain measurements is in its fine-wire and thin-film forms designed to be temperature-compensated on any substrate material. The gage element is of Pd-Cr alloy, while the compensator is of Pt. Because the thermally-induced apparent strain of this compensated wire strain gage is sufficiently small, with good reproducibility between thermal cycles to 800 C, output figures can be corrected within a reasonable margin of error.

  16. Final Report Ra Power Management 1255 10-15-16 FINAL_Public

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iverson, Aaron

    Ra Power Management (RPM) has developed a cloud based software platform that manages the financial and operational functions of third party financed solar projects throughout their lifecycle. RPM’s software streamlines and automates the sales, financing, and management of a portfolio of solar assets. The software helps solar developers automate the most difficult aspects of asset management, leading to increased transparency, efficiency, and reduction in human error. More importantly, our platform will help developers save money by improving their operating margins

  17. Hexagonally packed DNA within bacteriophage T7 stabilized by curvature stress.

    PubMed Central

    Odijk, T

    1998-01-01

    A continuum computation is proposed for the bending stress stabilizing DNA that is hexagonally packed within bacteriophage T7. Because the inner radius of the DNA spool is rather small, the stress of the curved DNA genome is strong enough to balance its electrostatic self-repulsion so as to form a stable hexagonal phase. The theory is in accord with the microscopically determined structure of bacteriophage T7 filled with DNA within the experimental margin of error. PMID:9726924

  18. Effects of online cone-beam computed tomography with active breath control in determining planning target volume during accelerated partial breast irradiation.

    PubMed

    Li, Y; Zhong, R; Wang, X; Ai, P; Henderson, F; Chen, N; Luo, F

    2017-04-01

    To test if active breath control during cone-beam computed tomography (CBCT) could improve planning target volume during accelerated partial breast radiotherapy for breast cancer. Patients who were more than 40 years old, underwent breast-conserving dissection and planned for accelerated partial breast irradiation, and with postoperative staging limited to T1-2 N0 M0, or postoperative staging T2 lesion no larger than 3cm with a negative surgical margin greater than 2mm were enrolled. Patients with lobular carcinoma or extensive ductal carcinoma in situ were excluded. CBCT images were obtained pre-correction, post-correction and post-treatment. Set-up errors were recorded at left-right, anterior-posterior and superior-inferior directions. The differences between these CBCT images, as well as calculated radiation doses, were compared between patients with active breath control or free breathing. Forty patients were enrolled, among them 25 had active breath control. A total of 836 CBCT images were obtained for analysis. CBCT significantly reduced planning target volume. However, active breath control did not show significant benefit in decreasing planning target volume margin and the doses of organ-at-risk when compared to free breathing. CBCT, but not active breath control, could reduce planning target volume during accelerated partial breast irradiation. Copyright © 2017 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.

  19. Circumferential resection margin (CRM) positivity after MRI assessment and adjuvant treatment in 189 patients undergoing rectal cancer resection.

    PubMed

    Simpson, G S; Eardley, N; McNicol, F; Healey, P; Hughes, M; Rooney, P S

    2014-05-01

    The management of rectal cancer relies on accurate MRI staging. Multi-modal treatments can downstage rectal cancer prior to surgery and may have an effect on MRI accuracy. We aim to correlate the findings of MRI staging of rectal cancer with histological analysis, the effect of neoadjuvant therapy on this and the implications of circumferential resection margin (CRM) positivity following neoadjuvant therapy. An analysis of histological data and radiological staging of all cases of rectal cancer in a single centre between 2006 and 2011 were conducted. Two hundred forty-one patients had histologically proved rectal cancer during the study period. One hundred eighty-two patients underwent resection. Median age was 66.6 years, and male to female ratio was 13:5. R1 resection rate was 11.1%. MRI assessments of the circumferential resection margin in patients without neoadjuvant radiotherapy were 93.6 and 88.1% in patients who underwent neoadjuvant radiotherapy. Eighteen patients had predicted positive margins following chemoradiotherapy, of which 38.9% had an involved CRM on histological analysis. MRI assessment of the circumferential resection margin in rectal cancer is associated with high accuracy. Neoadjuvant chemoradiotherapy has a detrimental effect on this accuracy, although accuracy remains high. In the presence of persistently predicted positive margins, complete resection remains achievable but may necessitate a more radical approach to resection.

  20. Influence of Manufacturing Methods of Implant-Supported Crowns on External and Internal Marginal Fit: A Micro-CT Analysis.

    PubMed

    Moris, Izabela C M; Monteiro, Silas Borges; Martins, Raíssa; Ribeiro, Ricardo Faria; Gomes, Erica A

    2018-01-01

    To evaluate the influence of different manufacturing methods of single implant-supported metallic crowns on the internal and external marginal fit through computed microtomography. Forty external hexagon implants were divided into 4 groups ( n = 8), according to the manufacturing method: GC, conventional casting; GI, induction casting; GP, plasma casting; and GCAD, CAD/CAM machining. The crowns were attached to the implants with insertion torque of 30 N·cm. The external (vertical and horizontal) marginal fit and internal fit were assessed through computed microtomography. Internal and external marginal fit data ( μ m) were submitted to a one-way ANOVA and Tukey's test ( α = .05). Qualitative evaluation of the images was conducted by using micro-CT. The statistical analysis revealed no significant difference between the groups for vertical misfit ( P = 0.721). There was no significant difference ( P > 0.05) for the internal and horizontal marginal misfit in the groups GC, GI, and GP, but it was found for the group GCAD ( P ≤ 0.05). Qualitative analysis revealed that most of the samples of cast groups exhibited crowns underextension while the group GCAD showed overextension. The manufacturing method of the crowns influenced the accuracy of marginal fit between the prosthesis and implant. The best results were found for the crowns fabricated through CAD/CAM machining.

  1. Excess Capacity in China’s Power Systems: A Regional Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Jiang; Liu, Xu; Karl, Fredrich

    2016-11-01

    This paper examines China’s regional electricity grids using a reliability perspective, which is commonly measured in terms of a reserve margin. Our analysis shows that at the end of 2014, the average reserve margin for China as a whole was roughly 28%, almost twice as high as a typical planning reserve margin in the U.S. However, this national average masks huge variations in reserve margins across major regional power grid areas: the northeastern region has the highest reserve margin of over 60%, followed by the northwestern region at 49%, and the southern grid area at 35%. In this analysis, wemore » also examined future reserve margins for regional electricity grids in China under two scenarios: 1) a low scenario of national annual electricity consumption growth rates of 1.5% between 2015 and 2020 and 1.0% between 2020 and 2025, and 2) a high scenario of annual average growth rates of 3.0% and 2.0%, respectively. Both scenarios suggest that the northeastern, northwestern, and southern regions have significant excess generation capacity, and that this excess capacity situation will continue over the next decade without regulatory intervention. The northern and central regions could have sufficient generation capacity to 2020, but may require additional resources in a higher growth scenario. The eastern region requires new resources by 2020 in both scenarios.« less

  2. Error analysis of mathematical problems on TIMSS: A case of Indonesian secondary students

    NASA Astrophysics Data System (ADS)

    Priyani, H. A.; Ekawati, R.

    2018-01-01

    Indonesian students’ competence in solving mathematical problems is still considered as weak. It was pointed out by the results of international assessment such as TIMSS. This might be caused by various types of errors made. Hence, this study aimed at identifying students’ errors in solving mathematical problems in TIMSS in the topic of numbers that considered as the fundamental concept in Mathematics. This study applied descriptive qualitative analysis. The subject was three students with most errors in the test indicators who were taken from 34 students of 8th graders. Data was obtained through paper and pencil test and student’s’ interview. The error analysis indicated that in solving Applying level problem, the type of error that students made was operational errors. In addition, for reasoning level problem, there are three types of errors made such as conceptual errors, operational errors and principal errors. Meanwhile, analysis of the causes of students’ errors showed that students did not comprehend the mathematical problems given.

  3. Factors affecting surgical margin recurrence after hepatectomy for colorectal liver metastases.

    PubMed

    Akyuz, Muhammet; Aucejo, Federico; Quintini, Cristiano; Miller, Charles; Fung, John; Berber, Eren

    2016-06-01

    Hepatic recurrence after resection of colorectal liver metastasis (CLM) occurs in 50% of patients during follow-up, with 2.8% to 13.9% presenting with surgical margin recurrence (SMR). The aim of this study is to analyze factors that related to SMR in patients with CLM undergoing hepatectomy. Demographics, clinical and survival data of patients who underwent hepatectomy were identified from a prospectively maintained, institutional review board (IRB)-approved database between 2000 and 2012. Statistical analysis was performed using univariate Kaplan Meier and Cox proportional hazard model. There were 85 female and 121 male patients who underwent liver resection for CLM. An R0 resection was performed in 157 (76%) patients and R1 resection in 49. SMR was detected in 32 patients (15.5%) followed up for a median of 29 months (range, 3-121 months). A half of these patients had undergone R1 (n=16) and another half R0 resection (n=16). Tumor size, preoperative carcinoembryonic antigen (CEA) level and margin status were associated with SMR on univariate analysis. On multivariate analysis, a positive surgical margin was the only independent predictor of SMR. The receipt of adjuvant chemotherapy did not affect margin recurrence. SMR was an independent risk factor associated with worse disease-free (DFS) and overall survival (OS). This study shows that SMR, which can be detected in up to 15.5% of patients after liver resection for CLM, adversely affects DFS and OS. The fact that a positive surgical margin was the only predictive factor for SMR in these patients underscores the importance of achieving negative margins during hepatectomy.

  4. Error Propagation Analysis in the SAE Architecture Analysis and Design Language (AADL) and the EDICT Tool Framework

    NASA Technical Reports Server (NTRS)

    LaValley, Brian W.; Little, Phillip D.; Walter, Chris J.

    2011-01-01

    This report documents the capabilities of the EDICT tools for error modeling and error propagation analysis when operating with models defined in the Architecture Analysis & Design Language (AADL). We discuss our experience using the EDICT error analysis capabilities on a model of the Scalable Processor-Independent Design for Enhanced Reliability (SPIDER) architecture that uses the Reliable Optical Bus (ROBUS). Based on these experiences we draw some initial conclusions about model based design techniques for error modeling and analysis of highly reliable computing architectures.

  5. A study of the mutational landscape of pediatric-type follicular lymphoma and pediatric nodal marginal zone lymphoma.

    PubMed

    Ozawa, Michael G; Bhaduri, Aparna; Chisholm, Karen M; Baker, Steven A; Ma, Lisa; Zehnder, James L; Luna-Fineman, Sandra; Link, Michael P; Merker, Jason D; Arber, Daniel A; Ohgami, Robert S

    2016-10-01

    Pediatric-type follicular lymphoma and pediatric marginal zone lymphoma are two of the rarest B-cell lymphomas. These lymphomas occur predominantly in the pediatric population and show features distinct from their more common counterparts in adults: adult-type follicular lymphoma and adult-type nodal marginal zone lymphoma. Here we report a detailed whole-exome deep sequencing analysis of a cohort of pediatric-type follicular lymphomas and pediatric marginal zone lymphomas. This analysis revealed a recurrent somatic variant encoding p.Lys66Arg in the transcription factor interferon regulatory factor 8 (IRF8) in 3 of 6 cases (50%) of pediatric-type follicular lymphoma. This specific point mutation was not detected in pediatric marginal zone lymphoma or in adult-type follicular lymphoma. Additional somatic point mutations in pediatric-type follicular lymphoma were observed in genes involved in transcription, intracellular signaling, and cell proliferation. In pediatric marginal zone lymphoma, no recurrent mutation was identified; however, somatic point mutations were observed in genes involved in cellular adhesion, cytokine regulatory elements, and cellular proliferation. A somatic variant in AMOTL1, a recurrently mutated gene in splenic marginal zone lymphoma, was also identified in a case of pediatric marginal zone lymphoma. The overall non-synonymous mutational burden was low in both pediatric-type follicular lymphoma and pediatric marginal zone lymphoma (4.6 mutations per exome). Altogether, these findings support a distinctive genetic basis for pediatric-type follicular lymphoma and pediatric marginal zone lymphoma when compared with adult subtypes and to one another. Moreover, identification of a recurrent point mutation in IRF8 provides insight into a potential driver mutation in the pathogenesis of pediatric-type follicular lymphoma with implications for novel diagnostic or therapeutic strategies.

  6. A study of the mutational landscape of pediatric-type follicular lymphoma and pediatric nodal marginal zone lymphoma

    PubMed Central

    Ozawa, Michael G; Bhaduri, Aparna; Chisholm, Karen M; Baker, Steven A; Ma, Lisa; Zehnder, James L; Luna-Fineman, Sandra; Link, Michael P; Merker, Jason D; Arber, Daniel A; Ohgami, Robert S

    2016-01-01

    Pediatric-type follicular lymphoma and pediatric marginal zone lymphoma are two of the rarest B-cell lymphomas. These lymphomas occur predominantly in the pediatric population and show features distinct from their more common counterparts in adults: adult-type follicular lymphoma and adult-type nodal marginal zone lymphoma. Here we report a detailed whole-exome deep sequencing analysis of a cohort of pediatric-type follicular lymphomas and pediatric marginal zone lymphomas. This analysis revealed a recurrent somatic variant encoding p.Lys66Arg in the transcription factor interferon regulatory factor 8 (IRF8) in 3 of 6 cases (50%) of pediatric-type follicular lymphoma. This specific point mutation was not detected in pediatric marginal zone lymphoma or in adult-type follicular lymphoma. Additional somatic point mutations in pediatric-type follicular lymphoma were observed in genes involved in transcription, intracellular signaling, and cell proliferation. In pediatric marginal zone lymphoma, no recurrent mutation was identified; however, somatic point mutations were observed in genes involved in cellular adhesion, cytokine regulatory elements, and cellular proliferation. A somatic variant in AMOTL1, a recurrently mutated gene in splenic marginal zone lymphoma, was also identified in a case of pediatric marginal zone lymphoma. The overall non-synonymous mutational burden was low in both pediatric-type follicular lymphoma and pediatric marginal zone lymphoma (4.6 mutations per exome). Altogether, these findings support a distinctive genetic basis for pediatric-type follicular lymphoma and pediatric marginal zone lymphoma when compared with adult subtypes and to one another. Moreover, identification of a recurrent point mutation in IRF8 provides insight into a potential driver mutation in the pathogenesis of pediatric-type follicular lymphoma with implications for novel diagnostic or therapeutic strategies. PMID:27338637

  7. Marginal and Random Intercepts Models for Longitudinal Binary Data with Examples from Criminology

    ERIC Educational Resources Information Center

    Long, Jeffrey D.; Loeber, Rolf; Farrington, David P.

    2009-01-01

    Two models for the analysis of longitudinal binary data are discussed: the marginal model and the random intercepts model. In contrast to the linear mixed model (LMM), the two models for binary data are not subsumed under a single hierarchical model. The marginal model provides group-level information whereas the random intercepts model provides…

  8. Marginal lands for biocontrol and ecosystem services: Where to enhance and what do we put there?

    USDA-ARS?s Scientific Manuscript database

    Analysis of the Coastal Plain of Georgia, USA identified over 300,000 hectares of marginal land. There is a potential to grow other non-commodity native plants in marginal areas that have the potential to improve the diversity of the landscape and promote ecosystem services. Bioenergy feedstocks are...

  9. Constructions of Difference and Deficit, a Case Study: Nicaraguan Families and Children on the Margins in Costa Rica

    ERIC Educational Resources Information Center

    Purcell-Gates, Victoria

    2014-01-01

    This analysis examines the nexus of marginalization and education, particularly the literacy potential and achievement of young children from socially and politically marginalized communities. Drawing on data from a study of literacy practice among Nicaraguan immigrants in Costa Rica and the schooling of the Nicaraguan children in Costa Rican…

  10. Systematic effects on dark energy from 3D weak shear

    NASA Astrophysics Data System (ADS)

    Kitching, T. D.; Taylor, A. N.; Heavens, A. F.

    2008-09-01

    We present an investigation into the potential effect of systematics inherent in multiband wide-field surveys on the dark energy equation-of-state determination for two 3D weak lensing methods. The weak lensing methods are a geometric shear-ratio method and 3D cosmic shear. The analysis here uses an extension of the Fisher matrix framework to include jointly photometric redshift systematics, shear distortion systematics and intrinsic alignments. Using analytic parametrizations of these three primary systematic effects allows an isolation of systematic parameters of particular importance. We show that assuming systematic parameters are fixed, but possibly biased, results in potentially large biases in dark energy parameters. We quantify any potential bias by defining a Bias Figure of Merit. By marginalizing over extra systematic parameters, such biases are negated at the expense of an increase in the cosmological parameter errors. We show the effect on the dark energy Figure of Merit of marginalizing over each systematic parameter individually. We also show the overall reduction in the Figure of Merit due to all three types of systematic effects. Based on some assumption of the likely level of systematic errors, we find that the largest effect on the Figure of Merit comes from uncertainty in the photometric redshift systematic parameters. These can reduce the Figure of Merit by up to a factor of 2 to 4 in both 3D weak lensing methods, if no informative prior on the systematic parameters is applied. Shear distortion systematics have a smaller overall effect. Intrinsic alignment effects can reduce the Figure of Merit by up to a further factor of 2. This, however, is a worst-case scenario, within the assumptions of the parametrizations used. By including prior information on systematic parameters, the Figure of Merit can be recovered to a large extent, and combined constraints from 3D cosmic shear and shear ratio are robust to systematics. We conclude that, as a rule of thumb, given a realistic current understanding of intrinsic alignments and photometric redshifts, then including all three primary systematic effects reduces the Figure of Merit by at most a factor of 2.

  11. Spine detection in CT and MR using iterated marginal space learning.

    PubMed

    Michael Kelm, B; Wels, Michael; Kevin Zhou, S; Seifert, Sascha; Suehling, Michael; Zheng, Yefeng; Comaniciu, Dorin

    2013-12-01

    Examinations of the spinal column with both, Magnetic Resonance (MR) imaging and Computed Tomography (CT), often require a precise three-dimensional positioning, angulation and labeling of the spinal disks and the vertebrae. A fully automatic and robust approach is a prerequisite for an automated scan alignment as well as for the segmentation and analysis of spinal disks and vertebral bodies in Computer Aided Diagnosis (CAD) applications. In this article, we present a novel method that combines Marginal Space Learning (MSL), a recently introduced concept for efficient discriminative object detection, with a generative anatomical network that incorporates relative pose information for the detection of multiple objects. It is used to simultaneously detect and label the spinal disks. While a novel iterative version of MSL is used to quickly generate candidate detections comprising position, orientation, and scale of the disks with high sensitivity, the anatomical network selects the most likely candidates using a learned prior on the individual nine dimensional transformation spaces. Finally, we propose an optional case-adaptive segmentation approach that allows to segment the spinal disks and vertebrae in MR and CT respectively. Since the proposed approaches are learning-based, they can be trained for MR or CT alike. Experimental results based on 42 MR and 30 CT volumes show that our system not only achieves superior accuracy but also is among the fastest systems of its kind in the literature. On the MR data set the spinal disks of a whole spine are detected in 11.5s on average with 98.6% sensitivity and 0.073 false positive detections per volume. On the CT data a comparable sensitivity of 98.0% with 0.267 false positives is achieved. Detected disks are localized with an average position error of 2.4 mm/3.2 mm and angular error of 3.9°/4.5° in MR/CT, which is close to the employed hypothesis resolution of 2.1 mm and 3.3°. Copyright © 2012 Elsevier B.V. All rights reserved.

  12. From least squares to multilevel modeling: A graphical introduction to Bayesian inference

    NASA Astrophysics Data System (ADS)

    Loredo, Thomas J.

    2016-01-01

    This tutorial presentation will introduce some of the key ideas and techniques involved in applying Bayesian methods to problems in astrostatistics. The focus will be on the big picture: understanding the foundations (interpreting probability, Bayes's theorem, the law of total probability and marginalization), making connections to traditional methods (propagation of errors, least squares, chi-squared, maximum likelihood, Monte Carlo simulation), and highlighting problems where a Bayesian approach can be particularly powerful (Poisson processes, density estimation and curve fitting with measurement error). The "graphical" component of the title reflects an emphasis on pictorial representations of some of the math, but also on the use of graphical models (multilevel or hierarchical models) for analyzing complex data. Code for some examples from the talk will be available to participants, in Python and in the Stan probabilistic programming language.

  13. Nurses' rights of medication administration: Including authority with accountability and responsibility.

    PubMed

    Jones, Jackie H; Treiber, Linda A

    2018-04-23

    Medication errors continue to occur too frequently in the United States. Although the five rights of medication administration have expanded to include several others, evidence that the number of errors has decreased is missing. This study suggests that medication rights for nurses as they administer medications are needed. The historical marginalization of the voice of nurses has been perpetuated with detrimental impacts to nurses and patients. In recent years, a focus on the creation of a just culture, with a balance of accountability and responsibility, has sought to bring a fairer and safer construct to the healthcare environment. This paper proposes that in order for a truly just culture to exist, the balance must also include nurses' authority. Only when a triumvirate of responsibility, accountability, and authority exists can an environment that supports reduced medication errors flourish. Through identification and implementation of Nurses Rights of Medication Administration, nurses' authority to control the administration process is both formalized and legitimized. Further study is needed to identify these rights and how to fully implement them. © 2018 Wiley Periodicals, Inc.

  14. Comparative assessment of marginal accuracy of grade II titanium and Ni–Cr alloy before and after ceramic firing: An in vitro study

    PubMed Central

    Patil, Abhijit; Singh, Kishan; Sahoo, Sukant; Suvarna, Suraj; Kumar, Prince; Singh, Anupam

    2013-01-01

    Objective: The aims of the study are to assess the marginal accuracy of base metal and titanium alloy casting and to evaluate the effect of repeated ceramic firing on the marginal accuracy of base metal and titanium alloy castings. Materials and Methods: Twenty metal copings were fabricated with each casting material. Specimens were divided into 4 groups of 10 each representing base metal alloys castings without (Group A) and with metal shoulder margin (Group B), titanium castings without (Group C) and with metal shoulder margin (Group D). The measurement of fit of the metal copings was carried out before the ceramic firing at four different points and the same was followed after porcelain build-up. Results: Significant difference was found when Ni–Cr alloy samples were compared with Grade II titanium samples both before and after ceramic firings. The titanium castings with metal shoulder margin showed highest microgap among all the materials tested. Conclusions: Based on the results that were found and within the limitations of the study design, it can be concluded that there is marginal discrepancy in the copings made from Ni–Cr and Grade II titanium. This marginal discrepancy increased after ceramic firing cycles for both Ni–Cr and Grade II titanium. The comparative statistical analysis for copings with metal-collar showed maximum discrepancy for Group D. The comparative statistical analysis for copings without metal-collar showed maximum discrepancy for Group C. PMID:24926205

  15. Monitoring Rainfall by Combining Ground-based Observed Precipitation and PERSIANN Satellite Product (Case Study Area: Lake Urmia Basin)

    NASA Astrophysics Data System (ADS)

    Abrishamchi, A.; Mirshahi, A.

    2015-12-01

    The global coverage, quick access, and appropriate spatial-temporal resolution of satellite precipitation data renders the data appropriate for hydrologic studies, especially in regions with no sufficient rain-gauge network. On the other hand, satellite precipitation products may have major errors. The present study aims at reduction of estimation error of the PERSIANN satellite precipitation product. Bayesian logic employed to develop a statistical relationship between historical ground-based and satellite precipitation data. This relationship can then be used to reduce satellite precipitation product error in near real time, when there is no ground-based precipitation observation. The method was evaluated in the Lake Urmia basin with a monthly time scale; November to May of 2000- 2008 for the purpose of model development and two years of 2009 and 2010 for the validation of the established relationships. Moreover, Kriging interpolation method was employed to estimate the average rainfall in the basin. Furthermore, to downscale the satellite precipitation product from 0.25o to 0.05o, data-location downscaling algorithm was used. In 76 percent of months, the final product, compared with the satellite precipitation, had less error during the validation period. Additionally, its performance was marginally better than adjusted PERSIANN product.

  16. Refining Field Measurements of Methane Flux Rates from Abandoned Oil and Gas Wells

    NASA Astrophysics Data System (ADS)

    Lagron, C. S.; Kang, M.; Riqueros, N. S.; Jackson, R. B.

    2015-12-01

    Recent studies in Pennsylvania demonstrate the potential for significant methane emissions from abandoned oil and gas wells. A subset of tested wells was high emitting, with methane flux rates up to seven orders of magnitude greater than natural fluxes (up to 105 mg CH4/hour, or about 2.5LPM). These wells contribute disproportionately to the total methane emissions from abandoned oil and gas wells. The principles guiding the chamber design have been developed for lower flux rates, typically found in natural environments, and chamber design modifications may reduce uncertainty in flux rates associated with high-emitting wells. Kang et al. estimate errors of a factor of two in measured values based on previous studies. We conduct controlled releases of methane to refine error estimates and improve chamber design with a focus on high-emitters. Controlled releases of methane are conducted at 0.05 LPM, 0.50 LPM, 1.0 LPM, 2.0 LPM, 3.0 LPM, and 5.0 LPM, and at two chamber dimensions typically used in field measurements studies of abandoned wells. As most sources of error tabulated by Kang et al. tend to bias the results toward underreporting of methane emissions, a flux-targeted chamber design modification can reduce error margins and/or provide grounds for a potential upward revision of emission estimates.

  17. Competing regression models for longitudinal data.

    PubMed

    Alencar, Airlane P; Singer, Julio M; Rocha, Francisco Marcelo M

    2012-03-01

    The choice of an appropriate family of linear models for the analysis of longitudinal data is often a matter of concern for practitioners. To attenuate such difficulties, we discuss some issues that emerge when analyzing this type of data via a practical example involving pretest-posttest longitudinal data. In particular, we consider log-normal linear mixed models (LNLMM), generalized linear mixed models (GLMM), and models based on generalized estimating equations (GEE). We show how some special features of the data, like a nonconstant coefficient of variation, may be handled in the three approaches and evaluate their performance with respect to the magnitude of standard errors of interpretable and comparable parameters. We also show how different diagnostic tools may be employed to identify outliers and comment on available software. We conclude by noting that the results are similar, but that GEE-based models may be preferable when the goal is to compare the marginal expected responses. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Build-up Approach to Updating the Mock Quiet Spike(TradeMark) Beam Model

    NASA Technical Reports Server (NTRS)

    Herrera, Claudia Y.; Pak, Chan-gi

    2007-01-01

    A crucial part of aircraft design is ensuring that the required margin for flutter is satisfied. A trustworthy flutter analysis, which begins by possessing an accurate dynamics model, is necessary for this task. Traditionally, a model was updated manually by fine tuning specific stiffness parameters until the analytical results matched test data. This is a time consuming iterative process. NASA Dryden Flight Research Center has developed a mode matching code to execute this process in a more efficient manner. Recently, this code was implemented in the F-15B/Quiet Spike(TradeMark) (Gulfstream Aerospace Corporation, Savannah, Georgia) model update. A build-up approach requiring several ground vibration test configurations and a series of model updates was implemented in order to determine the connection stiffness between aircraft and test article. The mode matching code successfully updated various models for the F-15B/Quiet Spike(TradeMark) project to within 1 percent error in frequency and the modal assurance criteria values ranged from 88.51-99.42 percent.

  19. Build-up Approach to Updating the Mock Quiet Spike(TM)Beam Model

    NASA Technical Reports Server (NTRS)

    Herrera, Claudia Y.; Pak, Chan-gi

    2007-01-01

    A crucial part of aircraft design is ensuring that the required margin for flutter is satisfied. A trustworthy flutter analysis, which begins by possessing an accurate dynamics model, is necessary for this task. Traditionally, a model was updated manually by fine tuning specific stiffness parameters until the analytical results matched test data. This is a time consuming iterative process. The NASA Dryden Flight Research Center has developed a mode matching code to execute this process in a more efficient manner. Recently, this code was implemented in the F-15B/Quiet Spike (Gulfstream Aerospace Corporation, Savannah, Georgia) model update. A build-up approach requiring several ground vibration test configurations and a series of model updates was implemented to determine the connection stiffness between aircraft and test article. The mode matching code successfully updated various models for the F-15B/Quiet Spike project to within 1 percent error in frequency and the modal assurance criteria values ranged from 88.51-99.42 percent.

  20. Sex determination from the frontal bone: a geometric morphometric study.

    PubMed

    Perlaza, Néstor A

    2014-09-01

    Sex estimation in human skeletal remains when using the cranium through traditional methods is a fundamental pillar in human identification; however, it may be possible to incur in a margin of error due because of the state of preservation in incomplete or fragmented remains. The aim of this investigation was sex estimation through the geometric morphometric analysis of the frontal bone. The sample employed 60 lateral radiographs of adult subjects of both sexes (30 males and 30 females), aged between 18 and 40 years, with mean age for males of 28 ± 4 and 30 ± 6 years for females. Thin-plate splines evidenced strong expansion of the glabellar region in males and contraction in females. No significant differences were found between sexes with respect to size. The findings suggest differences in shape and size in the glabellar region, besides reaffirming the use of geometric morphometrics as a quantitative method in sex estimation. © 2014 American Academy of Forensic Sciences.

  1. Effects of pan cooking on micropollutants in meat.

    PubMed

    Planche, Christelle; Ratel, Jérémy; Blinet, Patrick; Mercier, Frédéric; Angénieux, Magaly; Chafey, Claude; Zinck, Julie; Marchond, Nathalie; Chevolleau, Sylvie; Marchand, Philippe; Dervilly-Pinel, Gaud; Guérin, Thierry; Debrauwer, Laurent; Engel, Erwan

    2017-10-01

    This work presents the effects of pan cooking on PCBs, PCDD/Fs, pesticides and trace elements in meat from a risk assessment perspective. Three different realistic cooking intensities were studied. A GC×GC-TOF/MS method was set up for the multiresidue analysis of 189 PCBs, 17 PCDD/Fs and 16 pesticides whereas Cd, As, Pb and Hg were assayed by ICP-MS. In terms of quantity, average PCB losses after cooking were 18±5% for rare, 30±3% for medium, and 48±2% for well-done meat. In contrast, average PCDD/F losses were not significant. For pesticides, no loss occurred for aldrin, lindane, DDE or DDD, whereas losses exceeding 80% were found for dieldrin, sulfotep or phorate. Losses close to the margin of error were observed for trace elements. These results are discussed in light of the physicochemical properties of the micropollutants as well as of water and fat losses into cooking juice. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Writing Stories, Rewriting Identities: Using Journalism Education and Mobile Technologies to Empower Marginalized High School Students

    ERIC Educational Resources Information Center

    Cybart-Persenaire, Alena; Literat, Ioana

    2018-01-01

    This study examines the impact that producing a print newspaper using cell phones had on marginalized students in a high school journalism classroom. Analysis of data from participant observation, artifact analysis and student interviews revealed that a) students negotiated cell phone use for educational purposes, despite school bans on such…

  3. Flight-determined stability analysis of multiple-input-multiple-output control systems

    NASA Technical Reports Server (NTRS)

    Burken, John J.

    1992-01-01

    Singular value analysis can give conservative stability margin results. Applying structure to the uncertainty can reduce this conservatism. This paper presents flight-determined stability margins for the X-29A lateral-directional, multiloop control system. These margins are compared with the predicted unscaled singular values and scaled structured singular values. The algorithm was further evaluated with flight data by changing the roll-rate-to-aileron command-feedback gain by +/- 20 percent. Minimum eigenvalues of the return difference matrix which bound the singular values are also presented. Extracting multiloop singular values from flight data and analyzing the feedback gain variations validates this technique as a measure of robustness. This analysis can be used for near-real-time flight monitoring and safety testing.

  4. Flight-determined stability analysis of multiple-input-multiple-output control systems

    NASA Technical Reports Server (NTRS)

    Burken, John J.

    1992-01-01

    Singular value analysis can give conservative stability margin results. Applying structure to the uncertainty can reduce this conservatism. This paper presents flight-determined stability margins for the X-29A lateral-directional, multiloop control system. These margins are compared with the predicted unscaled singular values and scaled structured singular values. The algorithm was further evaluated with flight data by changing the roll-rate-to-aileron-command-feedback gain by +/- 20 percent. Also presented are the minimum eigenvalues of the return difference matrix which bound the singular values. Extracting multiloop singular values from flight data and analyzing the feedback gain variations validates this technique as a measure of robustness. This analysis can be used for near-real-time flight monitoring and safety testing.

  5. Risk Informed Margins Management as part of Risk Informed Safety Margin Characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curtis Smith

    2014-06-01

    The ability to better characterize and quantify safety margin is important to improved decision making about Light Water Reactor (LWR) design, operation, and plant life extension. A systematic approach to characterization of safety margins and the subsequent margin management options represents a vital input to the licensee and regulatory analysis and decision making that will be involved. In addition, as research and development in the LWR Sustainability (LWRS) Program and other collaborative efforts yield new data, sensors, and improved scientific understanding of physical processes that govern the aging and degradation of plant SSCs needs and opportunities to better optimize plantmore » safety and performance will become known. To support decision making related to economics, readability, and safety, the Risk Informed Safety Margin Characterization (RISMC) Pathway provides methods and tools that enable mitigation options known as risk informed margins management (RIMM) strategies.« less

  6. 42 CFR 431.992 - Corrective action plan.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... CMS, designed to reduce improper payments in each program based on its analysis of the error causes in... State must take the following actions: (1) Data analysis. States must conduct data analysis such as reviewing clusters of errors, general error causes, characteristics, and frequency of errors that are...

  7. 42 CFR 431.992 - Corrective action plan.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... CMS, designed to reduce improper payments in each program based on its analysis of the error causes in... State must take the following actions: (1) Data analysis. States must conduct data analysis such as reviewing clusters of errors, general error causes, characteristics, and frequency of errors that are...

  8. An Analysis of Cost Analysis Methods Used during Contract Evaluation and Source Selection in Government Contracting.

    DTIC Science & Technology

    1986-12-01

    optimal value can be stated as, Marginal Productivity of Marginal Productivity of Good A Good B " Price of Good A Price of Good B This...contractor proposed production costs could be used. _11 i4 W Vi..:. II. CONTRACT PROPOSAL EVALUATION A. PRICE ANALYSIS Price analysis, in its broadest sense...enters the market with a supply function represented by line S2, then the new price will be reestablished at price OP2 and quantity OQ2. Price

  9. Beam-specific planning volumes for scattered-proton lung radiotherapy

    NASA Astrophysics Data System (ADS)

    Flampouri, S.; Hoppe, B. S.; Slopsema, R. L.; Li, Z.

    2014-08-01

    This work describes the clinical implementation of a beam-specific planning treatment volume (bsPTV) calculation for lung cancer proton therapy and its integration into the treatment planning process. Uncertainties incorporated in the calculation of the bsPTV included setup errors, machine delivery variability, breathing effects, inherent proton range uncertainties and combinations of the above. Margins were added for translational and rotational setup errors and breathing motion variability during the course of treatment as well as for their effect on proton range of each treatment field. The effect of breathing motion and deformation on the proton range was calculated from 4D computed tomography data. Range uncertainties were considered taking into account the individual voxel HU uncertainty along each proton beamlet. Beam-specific treatment volumes generated for 12 patients were used: a) as planning targets, b) for routine plan evaluation, c) to aid beam angle selection and d) to create beam-specific margins for organs at risk to insure sparing. The alternative planning technique based on the bsPTVs produced similar target coverage as the conventional proton plans while better sparing the surrounding tissues. Conventional proton plans were evaluated by comparing the dose distributions per beam with the corresponding bsPTV. The bsPTV volume as a function of beam angle revealed some unexpected sources of uncertainty and could help the planner choose more robust beams. Beam-specific planning volume for the spinal cord was used for dose distribution shaping to ensure organ sparing laterally and distally to the beam.

  10. Parametric analysis for matched pair survival data.

    PubMed

    Manatunga, A K; Oakes, D

    1999-12-01

    Hougaard's (1986) bivariate Weibull distribution with positive stable frailties is applied to matched pairs survival data when either or both components of the pair may be censored and covariate vectors may be of arbitrary fixed length. When there is no censoring, we quantify the corresponding gain in Fisher information over a fixed-effects analysis. With the appropriate parameterization, the results take a simple algebraic form. An alternative marginal ("independence working model") approach to estimation is also considered. This method ignores the correlation between the two survival times in the derivation of the estimator, but provides a valid estimate of standard error. It is shown that when both the correlation between the two survival times is high, and the ratio of the within-pair variability to the between-pair variability of the covariates is high, the fixed-effects analysis captures most of the information about the regression coefficient but the independence working model does badly. When the correlation is low, and/or most of the variability of the covariates occurs between pairs, the reverse is true. The random effects model is applied to data on skin grafts, and on loss of visual acuity among diabetics. In conclusion some extensions of the methods are indicated and they are placed in a wider context of Generalized Estimation Equation methodology.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ke, Yinghai; Coleman, Andre M.; Diefenderfer, Heida L.

    We delineated 8 watersheds contributing to previously defined river reaches within the 1,468-km2 historical floodplain of the tidally influenced lower Columbia River and estuary. We assessed land-cover change at the watershed, reach, and restoration site scales by reclassifying remote-sensing data from the National Oceanic and Atmospheric Administration Coastal Change Analysis Program’s land cover/land change product into forest, wetland, and urban categories. The analysis showed a 198.3 km2 loss of forest cover during the first 6 years of the Columbia Estuary Ecosystem Restoration Program, 2001–2006. Total measured urbanization in the contributing watersheds of the estuary during the full 1996-2006 change analysismore » period was 48.4 km2. Trends in forest gain/loss and urbanization differed between watersheds. Wetland gains and losses were within the margin of error of the satellite imagery analysis. No significant land cover change was measured at restoration sites, although it was visible in aerial imagery, therefore, the 30-m land-cover product may not be appropriate for assessment of early-stage wetland restoration. These findings suggest that floodplain restoration sites in reaches downstream of watersheds with decreasing forest cover will be subject to increased sediment loads, and those downstream of urbanization will experience effects of increased impervious surfaces on hydrologic processes.« less

  12. Positive Surgical Margins in Favorable-Stage Differentiated Thyroid Cancer.

    PubMed

    Mercado, Catherine E; Drew, Peter A; Morris, Christopher G; Dziegielewski, Peter T; Mendenhall, William M; Amdur, Robert J

    2018-04-16

    The significance of positive margin in favorable-stage well-differentiated thyroid cancer is controversial. We report outcomes of positive-margin patients with a matched-pair comparison to a negative-margin group. A total of 25 patients with classic-histology papillary or follicular carcinoma, total thyroidectomy +/- node dissection, stage T1-3N0-1bM0, positive surgical margin at primary site, adjuvant radioactive iodine (I-131), and age older than 18 years were treated between 2003 and 2013. Endpoints were clinical and biochemical (thyroglobulin-only) recurrence-free survival. Matched-pair analysis involved a 1:1 match with negative-margin cases matched for overall stage and I-131 dose. Recurrence-free survival in positive-margin patients was 71% at 10 years. No patient was successfully salvaged with additional treatment. Only 1 patient died of thyroid cancer. Recurrence-free survival at 10 years was worse with a positive (71%) versus negative (90%) margin (P=0.140). Cure with a microscopically positive margin was suboptimal (71%) despite patients having classic-histology papillary and follicular carcinoma, favorable stage, and moderate-dose I-131 therapy.

  13. Factors affecting surgical margin recurrence after hepatectomy for colorectal liver metastases

    PubMed Central

    Akyuz, Muhammet; Aucejo, Federico; Quintini, Cristiano; Miller, Charles; Fung, John

    2016-01-01

    Background Hepatic recurrence after resection of colorectal liver metastasis (CLM) occurs in 50% of patients during follow-up, with 2.8% to 13.9% presenting with surgical margin recurrence (SMR). The aim of this study is to analyze factors that related to SMR in patients with CLM undergoing hepatectomy. Methods Demographics, clinical and survival data of patients who underwent hepatectomy were identified from a prospectively maintained, institutional review board (IRB)-approved database between 2000 and 2012. Statistical analysis was performed using univariate Kaplan Meier and Cox proportional hazard model. Results There were 85 female and 121 male patients who underwent liver resection for CLM. An R0 resection was performed in 157 (76%) patients and R1 resection in 49. SMR was detected in 32 patients (15.5%) followed up for a median of 29 months (range, 3–121 months). A half of these patients had undergone R1 (n=16) and another half R0 resection (n=16). Tumor size, preoperative carcinoembryonic antigen (CEA) level and margin status were associated with SMR on univariate analysis. On multivariate analysis, a positive surgical margin was the only independent predictor of SMR. The receipt of adjuvant chemotherapy did not affect margin recurrence. SMR was an independent risk factor associated with worse disease-free (DFS) and overall survival (OS). Conclusions This study shows that SMR, which can be detected in up to 15.5% of patients after liver resection for CLM, adversely affects DFS and OS. The fact that a positive surgical margin was the only predictive factor for SMR in these patients underscores the importance of achieving negative margins during hepatectomy. PMID:27294032

  14. Prognostic significance of positive circumferential resection margin in esophageal cancer: a systematic review and meta-analysis.

    PubMed

    Wu, Jie; Chen, Qi-Xun; Teng, Li-song; Krasna, Mark J

    2014-02-01

    To assess the prognostic significance of positive circumferential resection margin on overall survival in patients with esophageal cancer, a systematic review and meta-analysis was performed. Studies were identified from PubMed, EMBASE, and Web of Science. Survival data were extracted from eligible studies to compare overall survival in patients with a positive circumferential resection margin with patients having a negative circumferential resection margin according to the Royal College of Pathologists (RCP) criteria and the College of American Pathologists (CAP) criteria. Survival data were pooled with hazard ratios (HRs) and their corresponding 95% confidence intervals (CIs). A random-effects model meta-analysis on overall survival was performed. The pooled HRs for survival were 1.510 (95% CI, 1.329-1.717; p<0.001) and 2.053 (95% CI, 1.597-2.638; p<0.001) according to the RCP and CAP criteria, respectively. Positive circumferential resection margin was associated with worse survival in patients with T3 stage disease according to the RCP (HR, 1.381; 95% CI, 1.028-1.584; p=0.001) and CAP (HR, 2.457; 95% CI, 1.902-3.175; p<0.001) criteria, respectively. Positive circumferential resection margin was associated with worse survival in patients receiving neoadjuvant therapy according to the RCP (HR, 1.676; 95% CI, 1.023-2.744; p=0.040) and CAP (HR, 1.847; 95% CI, 1.226-2.78; p=0.003) criteria, respectively. Positive circumferential resection margin is associated with poor prognosis in patients with esophageal cancer, particularly in patients with T3 stage disease and patients receiving neoadjuvant therapy. Copyright © 2014 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  15. Low-dimensional Representation of Error Covariance

    NASA Technical Reports Server (NTRS)

    Tippett, Michael K.; Cohn, Stephen E.; Todling, Ricardo; Marchesin, Dan

    2000-01-01

    Ensemble and reduced-rank approaches to prediction and assimilation rely on low-dimensional approximations of the estimation error covariances. Here stability properties of the forecast/analysis cycle for linear, time-independent systems are used to identify factors that cause the steady-state analysis error covariance to admit a low-dimensional representation. A useful measure of forecast/analysis cycle stability is the bound matrix, a function of the dynamics, observation operator and assimilation method. Upper and lower estimates for the steady-state analysis error covariance matrix eigenvalues are derived from the bound matrix. The estimates generalize to time-dependent systems. If much of the steady-state analysis error variance is due to a few dominant modes, the leading eigenvectors of the bound matrix approximate those of the steady-state analysis error covariance matrix. The analytical results are illustrated in two numerical examples where the Kalman filter is carried to steady state. The first example uses the dynamics of a generalized advection equation exhibiting nonmodal transient growth. Failure to observe growing modes leads to increased steady-state analysis error variances. Leading eigenvectors of the steady-state analysis error covariance matrix are well approximated by leading eigenvectors of the bound matrix. The second example uses the dynamics of a damped baroclinic wave model. The leading eigenvectors of a lowest-order approximation of the bound matrix are shown to approximate well the leading eigenvectors of the steady-state analysis error covariance matrix.

  16. Error-Analysis for Correctness, Effectiveness, and Composing Procedure.

    ERIC Educational Resources Information Center

    Ewald, Helen Rothschild

    The assumptions underpinning grammatical mistakes can often be detected by looking for patterns of errors in a student's work. Assumptions that negatively influence rhetorical effectiveness can similarly be detected through error analysis. On a smaller scale, error analysis can also reveal assumptions affecting rhetorical choice. Snags in the…

  17. The role of tectonic inheritance in the morphostructural evolution of the Galicia continental margin and adjacent abyssal plains from digital bathymetric model (DBM) analysis (NW Spain)

    NASA Astrophysics Data System (ADS)

    Maestro, A.; Jané, G.; Llave, E.; López-Martínez, J.; Bohoyo, F.; Druet, M.

    2018-06-01

    The identification of recent major tectonic structures in the Galicia continental margin and adjacent abyssal plains was carried out by means of a quantitative analysis of the linear structures having bathymetric expression on the seabed. It was possible to identify about 5800 lineaments throughout the entire study area, of approximately 271,500 km2. Most lineaments are located in the Charcot and Coruña highs, in the western sector of the Galicia Bank, in the area of the Marginal Platforms and in the northern sector of the margin. Analysis of the lineament orientations shows a predominant NE-SW direction and three relative maximum directions: NW-SE, E-W and N-S. The total length of the lineaments identified is over 44,000 km, with a mode around 5000 m and an average length of about 7800 m. In light of different tectonic studies undertaken in the northwestern margin of the Iberian Peninsula, we establish that the lineaments obtained from analysis of the digital bathymetric model of the Galicia continental margin and adjacent abyssal plains would correspond to fracture systems. In general, the orientation of lineaments corresponds to main faults, tectonic structures following the directions of ancient faults that resulted from late stages of the Variscan orogeny and Mesozoic extension phases related to Triassic rifting and Upper Jurassic to Early Cretaceous opening of the North Atlantic Ocean. The N-S convergence between Eurasian and African plates since Palaeogene times until the Miocene, and NW-SE convergence from Neogene to present, reactivated the Variscan and Mesozoic fault systems and related physiography.

  18. Automatic Error Analysis Using Intervals

    ERIC Educational Resources Information Center

    Rothwell, E. J.; Cloud, M. J.

    2012-01-01

    A technique for automatic error analysis using interval mathematics is introduced. A comparison to standard error propagation methods shows that in cases involving complicated formulas, the interval approach gives comparable error estimates with much less effort. Several examples are considered, and numerical errors are computed using the INTLAB…

  19. Fractionated stereotactic radiotherapy: a method to evaluate geometric and dosimetric uncertainties using radiochromic films.

    PubMed

    Coscia, Gianluca; Vaccara, Elena; Corvisiero, Roberta; Cavazzani, Paolo; Ruggieri, Filippo Grillo; Taccini, Gianni

    2009-07-01

    In the authors' hospital, stereotactic radiotherapy treatments are performed with a Varian Clinac 600C equipped with a BrainLAB m3 micro-multileaf-collimator generally using the dynamic conformal arc technique. Patient immobilization during the treatment is achieved with a fixation mask supplied by BrainLAB, made with two reinforced thermoplastic sheets fitting the patient's head. With this work the authors propose a method to evaluate treatment geometric accuracy and, consequently, to determine the amount of the margin to keep in the CTV-PTV expansion during the treatment planning. The reproducibility of the isocenter position was tested by simulating a complete treatment on the anthropomorphic phantom Alderson Rando, inserting in between two phantom slices a high sensitivity Gafchromic EBT film, properly prepared and calibrated, and repeating several treatment sessions, each time removing the fixing mask and replacing the film inside the phantom. The comparison between the dose distributions measured on films and computed by TPS, after a precise image registration procedure performed by a commercial piece of software (FILMQA, 3cognition LLC (Division of ISP), Wayne, NJ), allowed the authors to measure the repositioning errors, obtaining about 0.5 mm in case of central spherical PTV and about 1.5 mm in case of peripheral irregular PTV. Moreover, an evaluation of the errors in the registration procedure was performed, giving negligible values with respect to the quantities to be measured. The above intrinsic two-dimensional estimate of treatment accuracy has to be increased for the error in the third dimension, but the 2 mm margin the authors generally use for the CTV-PTV expansion seems adequate anyway. Using the same EBT films, a dosimetric verification of the treatment planning system was done. Measured dose values are larger or smaller than the nominal ones depending on geometric irradiation conditions, but, in the authors' experimental conditions, always within 4%.

  20. Saline water in southeastern New Mexico

    USGS Publications Warehouse

    Hiss, W.L.; Peterson, J.B.; Ramsey, T.R.

    1969-01-01

    Saline waters from formations of several geologic ages are being studied in a seven-county area in southeastern New Mexico and western Texas, where more than 30,000 oil and gas tests have been drilled in the past 40 years. This area of 7,500 sq. miles, which is stratigraphically complex, includes the northern and eastern margins of the Delaware Basin between the Guadalupe and Glass Mountains. Chloride-ion concentrations in water produced from rocks of various ages and depths have been mapped in Lea County, New Mexico, using machine map-plotting techniques and trend analyses. Anomalously low chloride concentrations (1,000-3,000 mg/l) were found along the western margin of the Central Basin platform in the San Andres and Capitan Limestone Formations of Permian age. These low chloride-ion concentrations may be due to preferential circulation of ground water through the more porous and permeable rocks. Data being used in the study were obtained principally from oil companies and from related service companies. The P.B.W.D.S. (Permian Basin Well Data System) scout-record magnetic-tape file was used as a framework in all computer operations. Shallow or non-oil-field water analyses acquired from state, municipal, or federal agencies were added to these data utilizing P.B.W.D.S.-compatible reference numbers and decimal latitude-longitude coordinates. Approximately 20,000 water analyses collected from over 65 sources were coded, recorded on punch cards and stored on magnetic tape for computer operations. Extensive manual and computer error checks for duplication and accuracy were made to eliminate data errors resulting from poorly located or identified samples; non-representative or contaminated samples; mistakes in coding, reproducing or key-punching; laboratory errors; and inconsistent reporting. The original 20,000 analyses considered were reduced to 6,000 representative analyses which are being used in the saline water studies. ?? 1969.

Top