[Development of a Compared Software for Automatically Generated DVH in Eclipse TPS].
Xie, Zhao; Luo, Kelin; Zou, Lian; Hu, Jinyou
2016-03-01
This study is to automatically calculate the dose volume histogram(DVH) for the treatment plan, then to compare it with requirements of doctor's prescriptions. The scripting language Autohotkey and programming language C# were used to develop a compared software for automatically generated DVH in Eclipse TPS. This software is named Show Dose Volume Histogram (ShowDVH), which is composed of prescription documents generation, operation functions of DVH, software visualization and DVH compared report generation. Ten cases in different cancers have been separately selected, in Eclipse TPS 11.0 ShowDVH could not only automatically generate DVH reports but also accurately determine whether treatment plans meet the requirements of doctor’s prescriptions, then reports gave direction for setting optimization parameters of intensity modulated radiated therapy. The ShowDVH is an user-friendly and powerful software, and can automatically generated compared DVH reports fast in Eclipse TPS 11.0. With the help of ShowDVH, it greatly saves plan designing time and improves working efficiency of radiation therapy physicists.
NASA Astrophysics Data System (ADS)
Min, Byung Jun; Nam, Heerim; Jeong, Il Sun; Lee, Hyebin
2015-07-01
In recent years, the use of a picture archiving and communication system (PACS) for radiation therapy has become the norm in hospital environments and has been suggested for collecting and managing data using Digital Imaging and Communication in Medicine (DICOM) objects from different treatment planning systems (TPSs). However, some TPSs do not provide the ability to export the dose-volume histogram (DVH) in text or other format. In addition, plan review systems for various TPSs often allow DVH recalculations with different algorithms. These algorithms result in inevitable discrepancies between the values obtained with the recalculation and those obtained with TPS itself. The purpose of this study was to develop a simple method for generating reproducible DVH values by using the TPSs. Treatment planning information, including structures and delivered dose, was exported in the DICOM format from the Eclipse v8.9 or the Pinnacle v9.6 planning systems. The supersampling and trilinear interpolation methods were employed to calculate the DVH data from 35 treatment plans. The discrepancies between the DVHs extracted from each TPS and those extracted by using the proposed calculation method were evaluated with respect to the supersampling ratio. The volume, minimum dose, maximum dose, and mean dose were compared. The variations in DVHs from multiple TPSs were compared by using the MIM software v6.1, which is a commercially available treatment planning comparison tool. The overall comparisons of the volume, minimum dose, maximum dose, and mean dose showed that the proposed method generated relatively smaller discrepancies compared with TPS than the MIM software did compare with the TPS. As the structure volume decreased, the overall percent difference increased. The largest difference was observed in small organs such as the eye ball, eye lens, and optic nerve which had volume below 10 cc. A simple and useful technique was developed to generate a DVH with an acceptable error from a proprietary TPS. This study provides a convenient and common framework that will allow the use of a single well-managed storage solution for an independent information system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nakaguchi, Yuji, E-mail: nkgc2003@yahoo.co.jp; Ono, Takeshi; Onitsuka, Ryota
COMPASS system (IBA Dosimetry, Schwarzenbruck, Germany) and ArcCHECK with 3DVH software (Sun Nuclear Corp., Melbourne, FL) are commercial quasi-3-dimensional (3D) dosimetry arrays. Cross-validation to compare them under the same conditions, such as a treatment plan, allows for clear evaluation of such measurement devices. In this study, we evaluated the accuracy of reconstructed dose distributions from the COMPASS system and ArcCHECK with 3DVH software using Monte Carlo simulation (MC) for multi-leaf collimator (MLC) test patterns and clinical VMAT plans. In a phantom study, ArcCHECK 3DVH showed clear differences from COMPASS, measurement and MC due to the detector resolution and the dosemore » reconstruction method. Especially, ArcCHECK 3DVH showed 7% difference from MC for the heterogeneous phantom. ArcCHECK 3DVH only corrects the 3D dose distribution of treatment planning system (TPS) using ArcCHECK measurement, and therefore the accuracy of ArcCHECK 3DVH depends on TPS. In contrast, COMPASS showed good agreement with MC for all cases. However, the COMPASS system requires many complicated installation procedures such as beam modeling, and appropriate commissioning is needed. In terms of clinical cases, there were no large differences for each QA device. The accuracy of the compass and ArcCHECK 3DVH systems for phantoms and clinical cases was compared. Both systems have advantages and disadvantages for clinical use, and consideration of the operating environment is important. The QA system selection is depending on the purpose and workflow in each hospital.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benhabib, S; Cardan, R; Huang, M
Purpose: To assess dose calculated by the 3DVH software (Sun Nuclear Systems, Melbourne, FL) against TLD measurements and treatment planning system calculations in anthropomorphic phantoms. Methods: The IROC Houston (RPC) head and neck (HN) and lung phantoms were scanned and plans were generated using Eclipse (Varian Medical Systems, Milpitas, CA) following IROC Houston procedures. For the H and N phantom, 6 MV VMAT and 9-field dynamic MLC (DMLC) plans were created. For the lung phantom 6 MV VMAT and 15 MV 9-field dynamic MLC (DMLC) plans were created. The plans were delivered to the phantoms and to an ArcCHECK (Sunmore » Nuclear Systems, Melbourne, FL). The head and neck phantom contained 8 TLDs located at PTV1 (4), PTV2 (2), and OAR Cord (2). The lung phantom contained 4 TLDs, 2 in the PTV, 1 in the cord, and 1 in the heart. Daily outputs were recorded before each measurement for correction. 3DVH dose reconstruction software was used to project the calculated dose to patient anatomy. Results: For the HN phantom, the maximum difference between 3DVH and TLDs was -3.4% and between 3DVH and Eclipse was 1.2%. For the lung plan the maximum difference between 3DVH and TLDs was 4.3%, except for the spinal cord for which 3DVH overestimated the TLD dose by 12%. The maximum difference between 3DVH and Eclipse was 0.3%. 3DVH agreed well with Eclipse because the dose reconstruction algorithm uses the diode measurements to perturb the dose calculated by the treatment planning system; therefore, if there is a problem in the modeling or heterogeneity correction, it will be carried through to 3DVH. Conclusion: 3DVH agreed well with Eclipse and TLD measurements. Comparison of 3DVH with film measurements is ongoing. Work supported by PHS grant CA10953 and CA81647 (NCI, DHHS)« less
Initial experience of ArcCHECK and 3DVH software for RapidArc treatment plan verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Infusino, Erminia; Mameli, Alessandra, E-mail: e.infusino@unicampus.it; Conti, Roberto
2014-10-01
The purpose of this study was to perform delivery quality assurance with ArcCHECK and 3DVH system (Sun Nuclear, FL) and to evaluate the suitability of this system for volumetric-modulated arc therapy (VMAT) (RapidArc [RA]) verification. This software calculates the delivered dose distributions in patients by perturbing the calculated dose using errors detected in fluence or planar dose measurements. The device is tested to correlate the gamma passing rate (%GP) and the composite dose predicted by 3DVH software. A total of 28 patients with prostate cancer who were treated with RA were analyzed. RA treatments were delivered to a diode arraymore » phantom (ArcCHECK), which was used to create a planned dose perturbation (PDP) file. The 3DVH analysis used the dose differences derived from comparing the measured dose with the treatment planning system (TPS)-calculated doses to perturb the initial TPS-calculated dose. The 3DVH then overlays the resultant dose on the patient's structures using the resultant “PDP” beams. Measured dose distributions were compared with the calculated ones using the gamma index (GI) method by applying the global (Van Dyk) normalization and acceptance criteria, i.e., 3%/3 mm. Paired differences tests were used to estimate statistical significance of the differences between the composite dose calculated using 3DVH and %GP. Also, statistical correlation by means of logistic regression analysis has been analyzed. Dose-volume histogram (DVH) analysis for patient plans revealed small differences between treatment plan calculations and 3DVH results for organ at risk (OAR), whereas planning target volume (PTV) of the measured plan was systematically higher than that predicted by the TPS. The t-test results between the planned and the estimated DVH values showed that mean values were incomparable (p < 0.05). The quality assurance (QA) gamma analysis 3%/3 mm showed that in all cases there were only weak-to-moderate correlations (Pearson r: 0.12 to 0.74). Moreover, clinically relevant differences increased with increasing QA passing rate, indicating that some of the largest dose differences occurred in the cases of high QA passing rates, which may be called “false negatives.” The clinical importance of any disagreement between the measured and the calculated dose is often difficult to interpret; however, beam errors (either in delivery or in TPS calculation) can affect the effectiveness of the patient dose. Further research is needed to determinate the role of a PDP-type algorithm to accurately estimate patient dose effect.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelms, Benjamin; Stambaugh, Cassandra; Hunt, Dylan
2015-08-15
Purpose: The authors designed data, methods, and metrics that can serve as a standard, independent of any software package, to evaluate dose-volume histogram (DVH) calculation accuracy and detect limitations. The authors use simple geometrical objects at different orientations combined with dose grids of varying spatial resolution with linear 1D dose gradients; when combined, ground truth DVH curves can be calculated analytically in closed form to serve as the absolute standards. Methods: DICOM RT structure sets containing a small sphere, cylinder, and cone were created programmatically with axial plane spacing varying from 0.2 to 3 mm. Cylinders and cones were modeledmore » in two different orientations with respect to the IEC 1217 Y axis. The contours were designed to stringently but methodically test voxelation methods required for DVH. Synthetic RT dose files were generated with 1D linear dose gradient and with grid resolution varying from 0.4 to 3 mm. Two commercial DVH algorithms—PINNACLE (Philips Radiation Oncology Systems) and PlanIQ (Sun Nuclear Corp.)—were tested against analytical values using custom, noncommercial analysis software. In Test 1, axial contour spacing was constant at 0.2 mm while dose grid resolution varied. In Tests 2 and 3, the dose grid resolution was matched to varying subsampled axial contours with spacing of 1, 2, and 3 mm, and difference analysis and metrics were employed: (1) histograms of the accuracy of various DVH parameters (total volume, D{sub max}, D{sub min}, and doses to % volume: D99, D95, D5, D1, D0.03 cm{sup 3}) and (2) volume errors extracted along the DVH curves were generated and summarized in tabular and graphical forms. Results: In Test 1, PINNACLE produced 52 deviations (15%) while PlanIQ produced 5 (1.5%). In Test 2, PINNACLE and PlanIQ differed from analytical by >3% in 93 (36%) and 18 (7%) times, respectively. Excluding D{sub min} and D{sub max} as least clinically relevant would result in 32 (15%) vs 5 (2%) scored deviations for PINNACLE vs PlanIQ in Test 1, while Test 2 would yield 53 (25%) vs 17 (8%). In Test 3, statistical analyses of volume errors extracted continuously along the curves show PINNACLE to have more errors and higher variability (relative to PlanIQ), primarily due to PINNACLE’s lack of sufficient 3D grid supersampling. Another major driver for PINNACLE errors is an inconsistency in implementation of the “end-capping”; the additional volume resulting from expanding superior and inferior contours halfway to the next slice is included in the total volume calculation, but dose voxels in this expanded volume are excluded from the DVH. PlanIQ had fewer deviations, and most were associated with a rotated cylinder modeled by rectangular axial contours; for coarser axial spacing, the limited number of cross-sectional rectangles hinders the ability to render the true structure volume. Conclusions: The method is applicable to any DVH-calculating software capable of importing DICOM RT structure set and dose objects (the authors’ examples are available for download). It includes a collection of tests that probe the design of the DVH algorithm, measure its accuracy, and identify failure modes. Merits and applicability of each test are discussed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, T; Kumaraswamy, L
Purpose: Detection of treatment delivery errors is important in radiation therapy. However, accurate quantification of delivery errors is also of great importance. This study aims to evaluate the 3DVH software’s ability to accurately quantify delivery errors. Methods: Three VMAT plans (prostate, H&N and brain) were randomly chosen for this study. First, we evaluated whether delivery errors could be detected by gamma evaluation. Conventional per-beam IMRT QA was performed with the ArcCHECK diode detector for the original plans and for the following modified plans: (1) induced dose difference error up to ±4.0% and (2) control point (CP) deletion (3 to 10more » CPs were deleted) (3) gantry angle shift error (3 degree uniformly shift). 2D and 3D gamma evaluation were performed for all plans through SNC Patient and 3DVH, respectively. Subsequently, we investigated the accuracy of 3DVH analysis for all cases. This part evaluated, using the Eclipse TPS plans as standard, whether 3DVH accurately can model the changes in clinically relevant metrics caused by the delivery errors. Results: 2D evaluation seemed to be more sensitive to delivery errors. The average differences between ECLIPSE predicted and 3DVH results for each pair of specific DVH constraints were within 2% for all three types of error-induced treatment plans, illustrating the fact that 3DVH is fairly accurate in quantifying the delivery errors. Another interesting observation was that even though the gamma pass rates for the error plans are high, the DVHs showed significant differences between original plan and error-induced plans in both Eclipse and 3DVH analysis. Conclusion: The 3DVH software is shown to accurately quantify the error in delivered dose based on clinically relevant DVH metrics, where a conventional gamma based pre-treatment QA might not necessarily detect.« less
Poster - Thur Eve - 54: A software solution for ongoing DVH quality assurance in radiation therapy.
Annis, S-L; Zeng, G; Wu, X; Macpherson, M
2012-07-01
A program has been developed in MATLAB for use in quality assurance of treatment planning of radiation therapy. It analyzes patient DVH files and compiles dose volume data for review, trending, comparison and analysis. Patient DVH files are exported from the Eclipse treatment planning system and saved according to treatment sites and date. Currently analysis is available for 4 treatment sites; Prostate, Prostate Bed, Lung, and Upper GI, with two functions for data report and analysis: patient-specific and organ-specific. The patient-specific function loads one patient DVH file and reports the user-specified dose volume data of organs and targets. These data can be compiled to an external file for a third party analysis. The organ-specific function extracts a requested dose volume of an organ from the DVH files of a patient group and reports the statistics over this population. A graphical user interface is utilized to select clinical sites, function and structures, and input user's requests. We have implemented this program in planning quality assurance at our center. The program has tracked the dosimetric improvement in GU sites after VMAT was implemented clinically. It has generated dose volume statistics for different groups of patients associated with technique or time range. This program allows reporting and statistical analysis of DVH files. It is an efficient tool for the planning quality control in radiation therapy. © 2012 American Association of Physicists in Medicine.
SU-F-T-285: Evaluation of a Patient DVH-Based IMRT QA System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhen, H; Redler, G; Chu, J
2016-06-15
Purpose: To evaluate the clinical performance of a patient DVH-based QA system for prostate VMAT QA. Methods: Mobius3D(M3D) is a QA software with an independent beam model and dose engine. The MobiusFX(MFX) add-on predicts patient dose using treatment machine log files. We commissioned the Mobius beam model in two steps. First, the stock beam model was customized using machine commissioning data, then verified against the TPS with 12 simple phantom plans and 7 clinical 3D plans. Secondly, the Dosimetric Leaf Gap(DLG) in the Mobius model was fine-tuned for VMAT treatment based on ion chamber measurements for 6 clinical VMAT plans.more » Upon successful commissioning, we retrospectively performed IMRT QA for 12 VMAT plans with the Mobius system as well as the ArcCHECK-3DVH system. Selected patient DVH values (PTV D95, D50; Bladder D2cc, Dmean; Rectum D2cc) were compared between TPS, M3D, MFX, and 3DVH. Results: During the first commissioning step, TPS and M3D calculated target Dmean for 3D plans agree within 0.7%±0.7%, with 3D gamma passing rates of 98%±2%. In the second commissioning step, the Mobius DLG was adjusted by 1.2mm from the stock value, reducing the average difference between MFX calculation and ion chamber measurement from 3.2% to 0.1%. In retrospective prostate VMAT QA, 5 of 60 MFX calculated DVH values have a deviation greater than 5% compared to TPS. One large deviation at high dose level was identified as a potential QA failure. This echoes the 3DVH QA result, which identified 2 instances of large DVH deviation on the same structure. For all DVH’s evaluated, M3D and MFX show high level of agreement (0.1%±0.2%), indicating that the observed deviation is likely from beam modelling differences rather than delivery errors. Conclusion: Mobius system provides a viable solution for DVH based VMAT QA, with the capability of separating TPS and delivery errors.« less
A DICOM based radiotherapy plan database for research collaboration and reporting
NASA Astrophysics Data System (ADS)
Westberg, J.; Krogh, S.; Brink, C.; Vogelius, I. R.
2014-03-01
Purpose: To create a central radiotherapy (RT) plan database for dose analysis and reporting, capable of calculating and presenting statistics on user defined patient groups. The goal is to facilitate multi-center research studies with easy and secure access to RT plans and statistics on protocol compliance. Methods: RT institutions are able to send data to the central database using DICOM communications on a secure computer network. The central system is composed of a number of DICOM servers, an SQL database and in-house developed software services to process the incoming data. A web site within the secure network allows the user to manage their submitted data. Results: The RT plan database has been developed in Microsoft .NET and users are able to send DICOM data between RT centers in Denmark. Dose-volume histogram (DVH) calculations performed by the system are comparable to those of conventional RT software. A permission system was implemented to ensure access control and easy, yet secure, data sharing across centers. The reports contain DVH statistics for structures in user defined patient groups. The system currently contains over 2200 patients in 14 collaborations. Conclusions: A central RT plan repository for use in multi-center trials and quality assurance was created. The system provides an attractive alternative to dummy runs by enabling continuous monitoring of protocol conformity and plan metrics in a trial.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patwe, P; Mhatre, V; Dandekar, P
Purpose: 3DVH software is a patient specific quality assurance tool which estimates the 3D dose to the patient specific geometry with the help of Planned Dose Perturbation algorithm. The purpose of this study is to evaluate the impact of HU value of ArcCHECK phantom entered in Eclipse TPS on 3D dose & DVH QA analysis. Methods: Manufacturer of ArcCHECK phantom provides CT data set of phantom & recommends considering it as a homogeneous phantom with electron density (1.19 gm/cc or 282 HU) close to PMMA. We performed this study on Eclipse TPS (V13, VMS) & trueBEAM STx VMS Linac &more » ArcCHECK phantom (SNC). Plans were generated for 6MV photon beam, 20cm×20cm field size at isocentre & SPD (Source to phantom distance) of 86.7 cm to deliver 100cGy at isocentre. 3DVH software requires patients DICOM data generated by TPS & plan delivered on ArcCHECK phantom. Plans were generated in TPS by assigning different HU values to phantom. We analyzed gamma index & the dose profile for all plans along vertical down direction of beam’s central axis for Entry, Exit & Isocentre dose. Results: The global gamma passing rate (2% & 2mm) for manufacturer recommended HU value 282 was 96.3%. Detector entry, Isocentre & detector exit Doses were 1.9048 (1.9270), 1.00(1.0199) & 0.5078(0.527) Gy for TPS (Measured) respectively.The global gamma passing rate for electron density 1.1302 gm/cc was 98.6%. Detector entry, Isocentre & detector exit Doses were 1.8714 (1.8873), 1.00(0.9988) & 0.5211(0.516) Gy for TPS (Measured) respectively. Conclusion: Electron density value assigned by manufacturer does not hold true for every user. Proper modeling of electron density of ArcCHECK in TPS is essential to avoid systematic error in dose calculation of patient specific QA.« less
Assessment of PlanIQ Feasibility DVH for head and neck treatment planning.
Fried, David V; Chera, Bhishamjit S; Das, Shiva K
2017-09-01
Designing a radiation plan that optimally delivers both target coverage and normal tissue sparing is challenging. There are limited tools to determine what is dosimetrically achievable and frequently the experience of the planner/physician is relied upon to make these determinations. PlanIQ software provides a tool that uses target and organ at risk (OAR) geometry to indicate the difficulty of achieving different points for organ dose-volume histograms (DVH). We hypothesized that PlanIQ Feasibility DVH may aid planners in reducing dose to OARs. Clinically delivered head and neck treatments (clinical plan) were re-planned (re-plan) putting high emphasis on maximally sparing the contralateral parotid gland, contralateral submandibular gland, and larynx while maintaining routine clinical dosimetric objectives. The planner was blinded to the results of the clinically delivered plan as well as the Feasibility DVHs from PlanIQ. The re-plan treatments were designed using 3-arc VMAT in Raystation (RaySearch Laboratories, Sweden). The planner was then given the results from the PlanIQ Feasibility DVH analysis and developed an additional plan incorporating this information using 4-arc VMAT (IQ plan). The DVHs across the three treatment plans were compared with what was deemed "impossible" by PlanIQ's Feasibility DVH (Impossible DVH). The impossible DVH (red) is defined as the DVH generated using the minimal dose that any voxel outside the targets must receive given 100% target coverage. The re-plans performed blinded to PlanIQ Feasibilty DVH achieved superior sparing of aforementioned OARs compared to the clinically delivered plans and resulted in discrepancies from the impossible DVHs by an average of 200-700 cGy. Using the PlanIQ Feasibility DVH led to additionalOAR sparing compared to both the re-plans and clinical plans and reduced the discrepancies from the impossible DVHs to an average of approximately 100 cGy. The dose reduction from clinical to re-plan and re-plan to IQ plan were significantly different even when taking into account multiple hypothesis testing for both the contralateral parotid and the larynx (P < 0.004 for all comparisons). No significant differences were observed between the three plans for the contralateral parotid when considering multiple hypothesis testing. Clinical treatment plans and blinded re-plans were found to suboptimally spare OARs. PlanIQ could aid planners in generating treatment plans that push the limits of OAR sparing while maintaining routine clinical target coverage goals. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.
WE-AB-207B-01: Dose Tolerance for SBRT/SABR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grimm, J
Purpose: Stereotactic body radiation therapy (SBRT) / stereotactic ablative body radiotherapy (SABR) is gaining popularity, but quantitative dose tolerance has still been lacking. To improve this, the April 2016 issue of Seminars in Radiation Oncology will have normal tissue complication probability (NTCP) models for 10 critical structures: optic pathway, cochlea, oral mucosa, esophagus, chestwall, aorta, bronchi, duodenum, small bowel, and spinal cord. Methods: The project included more than 1500 treatments in 1–5 fractions using CyberKnife, Gamma Knife, or LINAC, with 60 authors from 15 institutions. NTCP models were constructed from the 97 grade 2–3 complications, predominantly scored using the commonmore » terminology criteria for adverse events (CTCAEv4). Dose volume histogram (DVH) data from each institutional dataset was loaded into the DVH Evaluator software (DiversiLabs, LLC, Huntingdon Valley, Pa) for modeling. The current state of the literature for the critical structures was depicted using DVH Risk Maps: comparative graphs of dose tolerance limits that can include estimated risk levels, reported complications, DVH data for study patients, as well as high- and low-risk dose tolerance limits. Results: For relatively acceptable toxicity like grade 1–3 rib fractures and chestwall pain, the high-risk limits have 50% risk and the low-risk limits have 5% risk. Emami et al (IJROBP 1991 May 15;21(1):109–22) used 50% and 5% risk levels for all structures, whereas this effort used clinically acceptable ranges for each: in structures like aorta or spinal cord where complications must be avoided, the high- and low-risk limits have about 3% and 1% risk, respectively, in this issue of Seminars. These statistically based guidelines can help ensure plan quality for each patient. Conclusion: NTCP for SBRT is now becoming available. Hypofractionated dose tolerance can be dramatically different than extrapolations of conventional fractionation so NTCP analysis of the SBRT/SBRT data is important to ensure safe clinical practice. Dr. Grimm, designed and holds intellectual property rights to the DVH Evaluator software tool which is an FDA-cleared product in commercial use, and was used to analyze the data.« less
NASA Astrophysics Data System (ADS)
Pyakuryal, Anil
2009-05-01
Studies have shown that as many as 8 out of 10 men had prostate cancer by age 80.Prostate cancer begins with small changes (prostatic intraepithelial neoplasia(PIN)) in size and shape of prostate gland cells,known as prostate adenocarcinoma.With advent in technology, prostate cancer has been the most widely used application of IMRT with the longest follow-up periods.Prostate cancer fits the ideal target criteria for IMRT of adjacent sensitive dose-limiting tissue (rectal, bladder).A retrospective study was performed on 10 prostate cancer patients treated with radiation to a limited pelvic field with a standard 4 field arrangements at dose 45 Gy, and an IMRT boost field to a total isocenter dose of 75 Gy.Plans were simulated for 4 field and the supplementary IMRT treatments with proposed dose delivery at 1.5 Gy/fraction in BID basis.An automated DVH analysis software, HART (S. Jang et al., 2008,Med Phys 35,p.2812)was used to perform DVH assessments in IMRT plans.A statistical analysis of dose coverage at targets in prostate gland and neighboring critical organs,and the plan indices(homogeneity, conformality etc) evaluations were also performed using HART extracted DVH statistics.Analyzed results showed a better correlation with the proposed outcomes (TCP, NTCP) of the treatments.
NASA Astrophysics Data System (ADS)
Maggio, Angelo; Carillo, Viviana; Cozzarini, Cesare; Perna, Lucia; Rancati, Tiziana; Valdagni, Riccardo; Gabriele, Pietro; Fiorino, Claudio
2013-04-01
The aim of this study was to evaluate the correlation between the ‘true’ absolute and relative dose-volume histograms (DVHs) of the bladder wall, dose-wall histogram (DWH) defined on MRI imaging and other surrogates of bladder dosimetry in prostate cancer patients, planned both with 3D-conformal and intensity-modulated radiation therapy (IMRT) techniques. For 17 prostate cancer patients, previously treated with radical intent, CT and MRI scans were acquired and matched. The contours of bladder walls were drawn by using MRI images. External bladder surfaces were then used to generate artificial bladder walls by performing automatic contractions of 5, 7 and 10 mm. For each patient a 3D conformal radiotherapy (3DCRT) and an IMRT treatment plan was generated with a prescription dose of 77.4 Gy (1.8 Gy/fr) and DVH of the whole bladder of the artificial walls (DVH-5/10) and dose-surface histograms (DSHs) were calculated and compared against the DWH in absolute and relative value, for both treatment planning techniques. A specific software (VODCA v. 4.4.0, MSS Inc.) was used for calculating the dose-volume/surface histogram. Correlation was quantified for selected dose-volume/surface parameters by the Spearman correlation coefficient. The agreement between %DWH and DVH5, DVH7 and DVH10 was found to be very good (maximum average deviations below 2%, SD < 5%): DVH5 showed the best agreement. The correlation was slightly better for absolute (R = 0.80-0.94) compared to relative (R = 0.66-0.92) histograms. The DSH was also found to be highly correlated with the DWH, although slightly higher deviations were generally found. The DVH was not a good surrogate of the DWH (R < 0.7 for most of parameters). When comparing the two treatment techniques, more pronounced differences between relative histograms were seen for IMRT with respect to 3DCRT (p < 0.0001).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Z; Feng, Y; Lo, S
2015-06-15
Purpose: The dose–volume histogram (DVH) has been normally accepted as a tool for treatment plan evaluation. However, spatial information is lacking in DVH. As a supplement to the DVH in three-dimensional treatment planning, the differential DVH (DDVH) provides the spatial variation, the size and magnitude of the different dose regions within a region of interest, which can be incorporated into tumor control probability model. This study was to provide a method in evaluating and improving Gamma Knife treatment planning. Methods: 10 patients with brain metastases from different primary tumors including melanoma (#1,#4,#5, #10), breast cancer (#2), prostate cancer (#3) andmore » lung cancer (#6–9) were analyzed. By using Leksell GammaPlan software, two plans were prepared for each patient. Special attention was given to the DDVHs that were different for different plans and were used for a comparison between two plans. Dose distribution inside target and tumor control probability (TCP) based on DDVH were calculated, where cell density and radiobiological parameters were adopted from literature. The plans were compared based on DVH, DDVH and TCP. Results: Using DVH, the coverage and selectivity were the same between plans for 10 patients. DDVH were different between two plans for each patient. The paired t-test showed no significant difference in TCP between the two plans. For brain metastases from melanoma (#1, #4–5), breast cancer (#2) and lung cancer (#6–8), the difference in TCP was less than 5%. But the difference in TCP was about 6.5% for patient #3 with the metastasis from prostate cancer, 10.1% and 178.7% for two patients (#9–10) with metastasis from lung cancer. Conclusion: Although DVH provides average dose–volume information, DDVH provides differential dose– volume information with respect to different regions inside the tumor. TCP provides radiobiological information and adds additional information on improving treatment planning as well as adaptive radiotherapy. Further clinical validation is necessary.« less
Mayo, Charles S; Yao, John; Eisbruch, Avraham; Balter, James M; Litzenberg, Dale W; Matuszak, Martha M; Kessler, Marc L; Weyburn, Grant; Anderson, Carlos J; Owen, Dawn; Jackson, William C; Haken, Randall Ten
2017-01-01
To develop statistical dose-volume histogram (DVH)-based metrics and a visualization method to quantify the comparison of treatment plans with historical experience and among different institutions. The descriptive statistical summary (ie, median, first and third quartiles, and 95% confidence intervals) of volume-normalized DVH curve sets of past experiences was visualized through the creation of statistical DVH plots. Detailed distribution parameters were calculated and stored in JavaScript Object Notation files to facilitate management, including transfer and potential multi-institutional comparisons. In the treatment plan evaluation, structure DVH curves were scored against computed statistical DVHs and weighted experience scores (WESs). Individual, clinically used, DVH-based metrics were integrated into a generalized evaluation metric (GEM) as a priority-weighted sum of normalized incomplete gamma functions. Historical treatment plans for 351 patients with head and neck cancer, 104 with prostate cancer who were treated with conventional fractionation, and 94 with liver cancer who were treated with stereotactic body radiation therapy were analyzed to demonstrate the usage of statistical DVH, WES, and GEM in a plan evaluation. A shareable dashboard plugin was created to display statistical DVHs and integrate GEM and WES scores into a clinical plan evaluation within the treatment planning system. Benchmarking with normal tissue complication probability scores was carried out to compare the behavior of GEM and WES scores. DVH curves from historical treatment plans were characterized and presented, with difficult-to-spare structures (ie, frequently compromised organs at risk) identified. Quantitative evaluations by GEM and/or WES compared favorably with the normal tissue complication probability Lyman-Kutcher-Burman model, transforming a set of discrete threshold-priority limits into a continuous model reflecting physician objectives and historical experience. Statistical DVH offers an easy-to-read, detailed, and comprehensive way to visualize the quantitative comparison with historical experiences and among institutions. WES and GEM metrics offer a flexible means of incorporating discrete threshold-prioritizations and historic context into a set of standardized scoring metrics. Together, they provide a practical approach for incorporating big data into clinical practice for treatment plan evaluations.
"SABER": A new software tool for radiotherapy treatment plan evaluation.
Zhao, Bo; Joiner, Michael C; Orton, Colin G; Burmeister, Jay
2010-11-01
Both spatial and biological information are necessary in order to perform true optimization of a treatment plan and for predicting clinical outcome. The goal of this work is to develop an enhanced treatment plan evaluation tool which incorporates biological parameters and retains spatial dose information. A software system is developed which provides biological plan evaluation with a novel combination of features. It incorporates hyper-radiosensitivity using the induced-repair model and applies the new concept of dose convolution filter (DCF) to simulate dose wash-out effects due to cell migration, bystander effect, and/or tissue motion during treatment. Further, the concept of spatial DVH (sDVH) is introduced to evaluate and potentially optimize the spatial dose distribution in the target volume. Finally, generalized equivalent uniform dose is derived from both the physical dose distribution (gEUD) and the distribution of equivalent dose in 2 Gy fractions (gEUD2) and the software provides three separate models for calculation of tumor control probability (TCP), normal tissue complication probability (NTCP), and probability of uncomplicated tumor control (P+). TCP, NTCP, and P+ are provided as a function of prescribed dose and multivariable TCP, NTCP, and P+ plots are provided to illustrate the dependence on individual parameters used to calculate these quantities. Ten plans from two clinical treatment sites are selected to test the three calculation models provided by this software. By retaining both spatial and biological information about the dose distribution, the software is able to distinguish features of radiotherapy treatment plans not discernible using commercial systems. Plans that have similar DVHs may have different spatial and biological characteristics and the application of novel tools such as sDVH and DCF within the software may substantially change the apparent plan quality or predicted plan metrics such as TCP and NTCP. For the cases examined, both the calculation method and the application of DCF can change the ranking order of competing plans. The voxel-by-voxel TCP model makes it feasible to incorporate spatial variations of clonogen densities (n), radiosensitivities (SF2), and fractionation sensitivities (alpha/beta) as those data become available. The new software incorporates both spatial and biological information into the treatment planning process. The application of multiple methods for the incorporation of biological and spatial information has demonstrated that the order of application of biological models can change the order of plan ranking. Thus, the results of plan evaluation and optimization are dependent not only on the models used but also on the order in which they are applied. This software can help the planner choose more biologically optimal treatment plans and potentially predict treatment outcome more accurately.
Jin, X; Yan, H; Han, C; Zhou, Y; Yi, J; Xie, C
2015-03-01
To investigate comparatively the percentage gamma passing rate (%GP) of two-dimensional (2D) and three-dimensional (3D) pre-treatment volumetric modulated arc therapy (VMAT) dosimetric verification and their correlation and sensitivity with percentage dosimetric errors (%DE). %GP of 2D and 3D pre-treatment VMAT quality assurance (QA) with different acceptance criteria was obtained by ArcCHECK® (Sun Nuclear Corporation, Melbourne, FL) for 20 patients with nasopharyngeal cancer (NPC) and 20 patients with oesophageal cancer. %DE were calculated from planned dose-volume histogram (DVH) and patients' predicted DVH calculated by 3DVH® software (Sun Nuclear Corporation). Correlation and sensitivity between %GP and %DE were investigated using Pearson's correlation coefficient (r) and receiver operating characteristics (ROCs). Relatively higher %DE on some DVH-based metrics were observed for both patients with NPC and oesophageal cancer. Except for 2%/2 mm criterion, the average %GPs for all patients undergoing VMAT were acceptable with average rates of 97.11% ± 1.54% and 97.39% ± 1.37% for 2D and 3D 3%/3 mm criteria, respectively. The number of correlations for 3D was higher than that for 2D (21 vs 8). However, the general correlation was still poor for all the analysed metrics (9 out of 26 for 3D 3%/3 mm criterion). The average area under the curve (AUC) of ROCs was 0.66 ± 0.12 and 0.71 ± 0.21 for 2D and 3D evaluations, respectively. There is a lack of correlation between %GP and %DE for both 2D and 3D pre-treatment VMAT dosimetric evaluation. DVH-based dose metrics evaluation obtained from 3DVH will provide more useful analysis. Correlation and sensitivity of %GP with %DE for VMAT QA were studied for the first time.
Jin, X; Yan, H; Han, C; Zhou, Y; Yi, J
2015-01-01
Objective: To investigate comparatively the percentage gamma passing rate (%GP) of two-dimensional (2D) and three-dimensional (3D) pre-treatment volumetric modulated arc therapy (VMAT) dosimetric verification and their correlation and sensitivity with percentage dosimetric errors (%DE). Methods: %GP of 2D and 3D pre-treatment VMAT quality assurance (QA) with different acceptance criteria was obtained by ArcCHECK® (Sun Nuclear Corporation, Melbourne, FL) for 20 patients with nasopharyngeal cancer (NPC) and 20 patients with oesophageal cancer. %DE were calculated from planned dose–volume histogram (DVH) and patients' predicted DVH calculated by 3DVH® software (Sun Nuclear Corporation). Correlation and sensitivity between %GP and %DE were investigated using Pearson's correlation coefficient (r) and receiver operating characteristics (ROCs). Results: Relatively higher %DE on some DVH-based metrics were observed for both patients with NPC and oesophageal cancer. Except for 2%/2 mm criterion, the average %GPs for all patients undergoing VMAT were acceptable with average rates of 97.11% ± 1.54% and 97.39% ± 1.37% for 2D and 3D 3%/3 mm criteria, respectively. The number of correlations for 3D was higher than that for 2D (21 vs 8). However, the general correlation was still poor for all the analysed metrics (9 out of 26 for 3D 3%/3 mm criterion). The average area under the curve (AUC) of ROCs was 0.66 ± 0.12 and 0.71 ± 0.21 for 2D and 3D evaluations, respectively. Conclusions: There is a lack of correlation between %GP and %DE for both 2D and 3D pre-treatment VMAT dosimetric evaluation. DVH-based dose metrics evaluation obtained from 3DVH will provide more useful analysis. Advances in knowledge: Correlation and sensitivity of %GP with %DE for VMAT QA were studied for the first time. PMID:25494412
Fractional labelmaps for computing accurate dose volume histograms
NASA Astrophysics Data System (ADS)
Sunderland, Kyle; Pinter, Csaba; Lasso, Andras; Fichtinger, Gabor
2017-03-01
PURPOSE: In radiation therapy treatment planning systems, structures are represented as parallel 2D contours. For treatment planning algorithms, structures must be converted into labelmap (i.e. 3D image denoting structure inside/outside) representations. This is often done by triangulated a surface from contours, which is converted into a binary labelmap. This surface to binary labelmap conversion can cause large errors in small structures. Binary labelmaps are often represented using one byte per voxel, meaning a large amount of memory is unused. Our goal is to develop a fractional labelmap representation containing non-binary values, allowing more information to be stored in the same amount of memory. METHODS: We implemented an algorithm in 3D Slicer, which converts surfaces to fractional labelmaps by creating 216 binary labelmaps, changing the labelmap origin on each iteration. The binary labelmap values are summed to create the fractional labelmap. In addition, an algorithm is implemented in the SlicerRT toolkit that calculates dose volume histograms (DVH) using fractional labelmaps. RESULTS: We found that with manually segmented RANDO head and neck structures, fractional labelmaps represented structure volume up to 19.07% (average 6.81%) more accurately than binary labelmaps, while occupying the same amount of memory. When compared to baseline DVH from treatment planning software, DVH from fractional labelmaps had agreement acceptance percent (1% ΔD, 1% ΔV) up to 57.46% higher (average 4.33%) than DVH from binary labelmaps. CONCLUSION: Fractional labelmaps promise to be an effective method for structure representation, allowing considerably more information to be stored in the same amount of memory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olch, A
2015-06-15
Purpose: Systematic radiotherapy plan quality assessment promotes quality improvement. Software tools can perform this analysis by applying site-specific structure dose metrics. The next step is to similarly evaluate the quality of the dose delivery. This study defines metrics for acceptable doses to targets and normal organs for a particular treatment site and scores each plan accordingly. The input can be the TPS or the measurement-based 3D patient dose. From this analysis, one can determine whether the delivered dose distribution to the patient receives a score which is comparable to the TPS plan score, otherwise replanning may be indicated. Methods: Elevenmore » neuroblastoma patient plans were exported from Eclipse to the Quality Reports program. A scoring algorithm defined a score for each normal and target structure based on dose-volume parameters. Each plan was scored by this algorithm and the percentage of total possible points was obtained. Each plan also underwent IMRT QA measurements with a Mapcheck2 or ArcCheck. These measurements were input into the 3DVH program to compute the patient 3D dose distribution which was analyzed using the same scoring algorithm as the TPS plan. Results: The mean quality score for the TPS plans was 75.37% (std dev=14.15%) compared to 71.95% (std dev=13.45%) for the 3DVH dose distribution. For 3/11 plans, the 3DVH-based quality score was higher than the TPS score, by between 0.5 to 8.4 percentage points. Eight/11 plans scores decreased based on IMRT QA measurements by 1.2 to 18.6 points. Conclusion: Software was used to determine the degree to which the plan quality score differed between the TPS and measurement-based dose. Although the delivery score was generally in good agreement with the planned dose score, there were some that improved while there was one plan whose delivered dose quality was significantly less than planned. This methodology helps evaluate both planned and delivered dose quality. Sun Nuclear Corporation has provded a license for the software described.« less
Opp, Daniel; Nelms, Benjamin E.; Zhang, Geoffrey; Stevens, Craig
2013-01-01
3DVH software (Sun Nuclear Corp., Melbourne, FL) is capable of generating a volumetric patient VMAT dose by applying a volumetric perturbation algorithm based on comparing measurement‐guided dose reconstruction and TPS‐calculated dose to a cylindrical phantom. The primary purpose of this paper is to validate this dose reconstruction on an anthropomorphic heterogeneous thoracic phantom by direct comparison to independent measurements. The dosimetric insert to the phantom is novel, and thus the secondary goal is to demonstrate how it can be used for the hidden target end‐to‐end testing of VMAT treatments in lung. A dosimetric insert contains a 4 cm diameter unit‐density spherical target located inside the right lung (0.21g/cm3 density). It has 26 slots arranged in two orthogonal directions, milled to hold optically stimulated luminescent dosimeters (OSLDs). Dose profiles in three cardinal orthogonal directions were obtained for five VMAT plans with varying degrees of modulation. After appropriate OSLD corrections were applied, 3DVH measurement‐guided VMAT dose reconstruction agreed 100% with the measurements in the unit density target sphere at 3%/3 mm level (composite analysis) for all profile points for the four less‐modulated VMAT plans, and for 96% of the points in the highly modulated C‐shape plan (from TG‐119). For this latter plan, while 3DVH shows acceptable agreement with independent measurements in the unit density target, in the lung disagreement with experiment is relatively high for both the TPS calculation and 3DVH reconstruction. For the four plans excluding the C‐shape, 3%/3mm overall composite analysis passing rates for 3DVH against independent measurement ranged from 93% to 100%. The C‐shape plan was deliberately chosen as a stress test of the algorithm. The dosimetric spatial alignment hidden target test demonstrated the average distance to agreement between the measured and TPS profiles in the steep dose gradient area at the edge of the 2 cm target to be 1.0±0.7,0.3±0.3, and 0.3±0.3mm for the IEC X, Y, and Z directions, respectively. PACS number: 87.55Qr PMID:23835381
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lewis, C; Jiang, R; Chow, J
2015-06-15
Purpose: We developed a method to predict the change of DVH for PTV due to interfraction organ motion in prostate VMAT without repeating the CT scan and treatment planning. The method is based on a pre-calculated patient database with DVH curves of PTV modelled by the Gaussian error function (GEF). Methods: For a group of 30 patients with different prostate sizes, their VMAT plans were recalculated by shifting their PTVs 1 cm with 10 increments in the anterior-posterior, left-right and superior-inferior directions. The DVH curve of PTV in each replan was then fitted by the GEF to determine parameters describingmore » the shape of curve. Information of parameters, varying with the DVH change due to prostate motion for different prostate sizes, was analyzed and stored in a database of a program written by MATLAB. Results: To predict a new DVH for PTV due to prostate interfraction motion, prostate size and shift distance with direction were input to the program. Parameters modelling the DVH for PTV were determined based on the pre-calculated patient dataset. From the new parameters, DVH curves of PTVs with and without considering the prostate motion were plotted for comparison. The program was verified with different prostate cases involving interfraction prostate shifts and replans. Conclusion: Variation of DVH for PTV in prostate VMAT can be predicted using a pre-calculated patient database with DVH curve fitting. The computing time is fast because CT rescan and replan are not required. This quick DVH estimation can help radiation staff to determine if the changed PTV coverage due to prostate shift is tolerable in the treatment. However, it should be noted that the program can only consider prostate interfraction motions along three axes, and is restricted to prostate VMAT plan using the same plan script in the treatment planning system.« less
Analysis of dose heterogeneity using a subvolume-DVH
NASA Astrophysics Data System (ADS)
Said, M.; Nilsson, P.; Ceberg, C.
2017-11-01
The dose-volume histogram (DVH) is universally used in radiation therapy for its highly efficient way of summarizing three-dimensional dose distributions. An apparent limitation that is inherent to standard histograms is the loss of spatial information, e.g. it is no longer possible to tell where low- and high-dose regions are, and whether they are connected or disjoint. Two methods for overcoming the spatial fragmentation of low- and high-dose regions are presented, both based on the gray-level size zone matrix, which is a two-dimensional histogram describing the frequencies of connected regions of similar intensities. The first approach is a quantitative metric which can be likened to a homogeneity index. The large cold spot metric (LCS) is here defined to emphasize large contiguous regions receiving too low a dose; emphasis is put on both size, and deviation from the prescribed dose. In contrast, the subvolume-DVH (sDVH) is an extension to the standard DVH and allows for a qualitative evaluation of the degree of dose heterogeneity. The information retained from the two-dimensional histogram is overlaid on top of the DVH and the two are presented simultaneously. Both methods gauge the underlying heterogeneity in ways that the DVH alone cannot, and both have their own merits—the sDVH being more intuitive and the LCS being quantitative.
[Clinical evaluation of heavy-particle radiotherapy using dose volume histogram (DVH)].
Terahara, A; Nakano, T; Tsujii, H
1998-01-01
Radiotherapy with heavy particles such as proton and heavy-charged particles is a promising modality for treatment of localized malignant tumors because of the good dose distribution. A dose calculation and radiotherapy planning system which is essential for this kind of treatment has been developed in recent years. It has the capability to compute the dose volume histogram (DVH) which contains dose-volume information for the target volume and other interesting volumes. Recently, DVH is commonly used to evaluate and compare dose distributions in radiotherapy with both photon and heavy particles, and it shows that a superior dose distribution is obtained in heavy particle radiotherapy. DVH is also utilized for the evaluation of dose distribution related to clinical outcomes. Besides models such as normal tissue complication probability (NTCP) and tumor control probability (TCP), which can be calculated from DVH are proposed by several authors, they are applied to evaluate dose distributions themselves and to evaluate them in relation to clinical results. DVH is now a useful and important tool, but further studies are needed to use DVH and these models practically for clinical evaluation of heavy-particle radiotherapy.
Dumas, J L; Lorchel, F; Perrot, Y; Aletti, P; Noel, A; Wolf, D; Courvoisier, P; Bosset, J F
2007-03-01
The goal of our study was to quantify the limits of the EUD models for use in score functions in inverse planning software, and for clinical application. We focused on oesophagus cancer irradiation. Our evaluation was based on theoretical dose volume histograms (DVH), and we analyzed them using volumetric and linear quadratic EUD models, average and maximum dose concepts, the linear quadratic model and the differential area between each DVH. We evaluated our models using theoretical and more complex DVHs for the above regions of interest. We studied three types of DVH for the target volume: the first followed the ICRU dose homogeneity recommendations; the second was built out of the first requirements and the same average dose was built in for all cases; the third was truncated by a small dose hole. We also built theoretical DVHs for the organs at risk, in order to evaluate the limits of, and the ways to use both EUD(1) and EUD/LQ models, comparing them to the traditional ways of scoring a treatment plan. For each volume of interest we built theoretical treatment plans with differences in the fractionation. We concluded that both volumetric and linear quadratic EUDs should be used. Volumetric EUD(1) takes into account neither hot-cold spot compensation nor the differences in fractionation, but it is more sensitive to the increase of the irradiated volume. With linear quadratic EUD/LQ, a volumetric analysis of fractionation variation effort can be performed.
Zeng, G; Murphy, J; Annis, S-L; Wu, X; Wang, Y; McGowan, T; Macpherson, M
2012-07-01
To report a quality control program in prostate radiation therapy at our center that includes semi-automated planning process to generate high quality plans and in-house software to track plan quality in the subsequent clinical application. Arc planning in Eclipse v10.0 was preformed for both intact prostate and post-prostatectomy treatments. The planning focuses on DVH requirements and dose distributions being able to tolerate daily setup variations. A modified structure set is used to standardize the optimization, including short rectum and bladder in the fields to effectively tighten dose to target and a rectum expansion with 1cm cropped from PTV to block dose and shape posterior isodose lines. Structure, plan and optimization templates are used to streamline plan generation. DVH files are exported from Eclipse to a quality tracking software with GUI written in Matlab that can report the dose-volume data either for an individual patient or over a patient population. For 100 intact prostate patients treated with 78Gy, rectal D50, D25, D15 and D5 are 30.1±6.2Gy, 50.6±7.9Gy, 65.9±6.0Gy and 76.6±1.4Gy respectively, well below the limits 50Gy, 65Gy, 75Gy and 78Gy respectively. For prostate bed with prescription of 66Gy, rectal D50 is 35.9±6.9Gy. In both sites, PTV is covered by 95% prescription and the hotspots are less than 5%. The semi-automated planning method can efficiently create high quality plans while the tracking software can monitor the feedback from clinical application. It is a comprehensive and robust quality control program in radiation therapy. © 2012 American Association of Physicists in Medicine.
Trofimov, Alexei; Unkelbach, Jan; DeLaney, Thomas F; Bortfeld, Thomas
2012-01-01
Dose-volume histograms (DVH) are the most common tool used in the appraisal of the quality of a clinical treatment plan. However, when delivery uncertainties are present, the DVH may not always accurately describe the dose distribution actually delivered to the patient. We present a method, based on DVH formalism, to visualize the variability in the expected dosimetric outcome of a treatment plan. For a case of chordoma of the cervical spine, we compared 2 intensity modulated proton therapy plans. Treatment plan A was optimized based on dosimetric objectives alone (ie, desired target coverage, normal tissue tolerance). Plan B was created employing a published probabilistic optimization method that considered the uncertainties in patient setup and proton range in tissue. Dose distributions and DVH for both plans were calculated for the nominal delivery scenario, as well as for scenarios representing deviations from the nominal setup, and a systematic error in the estimate of range in tissue. The histograms from various scenarios were combined to create DVH bands to illustrate possible deviations from the nominal plan for the expected magnitude of setup and range errors. In the nominal scenario, the DVH from plan A showed superior dose coverage, higher dose homogeneity within the target, and improved sparing of the adjacent critical structure. However, when the dose distributions and DVH from plans A and B were recalculated for different error scenarios (eg, proton range underestimation by 3 mm), the plan quality, reflected by DVH, deteriorated significantly for plan A, while plan B was only minimally affected. In the DVH-band representation, plan A produced wider bands, reflecting its higher vulnerability to delivery errors, and uncertainty in the dosimetric outcome. The results illustrate that comparison of DVH for the nominal scenario alone does not provide any information about the relative sensitivity of dosimetric outcome to delivery uncertainties. Thus, such comparison may be misleading and may result in the selection of an inferior plan for delivery to a patient. A better-informed decision can be made if additional information about possible dosimetric variability is presented; for example, in the form of DVH bands. Copyright © 2012 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.
Dose to mass for evaluation and optimization of lung cancer radiation therapy.
Tyler Watkins, William; Moore, Joseph A; Hugo, Geoffrey D; Siebers, Jeffrey V
2017-11-01
To evaluate potential organ at risk dose-sparing by using dose-mass-histogram (DMH) objective functions compared with dose-volume-histogram (DVH) objective functions. Treatment plans were retrospectively optimized for 10 locally advanced non-small cell lung cancer patients based on DVH and DMH objectives. DMH-objectives were the same as DVH objectives, but with mass replacing volume. Plans were normalized to dose to 95% of the PTV volume (PTV-D95v) or mass (PTV-D95m). For a given optimized dose, DVH and DMH were intercompared to ascertain dose-to-volume vs. dose-to-mass differences. Additionally, the optimized doses were intercompared using DVH and DMH metrics to ascertain differences in optimized plans. Mean dose to volume, D v ‾, mean dose to mass, D M ‾, and fluence maps were intercompared. For a given dose distribution, DVH and DMH differ by >5% in heterogeneous structures. In homogeneous structures including heart and spinal cord, DVH and DMH are nearly equivalent. At fixed PTV-D95v, DMH-optimization did not significantly reduce dose to OARs but reduced PTV-D v ‾ by 0.20±0.2Gy (p=0.02) and PTV-D M ‾ by 0.23±0.3Gy (p=0.02). Plans normalized to PTV-D95m also result in minor PTV dose reductions and esophageal dose sparing (D v ‾ reduced 0.45±0.5Gy, p=0.02 and D M ‾ reduced 0.44±0.5Gy, p=0.02) compared to DVH-optimized plans. Optimized fluence map comparisons indicate that DMH optimization reduces dose in the periphery of lung PTVs. DVH- and DMH-dose indices differ by >5% in lung and lung target volumes for fixed dose distributions, but optimizing DMH did not reduce dose to OARs. The primary difference observed in DVH- and DMH-optimized plans were variations in fluence to the periphery of lung target PTVs, where low density lung surrounds tumor. Copyright © 2017 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Viraganathan, H; Jiang, R; Chow, J
Purpose: We proposed a method to predict the change of dose-volume histogram (DVH) for PTV due to patient weight loss in prostate volumetric modulated arc therapy (VMAT). This method is based on a pre-calculated patient dataset and DVH curve fitting using the Gaussian error function (GEF). Methods: Pre-calculated dose-volume data from patients having weight loss in prostate VMAT was employed to predict the change of PTV coverage due to reduced depth in external contour. The effect of patient weight loss in treatment was described by a prostate dose-volume factor (PDVF), which was evaluated by the prostate PTV. Along with themore » PDVF, the GEF was used to fit into the DVH curve for the PTV. To predict a new DVH due to weight loss, parameters from the GEF describing the shape of DVH curve were determined. Since the parameters were related to the PDVF as per the specific reduced depth, we could first predict the PDVF at a reduced depth based on the prostate size from the pre-calculated dataset. Then parameters of the GEF could be determined from the PDVF to plot the new DVH for the PTV corresponding to the reduced depth. Results: A MATLAB program was built basing on the patient dataset with different prostate sizes. We input data of the prostate size and reduced depth of the patient into the program. The program then calculated the PDVF and DVH for the PTV considering the patient weight loss. The program was verified by different patient cases with various reduced depths. Conclusion: Our method can estimate the change of DVH for the PTV due to patient weight loss quickly without CT rescan and replan. This would help the radiation staff to predict the change of PTV coverage, when patient’s external contour reduced in prostate VMAT.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zakariaee, R; Brown, C J; Hamarneh, G
2014-08-15
Dosimetric parameters based on dose-volume histograms (DVH) of contoured structures are routinely used to evaluate dose delivered to target structures and organs at risk. However, the DVH provides no information on the spatial distribution of the dose in situations of repeated fractions with changes in organ shape or size. The aim of this research was to develop methods to more accurately determine geometrically localized, cumulative dose to the bladder wall in intracavitary brachytherapy for cervical cancer. The CT scans and treatment plans of 20 cervical cancer patients were used. Each patient was treated with five high-dose-rate (HDR) brachytherapy fractions ofmore » 600cGy prescribed dose. The bladder inner and outer surfaces were delineated using MIM Maestro software (MIM Software Inc.) and were imported into MATLAB (MathWorks) as 3-dimensional point clouds constituting the “bladder wall”. A point-set registration toolbox for MATLAB, Coherent Point Drift (CPD), was used to non-rigidly transform the bladder-wall points from four of the fractions to the coordinate system of the remaining (reference) fraction, which was chosen to be the emptiest bladder for each patient. The doses were accumulated on the reference fraction and new cumulative dosimetric parameters were calculated. The LENT-SOMA toxicity scores of these patients were studied against the cumulative dose parameters. Based on this study, there was no significant correlation between the toxicity scores and the determined cumulative dose parameters.« less
Engberg, Lovisa; Forsgren, Anders; Eriksson, Kjell; Hårdemark, Björn
2017-06-01
To formulate convex planning objectives of treatment plan multicriteria optimization with explicit relationships to the dose-volume histogram (DVH) statistics used in plan quality evaluation. Conventional planning objectives are designed to minimize the violation of DVH statistics thresholds using penalty functions. Although successful in guiding the DVH curve towards these thresholds, conventional planning objectives offer limited control of the individual points on the DVH curve (doses-at-volume) used to evaluate plan quality. In this study, we abandon the usual penalty-function framework and propose planning objectives that more closely relate to DVH statistics. The proposed planning objectives are based on mean-tail-dose, resulting in convex optimization. We also demonstrate how to adapt a standard optimization method to the proposed formulation in order to obtain a substantial reduction in computational cost. We investigated the potential of the proposed planning objectives as tools for optimizing DVH statistics through juxtaposition with the conventional planning objectives on two patient cases. Sets of treatment plans with differently balanced planning objectives were generated using either the proposed or the conventional approach. Dominance in the sense of better distributed doses-at-volume was observed in plans optimized within the proposed framework. The initial computational study indicates that the DVH statistics are better optimized and more efficiently balanced using the proposed planning objectives than using the conventional approach. © 2017 American Association of Physicists in Medicine.
A novel four-dimensional radiotherapy planning strategy from a tumor-tracking beam's eye view
NASA Astrophysics Data System (ADS)
Li, Guang; Cohen, Patrice; Xie, Huchen; Low, Daniel; Li, Diana; Rimner, Andreas
2012-11-01
To investigate the feasibility of four-dimensional radiotherapy (4DRT) planning from a tumor-tracking beam's eye view (ttBEV) with reliable gross tumor volume (GTV) delineation, realistic normal tissue representation, high planning accuracy and low clinical workload, we propose and validate a novel 4D conformal planning strategy based on a synthesized 3.5D computed tomographic (3.5DCT) image with a motion-compensated tumor. To recreate patient anatomy from a ttBEV in the moving tumor coordinate system for 4DRT planning (or 4D planning), the centers of delineated GTVs in all phase CT images of 4DCT were aligned, and then the aligned CTs were averaged to produce a new 3.5DCT image. This GTV-motion-compensated CT contains a motionless target (with motion artifacts minimized) and motion-blurred normal tissues (with a realistic temporal density average). Semi-automatic threshold-based segmentation of the tumor, lung and body was applied, while manual delineation was used for other organs at risk (OARs). To validate this 3.5DCT-based 4D planning strategy, five patients with peripheral lung lesions of small size (<5 cm3) and large motion range (1.2-3.5 cm) were retrospectively studied for stereotactic body radiotherapy (SBRT) using 3D conformal radiotherapy planning tools. The 3.5DCT-based 4D plan (3.5DCT plan) with 9-10 conformal beams was compared with the 4DCT-based 4D plan (4DCT plan). The 4DCT plan was derived from multiple 3D plans based on all phase CT images, each of which used the same conformal beam configuration but with an isocenter shift to aim at the moving tumor and a minor beam aperture and weighting adjustment to maintain plan conformality. The dose-volume histogram (DVH) of the 4DCT plan was created with two methods: one is an integrated DVH (iDVH4D), which is defined as the temporal average of all 3D-phase-plan DVHs, and the other (DVH4D) is based on the dose distribution in a reference phase CT image by dose warping from all phase plans using the displacement vector field (DVF) from a free-form deformable image registration (DIR). The DVH3.5D (for the 3.5DCT plan) was compared with both iDVH4D and DVH4D. To quantify the DVH difference between the 3.5DCT plan and the 4DCT plan, two methods were used: relative difference (%) of the areas underneath the DVH curves and the volumes receiving more than 20% (V20) and 50% (V50) of prescribed dose of these 4D plans. The volume of the delineated GTV from different phase CTs varied dramatically from 24% to 112% among the five patients, whereas the GTV from 3.5DCT deviated from the averaged GTV in 4DCT by only -6%±6%. For planning tumor volume (PTV) coverage, the difference between the DVH3.5D and iDVH4D was negligible (<1% area), whereas the DVH3.5D and DVH4D were quite different, due to DIR uncertainty (˜2 mm), which propagates to PTV dose coverage with a pronounced uncertainty for small tumors (0.3-4.0 cm3) in stereotactic plans with sharp dose falloff around PTV. For OARs, such as the lung, heart, cord and esophagus, the three DVH curves (DVH3.5D, DVH4D and iDVH4D) were found to be almost identical for the same patients, especially in high-dose regions. For the tumor-containing lung, the relative difference of the areas underneath the DVH curves was found to be small (5.3% area on average), of which 65% resulted from the low-dose region (D < 20%). The averaged V20 difference between the two 4D plans was 1.2% ± 0.8%. For the mean lung dose (MLD), the 3.5DCT plan differed from the 4DCT plan by -1.1%±1.3%. GTV-motion-compensated CT (3.5DCT) produces an accurate and reliable GTV delineation, which is close to the mean GTV from 4DCT. The 3.5DCT plan is equivalent to the 4DCT plan with <1% dose difference to the PTV and negligible dose difference in OARs. The 3.5DCT approach simplifies 4D planning and provides accurate dose calculation without a substantial increase of clinical workload for motion-tracking delivery to treat small peripheral lung tumors with large motion.
SU-E-T-117: Analysis of the ArcCHECK Dosimetry Gamma Failure Using the 3DVH System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cho, S; Choi, W; Lee, H
2015-06-15
Purpose: To evaluate gamma analysis failure for the VMAT patient specific QA using ArcCHECK cylindrical phantom. The 3DVH system(Sun Nuclear, FL) was used to analyze the dose difference statistic between measured dose and treatment planning system calculated dose. Methods: Four case of gamma analysis failure were selected retrospectively. Our institution gamma analysis indexes were absolute dose, 3%/3mm and 90%pass rate in the ArcCHECK dosimetry. The collapsed cone convolution superposition (CCCS) dose calculation algorithm for VMAT was used. Dose delivery was performed with Elekta Agility. The A1SL(standard imaging, WI) and cavity plug were used for point dose measurement. Delivery QA plansmore » and images were used for 3DVH Reference data instead of patient plan and image. The measured data of ‘.txt’ file was used for comparison at diodes to acquire a global dose level. The,.acml’ file was used for AC-PDP and to calculated point dose. Results: The global dose of 3DVH was calculated as 1.10 Gy, 1.13, 1.01 and 0.2 Gy respectively. The global dose of 0.2 Gy case was induced by distance discrepancy. The TPS calculated point dose of was 2.33 Gy to 2.77 Gy and 3DVH calculated dose was 2.33 Gy to 2.68 Gy. The maximum dose differences were −2.83% and −3.1% for TPS vs. measured dose and TPS vs. 3DVH calculated respectively in the same case. The difference between measured and 3DVH was 0.1% in that case. The 3DVH gamma pass rate was 98% to 99.7%. Conclusion: We found the TPS calculation error by 3DVH calculation using ArcCHECK measured dose. It seemed that our CCCS algorithm RTP system over estimated at the central region and underestimated scattering at the peripheral diode detector point. The relative gamma analysis and point dose measurement would be recommended for VMAT DQA in the gamma failure case of ArcCHECK dosimetry.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, C
Purpose: To implement a novel, automatic, institutional customizable DVH quantities evaluation and PDF report tool on Philips Pinnacle treatment planning system (TPS) Methods: An add-on program (P3DVHStats) is developed by us to enable automatic DVH quantities evaluation (including both volume and dose based quantities, such as V98, V100, D2), and automatic PDF format report generation, for EMR convenience. The implementation is based on a combination of Philips Pinnacle scripting tool and Java language pre-installed on each Pinnacle Sun Solaris workstation. A single Pinnacle script provide user a convenient access to the program when needed. The activated script will first exportmore » DVH data for user selected ROIs from current Pinnacle plan trial; a Java program then provides a simple GUI interface, utilizes the data to compute any user requested DVH quantities, compare with preset institutional DVH planning goals; if accepted by users, the program will also generate a PDF report of the results and export it from Pinnacle to EMR import folder via FTP. Results: The program was tested thoroughly and has been released for clinical use at our institution (Pinnacle Enterprise server with both thin clients and P3PC access), for all dosimetry and physics staff, with excellent feedback. It used to take a few minutes to use MS-Excel worksheet to calculate these DVH quantities for IMRT/VMAT plans, and manually save them as PDF report; with the new program, it literally takes a few mouse clicks in less than 30 seconds to complete the same tasks. Conclusion: A Pinnacle scripting and Java language based program is successfully implemented, customized to our institutional needs. It is shown to dramatically reduce time and effort needed for DVH quantities computing and EMR reporting.« less
NASA Astrophysics Data System (ADS)
Alexander, A.; DeBlois, F.; Stroian, G.; Al-Yahya, K.; Heath, E.; Seuntjens, J.
2007-07-01
Radiotherapy research lacks a flexible computational research environment for Monte Carlo (MC) and patient-specific treatment planning. The purpose of this study was to develop a flexible software package on low-cost hardware with the aim of integrating new patient-specific treatment planning with MC dose calculations suitable for large-scale prospective and retrospective treatment planning studies. We designed the software package 'McGill Monte Carlo treatment planning' (MMCTP) for the research development of MC and patient-specific treatment planning. The MMCTP design consists of a graphical user interface (GUI), which runs on a simple workstation connected through standard secure-shell protocol to a cluster for lengthy MC calculations. Treatment planning information (e.g., images, structures, beam geometry properties and dose distributions) is converted into a convenient MMCTP local file storage format designated, the McGill RT format. MMCTP features include (a) DICOM_RT, RTOG and CADPlan CART format imports; (b) 2D and 3D visualization views for images, structure contours, and dose distributions; (c) contouring tools; (d) DVH analysis, and dose matrix comparison tools; (e) external beam editing; (f) MC transport calculation from beam source to patient geometry for photon and electron beams. The MC input files, which are prepared from the beam geometry properties and patient information (e.g., images and structure contours), are uploaded and run on a cluster using shell commands controlled from the MMCTP GUI. The visualization, dose matrix operation and DVH tools offer extensive options for plan analysis and comparison between MC plans and plans imported from commercial treatment planning systems. The MMCTP GUI provides a flexible research platform for the development of patient-specific MC treatment planning for photon and electron external beam radiation therapy. The impact of this tool lies in the fact that it allows for systematic, platform-independent, large-scale MC treatment planning for different treatment sites. Patient recalculations were performed to validate the software and ensure proper functionality.
Integrating Multimodal Radiation Therapy Data into i2b2.
Zapletal, Eric; Bibault, Jean-Emmanuel; Giraud, Philippe; Burgun, Anita
2018-04-01
Clinical data warehouses are now widely used to foster clinical and translational research and the Informatics for Integrating Biology and the Bedside (i2b2) platform has become a de facto standard for storing clinical data in many projects. However, to design predictive models and assist in personalized treatment planning in cancer or radiation oncology, all available patient data need to be integrated into i2b2, including radiation therapy data that are currently not addressed in many existing i2b2 sites. To use radiation therapy data in projects related to rectal cancer patients, we assessed the feasibility of integrating radiation oncology data into the i2b2 platform. The Georges Pompidou European Hospital, a hospital from the Assistance Publique - Hôpitaux de Paris group, has developed an i2b2-based clinical data warehouse of various structured and unstructured clinical data for research since 2008. To store and reuse various radiation therapy data-dose details, activities scheduling, and dose-volume histogram (DVH) curves-in this repository, we first extracted raw data by using some reverse engineering techniques and a vendor's application programming interface. Then, we implemented a hybrid storage approach by combining the standard i2b2 "Entity-Attribute-Value" storage mechanism with a "JavaScript Object Notation (JSON) document-based" storage mechanism without modifying the i2b2 core tables. Validation was performed using (1) the Business Objects framework for replicating vendor's application screens showing dose details and activities scheduling data and (2) the R software for displaying the DVH curves. We developed a pipeline to integrate the radiation therapy data into the Georges Pompidou European Hospital i2b2 instance and evaluated it on a cohort of 262 patients. We were able to use the radiation therapy data on a preliminary use case by fetching the DVH curve data from the clinical data warehouse and displaying them in a R chart. By adding radiation therapy data into the clinical data warehouse, we were able to analyze radiation therapy response in cancer patients and we have leveraged the i2b2 platform to store radiation therapy data, including detailed information such as the DVH to create new ontology-based modules that provides research investigators with a wider spectrum of clinical data. Schattauer GmbH Stuttgart.
Effects of voxelization on dose volume histogram accuracy
NASA Astrophysics Data System (ADS)
Sunderland, Kyle; Pinter, Csaba; Lasso, Andras; Fichtinger, Gabor
2016-03-01
PURPOSE: In radiotherapy treatment planning systems, structures of interest such as targets and organs at risk are stored as 2D contours on evenly spaced planes. In order to be used in various algorithms, contours must be converted into binary labelmap volumes using voxelization. The voxelization process results in lost information, which has little effect on the volume of large structures, but has significant impact on small structures, which contain few voxels. Volume differences for segmented structures affects metrics such as dose volume histograms (DVH), which are used for treatment planning. Our goal is to evaluate the impact of voxelization on segmented structures, as well as how factors like voxel size affects metrics, such as DVH. METHODS: We create a series of implicit functions, which represent simulated structures. These structures are sampled at varying resolutions, and compared to labelmaps with high sub-millimeter resolutions. We generate DVH and evaluate voxelization error for the same structures at different resolutions by calculating the agreement acceptance percentage between the DVH. RESULTS: We implemented tools for analysis as modules in the SlicerRT toolkit based on the 3D Slicer platform. We found that there were large DVH variation from the baseline for small structures or for structures located in regions with a high dose gradient, potentially leading to the creation of suboptimal treatment plans. CONCLUSION: This work demonstrates that labelmap and dose volume voxel size is an important factor in DVH accuracy, which must be accounted for in order to ensure the development of accurate treatment plans.
SU-F-T-266: Dynalogs Based Evaluation of Different Dose Rate IMRT Using DVH and Gamma Index
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahmed, S; Ahmed, S; Ahmed, F
2016-06-15
Purpose: This work investigates the impact of low and high dose rate on IMRT through Dynalogs by evaluating Gamma Index and Dose Volume Histogram. Methods: The Eclipse™ treatment planning software was used to generate plans on prostate and head and neck sites. A range of dose rates 300 MU/min and 600 MU/min were applied to each plan in order to investigate their effect on the beam ON time, efficiency and accuracy. Each plan had distinct monitor units per fraction, delivery time, mean dose rate and leaf speed. The DVH data was used in the assessment of the conformity and planmore » quality.The treatments were delivered on Varian™ Clinac 2100C accelerator equipped with 120 leaf millennium MLC. Dynalogs of each plan were analyzed by MATLAB™ program. Fluence measurements were performed using the Sun Nuclear™ 2D diode array and results were assessed, based on Gamma analysis of dose fluence maps, beam delivery statistics and Dynalogs data. Results: Minor differences found by adjusted R-squared analysis of DVH’s for all the plans with different dose rates. It has been also found that more and larger fields have greater time reduction at high dose rate and there was a sharp decrease in number of control points observed in dynalog files by switching dose rate from 300 MU/min to 600 MU/min. Gamma Analysis of all plans passes the confidence limit of ≥95% with greater number of passing points in 300 MU/min dose rate plans. Conclusion: The dynalog files are compatible tool for software based IMRT QA. It can work perfectly parallel to measurement based QA setup and stand-by procedure for pre and post delivery of treatment plan.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Xin; Ou, Xiaomin; Xu, Tingting
Purpose: To determine dosimetric risk factors for the occurrence of temporal lobe necrosis (TLN) among nasopharyngeal carcinoma (NPC) patients treated with intensity modulated radiation therapy (IMRT) and to investigate the impact of dose-volume histogram (DVH) parameters on the volume of TLN lesions (V-N). Methods and Materials: Forty-three NPC patients who had developed TLN following IMRT and 43 control subjects free of TLN were retrospectively assessed. DVH parameters included maximum dose (Dmax), minimum dose (Dmin), mean dose (Dmean), absolute volumes receiving specific dose (Vds) from 20 to 76 Gy (V20-V76), and doses covering certain volumes (Dvs) from 0.25 to 6.0 cm{sup 3} (D0.25-D6.0).more » V-Ns were quantified with axial magnetic resonance images. Results: DVH parameters were ubiquitously higher in temporal lobes with necrosis than in healthy temporal lobes. Increased Vds and Dvs were significantly associated with higher risk of TLN occurrence (P<.05). In particular, Vds at a dose of ≥70 Gy were found with the highest odds ratios. A common increasing trend was detected between V-N and DVH parameters through trend tests (P for trend of <.05). Linear regression analysis showed that V45 had the strongest predictive power for V-N (adjusted R{sup 2} = 0.305, P<.0001). V45 of <15.1 cm{sup 3} was relatively safe as the dose constraint for preventing large TLN lesions with V-N of >5 cm{sup 3}. Conclusions: Dosimetric parameters are significantly associated with TLN occurrence and the extent of temporal lobe injury. To better manage TLN, it would be important to avoid both focal high dose and moderate dose delivered to a large area in TLs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Weili; Department of Radiation Oncology, the Fourth Affiliated Hospital, China Medical University, Shenyang; Xu, Yaping
2013-08-01
Purpose: This study aimed to compare lung dose–volume histogram (DVH) parameters such as mean lung dose (MLD) and the lung volume receiving ≥20 Gy (V20) of commonly used definitions of normal lung in terms of tumor/target subtraction and to determine to what extent they differ in predicting radiation pneumonitis (RP). Methods and Materials: One hundred lung cancer patients treated with definitive radiation therapy were assessed. The gross tumor volume (GTV) and clinical planning target volume (PTV{sub c}) were defined by the treating physician and dosimetrist. For this study, the clinical target volume (CTV) was defined as GTV with 8-mm uniformmore » expansion, and the PTV was defined as CTV with an 8-mm uniform expansion. Lung DVHs were generated with exclusion of targets: (1) GTV (DVH{sub G}); (2) CTV (DVH{sub C}); (3) PTV (DVH{sub P}); and (4) PTV{sub c} (DVH{sub Pc}). The lung DVHs, V20s, and MLDs from each of the 4 methods were compared, as was their significance in predicting radiation pneumonitis of grade 2 or greater (RP2). Results: There are significant differences in dosimetric parameters among the various definition methods (all Ps<.05). The mean and maximum differences in V20 are 4.4% and 12.6% (95% confidence interval 3.6%-5.1%), respectively. The mean and maximum differences in MLD are 3.3 Gy and 7.5 Gy (95% confidence interval, 1.7-4.8 Gy), respectively. MLDs of all methods are highly correlated with each other and significantly correlated with clinical RP2, although V20s are not. For RP2 prediction, on the receiver operating characteristic curve, MLD from DVH{sub G} (MLD{sub G}) has a greater area under curve of than MLD from DVH{sub C} (MLD{sub C}) or DVH{sub P} (MLD{sub P}). Limiting RP2 to 30%, the threshold is 22.4, 20.6, and 18.8 Gy, for MLD{sub G}, MLD{sub C}, and MLD{sub P}, respectively. Conclusions: The differences in MLD and V20 from various lung definitions are significant. MLD from the GTV exclusion method may be more accurate in predicting clinical significant radiation pneumonitis.« less
Taylor, Bruce G; Mumford, Elizabeth A; Stein, Nan D
2015-02-01
We examine whether the Shifting Boundaries (SB) intervention, a primary intervention to prevent youth dating violence and sexual harassment (DV/H), is differentially effective for girls compared with boys or for youth with a history of DV/H experiences. We randomly assigned SB to 30 public middle schools in New York City, enrolling 117 sixth and seventh grade classes to receive a classroom, building, combined, or neither intervention. The SB classroom intervention included six sessions emphasizing the laws/consequences of DV/H, establishing boundaries and safe relationships. The SB schoolwide/building intervention included the use of school-based restraining orders, greater faculty/security presence in unsafe "hot spots" mapped by students, and posters to increase DV/H awareness and reporting. Student surveys were implemented at baseline, immediately after intervention, and 6 months after intervention. At 6 months after intervention, the SB building-level intervention was associated with significant reductions in the frequency of sexual harassment (SH) perpetration and victimization; the prevalence and frequency of sexual dating violence victimization; and the frequency of total dating violence victimization and perpetration. We also had one anomalous finding that the interventions were associated with an increase in the prevalence of SH victimization. These results were consistent for girls and boys, and those with or without a history of DV/H, with the one exception for those exposed to the SB building condition who had earlier reported perpetrating SH had a significantly lower frequency of perpetrating SH at the follow-up than those without such a history. SB can provide effective universal prevention of middle school DV/H experiences, regardless of students' prior exposure histories, and for boys and girls. Copyright © 2015 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.
Ahmed, Saeed; Nelms, Benjamin; Kozelka, Jakub; Zhang, Geoffrey; Moros, Eduardo
2016-01-01
The original helical ArcCHECK (AC) diode array and associated software for 3D measurement‐guided dose reconstruction were characterized and validated; however, recent design changes to the AC required that the subject be revisited. The most important AC change starting in 2014 was a significant reduction in the overresponse of diodes to scattered radiation outside of the direct beam, accomplished by reducing the amount of high‐Z materials adjacent to the diodes. This change improved the diode measurement accuracy, but in the process invalidated the dose reconstruction models that were assembled based on measured data acquired with the older version of the AC. A correction mechanism was introduced in the reconstruction software (3DVH) to accommodate this and potential future design changes without requiring updating model parameters. For each permutation of AC serial number and beam model, the user can define in 3DVH a single correction factor which will be used to compensate for the difference in the out‐of‐field response between the new and original AC designs. The exact value can be determined by minimizing the dose‐difference with an ionization chamber or another independent dosimeter. A single value of 1.17, corresponding to the maximum measured out‐of‐field response difference between the new and old AC, provided satisfactory results for all studied energies (6X, 15X, and flattening filter‐free 10XFFF). A library of standard cases recommended by the AAPM TG‐244 Report was used for reconstructed dose verification. The overall difference between reconstructed dose and an ion chamber in a water‐equivalent phantom in the targets was 0.0% ± 1.4% (1 SD). The reconstructed dose on a homogeneous phantom was also compared to a biplanar diode dosimeter (Delta4) using gamma analysis with 2% (local dose‐error normalization)/2 mm/10% cutoff criteria. The mean agreement rate was 96.7% ± 3.7%. For the plans common with the previous comparison, the mean agreement rate was 98.3% ± 0.8%, essentially unchanged. We conclude that the proposed software modification adequately addresses the change in the dosimeter response. PACS number(s): 87.55Qr PMID:27929491
Ahmed, Saeed; Nelms, Benjamin; Kozelka, Jakub; Zhang, Geoffrey; Moros, Eduardo; Feygelman, Vladimir
2016-11-08
The original helical ArcCHECK (AC) diode array and associated software for 3D measurement-guided dose reconstruction were characterized and validated; however, recent design changes to the AC required that the subject be revisited. The most important AC change starting in 2014 was a significant reduction in the overresponse of diodes to scattered radiation outside of the direct beam, accom-plished by reducing the amount of high-Z materials adjacent to the diodes. This change improved the diode measurement accuracy, but in the process invalidated the dose reconstruction models that were assembled based on measured data acquired with the older version of the AC. A correction mechanism was intro-duced in the reconstruction software (3DVH) to accommodate this and potential future design changes without requiring updating model parameters. For each permutation of AC serial number and beam model, the user can define in 3DVH a single correction factor which will be used to compensate for the difference in the out-of-field response between the new and original AC designs. The exact value can be determined by minimizing the dose-difference with an ionization chamber or another independent dosimeter. A single value of 1.17, corresponding to the maximum measured out-of-field response difference between the new and old AC, provided satisfactory results for all studied energies (6X, 15X, and flatten-ing filter-free 10XFFF). A library of standard cases recommended by the AAPM TG-244 Report was used for reconstructed dose verification. The overall difference between reconstructed dose and an ion chamber in a water-equivalent phantom in the targets was 0.0% ± 1.4% (1 SD). The reconstructed dose on a homogeneous phantom was also compared to a biplanar diode dosimeter (Delta4) using gamma analysis with 2% (local dose-error normalization) / 2 mm / 10% cutoff criteria. The mean agreement rate was 96.7% ± 3.7%. For the plans common with the previous comparison, the mean agreement rate was 98.3% ± 0.8%, essentially unchanged. We conclude that the proposed software modification adequately addresses the change in the dosimeter response. © 2016 The Authors.
SU-F-BRD-14: The Effect of Radiation-Induced Esophageal Swelling On Dose-Volume Histograms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Niedzielski, J; Martel, M; Tucker, S
2014-06-15
Purpose: Acute esophagitis results in esophageal swelling. Here we evaluate the effect of this response on DVH metrics calculated throughout the course of radiation therapy. Methods: Twenty-nine NSCLC patients were identified who received weekly CT imaging, and varying esophagitis grades (11 grade0 patients, 12 grade2 patients, 6 grade3 patients), using CTCAE scoring criteria. Deformable image registration was used to map the planning esophagus contour to the weekly CT images. Treatment plans were recalculated on each weekly 4DCT and DVH metrics for the esophagus were compared to the delivered treatment plan. DVH metrics were also extracted for esophagus planning-at-risk volumes (PRV)more » with 1–3mm uniform expansions. Results: The esophagus V50 increased as the treatment progressed by 2.3±0.7cc and 9.0±1.1cc, for the grade 0 and grade 2/3 patients, respectively. The mean esophageal dose (MED) increased by 3.5±0.9Gy, and 8.0±1.1Gy for the grade 0 and grade 2/3 patients, respectively. In some cases where the planned V50 was similar, it remained the same at the end of treatment for grade 0 cases, but increased for higher grade cases. These apparent changes in delivered dose, as expressed by the DVH, are mostly attributed to volume changes in the regions of esophagitis. In addition, portions of the esophagus of some patients moved into highdose regions. The 2mm PRV was able to account for these differences in all but 1 of the 18 G2/3 patients. The 1mm PRV produced the closest DVH metrics calculated from the average weekly plans compared to the true treatment plan. Conclusion: Esophagus radiation response affects DVH parameters throughout treatment, especially patients with high toxicity. This effect must be considered when comparing DVHs calculated using daily IGRT CT images with those from the original planning CT (e.g. for adaptive planning). Adding a margin to the esophagus can account for variation in DVH metrics.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, K; Chen, X; Wang, J
Purpose: To incorporate dose volume histogram (DVH) prediction into Auto-Planning for volumetric-modulated arc therapy (VMAT) treatment planning and investigate the benefit of this new technique for rectal cancer. Methods: Ninety clinically accepted VMAT plans for patients with rectal cancer were selected and trained in the RapidPlan for DVH prediction. Both internal and external validations were performed before implementing the prediction model. A new VMAT planning method (hybrid-VMAT) was created with combining the DVH prediction and Auto-Planning. For each new patient, the DVH will be predicted and individual DVH constrains will be obtained and were exported as the original optimization parametersmore » to the Auto-Planning (Pinnacle3 treatment planning system, v9.10) for planning. A total of 20 rectal cancer patients previously treated with manual VMAT (manual-VMAT) plans were replanned using this new method. Dosimetric comparisons were performed between manual VMAT and new method plans. Results: Hybrid-VMAT shows similar PTV coverage to manual-VMAT in D2%, D98% and HI (p>0.05) and superior coverage in CI (p=0.000). For the bladder, the means of V40 and mean dose are 36.0% and 35.6Gy for hybrid-VMAT and 42% and 38.0Gy for the manual-VMAT. For the left (right) femur, the means of V30 and mean dose are 10.6% (11.6%) and 17.9Gy (19.2Gy) for the hybrid-VMAT and 25.6% (24.1%) and 27.3Gy (26.2Gy) for the manual-VMAT. The hybrid-VMAT has significantly improved the organs at risk sparing. Conclusion: The integration of DVH prediction and Auto-Planning significantly improve the VMAT plan quality in the rectal cancer radiotherapy. Our results show the benefit of the new method and will be further investigated in other tumor sites.« less
Li, Nan; Zarepisheh, Masoud; Uribe-Sanchez, Andres; Moore, Kevin; Tian, Zhen; Zhen, Xin; Graves, Yan Jiang; Gautier, Quentin; Mell, Loren; Zhou, Linghong; Jia, Xun; Jiang, Steve
2013-12-21
Adaptive radiation therapy (ART) can reduce normal tissue toxicity and/or improve tumor control through treatment adaptations based on the current patient anatomy. Developing an efficient and effective re-planning algorithm is an important step toward the clinical realization of ART. For the re-planning process, manual trial-and-error approach to fine-tune planning parameters is time-consuming and is usually considered unpractical, especially for online ART. It is desirable to automate this step to yield a plan of acceptable quality with minimal interventions. In ART, prior information in the original plan is available, such as dose-volume histogram (DVH), which can be employed to facilitate the automatic re-planning process. The goal of this work is to develop an automatic re-planning algorithm to generate a plan with similar, or possibly better, DVH curves compared with the clinically delivered original plan. Specifically, our algorithm iterates the following two loops. An inner loop is the traditional fluence map optimization, in which we optimize a quadratic objective function penalizing the deviation of the dose received by each voxel from its prescribed or threshold dose with a set of fixed voxel weighting factors. In outer loop, the voxel weighting factors in the objective function are adjusted according to the deviation of the current DVH curves from those in the original plan. The process is repeated until the DVH curves are acceptable or maximum iteration step is reached. The whole algorithm is implemented on GPU for high efficiency. The feasibility of our algorithm has been demonstrated with three head-and-neck cancer IMRT cases, each having an initial planning CT scan and another treatment CT scan acquired in the middle of treatment course. Compared with the DVH curves in the original plan, the DVH curves in the resulting plan using our algorithm with 30 iterations are better for almost all structures. The re-optimization process takes about 30 s using our in-house optimization engine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grigorov, Grigor N.; Chow, James C.L.; Grigorov, Lenko
2006-05-15
The normal tissue complication probability (NTCP) is a predictor of radiobiological effect for organs at risk (OAR). The calculation of the NTCP is based on the dose-volume-histogram (DVH) which is generated by the treatment planning system after calculation of the 3D dose distribution. Including the NTCP in the objective function for intensity modulated radiation therapy (IMRT) plan optimization would make the planning more effective in reducing the postradiation effects. However, doing so would lengthen the total planning time. The purpose of this work is to establish a method for NTCP determination, independent of a DVH calculation, as a quality assurancemore » check and also as a mean of improving the treatment planning efficiency. In the study, the CTs of ten randomly selected prostate patients were used. IMRT optimization was performed with a PINNACLE3 V 6.2b planning system, using planning target volume (PTV) with margins in the range of 2 to 10 mm. The DVH control points of the PTV and OAR were adapted from the prescriptions of Radiation Therapy Oncology Group protocol P-0126 for an escalated prescribed dose of 82 Gy. This paper presents a new model for the determination of the rectal NTCP ({sub R}NTCP). The method uses a special function, named GVN (from Gy, Volume, NTCP), which describes the {sub R}NTCP if 1 cm{sup 3} of the volume of intersection of the PTV and rectum (R{sub int}) is irradiated uniformly by a dose of 1 Gy. The function was 'geometrically' normalized using a prostate-prostate ratio (PPR) of the patients' prostates. A correction of the {sub R}NTCP for different prescribed doses, ranging from 70 to 82 Gy, was employed in our model. The argument of the normalized function is the R{sub int}, and parameters are the prescribed dose, prostate volume, PTV margin, and PPR. The {sub R}NTCPs of another group of patients were calculated by the new method and the resulting difference was <{+-}5% in comparison to the NTCP calculated by the PINNACLE3 software where Kutcher's dose-response model for NTCP calculation is adopted.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ono, K; Fujimoto, S; Akagi, Y
Purpose: To evaluate the dosimetric impact of the interplay effect between multileaf collimator (MLC) movement and tumor respiratory motion during delivery of volumetric modulate arc therapy (VMAT) by using customized polymer gel dosimeter. Methods: Polyacrylamide-based gel dosimeter contained magnesium chloride as a sensitizer (iPAGAT) was used in this study. An excellent gas barrier PAN (BAREX) techno bottle (φ8 cm, 650 mL) filled with iPAGAT was set to the QUASAR™ respiratory motion phantom, and was moved with motion amplitudes of 1 and 2 cm with a 4 second period during VMAT delivery by the Novalis Tx linear accelerator (Varian/BrainLAB). Two sphericalmore » tumors with a 2 cm diameter (GTV1 and GTV2) were defined, and ITV1 (GTV1+1 cm) and ITV2 (GTV2+2 cm) with expansion in the superior-inferior (S-I) direction were also defined with simulated respiratory motion. PTV margin was 2 mm around the ITV considering the setup uncertainty. Two single arc VMAT plans with 30 Gy at 3 Gy per fraction (GTV: D98>100%, PTV: D95=100%) were generated by the Varian Eclipse treatment planning system. Three-dimensional dose distribution in iPAGAT was read out by the Signa 1.5T MRI system (GE), and was evaluated by dose-volume histogram (DVH) using in-house developed software. Results: According to DVH analysis by iPAGAT, D98 of GTV1 and GTV2 were more than 100% of the prescribed dose. In contrast, D95 of PTV1 and PTV2 were about 85% and 65%, respectively. Furthermore, low-to-intermediate dose was widespread with motion amplitude of 2 cm. Conclusion: DVH analysis using iPAGAT polymer gel dosimeter was performed in this study. As a result, interplay effect was negligible, since dose coverage of GTV was sufficient during VMAT delivery with simulated respiratory motion. However, the dose reduction of PTV and the spread of low-to-intermediate dose compared to the planned dose require scrupulous attention for large tumor respiratory motion.« less
Automatic treatment plan re-optimization for adaptive radiotherapy guided with the initial plan DVHs
NASA Astrophysics Data System (ADS)
Li, Nan; Zarepisheh, Masoud; Uribe-Sanchez, Andres; Moore, Kevin; Tian, Zhen; Zhen, Xin; Jiang Graves, Yan; Gautier, Quentin; Mell, Loren; Zhou, Linghong; Jia, Xun; Jiang, Steve
2013-12-01
Adaptive radiation therapy (ART) can reduce normal tissue toxicity and/or improve tumor control through treatment adaptations based on the current patient anatomy. Developing an efficient and effective re-planning algorithm is an important step toward the clinical realization of ART. For the re-planning process, manual trial-and-error approach to fine-tune planning parameters is time-consuming and is usually considered unpractical, especially for online ART. It is desirable to automate this step to yield a plan of acceptable quality with minimal interventions. In ART, prior information in the original plan is available, such as dose-volume histogram (DVH), which can be employed to facilitate the automatic re-planning process. The goal of this work is to develop an automatic re-planning algorithm to generate a plan with similar, or possibly better, DVH curves compared with the clinically delivered original plan. Specifically, our algorithm iterates the following two loops. An inner loop is the traditional fluence map optimization, in which we optimize a quadratic objective function penalizing the deviation of the dose received by each voxel from its prescribed or threshold dose with a set of fixed voxel weighting factors. In outer loop, the voxel weighting factors in the objective function are adjusted according to the deviation of the current DVH curves from those in the original plan. The process is repeated until the DVH curves are acceptable or maximum iteration step is reached. The whole algorithm is implemented on GPU for high efficiency. The feasibility of our algorithm has been demonstrated with three head-and-neck cancer IMRT cases, each having an initial planning CT scan and another treatment CT scan acquired in the middle of treatment course. Compared with the DVH curves in the original plan, the DVH curves in the resulting plan using our algorithm with 30 iterations are better for almost all structures. The re-optimization process takes about 30 s using our in-house optimization engine. This work was originally presented at the 54th AAPM annual meeting in Charlotte, NC, July 29-August 2, 2012.
NASA Astrophysics Data System (ADS)
Kennedy, A. M.; Lane, J.; Ebert, M. A.
2014-03-01
Plan review systems often allow dose volume histogram (DVH) recalculation as part of a quality assurance process for trials. A review of the algorithms provided by a number of systems indicated that they are often very similar. One notable point of variation between implementations is in the location and frequency of dose sampling. This study explored the impact such variations can have on DVH based plan evaluation metrics (Normal Tissue Complication Probability (NTCP), min, mean and max dose), for a plan with small structures placed over areas of high dose gradient. Dose grids considered were exported from the original planning system at a range of resolutions. We found that for the CT based resolutions used in all but one plan review systems (CT and CT with guaranteed minimum number of sampling voxels in the x and y direction) results were very similar and changed in a similar manner with changes in the dose grid resolution despite the extreme conditions. Differences became noticeable however when resolution was increased in the axial (z) direction. Evaluation metrics also varied differently with changing dose grid for CT based resolutions compared to dose grid based resolutions. This suggests that if DVHs are being compared between systems that use a different basis for selecting sampling resolution it may become important to confirm that a similar resolution was used during calculation.
SU-E-T-615: Plan Comparison Between Photon IMRT and Proton Plans Incorporating Uncertainty Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheng, C; Wessels, B; Jesseph, F
2015-06-15
Purpose: In this study, we investigate the effect of setup uncertainty on DVH calculations which may impact plan comparison. Methods: Treatment plans (6 MV VMAT calculated on Pinnacle TPS) were chosen for different disease sites: brain, prostate, H&N and spine in this retrospective study. A proton plan (PP) using double scattering beams was generated for each selected VMAT plan subject to the same set of dose-volume constraints as in VMAT. An uncertainty analysis was incorporated on the DVH calculations in which isocenter shifts from 1 to 5 mm in each of the ±x, ±y and ±z directions were used tomore » simulate the setup uncertainty and residual positioning errors. A total of 40 different combinations of isocenter shifts were used in the re-calculation of DVH of the PTV and the various OARs for both the VMAT and the corresponding PT. Results: For the brain case, both VMAT and PP are comparable in PTV coverage and OAR sparing, and VMAT is a clear choice for treatment due to its ease of delivery. However, when incorporating isoshifts in DVH calculations, a significant change in dose-volume relationship emerges. For example, both VMAT and PT provide adequate coverage, even with ±3mm isoshift. However, +3mm isoshift results in increase of V40(Lcochlea, VMAT) from 7.2% in the original plan to 45% and V40(R cochlea, VMAT) from 75% to 92%. For protons, V40(Lcochlea, PT) increases from 62% in the initial plan to 75%, while V40(Rcochea, PT) increases from 7% to 26%. Conclusion: DVH alone may not be sufficient to allow an unequivocal decision in plan comparison, especially when two rival plans are very similar in both PTV coverage and OAR sparing. It is a good practice to incorporate uncertainty analysis on photon and proton plan comparison studies to test the plan robustness in plan evaluation.« less
NASA Astrophysics Data System (ADS)
Gordon, J. J.; Snyder, K.; Zhong, H.; Barton, K.; Sun, Z.; Chetty, I. J.; Matuszak, M.; Ten Haken, R. K.
2015-09-01
In conventionally fractionated radiation therapy for lung cancer, radiation pneumonitis’ (RP) dependence on the normal lung dose-volume histogram (DVH) is not well understood. Complication models alternatively make RP a function of a summary statistic, such as mean lung dose (MLD). This work searches over damage profiles, which quantify sub-volume damage as a function of dose. Profiles that achieve best RP predictive accuracy on a clinical dataset are hypothesized to approximate DVH dependence. Step function damage rate profiles R(D) are generated, having discrete steps at several dose points. A range of profiles is sampled by varying the step heights and dose point locations. Normal lung damage is the integral of R(D) with the cumulative DVH. Each profile is used in conjunction with a damage cutoff to predict grade 2 plus (G2+) RP for DVHs from a University of Michigan clinical trial dataset consisting of 89 CFRT patients, of which 17 were diagnosed with G2+ RP. Optimal profiles achieve a modest increase in predictive accuracy—erroneous RP predictions are reduced from 11 (using MLD) to 8. A novel result is that optimal profiles have a similar distinctive shape: enhanced damage contribution from low doses (<20 Gy), a flat contribution from doses in the range ~20-40 Gy, then a further enhanced contribution from doses above 40 Gy. These features resemble the hyper-radiosensitivity / increased radioresistance (HRS/IRR) observed in some cell survival curves, which can be modeled using Joiner’s induced repair model. A novel search strategy is employed, which has the potential to estimate RP dependence on the normal lung DVH. When applied to a clinical dataset, identified profiles share a characteristic shape, which resembles HRS/IRR. This suggests that normal lung may have enhanced sensitivity to low doses, and that this sensitivity can affect RP risk.
SU-E-J-71: Spatially Preserving Prior Knowledge-Based Treatment Planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, H; Xing, L
2015-06-15
Purpose: Prior knowledge-based treatment planning is impeded by the use of a single dose volume histogram (DVH) curve. Critical spatial information is lost from collapsing the dose distribution into a histogram. Even similar patients possess geometric variations that becomes inaccessible in the form of a single DVH. We propose a simple prior knowledge-based planning scheme that extracts features from prior dose distribution while still preserving the spatial information. Methods: A prior patient plan is not used as a mere starting point for a new patient but rather stopping criteria are constructed. Each structure from the prior patient is partitioned intomore » multiple shells. For instance, the PTV is partitioned into an inner, middle, and outer shell. Prior dose statistics are then extracted for each shell and translated into the appropriate Dmin and Dmax parameters for the new patient. Results: The partitioned dose information from a prior case has been applied onto 14 2-D prostate cases. Using prior case yielded final DVHs that was comparable to manual planning, even though the DVH for the prior case was different from the DVH for the 14 cases. Solely using a single DVH for the entire organ was also performed for comparison but showed a much poorer performance. Different ways of translating the prior dose statistics into parameters for the new patient was also tested. Conclusion: Prior knowledge-based treatment planning need to salvage the spatial information without transforming the patients on a voxel to voxel basis. An efficient balance between the anatomy and dose domain is gained through partitioning the organs into multiple shells. The use of prior knowledge not only serves as a starting point for a new case but the information extracted from the partitioned shells are also translated into stopping criteria for the optimization problem at hand.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fan, J; Fan, J; Hu, W
Purpose: To develop a fast automatic algorithm based on the two dimensional kernel density estimation (2D KDE) to predict the dose-volume histogram (DVH) which can be employed for the investigation of radiotherapy quality assurance and automatic treatment planning. Methods: We propose a machine learning method that uses previous treatment plans to predict the DVH. The key to the approach is the framing of DVH in a probabilistic setting. The training consists of estimating, from the patients in the training set, the joint probability distribution of the dose and the predictive features. The joint distribution provides an estimation of the conditionalmore » probability of the dose given the values of the predictive features. For the new patient, the prediction consists of estimating the distribution of the predictive features and marginalizing the conditional probability from the training over this. Integrating the resulting probability distribution for the dose yields an estimation of the DVH. The 2D KDE is implemented to predict the joint probability distribution of the training set and the distribution of the predictive features for the new patient. Two variables, including the signed minimal distance from each OAR (organs at risk) voxel to the target boundary and its opening angle with respect to the origin of voxel coordinate, are considered as the predictive features to represent the OAR-target spatial relationship. The feasibility of our method has been demonstrated with the rectum, breast and head-and-neck cancer cases by comparing the predicted DVHs with the planned ones. Results: The consistent result has been found between these two DVHs for each cancer and the average of relative point-wise differences is about 5% within the clinical acceptable extent. Conclusion: According to the result of this study, our method can be used to predict the clinical acceptable DVH and has ability to evaluate the quality and consistency of the treatment planning.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peppa, V; Pappas, E; Pantelis, E
2015-06-15
Purpose: To assess the dosimetric and radiobiological differences between TG43-based and model-based dosimetry in the treatment planning of {sup 192}Ir HDR brachytherapy for breast and head and neck cancer. Methods: Two cohorts of 57 Accelerated Partial Breast Irradiation (APBI) and 22 head and neck (H&N) patients with oral cavity carcinoma were studied. Dosimetry for the treatment plans was performed using the TG43 algorithm of the Oncentra Brachy v4.4 treatment planning system (TPS). Corresponding Monte Carlo (MC) simulations were performed using MCNP6 with input files automatically prepared by the BrachyGuide software tool from DICOM RT plan data. TG43 and MC datamore » were compared in terms of % dose differences, Dose Volume Histograms (DVHs) and related indices of clinical interest for the Planning Target Volume (PTV) and the Organs-At-Risk (OARs). A radiobiological analysis was also performed using the Equivalent Uniform Dose (EUD), mean survival fraction (S) and Tumor Control Probability (TCP) for the PTV, and the Normal Tissue Control Probability (N TCP) and the generalized EUD (gEUD) for the OARs. Significance testing of the observed differences performed using the Wilcoxon paired sample test. Results: Differences between TG43 and MC DVH indices, associated with the increased corresponding local % dose differences observed, were statistically significant. This is mainly attributed to their consistency however, since TG43 agrees closely with MC for the majority of DVH and radiobiological parameters in both patient cohorts. Differences varied considerably among patients only for the ipsilateral lung and ribs in the APBI cohort, with a strong correlation to target location. Conclusion: While the consistency and magnitude of differences in the majority of clinically relevant DVH indices imply that no change is needed in the treatment planning practice, individualized dosimetry improves accuracy and addresses instances of inter-patient variability observed. Research co-financed by the ESF and Greek funds through the Operational Program Education and Lifelong Learning Investing in Knowledge Society of the NSRF. Research Funding Program Aristeia. Nucletron, an Elekta company (Veenendaal, The Netherlands) is gratefully acknowledged for providing Oncentra Brachy v4.4 for research purposes.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kusumoto, Chiaki; Ohira, Shingo; Department of Medical Physics and Engineering, Osaka University Graduate School of Medicine, Suita
2016-07-01
Several reports have dealt with correlations of late rectal toxicity with rectal dose-volume histograms (DVHs) for high dose levels. There are 2 techniques to assess rectal volume for reception of a specific dose: relative-DVH (R-DVH, %) that indicates relative volume for a vertical axis, and absolute-DVH (A-DVH, cc) with its vertical axis showing absolute volume of the rectum. The parameters of DVH vary depending on the rectum delineation method, but the literature does not present any standardization of such methods. The aim of the present study was to evaluate the effects of different delineation methods on rectal DVHs. The enrollmentmore » for this study comprised 28 patients with high-risk localized prostate cancer, who had undergone intensity-modulated radiation therapy (IMRT) with the prescription dose of 78 Gy. The rectum was contoured with 4 different methods using 2 lengths, short (Sh) and long (Lg), and 2 cross sections, rectum (Rec) and rectal wall (Rw). Sh means the length from 1 cm above the seminal vesicles to 1 cm below the prostate and Lg the length from the rectosigmoid junction to the anus. Rec represents the entire rectal volume including the rectal contents and Rw the rectal volume of the area with a wall thickness of 4 mm. We compared dose-volume parameters by using 4 rectal contour methods for the same plan with the R-DVHs as well as the A-DVHs. For the high dose levels, the R-DVH parameters varied widely. The mean of V{sub 70} for Sh-Rw was the highest (19.4%) and nearly twice as high as that for Lg-Rec (10.4%). On the contrary, only small variations were observed in the A-DVH parameters (4.3, 4.3, 5.5, and 5.5 cc for Sh-Rw, Lg-Rw, Sh-Rec, and Lg-Rec, respectively). As for R-DVHs, the parameters of V{sub 70} varied depending on the rectal lengths (Sh-Rec vs Lg-Rec: R = 0.76; Sh-Rw vs Lg-Rw: R = 0.85) and cross sections (Sh-Rec vs Sh-Rw: R = 0.49; Lg-Rec vs Lg-Rw: R = 0.65). For A-DVHs, however, the parameters of Sh rectal A-DVHs hardly changed regardless of differences in rectal length at all dose levels. Moreover, at high dose levels (V{sub 70}), the parameters of A-DVHs showed less dependence on rectal cross sections (Sh-Rec vs Sh-Rw: R = 0.66; Lg-Rec vs Lg-Rw: R = 0.59). This study showed that A-DVHs were less dependent on the delineation methods than R-DVHs, especially for evaluating the rectal dose at higher dose levels. It can therefore be assumed that, in addition to R-DVHs, A-DVHs can be used for evaluating rectal toxicity.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhandare, N.
2014-06-01
Purpose: To estimate and compare the doses received by the obturator, external and internal iliac lymph nodes and point Methods: CT-MR fused image sets of 15 patients obtained for each of 5 fractions of HDR brachytherapy using tandem and ring applicator, were used to generate treatment plans optimized to deliver a prescription dose to HRCTV-D90 and to minimize the doses to organs at risk (OARs). For each set of image, target volume (GTV, HRCTV) OARs (Bladder, Rectum, Sigmoid), and both left and right pelvic lymph nodes (obturator, external and internal iliac lymph nodes) were delineated. Dose-volume histograms (DVH) were generatedmore » for pelvic nodal groups (left and right obturator group, internal and external iliac chains) Per fraction DVH parameters used for dose comparison included dose to 100% volume (D100), and dose received by 2cc (D2cc), 1cc (D1cc) and 0.1 cc (D0.1cc) of nodal volume. Dose to point B was compared with each DVH parameter using 2 sided t-test. Pearson correlation were determined to examine relationship of point B dose with nodal DVH parameters. Results: FIGO clinical stage varied from 1B1 to IIIB. The median pretreatment tumor diameter measured on MRI was 4.5 cm (2.7– 6.4cm).The median dose to bilateral point B was 1.20 Gy ± 0.12 or 20% of the prescription dose. The correlation coefficients were all <0.60 for all nodal DVH parameters indicating low degree of correlation. Only 2 cc of obturator nodes was not significantly different from point B dose on t-test. Conclusion: Dose to point B does not adequately represent the dose to any specific pelvic nodal group. When using image guided 3D dose-volume optimized treatment nodal groups should be individually identified and delineated to obtain the doses received by pelvic nodes.« less
The Impact of the Grid Size on TomoTherapy for Prostate Cancer
Kawashima, Motohiro; Kawamura, Hidemasa; Onishi, Masahiro; Takakusagi, Yosuke; Okonogi, Noriyuki; Okazaki, Atsushi; Sekihara, Tetsuo; Ando, Yoshitaka; Nakano, Takashi
2017-01-01
Discretization errors due to the digitization of computed tomography images and the calculation grid are a significant issue in radiation therapy. Such errors have been quantitatively reported for a fixed multifield intensity-modulated radiation therapy using traditional linear accelerators. The aim of this study is to quantify the influence of the calculation grid size on the dose distribution in TomoTherapy. This study used ten treatment plans for prostate cancer. The final dose calculation was performed with “fine” (2.73 mm) and “normal” (5.46 mm) grid sizes. The dose distributions were compared from different points of view: the dose-volume histogram (DVH) parameters for planning target volume (PTV) and organ at risk (OAR), the various indices, and dose differences. The DVH parameters were used Dmax, D2%, D2cc, Dmean, D95%, D98%, and Dmin for PTV and Dmax, D2%, and D2cc for OARs. The various indices used were homogeneity index and equivalent uniform dose for plan evaluation. Almost all of DVH parameters for the “fine” calculations tended to be higher than those for the “normal” calculations. The largest difference of DVH parameters for PTV was Dmax and that for OARs was rectal D2cc. The mean difference of Dmax was 3.5%, and the rectal D2cc was increased up to 6% at the maximum and 2.9% on average. The mean difference of D95% for PTV was the smallest among the differences of the other DVH parameters. For each index, whether there was a significant difference between the two grid sizes was determined through a paired t-test. There were significant differences for most of the indices. The dose difference between the “fine” and “normal” calculations was evaluated. Some points around high-dose regions had differences exceeding 5% of the prescription dose. The influence of the calculation grid size in TomoTherapy is smaller than traditional linear accelerators. However, there was a significant difference. We recommend calculating the final dose using the “fine” grid size. PMID:28974860
SU-F-T-419: Evaluation of PlanIQ Feasibility DVH as Planning Objectives for Skull Base SBRT Patients
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, W; Wang, H; Chi, P
2016-06-15
Purpose: PlanIQ(Sun Nuclear Corporation) can provide feasibility measures on organs-at-risk(OARs) around the target based on depth, local anatomy density and energy of radiation beam used. This study is to test and evaluate PlanIQ feasibility DVHs as optimization objectives in the treatment planning process, and to investigate the potential to use them in routine clinical cases to improve planning efficiency. Methods: Two to three arcs VMAT Treatment plans were generated in Pinnacle based on PlanIQ feasibility DVH for six skull base patients who previously treated with SBRT. The PlanIQ feasibility DVH for each OAR consists of four zones – impossible (atmore » 100% target coverage), difficult, challenging and probable. Constrains to achieve DVH in difficult zone were used to start plan optimization. Further adjustment was made to improve coverage. The plan DVHs were compared to PlanIQ feasibility DVH to assess the dose received by 0%(D0), 5%(D5), 10%(D10) and 50%(D50) of the OAR volumes. Results: A total of 90 OARs were evaluated for 6 patients (mean 15 OARs, range 11–18 OARs). We used >98% PTV coverage as planning goal since it’s difficult to achieve 100% target coverage. For the generated plans, 96.7% of the OARs achieved D0 or D5 within difficult zone or impossible zone (ipsilateral OARs 93.5%, contralateral OARs 100%), while 90% and 65.6% of the OARs achieved D10 and D50 within difficult zone, respectively. Seventeen of the contralateral and out of field OARs achieved DVHs in impossible zone. For OARs adjacent or overlapped with target volume, the D0 and D5 are challenging to be optimized into difficult zone. All plans were completed within 2–4 adjustments to improve target coverage and uniformity. Conclusion: PlanIQ feasibility tool has the potential to provide difficult but achievable initial optimization objectives and therefore reduce the planning time to obtain a well optimized plan.« less
Threshold-driven optimization for reference-based auto-planning
NASA Astrophysics Data System (ADS)
Long, Troy; Chen, Mingli; Jiang, Steve; Lu, Weiguo
2018-02-01
We study threshold-driven optimization methodology for automatically generating a treatment plan that is motivated by a reference DVH for IMRT treatment planning. We present a framework for threshold-driven optimization for reference-based auto-planning (TORA). Commonly used voxel-based quadratic penalties have two components for penalizing under- and over-dosing of voxels: a reference dose threshold and associated penalty weight. Conventional manual- and auto-planning using such a function involves iteratively updating the preference weights while keeping the thresholds constant, an unintuitive and often inconsistent method for planning toward some reference DVH. However, driving a dose distribution by threshold values instead of preference weights can achieve similar plans with less computational effort. The proposed methodology spatially assigns reference DVH information to threshold values, and iteratively improves the quality of that assignment. The methodology effectively handles both sub-optimal and infeasible DVHs. TORA was applied to a prostate case and a liver case as a proof-of-concept. Reference DVHs were generated using a conventional voxel-based objective, then altered to be either infeasible or easy-to-achieve. TORA was able to closely recreate reference DVHs in 5-15 iterations of solving a simple convex sub-problem. TORA has the potential to be effective for auto-planning based on reference DVHs. As dose prediction and knowledge-based planning becomes more prevalent in the clinical setting, incorporating such data into the treatment planning model in a clear, efficient way will be crucial for automated planning. A threshold-focused objective tuning should be explored over conventional methods of updating preference weights for DVH-guided treatment planning.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghomi, Pooyan Shirvani; Zinchenko, Yuriy
2014-08-15
Purpose: To compare methods to incorporate the Dose Volume Histogram (DVH) curves into the treatment planning optimization. Method: The performance of three methods, namely, the conventional Mixed Integer Programming (MIP) model, a convex moment-based constrained optimization approach, and an unconstrained convex moment-based penalty approach, is compared using anonymized data of a prostate cancer patient. Three plans we generated using the corresponding optimization models. Four Organs at Risk (OARs) and one Tumor were involved in the treatment planning. The OARs and Tumor were discretized into total of 50,221 voxels. The number of beamlets was 943. We used commercially available optimization softwaremore » Gurobi and Matlab to solve the models. Plan comparison was done by recording the model runtime followed by visual inspection of the resulting dose volume histograms. Conclusion: We demonstrate the effectiveness of the moment-based approaches to replicate the set of prescribed DVH curves. The unconstrained convex moment-based penalty approach is concluded to have the greatest potential to reduce the computational effort and holds a promise of substantial computational speed up.« less
Nitsche, Mirko; Brannath, Werner; Brückner, Matthias; Wagner, Dirk; Kaltenborn, Alexander; Temme, Nils; Hermann, Robert M
2017-02-01
The objective of this retrospective planning study was to find a contouring definition for the rectum as an organ at risk (OAR) in curative three-dimensional external beam radiotherapy (EBRT) for prostate cancer (PCa) with a predictive correlation between the dose-volume histogram (DVH) and rectal toxicity. In a pre-study, the planning CT scans of 23 patients with PCa receiving definitive EBRT were analyzed. The rectum was contoured according to 13 different definitions, and the dose distribution was correlated with the respective rectal volumes by generating DVH curves. Three definitions were identified to represent the most distinct differences in the shapes of the DVH curves: one anatomical definition recommended by the Radiation Therapy Oncology Group (RTOG) and two functional definitions based on the target volume. In the main study, the correlation between different relative DVH parameters derived from these three contouring definitions and the occurrence of rectal toxicity during and after EBRT was studied in two consecutive collectives. The first cohort consisted of 97 patients receiving primary curative EBRT and the second cohort consisted of 66 patients treated for biochemical recurrence after prostatectomy. Rectal toxicity was investigated by clinical investigation and scored according to the Common Terminology Criteria for Adverse Events. Candidate parameters were the volume of the rectum, mean dose, maximal dose, volume receiving at least 60 Gy (V 60 ), area under the DVH curve up to 25 Gy and area under the DVH curve up to 75 Gy in dependence of each chosen rectum definition. Multivariable logistic regression considered other clinical factors such as pelvine lymphatics vs local target volume, diabetes, prior rectal surgery, anticoagulation or haemorrhoids too. In Cohort 1 (primary EBRT), the mean rectal volumes for definitions "RTOG", planning target volume "(PTV)-based" and "PTV-linked" were 100 cm 3 [standard deviation (SD) 43 cm 3 ], 60 cm 3 (SD 26 cm 3 ) and 74 cm 3 (SD 31 cm 3 ), respectively (p < 0.01; analysis of variance). The mean rectal doses according to these definitions were 35 Gy (SD 8 Gy), 48 Gy (SD 4 Gy) and 44 Gy (SD 5 Gy) (p < 0.01). In Cohort 2 (salvage EBRT), the mean rectal volumes were 114 cm 3 (SD 47 cm 3 ), 64 cm 3 (SD 26 cm 3 ) and 81 cm 3 (SD 30 cm 3 ) (p < 0.01) and the mean doses received by the rectum were 36 Gy (SD 8 Gy), 49 Gy (SD 5 Gy) and 44 Gy (SD 5 Gy) (p < 0.01). Acute or subacute rectal inflammation occurred in 69 (71.9%) patients in Cohort 1 and in 43 (70.5%) in Cohort 2. We did not find a correlation between all investigated DVH parameters and rectal toxicity, irrespective of the investigated definition. By adding additional variables in multivariate analysis, the predictive ability was substantially improved. Still, there was essentially no difference in the probability of predicting rectal inflammation occurrence between the tested contouring definitions. The RTOG anatomy-based recommendations are questionable in comparison with functional definitions, as they result in higher variances in several relative DVH parameters. Moreover, the anatomy-based definition is no better and no worse in the predictive value concerning clinical end points. Advances in knowledge: Functional definitions for the rectum as OAR are easier to apply, faster to contour, have smaller variances and do not offer less information than the anatomy-based RTOG definition.
Kamian, S; Kazemian, A; Esfahani, M; Mohammadi, E; Aghili, M
2010-01-01
To assess the possibility of delivering a homogeneous irradiation with respect to maximal tolerated dose to the optic pathway for paranasal sinus (PNS) tumors. Treatment planning with conformal three-dimensional (3D) and conventional two-dimensional (2D) was done on CT scans of 20 patients who had early or advanced PNS tumors. Four cases had been previously irradiated. Dose-volume histograms (DVH) for the planning target volume (PTV) and the visual pathway including globes, chiasma and optic nerves were compared between the 2 treatment plannings. The area under curve (AUC) in the DVH of the globes on the same side and contralateral side of tumor involvement was significantly higher in 2D planning (p <0.05), which caused higher integral dose to both globes. Also, the AUC in the DVH of chiasma was higher in 2D treatment planning (p=0.002). The integral dose to the contralateral optic nerve was significantly lower with 3D planning (p=0.007), but there was no significant difference for the optic nerve which was on the same side of tumor involvement (p >0.05). The AUC in the DVH of PTV was not significant (201.1 + or - 16.23 mm(3) in 2D planning vs. 201.15 + or - 15.09 mm(3) in 3D planning). The volume of PTV which received 90% of the prescribed dose was 96.9 + or - 4.41 cm(3) in 2D planning and 97.2 + or - 2.61 cm(3) in 3D planning (p >0.05). 3D conformal radiotherapy (RT) for PNS tumors enables the delivery of radiation to the tumor with respect to critical organs with a lower toxicity to the optic pathway.
Söhn, Matthias; Alber, Markus; Yan, Di
2007-09-01
The variability of dose-volume histogram (DVH) shapes in a patient population can be quantified using principal component analysis (PCA). We applied this to rectal DVHs of prostate cancer patients and investigated the correlation of the PCA parameters with late bleeding. PCA was applied to the rectal wall DVHs of 262 patients, who had been treated with a four-field box, conformal adaptive radiotherapy technique. The correlated changes in the DVH pattern were revealed as "eigenmodes," which were ordered by their importance to represent data set variability. Each DVH is uniquely characterized by its principal components (PCs). The correlation of the first three PCs and chronic rectal bleeding of Grade 2 or greater was investigated with uni- and multivariate logistic regression analyses. Rectal wall DVHs in four-field conformal RT can primarily be represented by the first two or three PCs, which describe approximately 94% or 96% of the DVH shape variability, respectively. The first eigenmode models the total irradiated rectal volume; thus, PC1 correlates to the mean dose. Mode 2 describes the interpatient differences of the relative rectal volume in the two- or four-field overlap region. Mode 3 reveals correlations of volumes with intermediate doses ( approximately 40-45 Gy) and volumes with doses >70 Gy; thus, PC3 is associated with the maximal dose. According to univariate logistic regression analysis, only PC2 correlated significantly with toxicity. However, multivariate logistic regression analysis with the first two or three PCs revealed an increased probability of bleeding for DVHs with more than one large PC. PCA can reveal the correlation structure of DVHs for a patient population as imposed by the treatment technique and provide information about its relationship to toxicity. It proves useful for augmenting normal tissue complication probability modeling approaches.
Poder, Joel; Yuen, Johnson; Howie, Andrew; Bece, Andrej; Bucci, Joseph
2017-11-01
The purpose of this study was to assess whether deformable image registration (DIR) is required for dose accumulation of multiple high dose rate prostate brachytherapy (HDRPBT) plans treated with the same catheter pattern on two different CT datasets. DIR was applied to 20 HDRPBT patients' planning CT images who received two treatment fractions on sequential days, on two different CT datasets, with the same implant. Quality of DIR in Velocity and MIM image registration systems was assessed by calculating the Dice Similarity Coefficient (DSC) and mean distance to agreement (MDA) for the prostate, urethra and rectum contours. Accumulated doses from each system were then calculated using the same DIR technique and dose volume histogram (DVH) parameters compared to manual addition with no DIR. The average DSC was found to be 0.83 (Velocity) and 0.84 (MIM), 0.80 (Velocity) and 0.80 (MIM), 0.80 (Velocity) and 0.81 (MIM), for the prostate, rectum and urethra contours, respectively. The average difference in calculated DVH parameters between the two systems using dose accumulation was less than 1%, and there was no statistically significant difference found between deformably accumulated doses in the two systems versus manual DVH addition with no DIR. Contour propagation using DIR in velocity and MIM was shown to be at least equivalent to inter-observer contouring variability on CT. The results also indicate that dose accumulation through manual addition of DVH parameters may be sufficient for HDRPBT treatments treated with the same catheter pattern on two different CT datasets. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.
Hayashi, K; Fujiwara, Y; Nomura, M; Kamata, M; Kojima, H; Kohzai, M; Sumita, K; Tanigawa, N
2015-02-01
To identify predictive factors for the development of pericardial effusion (PCE) in patients with oesophageal cancer treated with chemotherapy and radiotherapy (RT). From March 2006 to November 2012, patients with oesophageal cancer treated with chemoradiotherapy (CRT) using the following criteria were evaluated: radiation dose >50 Gy; heart included in the radiation field; dose-volume histogram (DVH) data available for analysis; no previous thoracic surgery; and no PCE before treatment. The diagnosis of PCE was independently determined by two radiologists. Clinical factors, the percentage of heart volume receiving >5-60 Gy in increments of 5 Gy (V5-60, respectively), maximum heart dose and mean heart dose were analysed. A total of 143 patients with oesophageal cancer were reviewed retrospectively. The median follow-up by CT was 15 months (range, 2.1-72.6 months) after RT. PCE developed in 55 patients (38.5%) after RT, and the median time to develop PCE was 3.5 months (range, 0.2-9.9 months). On univariate analysis, DVH parameters except for V60 were significantly associated with the development of PCE (p < 0.001). No clinical factor was significantly related to the development of PCE. Recursive partitioning analysis including all DVH parameters as variables showed a V10 cut-off value of 72.8% to be the most influential factor. The present results showed that DVH parameters are strong independent predictive factors for the development of PCE in patients with oesophageal cancer treated with CRT. A heart dosage was associated with the development of PCE with radiation and without prophylactic nodal irradiation.
Moore, Kevin L; Schmidt, Rachel; Moiseenko, Vitali; Olsen, Lindsey A; Tan, Jun; Xiao, Ying; Galvin, James; Pugh, Stephanie; Seider, Michael J; Dicker, Adam P; Bosch, Walter; Michalski, Jeff; Mutic, Sasa
2015-06-01
The purpose of this study was to quantify the frequency and clinical severity of quality deficiencies in intensity modulated radiation therapy (IMRT) planning in the Radiation Therapy Oncology Group 0126 protocol. A total of 219 IMRT patients from the high-dose arm (79.2 Gy) of RTOG 0126 were analyzed. To quantify plan quality, we used established knowledge-based methods for patient-specific dose-volume histogram (DVH) prediction of organs at risk and a Lyman-Kutcher-Burman (LKB) model for grade ≥2 rectal complications to convert DVHs into normal tissue complication probabilities (NTCPs). The LKB model was validated by fitting dose-response parameters relative to observed toxicities. The 90th percentile (22 of 219) of plans with the lowest excess risk (difference between clinical and model-predicted NTCP) were used to create a model for the presumed best practices in the protocol (pDVH0126,top10%). Applying the resultant model to the entire sample enabled comparisons between DVHs that patients could have received to DVHs they actually received. Excess risk quantified the clinical impact of suboptimal planning. Accuracy of pDVH predictions was validated by replanning 30 of 219 patients (13.7%), including equal numbers of presumed "high-quality," "low-quality," and randomly sampled plans. NTCP-predicted toxicities were compared to adverse events on protocol. Existing models showed that bladder-sparing variations were less prevalent than rectum quality variations and that increased rectal sparing was not correlated with target metrics (dose received by 98% and 2% of the PTV, respectively). Observed toxicities were consistent with current LKB parameters. Converting DVH and pDVH0126,top10% to rectal NTCPs, we observed 94 of 219 patients (42.9%) with ≥5% excess risk, 20 of 219 patients (9.1%) with ≥10% excess risk, and 2 of 219 patients (0.9%) with ≥15% excess risk. Replanning demonstrated the predicted NTCP reductions while maintaining the volume of the PTV receiving prescription dose. An equivalent sample of high-quality plans showed fewer toxicities than low-quality plans, 6 of 73 versus 10 of 73 respectively, although these differences were not significant (P=.21) due to insufficient statistical power in this retrospective study. Plan quality deficiencies in RTOG 0126 exposed patients to substantial excess risk for rectal complications. Copyright © 2015 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yin, L; Lin, A; Ahn, P
Purpose: To utilize online CBCT scans to develop models for predicting DVH metrics in proton therapy of head and neck tumors. Methods: Nine patients with locally advanced oropharyngeal cancer were retrospectively selected in this study. Deformable image registration was applied to the simulation CT, target volumes, and organs at risk (OARs) contours onto each weekly CBCT scan. Intensity modulated proton therapy (IMPT) treatment plans were created on the simulation CT and forward calculated onto each corrected CBCT scan. Thirty six potentially predictive metrics were extracted from each corrected CBCT. These features include minimum/maximum/mean over and under-ranges at the proximal andmore » distal surface of PTV volumes, and geometrical and water equivalent distance between PTV and each OARs. Principal component analysis (PCA) was used to reduce the dimension of the extracted features. Three principal components were found to account for over 90% of variances in those features. Datasets from eight patients were used to train a machine learning model to fit these principal components with DVH metrics (dose to 95% and 5% of PTV, mean dose or max dose to OARs) from the forward calculated dose on each corrected CBCT. The accuracy of this model was verified on the datasets from the 9th patient. Results: The predicted changes of DVH metrics from the model were in good agreement with actual values calculated on corrected CBCT images. Median differences were within 1 Gy for most DVH metrics except for larynx and constrictor mean dose. However, a large spread of the differences was observed, indicating additional training datasets and predictive features are needed to improve the model. Conclusion: Intensity corrected CBCT scans hold the potential to be used for online verification of proton therapy and prediction of delivered dose distributions.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
D’Souza, W; Zhang, B; Feigenberg, S
Purpose: To evaluate the compliance with evidence-based treatment planning organ-at-risk (OAR) guidelines in a single institution with four practice sites. Methods: Two hundred thirteen head and neck cancer patients treated between September 2009 and September 2013 were retrospectively selected. Consensus treatment planning guidelines, including OAR dose constraints, were established based on institutional experience and published data. Data spanned a time period of 2 years prior to (n=112) and 2 years post-enactment (n=101) of the guidelines. We investigated the differences in the frequency with which (1) OARs were contoured and (2) OAR DVH goals were met. Trends in the proportion withmore » OAR contours over time was tested using linear regression. Trends in the proportion of contoured OARs achieving clinical DVH goals were similarly tested. The proportion of patients contoured and meeting DVH goals before and after guidelines was compared using a test of proportions. Results: When the proportion of cases with OAR contours before and after guidelines were compared, we observed an increase from 75% to 87% (p=0.02) for the brainstem, decrease from 97% to 88% (p=0.01) for the cord and increase from 47% to 77% (p<0.001) for the mandible. For the proportion of cases with OAR contours in which clinical goals were met, a significant decrease from 99% to 90% was observed for the cord V48<0.3% (p=0.001). A significant decrease in the proportion of cases with left parotid contours (from 92% to 73% (p=0.03)) was observed over 2 years after guideline enactment and the proportion meeting the clinical DVH goal of V30<50% increased significantly from 36% to 50% (p=0.007) over the 2 years after guidelines. Conclusion: The enactment of OAR planning guidelines resulted in an increase in OAR contour compliance, overall. In cases with OAR contours, there was little to no change in the proportion that met clinical goals.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The ultimate goal of radiotherapy treatment planning is to find a treatment that will yield a high tumor control probability (TCP) with an acceptable normal tissue complication probability (NTCP). Yet most treatment planning today is not based upon optimization of TCPs and NTCPs, but rather upon meeting physical dose and volume constraints defined by the planner. It has been suggested that treatment planning evaluation and optimization would be more effective if they were biologically and not dose/volume based, and this is the claim debated in this month’s Point/Counterpoint. After a brief overview of biologically and DVH based treatment planning bymore » the Moderator Colin Orton, Joseph Deasy (for biological planning) and Charles Mayo (against biological planning) will begin the debate. Some of the arguments in support of biological planning include: this will result in more effective dose distributions for many patients DVH-based measures of plan quality are known to have little predictive value there is little evidence that either D95 or D98 of the PTV is a good predictor of tumor control sufficient validated outcome prediction models are now becoming available and should be used to drive planning and optimization Some of the arguments against biological planning include: several decades of experience with DVH-based planning should not be discarded we do not know enough about the reliability and errors associated with biological models the radiotherapy community in general has little direct experience with side by side comparisons of DVH vs biological metrics and outcomes it is unlikely that a clinician would accept extremely cold regions in a CTV or hot regions in a PTV, despite having acceptable TCP values Learning Objectives: To understand dose/volume based treatment planning and its potential limitations To understand biological metrics such as EUD, TCP, and NTCP To understand biologically based treatment planning and its potential limitations.« less
WE-B-304-03: Biological Treatment Planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Orton, C.
The ultimate goal of radiotherapy treatment planning is to find a treatment that will yield a high tumor control probability (TCP) with an acceptable normal tissue complication probability (NTCP). Yet most treatment planning today is not based upon optimization of TCPs and NTCPs, but rather upon meeting physical dose and volume constraints defined by the planner. It has been suggested that treatment planning evaluation and optimization would be more effective if they were biologically and not dose/volume based, and this is the claim debated in this month’s Point/Counterpoint. After a brief overview of biologically and DVH based treatment planning bymore » the Moderator Colin Orton, Joseph Deasy (for biological planning) and Charles Mayo (against biological planning) will begin the debate. Some of the arguments in support of biological planning include: this will result in more effective dose distributions for many patients DVH-based measures of plan quality are known to have little predictive value there is little evidence that either D95 or D98 of the PTV is a good predictor of tumor control sufficient validated outcome prediction models are now becoming available and should be used to drive planning and optimization Some of the arguments against biological planning include: several decades of experience with DVH-based planning should not be discarded we do not know enough about the reliability and errors associated with biological models the radiotherapy community in general has little direct experience with side by side comparisons of DVH vs biological metrics and outcomes it is unlikely that a clinician would accept extremely cold regions in a CTV or hot regions in a PTV, despite having acceptable TCP values Learning Objectives: To understand dose/volume based treatment planning and its potential limitations To understand biological metrics such as EUD, TCP, and NTCP To understand biologically based treatment planning and its potential limitations.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deasy, J.
The ultimate goal of radiotherapy treatment planning is to find a treatment that will yield a high tumor control probability (TCP) with an acceptable normal tissue complication probability (NTCP). Yet most treatment planning today is not based upon optimization of TCPs and NTCPs, but rather upon meeting physical dose and volume constraints defined by the planner. It has been suggested that treatment planning evaluation and optimization would be more effective if they were biologically and not dose/volume based, and this is the claim debated in this month’s Point/Counterpoint. After a brief overview of biologically and DVH based treatment planning bymore » the Moderator Colin Orton, Joseph Deasy (for biological planning) and Charles Mayo (against biological planning) will begin the debate. Some of the arguments in support of biological planning include: this will result in more effective dose distributions for many patients DVH-based measures of plan quality are known to have little predictive value there is little evidence that either D95 or D98 of the PTV is a good predictor of tumor control sufficient validated outcome prediction models are now becoming available and should be used to drive planning and optimization Some of the arguments against biological planning include: several decades of experience with DVH-based planning should not be discarded we do not know enough about the reliability and errors associated with biological models the radiotherapy community in general has little direct experience with side by side comparisons of DVH vs biological metrics and outcomes it is unlikely that a clinician would accept extremely cold regions in a CTV or hot regions in a PTV, despite having acceptable TCP values Learning Objectives: To understand dose/volume based treatment planning and its potential limitations To understand biological metrics such as EUD, TCP, and NTCP To understand biologically based treatment planning and its potential limitations.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mayo, C.
The ultimate goal of radiotherapy treatment planning is to find a treatment that will yield a high tumor control probability (TCP) with an acceptable normal tissue complication probability (NTCP). Yet most treatment planning today is not based upon optimization of TCPs and NTCPs, but rather upon meeting physical dose and volume constraints defined by the planner. It has been suggested that treatment planning evaluation and optimization would be more effective if they were biologically and not dose/volume based, and this is the claim debated in this month’s Point/Counterpoint. After a brief overview of biologically and DVH based treatment planning bymore » the Moderator Colin Orton, Joseph Deasy (for biological planning) and Charles Mayo (against biological planning) will begin the debate. Some of the arguments in support of biological planning include: this will result in more effective dose distributions for many patients DVH-based measures of plan quality are known to have little predictive value there is little evidence that either D95 or D98 of the PTV is a good predictor of tumor control sufficient validated outcome prediction models are now becoming available and should be used to drive planning and optimization Some of the arguments against biological planning include: several decades of experience with DVH-based planning should not be discarded we do not know enough about the reliability and errors associated with biological models the radiotherapy community in general has little direct experience with side by side comparisons of DVH vs biological metrics and outcomes it is unlikely that a clinician would accept extremely cold regions in a CTV or hot regions in a PTV, despite having acceptable TCP values Learning Objectives: To understand dose/volume based treatment planning and its potential limitations To understand biological metrics such as EUD, TCP, and NTCP To understand biologically based treatment planning and its potential limitations.« less
MO-FG-CAMPUS-TeP2-04: Optimizing for a Specified Target Coverage Probability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fredriksson, A
2016-06-15
Purpose: The purpose of this work is to develop a method for inverse planning of radiation therapy margins. When using this method the user specifies a desired target coverage probability and the system optimizes to meet the demand without any explicit specification of margins to handle setup uncertainty. Methods: The method determines which voxels to include in an optimization function promoting target coverage in order to achieve a specified target coverage probability. Voxels are selected in a way that retains the correlation between them: The target is displaced according to the setup errors and the voxels to include are selectedmore » as the union of the displaced target regions under the x% best scenarios according to some quality measure. The quality measure could depend on the dose to the considered structure alone or could depend on the dose to multiple structures in order to take into account correlation between structures. Results: A target coverage function was applied to the CTV of a prostate case with prescription 78 Gy and compared to conventional planning using a DVH function on the PTV. Planning was performed to achieve 90% probability of CTV coverage. The plan optimized using the coverage probability function had P(D98 > 77.95 Gy) = 0.97 for the CTV. The PTV plan using a constraint on minimum DVH 78 Gy at 90% had P(D98 > 77.95) = 0.44 for the CTV. To match the coverage probability optimization, the DVH volume parameter had to be increased to 97% which resulted in 0.5 Gy higher average dose to the rectum. Conclusion: Optimizing a target coverage probability is an easily used method to find a margin that achieves the desired coverage probability. It can lead to reduced OAR doses at the same coverage probability compared to planning with margins and DVH functions.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Men, Yujie; Feil, Helene; Verberkmoes, Nathan C
2012-01-01
Dehalococcoides ethenogenes strain 195 (DE195) was grown in a sustainable syntrophic association with Desulfovibrio vulgaris Hildenborough (DVH) as a co-culture, as well as with DVH and the hydrogenotrophic methanogen Methanobacterium congolense (MC) as a tri-culture using lactate as the sole energy and carbon source. In the co- and tri-cultures, maximum dechlorination rates of DE195 were enhanced by approximately three times (11.0 0.01 lmol per day for the co-culture and 10.1 0.3 lmol per day for the tri-culture) compared with DE195 grown alone (3.8 0.1 lmol per day). Cell yield of DE195 was enhanced in the co-culture (9.0 0.5107 cells permore » lmol Cl released, compared with 6.8 0.9107 cells per lmol Cl released for the pure culture), whereas no further enhancement was observed in the tri-culture (7.3 1.8107 cells per lmol Cl released). The transcriptome of DE195 grown in the co-culture was analyzed using a wholegenome microarray targeting DE195, which detected 102 significantly up- or down-regulated genes compared with DE195 grown in isolation, whereas no significant transcriptomic difference was observed between co- and tri-cultures. Proteomic analysis showed that 120 proteins were differentially expressed in the co-culture compared with DE195 grown in isolation. Physiological, transcriptomic and proteomic results indicate that the robust growth of DE195 in co- and tri-cultures is because of the advantages associated with the capabilities of DVH to ferment lactate to provide H2 and acetate for growth, along with potential benefits from proton translocation, cobalamin-salvaging and amino acid biosynthesis, whereas MC in the tri-culture provided no significant additional benefits beyond those of DVH.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berger, Daniel; Dimopoulos, Johannes; Georg, Petra
2007-04-01
Purpose: The vagina has not been widely recognized as organ at risk in brachytherapy for cervical cancer. No widely accepted dose parameters are available. This study analyzes the uncertainties in dose reporting for the vaginal wall using tandem-ring applicators. Methods and Materials: Organ wall contours were delineated on axial magnetic resonance (MR) slices to perform dose-volume histogram (DVH) analysis. Different DVH parameters were used in a feasibility study based on 40 magnetic resonance imaging (MRI)-based treatment plans of different cervical cancer patients. Dose to the most irradiated, 0.1 cm{sup 3}, 1 cm{sup 3}, 2 cm{sup 3}, and at defined pointsmore » on the ring surface and at 5-mm tissue depth were reported. Treatment-planning systems allow different methods of dose point definition. Film dosimetry was used to verify the maximum dose at the surface of the ring applicator in an experimental setup. Results: Dose reporting for the vagina is extremely sensitive to geometrical uncertainties with variations of 25% for 1 mm shifts. Accurate delineation of the vaginal wall is limited by the finite pixel size of MRI and available treatment-planning systems. No significant correlation was found between dose-point and dose-volume parameters. The DVH parameters were often related to noncontiguous volumes and were not able to detect very different situations of spatial dose distributions inside the vaginal wall. Deviations between measured and calculated doses were up to 21%. Conclusions: Reporting either point dose values or DVH parameters for the vaginal wall is based on high inaccuracies because of contouring and geometric positioning. Therefore, the use of prospective dose constraints for individual treatment plans is not to be recommended at present. However, for large patient groups treated within one protocol correlation with vaginal morbidity can be evaluated.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Demirag, N
Purpose: To verify the benefits of the biological cost functions. Methods: TG166 patients were used for the test case scenarios. Patients were planned using Monaco V5.0 (CMS/Elekta, St.Louis, MO) Monaco has 3 biological and 8 physical CFs. In this study the plans were optimized using 3 different scenarios. 1- Biological CFs only 2-Physical CFs only 3- Combination of Physical and Biological CFsMonaco has 3 biological CFs. Target EUD used for the targets, derived from the poisson cell kill model, has an α value that controls the cold spots inside the target. α values used in the optimization were 0.5 andmore » 0.8. if cold spots needs to be penalized α value increased. Serial CF: it's called serial to mimic the behaviour of the serial organs, if a high k value like 12 or 14 is used it controls the maximum dose. Serial CF has a k parameter that is used to shape the whole dvh curve. K value ranges between 1–20. k:1 is used to control the mean dose, lower k value controls the mean dose, higher k value controls the higher dose, using 2 serial CFs with different k values controls the whole DVH. Paralel CF controls the percentage of the volume that tolerates higher doses than the reference dose to mimic the behaviour of the paralel organs. Results: It was possible to achive clinically accepted plans in all 3 scenarios. The benefit of the biological cost functions were to control the mean dose for target and OAR, to shape the DVH curve using one EUD value and one k value simplifies the optimization process. Using the biological CFs alone, it was hard to control the dose at a point. Conclusion: Biological CFs in Monaco doesn't require the ntcp/tcp values from the labs and useful to shape the whole dvh curve. I work as an applications support specialist for Elekta and I am a Ph.D. Student in Istanbul University for radiation therapy physics.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moore, Kevin L., E-mail: kevinmoore@ucsd.edu; Schmidt, Rachel; Moiseenko, Vitali
Purpose: The purpose of this study was to quantify the frequency and clinical severity of quality deficiencies in intensity modulated radiation therapy (IMRT) planning in the Radiation Therapy Oncology Group 0126 protocol. Methods and Materials: A total of 219 IMRT patients from the high-dose arm (79.2 Gy) of RTOG 0126 were analyzed. To quantify plan quality, we used established knowledge-based methods for patient-specific dose-volume histogram (DVH) prediction of organs at risk and a Lyman-Kutcher-Burman (LKB) model for grade ≥2 rectal complications to convert DVHs into normal tissue complication probabilities (NTCPs). The LKB model was validated by fitting dose-response parameters relative tomore » observed toxicities. The 90th percentile (22 of 219) of plans with the lowest excess risk (difference between clinical and model-predicted NTCP) were used to create a model for the presumed best practices in the protocol (pDVH{sub 0126,top10%}). Applying the resultant model to the entire sample enabled comparisons between DVHs that patients could have received to DVHs they actually received. Excess risk quantified the clinical impact of suboptimal planning. Accuracy of pDVH predictions was validated by replanning 30 of 219 patients (13.7%), including equal numbers of presumed “high-quality,” “low-quality,” and randomly sampled plans. NTCP-predicted toxicities were compared to adverse events on protocol. Results: Existing models showed that bladder-sparing variations were less prevalent than rectum quality variations and that increased rectal sparing was not correlated with target metrics (dose received by 98% and 2% of the PTV, respectively). Observed toxicities were consistent with current LKB parameters. Converting DVH and pDVH{sub 0126,top10%} to rectal NTCPs, we observed 94 of 219 patients (42.9%) with ≥5% excess risk, 20 of 219 patients (9.1%) with ≥10% excess risk, and 2 of 219 patients (0.9%) with ≥15% excess risk. Replanning demonstrated the predicted NTCP reductions while maintaining the volume of the PTV receiving prescription dose. An equivalent sample of high-quality plans showed fewer toxicities than low-quality plans, 6 of 73 versus 10 of 73 respectively, although these differences were not significant (P=.21) due to insufficient statistical power in this retrospective study. Conclusions: Plan quality deficiencies in RTOG 0126 exposed patients to substantial excess risk for rectal complications.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anetai, Y; Mizuno, H; Sumida, I
2015-06-15
Purpose: To determine which proton planning technique on average-CT is more vulnerable to respiratory motion induced density changes and interplay effect among (a) IMPT of CTV-based minimax robust optimization with 5mm set-up error considered, (b, c) IMPT/SFUD of 5mm-expanded PTV optimization. Methods: Three planning techniques were optimized in Raystation with a prescription of 60/25 (Gy/fractions) and almost the same OAR constraints/objectives for each of 10 NSCLC patients. 4D dose without/with interplay effect was recalculated on eight 4D-CT phases and accumulated after deforming the dose of each phase to a reference (exhalation phase). The change of D98% of each CTV causedmore » by density changes and interplay was determined. In addition, evaluation of the DVH information vector (D99%, D98%, D95%, Dave, D50%, D2%, D1%) which compares the whole DVH by η score = (cosine similarity × Pearson correlation coefficient − 0.9) × 1000 quantified the degree of DVH change: score below 100 indicates changed DVH. Results: Three 3D plans of each technique satisfied our clinical goals. D98% shift mean±SD (Gy) due to density changes was largest in (c): −0.78±1.1 while (a): −0.11±0.65 and (b): − 0.59±0.93. Also the shift due to interplay effect most was (c): −.54±0.70 whereas (a): −0.25±0.93 and (b): −0.12±0.13. Moreover lowest η score caused by density change was also (c): 69, while (a) and (b) kept around 90. η score also indicated less effect of interplay than density changes. Note that generally the changed DVH were still acceptable clinically. Paired T-tests showed a significantly smaller density change effect in (a) (p<0.05) than in (b) or (c) and no significant difference in interplay effect. Conclusion: CTV-based robust optimized IMPT was more robust against respiratory motion induced density changes than PTV-based IMPT and SFUD. The interplay effect was smaller than the effect of density changes and similar among the three techniques. The JSPS Core-to-Core Program (No. 23003), Japan Society for the Promotion of Science Grant-in-Aid for Scientific Research (No. 23390300), Grant-in-Aid for Young Scientists (B) (No. 21791194) and Grant-in-Aid for Cancer Research (H22-3rd Term Cancer Control-General-043)« less
Mayo, Charles S; Moran, Jean M; Bosch, Walter; Xiao, Ying; McNutt, Todd; Popple, Richard; Michalski, Jeff; Feng, Mary; Marks, Lawrence B; Fuller, Clifton D; Yorke, Ellen; Palta, Jatinder; Gabriel, Peter E; Molineu, Andrea; Matuszak, Martha M; Covington, Elizabeth; Masi, Kathryn; Richardson, Susan L; Ritter, Timothy; Morgas, Tomasz; Flampouri, Stella; Santanam, Lakshmi; Moore, Joseph A; Purdie, Thomas G; Miller, Robert C; Hurkmans, Coen; Adams, Judy; Jackie Wu, Qing-Rong; Fox, Colleen J; Siochi, Ramon Alfredo; Brown, Norman L; Verbakel, Wilko; Archambault, Yves; Chmura, Steven J; Dekker, Andre L; Eagle, Don G; Fitzgerald, Thomas J; Hong, Theodore; Kapoor, Rishabh; Lansing, Beth; Jolly, Shruti; Napolitano, Mary E; Percy, James; Rose, Mark S; Siddiqui, Salim; Schadt, Christof; Simon, William E; Straube, William L; St James, Sara T; Ulin, Kenneth; Yom, Sue S; Yock, Torunn I
2018-03-15
A substantial barrier to the single- and multi-institutional aggregation of data to supporting clinical trials, practice quality improvement efforts, and development of big data analytics resource systems is the lack of standardized nomenclatures for expressing dosimetric data. To address this issue, the American Association of Physicists in Medicine (AAPM) Task Group 263 was charged with providing nomenclature guidelines and values in radiation oncology for use in clinical trials, data-pooling initiatives, population-based studies, and routine clinical care by standardizing: (1) structure names across image processing and treatment planning system platforms; (2) nomenclature for dosimetric data (eg, dose-volume histogram [DVH]-based metrics); (3) templates for clinical trial groups and users of an initial subset of software platforms to facilitate adoption of the standards; (4) formalism for nomenclature schema, which can accommodate the addition of other structures defined in the future. A multisociety, multidisciplinary, multinational group of 57 members representing stake holders ranging from large academic centers to community clinics and vendors was assembled, including physicists, physicians, dosimetrists, and vendors. The stakeholder groups represented in the membership included the AAPM, American Society for Radiation Oncology (ASTRO), NRG Oncology, European Society for Radiation Oncology (ESTRO), Radiation Therapy Oncology Group (RTOG), Children's Oncology Group (COG), Integrating Healthcare Enterprise in Radiation Oncology (IHE-RO), and Digital Imaging and Communications in Medicine working group (DICOM WG); A nomenclature system for target and organ at risk volumes and DVH nomenclature was developed and piloted to demonstrate viability across a range of clinics and within the framework of clinical trials. The final report was approved by AAPM in October 2017. The approval process included review by 8 AAPM committees, with additional review by ASTRO, European Society for Radiation Oncology (ESTRO), and American Association of Medical Dosimetrists (AAMD). This Executive Summary of the report highlights the key recommendations for clinical practice, research, and trials. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
SU-E-T-29: A Web Application for GPU-Based Monte Carlo IMRT/VMAT QA with Delivered Dose Verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Folkerts, M; University of California, San Diego, La Jolla, CA; Graves, Y
Purpose: To enable an existing web application for GPU-based Monte Carlo (MC) 3D dosimetry quality assurance (QA) to compute “delivered dose” from linac logfile data. Methods: We added significant features to an IMRT/VMAT QA web application which is based on existing technologies (HTML5, Python, and Django). This tool interfaces with python, c-code libraries, and command line-based GPU applications to perform a MC-based IMRT/VMAT QA. The web app automates many complicated aspects of interfacing clinical DICOM and logfile data with cutting-edge GPU software to run a MC dose calculation. The resultant web app is powerful, easy to use, and is ablemore » to re-compute both plan dose (from DICOM data) and delivered dose (from logfile data). Both dynalog and trajectorylog file formats are supported. Users upload zipped DICOM RP, CT, and RD data and set the expected statistic uncertainty for the MC dose calculation. A 3D gamma index map, 3D dose distribution, gamma histogram, dosimetric statistics, and DVH curves are displayed to the user. Additional the user may upload the delivery logfile data from the linac to compute a 'delivered dose' calculation and corresponding gamma tests. A comprehensive PDF QA report summarizing the results can also be downloaded. Results: We successfully improved a web app for a GPU-based QA tool that consists of logfile parcing, fluence map generation, CT image processing, GPU based MC dose calculation, gamma index calculation, and DVH calculation. The result is an IMRT and VMAT QA tool that conducts an independent dose calculation for a given treatment plan and delivery log file. The system takes both DICOM data and logfile data to compute plan dose and delivered dose respectively. Conclusion: We sucessfully improved a GPU-based MC QA tool to allow for logfile dose calculation. The high efficiency and accessibility will greatly facilitate IMRT and VMAT QA.« less
SU-G-BRC-04: Collimator Angle Optimization in Volumetric Modulated Arc Therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andersen, A; Johnson, C; Bartlett, G
2016-06-15
Purpose: Volumetric modulated arc therapy (VMAT) has revolutionized radiation treatment by decreasing treatment time and monitor units, thus reducing scattered and whole body radiation dose. As the collimator angle changes the apparent leaf gap becomes larger which can impact plan quality, organ at risk (OAR) sparing as well as IMRT QA passing rate which is investigated. Methods: Two sites (prostate and head and neck) that have maximum utilization of VMAT were investigated. Two previously treated VMAT patients were chosen. For each patient 10 plans were created by maintaining constant optimization constraints while varying collimator angles from 0-90 deg at anmore » interval of 10 degrees for the first arc and the appropriate complimentary angle for the second arc. Plans were created with AAA algorithm using 6 MV beam on a Varian IX machine with Millennium 120 MLC. The dose-volume histogram (DVH) for each plan was exported and dosimetric parameters (D98, D95, D50, D2) as well homogeneity index (HI) and conformity index (CI) were computed. Each plan was validated for QA using ArcCheck with gamma index passing criteria of 2%/2 mm and 3%/3 mm. Additionally, normal tissue complication probability (NTCP) for each OAR was computed using Uzan-Nahum software. Results: The CI values for both sites had no impact as target volume coverage in every collimator angle were the same since it was optimized for adequate coverage. The HI which is representative of DVH gradient or dose uniformity in PTV showed a clear trend in both sites. The NTCP for OAR (brain and cochlea) in H&N plan and (bladder and rectum) in prostate plan showed a distinct superiority for collimator angles between 15-30 deg. The gamma passing rates were not correlated with angle. Conclusion: Based on CI, HI, NTCP and gamma passing index, it can be concluded that collimator angles should be maintained within 15–30 deg.« less
Tyler, Madelaine K
2016-01-08
This study quantified the interplay and gradient effects on GTV dose coverage for 3D CRT, dMLC IMRT, and VMAT SABR treatments for target amplitudes of 5-30 mm using 3DVH v3.1 software incorporating 4D Respiratory MotionSim (4D RMS) module. For clinically relevant motion periods (5 s), the interplay effect was small, with deviations in the minimum dose covering the target volume (D99%) of less than ± 2.5% for target amplitudes up to 30 mm. Increasing the period to 60 s resulted in interplay effects of up to ± 15.0% on target D99% dose coverage. The gradient effect introduced by target motion resulted in deviations of up to ± 3.5% in D99% target dose coverage. VMAT treatments showed the largest deviation in dose metrics, which was attributed to the long delivery times in comparison to dMLC IMRT. Retrospective patient analysis indicated minimal interplay and gradient effects for patients treated with dMLC IMRT at the NCCI.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Q
Purpose: According to clinical and research requirement, we develop a function of automatic reading dose of interest from dose volume histogram(DVH), to replace the traditional method with a mouse one by one point, and it's also verified. Methods: The DVH automatic reading function will be developed in an in-house developed radiotherapy information management system(RTIMS), which is based on Apache+PHP+MySQL. A DVH ASCII file is exported from Varian Eclipse V8.6, which includes the following contents: 1. basic information of patient; 2. dose information of plan; 3. dose information of structures, including basic information and dose volume data of target volume andmore » organ at risk. And the default exported dose volume data also includes relative doses by 1% step and corresponding absolute doses and cumulative relative volumes, and the volumes are 4 decimal fraction. Clinically, we often need read the doses of some integer percent volumes, such as D50 and D30. So it couldn't be directly obtained from the above data, but we can use linear interpolation bye the near volumes and doses: Dx=D2−(V2−Vx)*(D2−D1)/(V2−V1), and program a function to search, read and calculate the corresponding data. And the doses of all preseted volume of interest of all structures can be automatically read one by one patient, and saved as a CSV file. To verify it, we select 24 IMRT plans for prostate cancer, and doses of interest are PTV D98/D95/D5/D2, bladder D30/D50, and rectum D25/D50. Two groups of data, using the automatic reading method(ARM) and pointed dose method(PDM), are analyzed with SPSS 16. The absolute difference=D-ARM-D-PDM, relative difference=absolute difference*100%/prescription dose(7600cGy). Results: The differences are as following: PTV D98/D95/D5/D2: −0.04%/− 0.04%/0.13%/0.19%, bladder D30/D50: −0.02%/0.01%, and rectum D25/D50: 0.03%/0.01%. Conclusion: Using this function, the error is very small, and can be neglected. It could greatly improve the efficiency of clinical work. Project supported by the National Natural Science Foundation of China (Grant No.81101694)« less
A study of optimization techniques in HDR brachytherapy for the prostate
NASA Astrophysics Data System (ADS)
Pokharel, Ghana Shyam
Several studies carried out thus far are in favor of dose escalation to the prostate gland to have better local control of the disease. But optimal way of delivery of higher doses of radiation therapy to the prostate without hurting neighboring critical structures is still debatable. In this study, we proposed that real time high dose rate (HDR) brachytherapy with highly efficient and effective optimization could be an alternative means of precise delivery of such higher doses. This approach of delivery eliminates the critical issues such as treatment setup uncertainties and target localization as in external beam radiation therapy. Likewise, dosimetry in HDR brachytherapy is not influenced by organ edema and potential source migration as in permanent interstitial implants. Moreover, the recent report of radiobiological parameters further strengthen the argument of using hypofractionated HDR brachytherapy for the management of prostate cancer. Firstly, we studied the essential features and requirements of real time HDR brachytherapy treatment planning system. Automating catheter reconstruction with fast editing tools, fast yet accurate dose engine, robust and fast optimization and evaluation engine are some of the essential requirements for such procedures. Moreover, in most of the cases we performed, treatment plan optimization took significant amount of time of overall procedure. So, making treatment plan optimization automatic or semi-automatic with sufficient speed and accuracy was the goal of the remaining part of the project. Secondly, we studied the role of optimization function and constraints in overall quality of optimized plan. We have studied the gradient based deterministic algorithm with dose volume histogram (DVH) and more conventional variance based objective functions for optimization. In this optimization strategy, the relative weight of particular objective in aggregate objective function signifies its importance with respect to other objectives. Based on our study, DVH based objective function performed better than traditional variance based objective function in creating a clinically acceptable plan when executed under identical conditions. Thirdly, we studied the multiobjective optimization strategy using both DVH and variance based objective functions. The optimization strategy was to create several Pareto optimal solutions by scanning the clinically relevant part of the Pareto front. This strategy was adopted to decouple optimization from decision such that user could select final solution from the pool of alternative solutions based on his/her clinical goals. The overall quality of treatment plan improved using this approach compared to traditional class solution approach. In fact, the final optimized plan selected using decision engine with DVH based objective was comparable to typical clinical plan created by an experienced physicist. Next, we studied the hybrid technique comprising both stochastic and deterministic algorithm to optimize both dwell positions and dwell times. The simulated annealing algorithm was used to find optimal catheter distribution and the DVH based algorithm was used to optimize 3D dose distribution for given catheter distribution. This unique treatment planning and optimization tool was capable of producing clinically acceptable highly reproducible treatment plans in clinically reasonable time. As this algorithm was able to create clinically acceptable plans within clinically reasonable time automatically, it is really appealing for real time procedures. Next, we studied the feasibility of multiobjective optimization using evolutionary algorithm for real time HDR brachytherapy for the prostate. The algorithm with properly tuned algorithm specific parameters was able to create clinically acceptable plans within clinically reasonable time. However, the algorithm was let to run just for limited number of generations not considered optimal, in general, for such algorithms. This was done to keep time window desirable for real time procedures. Therefore, it requires further study with improved conditions to realize the full potential of the algorithm.
MR and CT image fusion for postimplant analysis in permanent prostate seed implants.
Polo, Alfredo; Cattani, Federica; Vavassori, Andrea; Origgi, Daniela; Villa, Gaetano; Marsiglia, Hugo; Bellomi, Massimo; Tosi, Giampiero; De Cobelli, Ottavio; Orecchia, Roberto
2004-12-01
To compare the outcome of two different image-based postimplant dosimetry methods in permanent seed implantation. Between October 1999 and October 2002, 150 patients with low-risk prostate carcinoma were treated with (125)I and (103)Pd in our institution. A CT-MRI image fusion protocol was used in 21 consecutive patients treated with exclusive brachytherapy. The accuracy and reproducibility of the method was calculated, and then the CT-based dosimetry was compared with the CT-MRI-based dosimetry using the dose-volume histogram (DVH) related parameters recommended by the American Brachytherapy Society and the American Association of Physicists in Medicine. Our method for CT-MRI image fusion was accurate and reproducible (median shift <1 mm). Differences in prostate volume were found, depending on the image modality used. Quality assurance DVH-related parameters strongly depended on the image modality (CT vs. CT-MRI): V(100) = 82% vs. 88%, p < 0.05. D(90) = 96% vs. 115%, p < 0.05. Those results depend on the institutional implant technique and reflect the importance of lowering inter- and intraobserver discrepancies when outlining prostate and organs at risk for postimplant dosimetry. Computed tomography-MRI fused images allow accurate determination of prostate size, significantly improving the dosimetric evaluation based on DVH analysis. This provides a consistent method to judge a prostate seed implant's quality.
Modelling duodenum radiotherapy toxicity using cohort dose-volume-histogram data.
Holyoake, Daniel L P; Aznar, Marianne; Mukherjee, Somnath; Partridge, Mike; Hawkins, Maria A
2017-06-01
Gastro-intestinal toxicity is dose-limiting in abdominal radiotherapy and correlated with duodenum dose-volume parameters. We aimed to derive updated NTCP model parameters using published data and prospective radiotherapy quality-assured cohort data. A systematic search identified publications providing duodenum dose-volume histogram (DVH) statistics for clinical studies of conventionally-fractionated radiotherapy. Values for the Lyman-Kutcher-Burman (LKB) NTCP model were derived through sum-squared-error minimisation and using leave-one-out cross-validation. Data were corrected for fraction size and weighted according to patient numbers, and the model refined using individual patient DVH data for two further cohorts from prospective clinical trials. Six studies with published DVH data were utilised, and with individual patient data included outcomes for 531 patients in total (median follow-up 16months). Observed gastro-intestinal toxicity rates ranged from 0% to 14% (median 8%). LKB parameter values for unconstrained fit to published data were: n=0.070, m=0.46, TD 50(1) [Gy]=183.8, while the values for the model incorporating the individual patient data were n=0.193, m=0.51, TD 50(1) [Gy]=299.1. LKB parameters derived using published data are shown to be consistent to those previously obtained using individual patient data, supporting a small volume-effect and dependence on exposure to high threshold dose. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Nan; Carmona, Ruben; Sirak, Igor
Purpose: To demonstrate an efficient method for training and validation of a knowledge-based planning (KBP) system as a radiation therapy clinical trial plan quality-control system. Methods and Materials: We analyzed 86 patients with stage IB through IVA cervical cancer treated with intensity modulated radiation therapy at 2 institutions according to the standards of the INTERTECC (International Evaluation of Radiotherapy Technology Effectiveness in Cervical Cancer, National Clinical Trials Network identifier: 01554397) protocol. The protocol used a planning target volume and 2 primary organs at risk: pelvic bone marrow (PBM) and bowel. Secondary organs at risk were rectum and bladder. Initial unfiltered dose-volumemore » histogram (DVH) estimation models were trained using all 86 plans. Refined training sets were created by removing sub-optimal plans from the unfiltered sample, and DVH estimation models… and DVH estimation models were constructed by identifying 30 of 86 plans emphasizing PBM sparing (comparing protocol-specified dosimetric cutpoints V{sub 10} (percentage volume of PBM receiving at least 10 Gy dose) and V{sub 20} (percentage volume of PBM receiving at least 20 Gy dose) with unfiltered predictions) and another 30 of 86 plans emphasizing bowel sparing (comparing V{sub 40} (absolute volume of bowel receiving at least 40 Gy dose) and V{sub 45} (absolute volume of bowel receiving at least 45 Gy dose), 9 in common with the PBM set). To obtain deliverable KBP plans, refined models must inform patient-specific optimization objectives and/or priorities (an auto-planning “routine”). Four candidate routines emphasizing different tradeoffs were composed, and a script was developed to automatically re-plan multiple patients with each routine. After selection of the routine that best met protocol objectives in the 51-patient training sample (KBP{sub FINAL}), protocol-specific DVH metrics and normal tissue complication probability were compared for original versus KBP{sub FINAL} plans across the 35-patient validation set. Paired t tests were used to test differences between planning sets. Results: KBP{sub FINAL} plans outperformed manual planning across the validation set in all protocol-specific DVH cutpoints. The mean normal tissue complication probability for gastrointestinal toxicity was lower for KBP{sub FINAL} versus validation-set plans (48.7% vs 53.8%, P<.001). Similarly, the estimated mean white blood cell count nadir was higher (2.77 vs 2.49 k/mL, P<.001) with KBP{sub FINAL} plans, indicating lowered probability of hematologic toxicity. Conclusions: This work demonstrates that a KBP system can be efficiently trained and refined for use in radiation therapy clinical trials with minimal effort. This patient-specific plan quality control resulted in improvements on protocol-specific dosimetric endpoints.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Isohashi, Fumiaki, E-mail: isohashi@radonc.med.osaka-u.ac.jp; Yoshioka, Yasuo; Mabuchi, Seiji
2013-03-01
Purpose: The purpose of this study was to evaluate dose-volume histogram (DVH) predictors for the development of chronic gastrointestinal (GI) complications in cervical cancer patients who underwent radical hysterectomy and postoperative concurrent nedaplatin-based chemoradiation therapy. Methods and Materials: This study analyzed 97 patients who underwent postoperative concurrent chemoradiation therapy. The organs at risk that were contoured were the small bowel loops, large bowel loop, and peritoneal cavity. DVH parameters subjected to analysis included the volumes of these organs receiving more than 15, 30, 40, and 45 Gy (V15-V45) and their mean dose. Associations between DVH parameters or clinical factors andmore » the incidence of grade 2 or higher chronic GI complications were evaluated. Results: Of the clinical factors, smoking and low body mass index (BMI) (<22) were significantly associated with grade 2 or higher chronic GI complications. Also, patients with chronic GI complications had significantly greater V15-V45 volumes and higher mean dose of the small bowel loops compared with those without GI complications. In contrast, no parameters for the large bowel loop or peritoneal cavity were significantly associated with GI complications. Results of the receiver operating characteristics (ROC) curve analysis led to the conclusion that V15-V45 of the small bowel loops has high accuracy for prediction of GI complications. Among these parameters, V40 gave the highest area under the ROC curve. Finally, multivariate analysis was performed with V40 of the small bowel loops and 2 other clinical parameters that were judged to be potential risk factors for chronic GI complications: BMI and smoking. Of these 3 parameters, V40 of the small bowel loops and smoking emerged as independent predictors of chronic GI complications. Conclusions: DVH parameters of the small bowel loops may serve as predictors of grade 2 or higher chronic GI complications after postoperative concurrent nedaplatin-based chemoradiation therapy for early-stage cervical cancer.« less
Zhang, Hualin; Gopalakrishnan, Mahesh; Lee, Plato; Kang, Zhuang; Sathiaseelan, Vythialingam
2016-09-08
The purpose of this study was to evaluate the dosimetric impact of cylinder size in high-dose-rate (HDR) vaginal cuff brachytherapy (VCBT). Sample plans of HDR VCBT in a list of cylinders ranging from 2.5 to 4 cm in diameter at 0.5 cm incre-ment were created and analyzed. The doses were prescribed either at the 0.5cm depth with 5.5 Gy for 4 fractions or at the cylinder surface with 8.8 Gy for 4 frac-tions, in various treatment lengths. A 0.5 cm shell volume called PTV_Eval was contoured for each plan and served as the target volume for dosimetric evaluation. The cumulative and differential dose volume histograms (c-DVH and d-DVH), mean doses (D-mean) and the doses covering 90% (D90), 10% (D10), and 5% (D5) of PTV_Eval were calculated. In the 0.5 cm depth regimen, the DVH curves were found to have shifted toward the lower dose zone when a larger cylinder was used, but in the surface regimen the DVH curves shifted toward the higher dose zone as the cylinder size increased. The D-means of the both regimens were between 6.9 and 7.8 Gy and dependent on the cylinder size but independent of the treatment length. A 0.5 cm variation of diameter could result in a 4% change of D-mean. Average D90s were 5.7 (ranging from 5.6 to 5.8 Gy) and 6.1 Gy (from 5.7 to 6.4 Gy), respectively, for the 0.5 cm and surface regimens. Average D10 and D5 were 9.2 and 11 Gy, respectively, for the 0.5 cm depth regimen, and 8.9 and 9.7 Gy, respectively, for the surface regimen. D-mean, D90, D10, and D5 for other prescription doses could be calculated from the lookup tables of this study. Results indicated that the cylinder size has moderate dosimetric impact, and that both regimens are comparable in dosimetric quality. © 2016 The Authors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Z; Kennedy, A; Larsen, E
2015-06-15
Purpose: The study aims to develop and validate a knowledge based planning (KBP) model for external beam radiation therapy of locally advanced non-small cell lung cancer (LA-NSCLC). Methods: RapidPlan™ technology was used to develop a lung KBP model. Plans from 65 patients with LA-NSCLC were used to train the model. 25 patients were treated with VMAT, and the other patients were treated with IMRT. Organs-at-risk (OARs) included right lung, left lung, heart, esophagus, and spinal cord. DVH and geometric distribution DVH were extracted from the treated plans. The model was trained using principal component analysis and step-wise multiple regression. Boxmore » plot and regression plot tools were used to identify geometric outliers and dosimetry outliers and help fine-tune the model. The validation was performed by (a) comparing predicted DVH boundaries to actual DVHs of 63 patients and (b) using an independent set of treatment planning data. Results: 63 out of 65 plans were included in the final KBP model with PTV volume ranging from 102.5cc to 1450.2cc. Total treatment dose prescription varied from 50Gy to 70Gy based on institutional guidelines. One patient was excluded due to geometric outlier where 2.18cc of spinal cord was included in PTV. The other patient was excluded due to dosimetric outlier where the dose sparing to spinal cord was heavily enforced in the clinical plan. Target volume, OAR volume, OAR overlap volume percentage to target, and OAR out-of-field volume were included in the trained model. Lungs and heart had two principal component scores of GEDVH, whereas spinal cord and esophagus had three in the final model. Predicted DVH band (mean ±1 standard deviation) represented 66.2±3.6% of all DVHs. Conclusion: A KBP model was developed and validated for radiotherapy of LA-NSCLC in a commercial treatment planning system. The clinical implementation may improve the consistency of IMRT/VMAT planning.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tran, A; Ruan, D; Woods, K
Purpose: The predictive power of knowledge based planning (KBP) has considerable potential in the development of automated treatment planning. Here, we examine the predictive capabilities and accuracy of previously reported KBP methods, as well as an artificial neural networks (ANN) method. Furthermore, we compare the predictive accuracy of these methods on coplanar volumetric-modulated arc therapy (VMAT) and non-coplanar 4π radiotherapy. Methods: 30 liver SBRT patients previously treated using coplanar VMAT were selected for this study. The patients were re-planned using 4π radiotherapy, which involves 20 optimally selected non-coplanar IMRT fields. ANNs were used to incorporate enhanced geometric information including livermore » and PTV size, prescription dose, patient girth, and proximity to beams. The performance of ANN was compared to three methods from statistical voxel dose learning (SVDL), wherein the doses of voxels sharing the same distance to the PTV are approximated by either taking the median of the distribution, non-parametric fitting, or skew-normal fitting. These three methods were shown to be capable of predicting DVH, but only median approximation can predict 3D dose. Prediction methods were tested using leave-one-out cross-validation tests and evaluated using residual sum of squares (RSS) for DVH and 3D dose predictions. Results: DVH prediction using non-parametric fitting had the lowest average RSS with 0.1176(4π) and 0.1633(VMAT), compared to 0.4879(4π) and 1.8744(VMAT) RSS for ANN. 3D dose prediction with median approximation had lower RSS with 12.02(4π) and 29.22(VMAT), compared to 27.95(4π) and 130.9(VMAT) for ANN. Conclusion: Paradoxically, although the ANNs included geometric features in addition to the distances to the PTV, it did not perform better in predicting DVH or 3D dose compared to simpler, faster methods based on the distances alone. The study further confirms that the prediction of 4π non-coplanar plans were more accurate than VMAT. NIH R43CA183390 and R01CA188300.« less
NASA Astrophysics Data System (ADS)
Wall, Phillip D. H.; Carver, Robert L.; Fontenot, Jonas D.
2018-01-01
The overlap volume histogram (OVH) is an anatomical metric commonly used to quantify the geometric relationship between an organ at risk (OAR) and target volume when predicting expected dose-volumes in knowledge-based planning (KBP). This work investigated the influence of additional variables contributing to variations in the assumed linear DVH-OVH correlation for the bladder and rectum in VMAT plans of prostate patients, with the goal of increasing prediction accuracy and achievability of knowledge-based planning methods. VMAT plans were retrospectively generated for 124 prostate patients using multi-criteria optimization. DVHs quantified patient dosimetric data while OVHs quantified patient anatomical information. The DVH-OVH correlations were calculated for fractional bladder and rectum volumes of 30, 50, 65, and 80%. Correlations between potential influencing factors and dose were quantified using the Pearson product-moment correlation coefficient (R). Factors analyzed included the derivative of the OVH, prescribed dose, PTV volume, bladder volume, rectum volume, and in-field OAR volume. Out of the selected factors, only the in-field bladder volume (mean R = 0.86) showed a strong correlation with bladder doses. Similarly, only the in-field rectal volume (mean R = 0.76) showed a strong correlation with rectal doses. Therefore, an OVH formalism accounting for in-field OAR volumes was developed to determine the extent to which it improved the DVH-OVH correlation. Including the in-field factor improved the DVH-OVH correlation, with the mean R values over the fractional volumes studied improving from -0.79 to -0.85 and -0.82 to -0.86 for the bladder and rectum, respectively. A re-planning study was performed on 31 randomly selected database patients to verify the increased accuracy of KBP dose predictions by accounting for bladder and rectum volume within treatment fields. The in-field OVH led to significantly more precise and fewer unachievable KBP predictions, especially for lower bladder and rectum dose-volumes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deek, Matthew P.; Benenati, Brian; Kim, Sinae
Purpose: To determine the relationships between radiation doses to the thoracic bone marrow and declines in blood cell counts in non-small cell lung cancer (NSCLC) patients treated with chemoradiation therapy (CRT). Methods and Materials: We included 52 patients with NSCLC treated with definitive concurrent carboplatin–paclitaxel and RT. Dose-volume histogram (DVH) parameters for the thoracic vertebrae (TV), sternum, scapulae, clavicles, and ribs were assessed for associations with changes in blood counts during the course of CRT. Linear and logistic regression analyses were performed to identify associations between hematologic nadirs and DVH parameters. A DVH parameter of Vx was the percentage ofmore » the total organ volume exceeding x radiation dose. Results: Grade ≥3 hematologic toxicity including neutropenia developed in 21% (n=11), leukopenia in 42% (n=22), anemia in 6% (n=3), and throbocytopenia in 2% (n=1) of patients. Greater RT dose to the TV was associated with higher risk of grade ≥3 leukopenia across multiple DVH parameters, including TV V{sub 20} (TVV) (odds ratio [OR] 1.06; P=.025), TVV{sub 30} (OR 1.07; P=.013), and mean vertebral dose (MVD) (OR 1.13; P=.026). On multiple regression analysis, TVV{sub 30} (β = −0.004; P=.018) and TVV{sub 20} (β = −0.003; P=.048) were associated with white blood cell nadir. Additional bone marrow sites (scapulae, clavicles, and ribs) did not affect hematologic toxicity. A 20% chance of grade ≥3 leukopenia was associated with a MVD of 13.5 Gy and a TTV{sub 30} of 28%. Cutoff values to avoid grade ≥3 leukopenia were MVD ≤23.9 Gy, TVV{sub 20} ≤56.0%, and TVV{sub 30} ≤52.1%. Conclusions: Hematologic toxicity is associated with greater RT doses to the TV during CRT for NSCLC. Sparing of the TV using advanced radiation techniques may improve tolerance of CRT and result in improved tolerance of concurrent chemotherapy.« less
Fan, Jiawei; Wang, Jiazhou; Zhang, Zhen; Hu, Weigang
2017-06-01
To develop a new automated treatment planning solution for breast and rectal cancer radiotherapy. The automated treatment planning solution developed in this study includes selection of the iterative optimized training dataset, dose volume histogram (DVH) prediction for the organs at risk (OARs), and automatic generation of clinically acceptable treatment plans. The iterative optimized training dataset is selected by an iterative optimization from 40 treatment plans for left-breast and rectal cancer patients who received radiation therapy. A two-dimensional kernel density estimation algorithm (noted as two parameters KDE) which incorporated two predictive features was implemented to produce the predicted DVHs. Finally, 10 additional new left-breast treatment plans are re-planned using the Pinnacle 3 Auto-Planning (AP) module (version 9.10, Philips Medical Systems) with the objective functions derived from the predicted DVH curves. Automatically generated re-optimized treatment plans are compared with the original manually optimized plans. By combining the iterative optimized training dataset methodology and two parameters KDE prediction algorithm, our proposed automated planning strategy improves the accuracy of the DVH prediction. The automatically generated treatment plans using the dose derived from the predicted DVHs can achieve better dose sparing for some OARs without compromising other metrics of plan quality. The proposed new automated treatment planning solution can be used to efficiently evaluate and improve the quality and consistency of the treatment plans for intensity-modulated breast and rectal cancer radiation therapy. © 2017 American Association of Physicists in Medicine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lancaster, Andrew; Menon, Angeli; Scott, Israel
2014-03-26
Although as many as half of all proteins are thought to require a metal cofactor, the metalloproteomes of microorganisms remain relatively unexplored. Microorganisms from different environments are likely to vary greatly in the metals that they assimilate, not just among the metals with well-characterized roles but also those lacking any known function. Herein we investigated the metal utilization of two microorganisms that were isolated from very similar environments and are of interest because of potential roles in the immobilization of heavy metals, such as uranium and chromium. The metals assimilated and their concentrations in the cytoplasm of Desulfovibrio vulgaris strainmore » Hildenborough (DvH) and Enterobacter cloacae strain Hanford (EcH) varied dramatically, with a larger number of metals present in Enterobacter. For example, a total of 9 and 19 metals were assimilated into their cytoplasmic fractions, respectively, and DvH did not assimilate significant amounts of zinc or copper whereas EcH assimilated both. However, bioinformatic analysis of their genome sequences revealed a comparable number of predicted metalloproteins, 813 in DvH and 953 in EcH. These allowed some rationalization of the types of metal assimilated in some cases (Fe, Cu, Mo, W, V) but not in others (Zn, Nd, Ce, Pr, Dy, Hf and Th). It was also shown that U binds an unknown soluble protein in EcH but this incorporation was the result of extracellular U binding to cytoplasmic components after cell lysis.« less
2016-01-01
This study quantified the interplay and gradient effects on GTV dose coverage for 3D CRT, dMLC IMRT, and VMAT SABR treatments for target amplitudes of 5–30 mm using 3DVH v3.1 software incorporating 4D Respiratory MotionSim (4D RMS) module. For clinically relevant motion periods (5 s), the interplay effect was small, with deviations in the minimum dose covering the target volume (D99%) of less than ±2.5% for target amplitudes up to 30 mm. Increasing the period to 60 s resulted in interplay effects of up to ±15.0% on target D99% dose coverage. The gradient effect introduced by target motion resulted in deviations of up to ±3.5% in D99% target dose coverage. VMAT treatments showed the largest deviation in dose metrics, which was attributed to the long delivery times in comparison to dMLC IMRT. Retrospective patient analysis indicated minimal interplay and gradient effects for patients treated with dMLC IMRT at the NCCI. PACS numbers: 87.55.km, 87.56.Fc PMID:26894347
SU-G-TeP3-11: Radiobiological-Cum-Dosimetric Quality Assurance of Complex Radiotherapy Plans
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paudel, N; Narayanasamy, G; Zhang, X
2016-06-15
Purpose: Dosimetric gamma-analysis used for QA of complex radiotherapy plans tests the dosimetric equivalence of a delivered plan with the treatment planning system (TPS) optimized plan. It does not examine whether a dosimetric difference results in any radiobiological difference. This study introduces a method to test the radiobiological and dosimetric equivalence between a delivered and the TPS optimized plan. Methods: Six head and neck and seven lung cancer VMAT or IMRT plans optimized for patient treatment were calculated and delivered to an ArcCheck phantom. ArcCheck measured dose distributions were compared with the TPS calculated dose distributions using a 2-D gamma-analysis.more » Dose volume histograms (DVHs) for various patient structures were obtained by using measured data in 3DVH software and compared against the TPS calculated DVHs using 3-D gamma analysis. DVH data were used in the Poisson model to calculate tumor control probability (TCP) for the treatment targets and in the sigmoid dose response model to calculate normal tissue complication probability (NTCP) for the normal structures. Results: Two-D and three-D gamma passing rates among six H&N patient plans differed by 0 to 2.7% and among seven lung plans by 0.1 to 4.5%. Average ± SD TCPs based on measurement and TPS were 0.665±0.018 and 0.674±0.044 for H&N, and 0.791±0.027 and 0.733±0.031 for lung plans, respectively. Differences in NTCPs were usually negligible. The differences in dosimetric results, TCPs and NTCPs were insignificant. Conclusion: The 2-D and 3-D gamma-analysis based agreement between measured and planned dose distributions may indicate their dosimetric equivalence. Small and insignificant differences in TCPs and NTCPs based on measured and planned dose distributions indicate the radiobiological equivalence between the measured and optimized plans. However, patient plans showing larger differences between 2-D and 3-D gamma-analysis can help us make a more definite conclusion through our ongoing research with a larger number of patients.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pasciak, A; Kao, J
2014-06-15
Purpose The process of converting Yttrium-90 (Y90) PET/CT images into 3D absorbed dose maps will be explained. The simple methods presented will allow the medical physicst to analyze Y90 PET images following radioembolization and determine the absorbed dose to tumor, normal liver parenchyma and other areas of interest, without application of Monte-Carlo radiation transport or dose-point-kernel (DPK) convolution. Methods Absorbed dose can be computed from Y90 PET/CT images based on the premise that radioembolization is a permanent implant with a constant relative activity distribution after infusion. Many Y90 PET/CT publications have used DPK convolution to obtain 3D absorbed dose maps.more » However, this method requires specialized software limiting clinical utility. The Local Deposition method, an alternative to DPK convolution, can be used to obtain absorbed dose and requires no additional computer processing. Pixel values from regions of interest drawn on Y90 PET/CT images can be converted to absorbed dose (Gy) by multiplication with a scalar constant. Results There is evidence that suggests the Local Deposition method may actually be more accurate than DPK convolution and it has been successfully used in a recent Y90 PET/CT publication. We have analytically compared dose-volume-histograms (DVH) for phantom hot-spheres to determine the difference between the DPK and Local Deposition methods, as a function of PET scanner point-spread-function for Y90. We have found that for PET/CT systems with a FWHM greater than 3.0 mm when imaging Y90, the Local Deposition Method provides a more accurate representation of DVH, regardless of target size than DPK convolution. Conclusion Using the Local Deposition Method, post-radioembolization Y90 PET/CT images can be transformed into 3D absorbed dose maps of the liver. An interventional radiologist or a Medical Physicist can perform this transformation in a clinical setting, allowing for rapid prediction of treatment efficacy by comparison to published tumoricidal thresholds.« less
NASA Astrophysics Data System (ADS)
Jiang, Runqing
Intensity-modulated radiation therapy (IMRT) uses non-uniform beam intensities within a radiation field to provide patient-specific dose shaping, resulting in a dose distribution that conforms tightly to the planning target volume (PTV). Unavoidable geometric uncertainty arising from patient repositioning and internal organ motion can lead to lower conformality index (CI) during treatment delivery, a decrease in tumor control probability (TCP) and an increase in normal tissue complication probability (NTCP). The CI of the IMRT plan depends heavily on steep dose gradients between the PTV and organ at risk (OAR). Geometric uncertainties reduce the planned dose gradients and result in a less steep or "blurred" dose gradient. The blurred dose gradients can be maximized by constraining the dose objective function in the static IMRT plan or by reducing geometric uncertainty during treatment with corrective verification imaging. Internal organ motion and setup error were evaluated simultaneously for 118 individual patients with implanted fiducials and MV electronic portal imaging (EPI). A Gaussian probability density function (PDF) is reasonable for modeling geometric uncertainties as indicated by the 118 patients group. The Gaussian PDF is patient specific and group standard deviation (SD) should not be used for accurate treatment planning for individual patients. In addition, individual SD should not be determined or predicted from small imaging samples because of random nature of the fluctuations. Frequent verification imaging should be employed in situations where geometric uncertainties are expected. Cumulative PDF data can be used for re-planning to assess accuracy of delivered dose. Group data is useful for determining worst case discrepancy between planned and delivered dose. The margins for the PTV should ideally represent true geometric uncertainties. The measured geometric uncertainties were used in this thesis to assess PTV coverage, dose to OAR, equivalent uniform dose per fraction (EUDf) and NTCP. The dose distribution including geometric uncertainties was determined from integration of the convolution of the static dose gradient with the PDF. Integration of the convolution of the static dose and derivative of the PDF can also be used to determine the dose including geometric uncertainties although this method was not investigated in detail. Local maximum dose gradient (LMDG) was determined via optimization of dose objective function by manually adjusting DVH control points or selecting beam numbers and directions during IMRT treatment planning. Minimum SD (SDmin) is used when geometric uncertainty is corrected with verification imaging. Maximum SD (SDmax) is used when the geometric uncertainty is known to be large and difficult to manage. SDmax was 4.38 mm in anterior-posterior (AP) direction, 2.70 mm in left-right (LR) direction and 4.35 mm in superior-inferior (SI) direction; SDmin was 1.1 mm in all three directions if less than 2 mm threshold was used for uncorrected fractions in every direction. EUDf is a useful QA parameter for interpreting the biological impact of geometric uncertainties on the static dose distribution. The EUD f has been used as the basis for the time-course NTCP evaluation in the thesis. Relative NTCP values are useful for comparative QA checking by normalizing known complications (e.g. reported in the RTOG studies) to specific DVH control points. For prostate cancer patients, rectal complications were evaluated from specific RTOG clinical trials and detailed evaluation of the treatment techniques (e.g. dose prescription, DVH, number of beams, bean angles). Treatment plans that did not meet DVH constraints represented additional complication risk. Geometric uncertainties improved or worsened rectal NTCP depending on individual internal organ motion within patient.
Automated support for experience-based software management
NASA Technical Reports Server (NTRS)
Valett, Jon D.
1992-01-01
To effectively manage a software development project, the software manager must have access to key information concerning a project's status. This information includes not only data relating to the project of interest, but also, the experience of past development efforts within the environment. This paper describes the concepts and functionality of a software management tool designed to provide this information. This tool, called the Software Management Environment (SME), enables the software manager to compare an ongoing development effort with previous efforts and with models of the 'typical' project within the environment, to predict future project status, to analyze a project's strengths and weaknesses, and to assess the project's quality. In order to provide these functions the tool utilizes a vast corporate memory that includes a data base of software metrics, a set of models and relationships that describe the software development environment, and a set of rules that capture other knowledge and experience of software managers within the environment. Integrating these major concepts into one software management tool, the SME is a model of the type of management tool needed for all software development organizations.
Using a knowledge-based planning solution to select patients for proton therapy.
Delaney, Alexander R; Dahele, Max; Tol, Jim P; Kuijper, Ingrid T; Slotman, Ben J; Verbakel, Wilko F A R
2017-08-01
Patient selection for proton therapy by comparing proton/photon treatment plans is time-consuming and prone to bias. RapidPlan™, a knowledge-based-planning solution, uses plan-libraries to model and predict organ-at-risk (OAR) dose-volume-histograms (DVHs). We investigated whether RapidPlan, utilizing an algorithm based only on photon beam characteristics, could generate proton DVH-predictions and whether these could correctly identify patients for proton therapy. Model PROT and Model PHOT comprised 30 head-and-neck cancer proton and photon plans, respectively. Proton and photon knowledge-based-plans (KBPs) were made for ten evaluation-patients. DVH-prediction accuracy was analyzed by comparing predicted-vs-achieved mean OAR doses. KBPs and manual plans were compared using salivary gland and swallowing muscle mean doses. For illustration, patients were selected for protons if predicted Model PHOT mean dose minus predicted Model PROT mean dose (ΔPrediction) for combined OARs was ≥6Gy, and benchmarked using achieved KBP doses. Achieved and predicted Model PROT /Model PHOT mean dose R 2 was 0.95/0.98. Generally, achieved mean dose for Model PHOT /Model PROT KBPs was respectively lower/higher than predicted. Comparing Model PROT /Model PHOT KBPs with manual plans, salivary and swallowing mean doses increased/decreased by <2Gy, on average. ΔPrediction≥6Gy correctly selected 4 of 5 patients for protons. Knowledge-based DVH-predictions can provide efficient, patient-specific selection for protons. A proton-specific RapidPlan-solution could improve results. Copyright © 2017 Elsevier B.V. All rights reserved.
Wang, Yibing; Heijmen, Ben J M; Petit, Steven F
2017-12-01
To prospectively investigate the use of an independent DVH prediction tool to detect outliers in the quality of fully automatically generated treatment plans for prostate cancer patients. A plan QA tool was developed to predict rectum, anus and bladder DVHs, based on overlap volume histograms and principal component analysis (PCA). The tool was trained with 22 automatically generated, clinical plans, and independently validated with 21 plans. Its use was prospectively investigated for 50 new plans by replanning in case of detected outliers. For rectum D mean , V 65Gy , V 75Gy , anus D mean , and bladder D mean , the difference between predicted and achieved was within 0.4 Gy or 0.3% (SD within 1.8 Gy or 1.3%). Thirteen detected outliers were re-planned, leading to moderate but statistically significant improvements (mean, max): rectum D mean (1.3 Gy, 3.4 Gy), V 65Gy (2.7%, 4.2%), anus D mean (1.6 Gy, 6.9 Gy), and bladder D mean (1.5 Gy, 5.1 Gy). The rectum V 75Gy of the new plans slightly increased (0.2%, p = 0.087). A high accuracy DVH prediction tool was developed and used for independent QA of automatically generated plans. In 28% of plans, minor dosimetric deviations were observed that could be improved by plan adjustments. Larger gains are expected for manually generated plans. Copyright © 2017 Elsevier B.V. All rights reserved.
1992-06-01
presents the concept of software Total Quality Management (TQM) which focuses on the entire process of software acquisition, as a partial solution to...software TQM can be applied to software acquisition. Software Development, Software Acquisition, Total Quality management (TQM), Army Tactical Missile
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, SP; Quon, H; Kiess, AP
Purpose: To develop a framework for automatic extraction of clinically meaningful dosimetric-outcome relationships from an in-house, analytic oncology database. Methods: Dose-volume histograms (DVH) and clinical outcome-related structured data elements have been routinely stored to our database for 513 HN cancer patients treated from 2007 to 2014. SQL queries were developed to extract outcomes that had been assessed for at least 100 patients, as well as DVH curves for organs-at-risk (OAR) that were contoured for at least 100 patients. DVH curves for paired OAR (e.g., left and right parotids) were automatically combined and included as additional structures for analysis. For eachmore » OAR-outcome combination, DVH dose points, D(V{sub t}), at a series of normalized volume thresholds, V{sub t}=[0.01,0.99], were stratified into two groups based on outcomes after treatment completion. The probability, P[D(V{sub t})], of an outcome was modeled at each V{sub t} by logistic regression. Notable combinations, defined as having P[D(V{sub t})] increase by at least 5% per Gy (p<0.05), were further evaluated for clinical relevance using a custom graphical interface. Results: A total of 57 individual and combined structures and 115 outcomes were queried, resulting in over 6,500 combinations for analysis. Of these, 528 combinations met the 5%/Gy requirement, with further manual inspection revealing a number of reasonable models based on either reported literature or proximity between neighboring OAR. The data mining algorithm confirmed the following well-known toxicity/outcome relationships: dysphagia/larynx, voice changes/larynx, esophagitis/esophagus, xerostomia/combined parotids, and mucositis/oral mucosa. Other notable relationships included dysphagia/pharyngeal constrictors, nausea/brainstem, nausea/spinal cord, weight-loss/mandible, and weight-loss/combined parotids. Conclusion: Our database platform has enabled large-scale analysis of dose-outcome relationships. The current data-mining framework revealed both known and novel dosimetric and clinical relationships, underscoring the potential utility of this analytic approach. Multivariate models may be necessary to further evaluate the complex relationship between neighboring OARs and observed outcomes. This research was supported through collaborations with Elekta, Philips, and Toshiba.« less
Software Configuration Management Guidebook
NASA Technical Reports Server (NTRS)
1995-01-01
The growth in cost and importance of software to NASA has caused NASA to address the improvement of software development across the agency. One of the products of this program is a series of guidebooks that define a NASA concept of the assurance processes which are used in software development. The Software Assurance Guidebook, SMAP-GB-A201, issued in September, 1989, provides an overall picture of the concepts and practices of NASA in software assurance. Lower level guidebooks focus on specific activities that fall within the software assurance discipline, and provide more detailed information for the manager and/or practitioner. This is the Software Configuration Management Guidebook which describes software configuration management in a way that is compatible with practices in industry and at NASA Centers. Software configuration management is a key software development process, and is essential for doing software assurance.
Office Computer Software: A Comprehensive Review of Software Programs.
ERIC Educational Resources Information Center
Secretary, 1992
1992-01-01
Describes types of software including system software, application software, spreadsheets, accounting software, graphics packages, desktop publishing software, database, desktop and personal information management software, project and records management software, groupware, and shareware. (JOW)
Software Management Environment (SME) concepts and architecture, revision 1
NASA Technical Reports Server (NTRS)
Hendrick, Robert; Kistler, David; Valett, Jon
1992-01-01
This document presents the concepts and architecture of the Software Management Environment (SME), developed for the Software Engineering Branch of the Flight Dynamic Division (FDD) of GSFC. The SME provides an integrated set of experience-based management tools that can assist software development managers in managing and planning flight dynamics software development projects. This document provides a high-level description of the types of information required to implement such an automated management tool.
Software Management Environment (SME): Components and algorithms
NASA Technical Reports Server (NTRS)
Hendrick, Robert; Kistler, David; Valett, Jon
1994-01-01
This document presents the components and algorithms of the Software Management Environment (SME), a management tool developed for the Software Engineering Branch (Code 552) of the Flight Dynamics Division (FDD) of the Goddard Space Flight Center (GSFC). The SME provides an integrated set of visually oriented experienced-based tools that can assist software development managers in managing and planning software development projects. This document describes and illustrates the analysis functions that underlie the SME's project monitoring, estimation, and planning tools. 'SME Components and Algorithms' is a companion reference to 'SME Concepts and Architecture' and 'Software Engineering Laboratory (SEL) Relationships, Models, and Management Rules.'
Software Engineering Guidebook
NASA Technical Reports Server (NTRS)
Connell, John; Wenneson, Greg
1993-01-01
The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.
SAGA: A project to automate the management of software production systems
NASA Technical Reports Server (NTRS)
Campbell, Roy H.; Laliberte, D.; Render, H.; Sum, R.; Smith, W.; Terwilliger, R.
1987-01-01
The Software Automation, Generation and Administration (SAGA) project is investigating the design and construction of practical software engineering environments for developing and maintaining aerospace systems and applications software. The research includes the practical organization of the software lifecycle, configuration management, software requirements specifications, executable specifications, design methodologies, programming, verification, validation and testing, version control, maintenance, the reuse of software, software libraries, documentation, and automated management.
The research and practice of spacecraft software engineering
NASA Astrophysics Data System (ADS)
Chen, Chengxin; Wang, Jinghua; Xu, Xiaoguang
2017-06-01
In order to ensure the safety and reliability of spacecraft software products, it is necessary to execute engineering management. Firstly, the paper introduces the problems of unsystematic planning, uncertain classified management and uncontinuous improved mechanism in domestic and foreign spacecraft software engineering management. Then, it proposes a solution for software engineering management based on system-integrated ideology in the perspective of spacecraft system. Finally, a application result of spacecraft is given as an example. The research can provides a reference for executing spacecraft software engineering management and improving software product quality.
Integrating automated support for a software management cycle into the TAME system
NASA Technical Reports Server (NTRS)
Sunazuka, Toshihiko; Basili, Victor R.
1989-01-01
Software managers are interested in the quantitative management of software quality, cost and progress. An integrated software management methodology, which can be applied throughout the software life cycle for any number purposes, is required. The TAME (Tailoring A Measurement Environment) methodology is based on the improvement paradigm and the goal/question/metric (GQM) paradigm. This methodology helps generate a software engineering process and measurement environment based on the project characteristics. The SQMAR (software quality measurement and assurance technology) is a software quality metric system and methodology applied to the development processes. It is based on the feed forward control principle. Quality target setting is carried out before the plan-do-check-action activities are performed. These methodologies are integrated to realize goal oriented measurement, process control and visual management. A metric setting procedure based on the GQM paradigm, a management system called the software management cycle (SMC), and its application to a case study based on NASA/SEL data are discussed. The expected effects of SMC are quality improvement, managerial cost reduction, accumulation and reuse of experience, and a highly visual management reporting system.
Software And Systems Engineering Risk Management
2010-04-01
RSKM 2004 COSO Enterprise RSKM Framework 2006 ISO/IEC 16085 Risk Management Process 2008 ISO/IEC 12207 Software Lifecycle Processes 2009 ISO/IEC...1 Software And Systems Engineering Risk Management John Walz VP Technical and Conferences Activities, IEEE Computer Society Vice-Chair Planning...Software & Systems Engineering Standards Committee, IEEE Computer Society US TAG to ISO TMB Risk Management Working Group Systems and Software
Pasler, Marlies; Michel, Kilian; Marrazzo, Livia; Obenland, Michael; Pallotta, Stefania; Björnsgard, Mari; Lutterbach, Johannes
2017-09-01
The purpose of this study was to characterize a new single large-area ionization chamber, the integral quality monitor system (iRT, Germany), for online and real-time beam monitoring. Signal stability, monitor unit (MU) linearity and dose rate dependence were investigated for static and arc deliveries and compared to independent ionization chamber measurements. The dose verification capability of the transmission detector system was evaluated by comparing calculated and measured detector signals for 15 volumetric modulated arc therapy plans. The error detection sensitivity was tested by introducing MLC position and linac output errors. Deviations in dose distributions between the original and error-induced plans were compared in terms of detector signal deviation, dose-volume histogram (DVH) metrics and 2D γ-evaluation (2%/2 mm and 3%/3 mm). The detector signal is linearly dependent on linac output and shows negligible (<0.4%) dose rate dependence up to 460 MU min -1 . Signal stability is within 1% for cumulative detector output; substantial variations were observed for the segment-by-segment signal. Calculated versus measured cumulative signal deviations ranged from -0.16%-2.25%. DVH, mean 2D γ-value and detector signal evaluations showed increasing deviations with regard to the respective reference with growing MLC and dose output errors; good correlation between DVH metrics and detector signal deviation was found (e.g. PTV D mean : R 2 = 0.97). Positional MLC errors of 1 mm and errors in linac output of 2% were identified with the transmission detector system. The extensive tests performed in this investigation show that the new transmission detector provides a stable and sensitive cumulative signal output and is suitable for beam monitoring during patient treatment.
A nuclear factor kappa B-derived inhibitor tripeptide inhibits UVB-induced photoaging process.
Oh, Jee Eun; Kim, Min Seo; Jeon, Woo-Kwang; Seo, Young Kwon; Kim, Byung-Chul; Hahn, Jang Hee; Park, Chang Seo
2014-12-01
Ultraviolet (UV) irradiation on the skin induces photoaging which is characterized by keratinocyte hyperproliferation, generation of coarse wrinkles, worse of laxity and roughness. Upon UV irradiation, nuclear factor kappa B (NF-κB) is activated which plays a key role in signaling pathway leading to inflammation cascade and this activation stimulates expression of pro-inflammatory cytokines such as tumor necrosis factor alpha (TNF-α), interleukin-1alpha (IL-1α) and a stress response gene cyclooxygenase-2 (COX-2). In addition, activation of NF-κB up-regulates the expression of matrix metalloprotease-1 (MMP-1) and consequently collagen in dermis is degraded. In this study, the effects of a NF-κB-derived inhibitor tripeptide on the UVB-induced photoaging and inflammation were investigated in vitro and in vivo. A NF-κB-derived inhibitor tripeptide (NF-κB-DVH) was synthesized based on the sequence of dimerization region of the subunit p65 of NF-κB. Its inhibitory activity was confirmed using chromatin immunoprecipitation assay and in situ proximate ligation assay. The effects of anti-photoaging and anti-inflammation were analyzed by Enzyme-linked immunosorbent assay (ELISA), immunoblotting and immunochemistry. NF-κB-DVH significantly decreased UV-induced expression of TNF-α, IL-1α, MMP-1 and COX-2 while increased production of type I procollagen. Results showed NF-κB-DVH had strong anti-inflammatory activity probably by inhibiting NF-κB activation pathway and suggested to be used as a novel agent for anti-photoaging. Copyright © 2014 Japanese Society for Investigative Dermatology. Published by Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Pasler, Marlies; Michel, Kilian; Marrazzo, Livia; Obenland, Michael; Pallotta, Stefania; Björnsgard, Mari; Lutterbach, Johannes
2017-09-01
The purpose of this study was to characterize a new single large-area ionization chamber, the integral quality monitor system (iRT, Germany), for online and real-time beam monitoring. Signal stability, monitor unit (MU) linearity and dose rate dependence were investigated for static and arc deliveries and compared to independent ionization chamber measurements. The dose verification capability of the transmission detector system was evaluated by comparing calculated and measured detector signals for 15 volumetric modulated arc therapy plans. The error detection sensitivity was tested by introducing MLC position and linac output errors. Deviations in dose distributions between the original and error-induced plans were compared in terms of detector signal deviation, dose-volume histogram (DVH) metrics and 2D γ-evaluation (2%/2 mm and 3%/3 mm). The detector signal is linearly dependent on linac output and shows negligible (<0.4%) dose rate dependence up to 460 MU min-1. Signal stability is within 1% for cumulative detector output; substantial variations were observed for the segment-by-segment signal. Calculated versus measured cumulative signal deviations ranged from -0.16%-2.25%. DVH, mean 2D γ-value and detector signal evaluations showed increasing deviations with regard to the respective reference with growing MLC and dose output errors; good correlation between DVH metrics and detector signal deviation was found (e.g. PTV D mean: R 2 = 0.97). Positional MLC errors of 1 mm and errors in linac output of 2% were identified with the transmission detector system. The extensive tests performed in this investigation show that the new transmission detector provides a stable and sensitive cumulative signal output and is suitable for beam monitoring during patient treatment.
Interaction between duck hepatitis virus and DDT in ducks
Ragland, W.L.; Friend, Milton; Trainer, D.O.; Sladek, N.E.
1971-01-01
Injections of duck hepatitis virus (DVH) decreased, and exposure to DDT increased, hepatic microsomal mixed-function oxidase activity. Injection of DFV prior to exposure to DDT did not prevent stimulation of hepatic microsomal mixed-function oxidase activity by DDT and may have enhanced it.
NASA Astrophysics Data System (ADS)
Dréan, G.; Acosta, O.; Ospina, J. D.; Voisin, C.; Rigaud, B.; Simon, A.; Haigron, P.; de Crevoisier, R.
2013-11-01
Nowadays, the de nition of patient-speci c constraints in prostate cancer radiotherapy planning are solely based on dose-volume histogram (DVH) parameters. Nevertheless those DVH models lack of spatial accuracy since they do not use the complete 3D information of the dose distribution. The goal of the study was to propose an automatic work ow to de ne patient-speci c rectal sub-regions (RSR) involved in rectal bleeding (RB) in case of prostate cancer radiotherapy. A multi-atlas database spanning the large rectal shape variability was built from a population of 116 individuals. Non-rigid registration followed by voxel-wise statistical analysis on those templates allowed nding RSR likely correlated with RB (from a learning cohort of 63 patients). To de ne patient-speci c RSR, weighted atlas-based segmentation with a vote was then applied to 30 test patients. Results show the potentiality of the method to be used for patient-speci c planning of intensity modulated radiotherapy (IMRT).
TMT approach to observatory software development process
NASA Astrophysics Data System (ADS)
Buur, Hanne; Subramaniam, Annapurni; Gillies, Kim; Dumas, Christophe; Bhatia, Ravinder
2016-07-01
The purpose of the Observatory Software System (OSW) is to integrate all software and hardware components of the Thirty Meter Telescope (TMT) to enable observations and data capture; thus it is a complex software system that is defined by four principal software subsystems: Common Software (CSW), Executive Software (ESW), Data Management System (DMS) and Science Operations Support System (SOSS), all of which have interdependencies with the observatory control systems and data acquisition systems. Therefore, the software development process and plan must consider dependencies to other subsystems, manage architecture, interfaces and design, manage software scope and complexity, and standardize and optimize use of resources and tools. Additionally, the TMT Observatory Software will largely be developed in India through TMT's workshare relationship with the India TMT Coordination Centre (ITCC) and use of Indian software industry vendors, which adds complexity and challenges to the software development process, communication and coordination of activities and priorities as well as measuring performance and managing quality and risk. The software project management challenge for the TMT OSW is thus a multi-faceted technical, managerial, communications and interpersonal relations challenge. The approach TMT is using to manage this multifaceted challenge is a combination of establishing an effective geographically distributed software team (Integrated Product Team) with strong project management and technical leadership provided by the TMT Project Office (PO) and the ITCC partner to manage plans, process, performance, risk and quality, and to facilitate effective communications; establishing an effective cross-functional software management team composed of stakeholders, OSW leadership and ITCC leadership to manage dependencies and software release plans, technical complexities and change to approved interfaces, architecture, design and tool set, and to facilitate effective communications; adopting an agile-based software development process across the observatory to enable frequent software releases to help mitigate subsystem interdependencies; defining concise scope and work packages for each of the OSW subsystems to facilitate effective outsourcing of software deliverables to the ITCC partner, and to enable performance monitoring and risk management. At this stage, the architecture and high-level design of the software system has been established and reviewed. During construction each subsystem will have a final design phase with reviews, followed by implementation and testing. The results of the TMT approach to the Observatory Software development process will only be preliminary at the time of the submittal of this paper, but it is anticipated that the early results will be a favorable indication of progress.
49 CFR 236.18 - Software management control plan.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 49 Transportation 4 2013-10-01 2013-10-01 false Software management control plan. 236.18 Section... Instructions: All Systems General § 236.18 Software management control plan. (a) Within 6 months of June 6, 2005, each railroad shall develop and adopt a software management control plan for its signal and train...
49 CFR 236.18 - Software management control plan.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 49 Transportation 4 2011-10-01 2011-10-01 false Software management control plan. 236.18 Section... Instructions: All Systems General § 236.18 Software management control plan. (a) Within 6 months of June 6, 2005, each railroad shall develop and adopt a software management control plan for its signal and train...
49 CFR 236.18 - Software management control plan.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 49 Transportation 4 2012-10-01 2012-10-01 false Software management control plan. 236.18 Section... Instructions: All Systems General § 236.18 Software management control plan. (a) Within 6 months of June 6, 2005, each railroad shall develop and adopt a software management control plan for its signal and train...
49 CFR 236.18 - Software management control plan.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 4 2010-10-01 2010-10-01 false Software management control plan. 236.18 Section... Instructions: All Systems General § 236.18 Software management control plan. (a) Within 6 months of June 6, 2005, each railroad shall develop and adopt a software management control plan for its signal and train...
49 CFR 236.18 - Software management control plan.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 49 Transportation 4 2014-10-01 2014-10-01 false Software management control plan. 236.18 Section... Instructions: All Systems General § 236.18 Software management control plan. (a) Within 6 months of June 6, 2005, each railroad shall develop and adopt a software management control plan for its signal and train...
ERIC Educational Resources Information Center
Eisen, Daniel
2013-01-01
This study explores how project managers, working for private federal IT contractors, experience and understand managing the development of software applications for U.S. federal government agencies. Very little is known about how they manage their projects in this challenging environment. Software development is a complex task and only grows in…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, S; Zhang, H; Zhang, B
2015-06-15
Purpose: To investigate the feasibility of a logistic function-based model to predict organ-at-risk (OAR) DVH for IMRT planning. The predicted DVHs are compared to achieved DVHs by expert treatment planners. Methods: A logistic function is used to model the OAR dose-gradient function. This function describes the percentage of the prescription dose as a function of the normal distance to PTV surface. The slope of dose-gradient function is function of relative spatial orientation of the PTV and OARs. The OAR DVH is calculated using the OAR dose-gradient function assuming that the dose is same for voxels with same normal distance tomore » PTV. Ten previously planned prostate IMRT plans were selected to build the model, and the following plan parameters were calculated as possible features to the model: the PTV maximum/minimum dose, PTV volume, bladder/rectum volume in the radiation field, percentage of bladder/rectum overlapping with PTV, and the distance between the bladder/rectum centroid and PTV. The bladder/rectum dose-gradient function was modeled and applied on 10 additional test cases, and the predicted and achieved clinical bladder/rectum DVHs were compared: V70 (percentage of volume receiving 70Gy and above), V65, V60, V55, V50, V45, V40. Results: The following parameters were selected as model features: PTV volume, and distance of centroid of rectum/bladder to PTV. The model was tested with 10 additional patients. For bladder, the absolute difference (mean±standard deviation) between predicted and clinical DVHs is V70=−0.3±3.2, V65=−0.8±3.9, V60=1.5±4.3, V55=1.7±5.3, V50=−0.6±6.4, V45=0.6±6.5, and V40=0.9±5.7, the correlation coefficient is 0.96; for rectum, the difference is V70=1.5±3.8, V65=1.2±4.2, V60=−0.1±5.3, V55=1.0±6.6, V50=1.6±8.7, V45=1.9±9.8, and V40=1.5±10.1, and the correlation coefficient is 0.87. Conclusion: The OAR DVH can be accurately predicted using the OAR dose-gradient function in IMRT plans. This approach may be used as a quality control tool and aid less experienced planners determine benchmarks for plan quality.« less
Software Program: Software Management Guidebook
NASA Technical Reports Server (NTRS)
1996-01-01
The purpose of this NASA Software Management Guidebook is twofold. First, this document defines the core products and activities required of NASA software projects. It defines life-cycle models and activity-related methods but acknowledges that no single life-cycle model is appropriate for all NASA software projects. It also acknowledges that the appropriate method for accomplishing a required activity depends on characteristics of the software project. Second, this guidebook provides specific guidance to software project managers and team leaders in selecting appropriate life cycles and methods to develop a tailored plan for a software engineering project.
The Software Management Environment (SME)
NASA Technical Reports Server (NTRS)
Valett, Jon D.; Decker, William; Buell, John
1988-01-01
The Software Management Environment (SME) is a research effort designed to utilize the past experiences and results of the Software Engineering Laboratory (SEL) and to incorporate this knowledge into a tool for managing projects. SME provides the software development manager with the ability to observe, compare, predict, analyze, and control key software development parameters such as effort, reliability, and resource utilization. The major components of the SME, the architecture of the system, and examples of the functionality of the tool are discussed.
Avoidable Software Procurements
2012-09-01
software license, software usage, ELA, Software as a Service , SaaS , Software Asset...PaaS Platform as a Service SaaS Software as a Service SAM Software Asset Management SMS System Management Server SEWP Solutions for Enterprise Wide...delivery of full Cloud Services , we will see the transition of the Cloud Computing service model from Iaas to SaaS , or Software as a Service . Software
Lorenzetti, Diane L; Ghali, William A
2013-11-15
Reference management software programs enable researchers to more easily organize and manage large volumes of references typically identified during the production of systematic reviews. The purpose of this study was to determine the extent to which authors are using reference management software to produce systematic reviews; identify which programs are used most frequently and rate their ease of use; and assess the degree to which software usage is documented in published studies. We reviewed the full text of systematic reviews published in core clinical journals indexed in ACP Journal Club from 2008 to November 2011 to determine the extent to which reference management software usage is reported in published reviews. We surveyed corresponding authors to verify and supplement information in published reports, and gather frequency and ease-of-use data on individual reference management programs. Of the 78 researchers who responded to our survey, 79.5% reported that they had used a reference management software package to prepare their review. Of these, 4.8% reported this usage in their published studies. EndNote, Reference Manager, and RefWorks were the programs of choice for more than 98% of authors who used this software. Comments with respect to ease-of-use issues focused on the integration of this software with other programs and computer interfaces, and the sharing of reference databases among researchers. Despite underreporting of use, reference management software is frequently adopted by authors of systematic reviews. The transparency, reproducibility and quality of systematic reviews may be enhanced through increased reporting of reference management software usage.
Software Management Environment (SME) installation guide
NASA Technical Reports Server (NTRS)
Kistler, David; Jeletic, Kellyann
1992-01-01
This document contains installation information for the Software Management Environment (SME), developed for the Systems Development Branch (Code 552) of the Flight Dynamics Division of Goddard Space Flight Center (GSFC). The SME provides an integrated set of management tools that can be used by software development managers in their day-to-day management and planning activities. This document provides a list of hardware and software requirements as well as detailed installation instructions and trouble-shooting information.
Managing the Software Development Process
NASA Technical Reports Server (NTRS)
Lubelczky, Jeffrey T.; Parra, Amy
1999-01-01
The goal of any software development project is to produce a product that is delivered on time, within the allocated budget, and with the capabilities expected by the customer and unfortunately, this goal is rarely achieved. However, a properly managed project in a mature software engineering environment can consistently achieve this goal. In this paper we provide an introduction to three project success factors, a properly managed project, a competent project manager, and a mature software engineering environment. We will also present an overview of the benefits of a mature software engineering environment based on 24 years of data from the Software Engineering Lab, and suggest some first steps that an organization can take to begin benefiting from this environment. The depth and breadth of software engineering exceeds this paper, various references are cited with a goal of raising awareness and encouraging further investigation into software engineering and project management practices.
NASA Astrophysics Data System (ADS)
Kurz, Christopher; Landry, Guillaume; Resch, Andreas F.; Dedes, George; Kamp, Florian; Ganswindt, Ute; Belka, Claus; Raaymakers, Bas W.; Parodi, Katia
2017-11-01
Combining magnetic-resonance imaging (MRI) and proton therapy (PT) using pencil-beam scanning (PBS) may improve image-guided radiotherapy. We aimed at assessing the impact of a magnetic field on PBS-PT plan quality and robustness. Specifically, the robustness against anatomical changes and positioning errors in an MRI-guided scenario with a 30 cm radius 1.5 T magnetic field was studied for prostate PT. Five prostate cancer patients with three consecutive CT images (CT1-3) were considered. Single-field uniform dose PBS-PT plans were generated on the segmented CT1 with Monte-Carlo-based treatment planning software for inverse optimization. Plans were optimized at 90° gantry angle without B-field (no B), with ±1.5 T B-field (B and minus B), as well as at 81° gantry angle and +1.5 T (B G81). Plans were re-calculated on aligned CT2 and CT3 to study the impact of anatomical changes. Dose distributions were compared in terms of changes in DVH parameters, proton range and gamma-index pass-rates. To assess the impact of positioning errors, DVH parameters were compared for ±5 mm CT1 patient shifts in anterior-posterior (AP) and left-right (LR) direction. Proton beam deflection considerably reduced robustness against inter-fractional changes for the B scenario. Range agreement, gamma-index pass-rates and PTV V95% were significantly lower compared to no B. Improved robustness was obtained for minus B and B G81, the latter showing only minor differences to no B. The magnetic field introduced slight dosimetric changes under LR shifts. The impact of AP shifts was considerably larger, and equivalent for scenarios with and without B-field. Results suggest that robustness equivalent to PT without magnetic field can be achieved by adaptation of the treatment parameters, such as B-field orientation (minus B) with respect to the patient and/or gantry angle (B G81). MRI-guided PT for prostate cancer might thus be implemented without compromising robustness compared to state-of-the-art CT-guided PT.
Manager's handbook for software development, revision 1
NASA Technical Reports Server (NTRS)
1990-01-01
Methods and aids for the management of software development projects are presented. The recommendations are based on analyses and experiences of the Software Engineering Laboratory (SEL) with flight dynamics software development. The management aspects of the following subjects are described: organizing the project, producing a development plan, estimating costs, scheduling, staffing, preparing deliverable documents, using management tools, monitoring the project, conducting reviews, auditing, testing, and certifying.
Solar Asset Management Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Iverson, Aaron; Zviagin, George
Ra Power Management (RPM) has developed a cloud based software platform that manages the financial and operational functions of third party financed solar projects throughout their lifecycle. RPM’s software streamlines and automates the sales, financing, and management of a portfolio of solar assets. The software helps solar developers automate the most difficult aspects of asset management, leading to increased transparency, efficiency, and reduction in human error. More importantly, our platform will help developers save money by improving their operating margins.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosenthal, David I.; Chambers, Mark S.; Fuller, Clifton D.
2008-11-01
Background: Intensity-modulated radiation therapy (IMRT) beams traverse nontarget normal structures not irradiated during three-dimensional conformal RT (3D-CRT) for head and neck cancer (HNC). This study estimates the doses and toxicities to nontarget structures during IMRT. Materials and Methods: Oropharyngeal cancer IMRT and 3D-CRT cases were reviewed. Dose-volume histograms (DVH) were used to evaluate radiation dose to the lip, cochlea, brainstem, occipital scalp, and segments of the mandible. Toxicity rates were compared for 3D-CRT, IMRT alone, or IMRT with concurrent cisplatin. Descriptive statistics and exploratory recursive partitioning analysis were used to estimate dose 'breakpoints' associated with observed toxicities. Results: A totalmore » of 160 patients were evaluated for toxicity; 60 had detailed DVH evaluation and 15 had 3D-CRT plan comparison. Comparing IMRT with 3D-CRT, there was significant (p {<=} 0.002) nonparametric differential dose to all clinically significant structures of interest. Thirty percent of IMRT patients had headaches and 40% had occipital scalp alopecia. A total of 76% and 38% of patients treated with IMRT alone had nausea and vomiting, compared with 99% and 68%, respectively, of those with concurrent cisplatin. IMRT had a markedly distinct toxicity profile than 3D-CRT. In recursive partitioning analysis, National Cancer Institute's Common Toxicity Criteria adverse effects 3.0 nausea and vomiting, scalp alopecia and anterior mucositis were associated with reconstructed mean brainstem dose >36 Gy, occipital scalp dose >30 Gy, and anterior mandible dose >34 Gy, respectively. Conclusions: Dose reduction to specified structures during IMRT implies an increased beam path dose to alternate nontarget structures that may result in clinical toxicities that were uncommon with previous, less conformal approaches. These findings have implications for IMRT treatment planning and research, toxicity assessment, and multidisciplinary patient management.« less
Validation of a Quality Management Metric
2000-09-01
quality management metric (QMM) was used to measure the performance of ten software managers on Department of Defense (DoD) software development programs. Informal verification and validation of the metric compared the QMM score to an overall program success score for the entire program and yielded positive correlation. The results of applying the QMM can be used to characterize the quality of software management and can serve as a template to improve software management performance. Future work includes further refining the QMM, applying the QMM scores to provide feedback
Software Management Environment (SME) release 9.4 user reference material
NASA Technical Reports Server (NTRS)
Hendrick, R.; Kistler, D.; Manter, K.
1992-01-01
This document contains user reference material for the Software Management Environment (SME) prototype, developed for the Systems Development Branch (Code 552) of the Flight Dynamics Division (FDD) of Goddard Space Flight Center (GSFC). The SME provides an integrated set of management tools that can be used by software development managers in their day-to-day management and planning activities. This document provides an overview of the SME, a description of all functions, and detailed instructions concerning the software's installation and use.
Repository-Based Software Engineering Program: Working Program Management Plan
NASA Technical Reports Server (NTRS)
1993-01-01
Repository-Based Software Engineering Program (RBSE) is a National Aeronautics and Space Administration (NASA) sponsored program dedicated to introducing and supporting common, effective approaches to software engineering practices. The process of conceiving, designing, building, and maintaining software systems by using existing software assets that are stored in a specialized operational reuse library or repository, accessible to system designers, is the foundation of the program. In addition to operating a software repository, RBSE promotes (1) software engineering technology transfer, (2) academic and instructional support of reuse programs, (3) the use of common software engineering standards and practices, (4) software reuse technology research, and (5) interoperability between reuse libraries. This Program Management Plan (PMP) is intended to communicate program goals and objectives, describe major work areas, and define a management report and control process. This process will assist the Program Manager, University of Houston at Clear Lake (UHCL) in tracking work progress and describing major program activities to NASA management. The goal of this PMP is to make managing the RBSE program a relatively easy process that improves the work of all team members. The PMP describes work areas addressed and work efforts being accomplished by the program; however, it is not intended as a complete description of the program. Its focus is on providing management tools and management processes for monitoring, evaluating, and administering the program; and it includes schedules for charting milestones and deliveries of program products. The PMP was developed by soliciting and obtaining guidance from appropriate program participants, analyzing program management guidance, and reviewing related program management documents.
1988-09-01
Institute of Technology Air University In Partial Fulfillment of the Requirements for the Degree of Master of Science in Systems Management Dexter R... management system software Diag/Prob Diagnosis and problem solving or problem finding GR Graphics software Int/Transp Interoperability and...language software Plan/D.S. Planning and decision support or decision making PM Program management software SC Systems for Command, Control, Communications
An information model for use in software management estimation and prediction
NASA Technical Reports Server (NTRS)
Li, Ningda R.; Zelkowitz, Marvin V.
1993-01-01
This paper describes the use of cluster analysis for determining the information model within collected software engineering development data at the NASA/GSFC Software Engineering Laboratory. We describe the Software Management Environment tool that allows managers to predict development attributes during early phases of a software project and the modifications we propose to allow it to develop dynamic models for better predictions of these attributes.
CrossTalk: The Journal of Defense Software Engineering. Volume 21, Number 1
2008-01-01
project manage- ment and the individual components of the software life-cycle model ; it will be awarded for...software professionals that had been formally educated in software project manage- ment. The study indicated that our industry is lacking in program managers...soft- ware developments get bigger, more complicated, and more dependent on senior software pro- fessionals to get the project on the right path
An Operations Management System for the Space Station
NASA Astrophysics Data System (ADS)
Rosenthal, H. G.
1986-09-01
This paper presents an overview of the conceptual design of an integrated onboard Operations Management System (OMS). Both hardware and software concepts are presented and the integrated space station network is discussed. It is shown that using currently available software technology, an integrated software solution for Space Station management and control, implemented with OMS software, is feasible.
Continuous Risk Management: An Overview
NASA Technical Reports Server (NTRS)
Rosenberg, Linda; Hammer, Theodore F.
1999-01-01
Software risk management is important because it helps avoid disasters, rework, and overkill, but more importantly because it stimulates win-win situations. The objectives of software risk management are to identify, address, and eliminate software risk items before they become threats to success or major sources of rework. In general, good project managers are also good managers of risk. It makes good business sense for all software development projects to incorporate risk management as part of project management. The Software Assurance Technology Center (SATC) at NASA GSFC has been tasked with the responsibility for developing and teaching a systems level course for risk management that provides information on how to implement risk management. The course was developed in conjunction with the Software Engineering Institute at Carnegie Mellon University, then tailored to the NASA systems community. This is an introductory tutorial to continuous risk management based on this course. The rational for continuous risk management and how it is incorporated into project management are discussed. The risk management structure of six functions is discussed in sufficient depth for managers to understand what is involved in risk management and how it is implemented. These functions include: (1) Identify the risks in a specific format; (2) Analyze the risk probability, impact/severity, and timeframe; (3) Plan the approach; (4) Track the risk through data compilation and analysis; (5) Control and monitor the risk; (6) Communicate and document the process and decisions.
SU-E-T-357: Electronic Compensation Technique to Deliver Total Body Dose
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lakeman, T; Wang, I; Podgorsak, M
Purpose: Total body irradiation (TBI) uses large parallel-opposed radiation fields to suppress the patient’s immune system and eradicate the residual cancer cells in preparation of recipient for bone marrow transplant. The manual placement of lead compensators has conventionally been used to compensate for the varying thickness through the entire body in large-field TBI. The goal of this study is to pursue utilizing the modern electronic compensation technique to more accurately and efficiently deliver dose to patients in need of TBI. Methods: Treatment plans utilizing electronic compensation to deliver a total body dose were created retrospectively for patients for whom CTmore » data had been previously acquired. Each treatment plan includes two, specifically weighted, pair of opposed fields. One pair of open, large fields (collimator=45°), to encompass the patient’s entire anatomy, and one pair of smaller fields (collimator=0°) focused only on the thicker midsection of the patient. The optimal fluence for each one of the smaller fields was calculated at a patient specific penetration depth. Irregular surface compensators provide a more uniform dose distribution within the smaller opposed fields. Results: Dose-volume histograms (DVH) were calculated for the evaluating the electronic compensation technique. In one case, the maximum body doses calculated from the DVH were reduced from the non-compensated 195.8% to 165.3% in the electronically compensated plans, indicating a more uniform dose with the region of electronic compensation. The mean body doses calculated from the DVH were also reduced from the non-compensated 120.6% to 112.7% in the electronically compensated plans, indicating a more accurate delivery of the prescription dose. All calculated monitor units were well within clinically acceptable limits. Conclusion: Electronic compensation technique for TBI will not substantially increase the beam on time while it can significantly reduce the compensator setup time and the potential risk of errors in manually placing lead compensators.« less
NASA Astrophysics Data System (ADS)
Houweling, Antonetta C.; Crama, Koen; Visser, Jorrit; Fukata, Kyohei; Rasch, Coen R. N.; Ohno, Tatsuya; Bel, Arjan; van der Horst, Astrid
2017-04-01
Radiotherapy using charged particles is characterized by a low dose to the surrounding healthy organs, while delivering a high dose to the tumor. However, interfractional anatomical changes can greatly affect the robustness of particle therapy. Therefore, we compared the dosimetric impact of interfractional anatomical changes (i.e. body contour differences and gastrointestinal gas volume changes) in photon, proton and carbon ion therapy for pancreatic cancer patients. In this retrospective planning study, photon, proton and carbon ion treatment plans were created for 9 patients. Fraction dose calculations were performed using daily cone-beam CT (CBCT) images. To this end, the planning CT was deformably registered to each CBCT; gastrointestinal gas volumes were delineated on the CBCTs and copied to the deformed CT. Fraction doses were accumulated rigidly. To compare planned and accumulated dose, dose-volume histogram (DVH) parameters of the planned and accumulated dose of the different radiotherapy modalities were determined for the internal gross tumor volume, internal clinical target volume (iCTV) and organs-at-risk (OARs; duodenum, stomach, kidneys, liver and spinal cord). Photon plans were highly robust against interfractional anatomical changes. The difference between the planned and accumulated DVH parameters for the photon plans was less than 0.5% for the target and OARs. In both proton and carbon ion therapy, however, coverage of the iCTV was considerably reduced for the accumulated dose compared with the planned dose. The near-minimum dose ({{D}98 % } ) of the iCTV reduced with 8% for proton therapy and with 10% for carbon ion therapy. The DVH parameters of the OARs differed less than 3% for both particle modalities. Fractionated radiotherapy using photons is highly robust against interfractional anatomical changes. In proton and carbon ion therapy, such changes can severely reduce the dose coverage of the target.
Electronic compensation technique to deliver a total body dose
NASA Astrophysics Data System (ADS)
Lakeman, Tara E.
Purpose: Total body irradiation (TBI) uses large parallel-opposed radiation fields to suppress the patient's immune system and eradicate the residual cancer cells in preparation of recipient for bone marrow transplant. The manual placement of lead compensators has been conventionally used to compensate for the varying thickness throughout the body in large-field TBI. The goal of this study is to pursue utilizing the modern electronic compensation technique to more accurately and efficiently deliver dose to patients in need of TBI. Method: Treatment plans utilizing the electronic compensation to deliver a total body dose were created retrospectively for patients for whom CT data had been previously acquired. Each treatment plan includes two pair of parallel opposed fields. One pair of large fields is used to encompass the majority of the patient's anatomy. The other pair are very small open fields focused only on the thin bottom portion of the patient's anatomy, which requires much less radiation than the rest of the body to reach 100% of the prescribed dose. A desirable fluence pattern was manually painted within each of the larger fields for each patient to provide a more uniform distribution. Results: Dose-volume histograms (DVH) were calculated for evaluating the electronic compensation technique. In the electronically compensated plans, the maximum body doses calculated from the DVH were reduced from the conventionally-compensated plans by an average of 15%, indicating a more uniform dose. The mean body doses calculated from the electronically compensated DVH remained comparable to that of the conventionally-compensated plans, indicating an accurate delivery of the prescription dose using electronic compensation. All calculated monitor units were within clinically acceptable limits. Conclusion: Electronic compensation technique for TBI will not increase the beam on time beyond clinically acceptable limits while it can substantially reduce the compensator setup time and the potential risk of errors in manually placing lead compensators.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Takahashi, Yutaka; Verneris, Michael R.; Dusenbery, Kathryn E.
Purpose: To report potential dose heterogeneity leading to underdosing at different skeletal sites in total marrow irradiation (TMI) with helical tomotherapy due to the thread effect and provide possible solutions to reduce this effect. Methods and Materials: Nine cases were divided into 2 groups based on patient size, defined as maximum left-to-right arm distance (mLRD): small mLRD (≤47 cm) and large mLRD (>47 cm). TMI treatment planning was conducted by varying the pitch and modulation factor while a jaw size (5 cm) was kept fixed. Ripple amplitude, defined as the peak-to-trough dose relative to the average dose due to themore » thread effect, and the dose–volume histogram (DVH) parameters for 9 cases with various mLRD was analyzed in different skeletal regions at off-axis (eg, bones of the arm or femur), at the central axis (eg, vertebrae), and planning target volume (PTV), defined as the entire skeleton plus 1-cm margin. Results: Average ripple amplitude for a pitch of 0.430, known as one of the magic pitches that reduce thread effect, was 9.2% at 20 cm off-axis. No significant differences in DVH parameters of PTV, vertebrae, or femur were observed between small and large mLRD groups for a pitch of ≤0.287. Conversely, in the bones of the arm, average differences in the volume receiving 95% and 107% dose (V95 and V107, respectively) between large and small mLRD groups were 4.2% (P=.016) and 16% (P=.016), respectively. Strong correlations were found between mLRD and ripple amplitude (rs=.965), mLRD and V95 (rs=−.742), and mLRD and V107 (rs=.870) of bones of the arm. Conclusions: Thread effect significantly influences DVH parameters in the bones of the arm for large mLRD patients. By implementing a favorable pitch value and adjusting arm position, peripheral dose heterogeneity could be reduced.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Isohashi, Fumiaki, E-mail: isohashi@radonc.med.osaka-u.ac.j; Yoshioka, Yasuo; Koizumi, Masahiko
2010-07-01
Purpose: The purpose of this study was to reconfirm our previous findings that the rectal dose and source strength both affect late rectal bleeding after high-dose-rate intracavitary brachytherapy (HDR-ICBT), by using a rectal dose calculated in accordance with the definitions of the International Commission on Radiation Units and Measurements Report 38 (ICRU{sub RP}) or of dose-volume histogram (DVH) parameters by the Groupe Europeen de Curietherapie of the European Society for Therapeutic Radiology and Oncology. Methods and Materials: Sixty-two patients who underwent HDR-ICBT and were followed up for 1 year or more were studied. The rectal dose for ICBT was calculatedmore » by using the ICRP{sub RP} based on orthogonal radiographs or the DVH parameters based on computed tomography (CT). The total dose was calculated as the biologically equivalent dose expressed in 2-Gy fractions (EQD{sub 2}). The relationship between averaged source strength or the EQD{sub 2} and late rectal bleeding was then analyzed. Results: When patients were divided into four groups according to rectal EQD{sub 2} ({>=} or
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, J; Eldib, A; Ma, C
2016-06-15
Purpose: Dose-volume-histogram (DVH) is widely used for plan evaluation in radiation treatment. The concept of dose-mass-histogram (DMH) is expected to provide a more representative description as it accounts for heterogeneity in tissue density. This study is intended to assess the difference between DVH and DMH for evaluating treatment planning quality. Methods: 12 lung cancer treatment plans were exported from the treatment planning system. DVHs for the planning target volume (PTV), the normal lung and other structures of interest were calculated. DMHs were calculated in a similar way as DVHs expect that the voxel density converted from the CT number wasmore » used in tallying the dose histogram bins. The equivalent uniform dose (EUD) was calculated based on voxel volume and mass, respectively. The normal tissue complication probability (NTCP) in relation to the EUD was calculated for the normal lung to provide quantitative comparison of DVHs and DMHs for evaluating the radiobiological effect. Results: Large differences were observed between DVHs and DMHs for lungs and PTVs. For PTVs with dense tumor cores, DMHs are higher than DVHs due to larger mass weighing in the high dose conformal core regions. For the normal lungs, DMHs can either be higher or lower than DVHs depending on the target location within the lung. When the target is close to the lower lung, DMHs show higher values than DVHs because the lower lung has higher density than the central portion or the upper lung. DMHs are lower than DVHs for targets in the upper lung. The calculated NTCPs showed a large range of difference between DVHs and DMHs. Conclusion: The heterogeneity of lung can be well considered using DMH for evaluating target coverage and normal lung pneumonitis. Further studies are warranted to quantify the benefits of DMH over DVH for plan quality evaluation.« less
Dose calculation and verification of the Vero gimbal tracking treatment delivery
NASA Astrophysics Data System (ADS)
Prasetio, H.; Wölfelschneider, J.; Ziegler, M.; Serpa, M.; Witulla, B.; Bert, C.
2018-02-01
The Vero linear accelerator delivers dynamic tumor tracking (DTT) treatment using a gimbal motion. However, the availability of treatment planning systems (TPS) to simulate DTT is limited. This study aims to implement and verify the gimbal tracking beam geometry in the dose calculation. Gimbal tracking was implemented by rotating the reference CT outside the TPS according to the ring, gantry, and gimbal tracking position obtained from the tracking log file. The dose was calculated using these rotated CTs. The geometric accuracy was verified by comparing calculated and measured film response using a ball bearing phantom. The dose was verified by comparing calculated 2D dose distributions and film measurements in a ball bearing and a homogeneous phantom using a gamma criterion of 2%/2 mm. The effect of implementing the gimbal tracking beam geometry in a 3D patient data dose calculation was evaluated using dose volume histograms (DVH). Geometrically, the gimbal tracking implementation accuracy was <0.94 mm. The isodose lines agreed with the film measurement. The largest dose difference of 9.4% was observed at maximum tilt positions with an isocenter and target separation of 17.51 mm. Dosimetrically, gamma passing rates were >98.4%. The introduction of the gimbal tracking beam geometry in the dose calculation shifted the DVH curves by 0.05%-1.26% for the phantom geometry and by 5.59% for the patient CT dataset. This study successfully demonstrates a method to incorporate the gimbal tracking beam geometry into dose calculations. By combining CT rotation and MU distribution according to the log file, the TPS was able to simulate the Vero tracking treatment dose delivery. The DVH analysis from the gimbal tracking dose calculation revealed changes in the dose distribution during gimbal DTT that are not visible with static dose calculations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mashouf, Shahram; Department of Radiation Oncology, Sunnybrook Odette Cancer Centre, Toronto, Ontario; Fleury, Emmanuelle
Purpose: The inhomogeneity correction factor (ICF) method provides heterogeneity correction for the fast calculation TG43 formalism in seed brachytherapy. This study compared ICF-corrected plans to their standard TG43 counterparts, looking at their capacity to assess inadequate coverage and/or risk of any skin toxicities for patients who received permanent breast seed implant (PBSI). Methods and Materials: Two-month postimplant computed tomography scans and plans of 140 PBSI patients were used to calculate dose distributions by using the TG43 and the ICF methods. Multiple dose-volume histogram (DVH) parameters of clinical target volume (CTV) and skin were extracted and compared for both ICF and TG43more » dose distributions. Short-term (desquamation and erythema) and long-term (telangiectasia) skin toxicity data were available on 125 and 110 of the patients, respectively, at the time of the study. The predictive value of each DVH parameter of skin was evaluated using the area under the receiver operating characteristic (ROC) curve for each toxicity endpoint. Results: Dose-volume histogram parameters of CTV, calculated using the ICF method, showed an overall decrease compared to TG43, whereas those of skin showed an increase, confirming previously reported findings of the impact of heterogeneity with low-energy sources. The ICF methodology enabled us to distinguish patients for whom the CTV V{sub 100} and V{sub 90} are up to 19% lower compared to TG43, which could present a risk of recurrence not detected when heterogeneity are not accounted for. The ICF method also led to an increase in the prediction of desquamation, erythema, and telangiectasia for 91% of skin DVH parameters studied. Conclusions: The ICF methodology has the advantage of distinguishing any inadequate dose coverage of CTV due to breast heterogeneity, which can be missed by TG43. Use of ICF correction also led to an increase in prediction accuracy of skin toxicities in most cases.« less
Gomez, Daniel R.; Tucker, Susan L.; Martel, Mary K.; Mohan, Radhe; Balter, Peter A.; Guerra, Jose Luis Lopez; Liu, Hongmei; Komaki, Ritsuko; Cox, James D.; Liao, Zhongxing
2014-01-01
Introduction We analyzed the ability of various patient- and treatment-related factors to predict radiation-induced esophagitis (RE) in patients with non-small cell lung cancer (NSCLC) treated with three-dimensional (3D) conformal radiation therapy (3D-CRT), intensity-modulated radiation therapy (IMRT), or proton beam therapy (PBT). Methods and Materials Patients were treated for NSCLC with 3D-CRT, IMRT, or PBT at MD Anderson from 2000 to 2008 and had full dose-volume histogram (DVH) data available. The endpoint was severe (grade ≥3) RE. The Lyman-Kutcher-Burman (LKB) model was used to analyze RE as a function of the fractional esophageal DVH, with clinical variables included as dose-modifying factors. Results Overall, 652 patients were included: 405 treated with 3D-CRT, 139 with IMRT, and 108 with PBT; corresponding rates of grade ≥3 RE were 8%, 28%, and 6%, with a median time to onset of 42 days (range 11–93 days). A fit of the fractional-DVH LKB model demonstrated that the volume parameter n was significantly different (p=0.046) than 1, indicating that high doses to small volumes are more predictive than mean esophageal dose. The model fit was better for 3D-CRT and PBT than for IMRT. Including receipt of concurrent chemotherapy as a dose-modifying factor significantly improved the LKB model (p=0.005), and the model was further improved by including a variable representing treatment with >30 fractions. Examining individual types of chemotherapy agents revealed a trend toward receipt of concurrent taxanes and increased risk of RE (p=0.105). Conclusions The fractional dose (dose rate) and number of fractions (total dose) distinctly affect the risk of severe RE estimated using the LKB model, and concurrent chemotherapy improves the model fit. This risk of severe RE is underestimated by this model in patients receiving IMRT. PMID:22920974
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gomez, Daniel R., E-mail: dgomez@mdanderson.org; Tucker, Susan L.; Martel, Mary K.
2012-11-15
Introduction: We analyzed the ability of various patient- and treatment-related factors to predict radiation-induced esophagitis (RE) in patients with non-small cell lung cancer (NSCLC) treated with three-dimensional conformal radiation therapy (3D-CRT), intensity-modulated radiation therapy (IMRT), or proton beam therapy (PBT). Methods and Materials: Patients were treated for NSCLC with 3D-CRT, IMRT, or PBT at MD Anderson from 2000 to 2008 and had full dose-volume histogram (DVH) data available. The endpoint was severe (grade {>=}3) RE. The Lyman-Kutcher-Burman (LKB) model was used to analyze RE as a function of the fractional esophageal DVH, with clinical variables included as dose-modifying factors. Results:more » Overall, 652 patients were included: 405 patients were treated with 3D-CRT, 139 with IMRT, and 108 with PBT; corresponding rates of grade {>=}3 RE were 8%, 28%, and 6%, respectively, with a median time to onset of 42 days (range, 11-93 days). A fit of the fractional DVH LKB model demonstrated that the fractional effective dose was significantly different (P=.046) than 1 (fractional mean dose) indicating that high doses to small volumes are more predictive than mean esophageal dose. The model fit was better for 3D-CRT and PBT than for IMRT. Including receipt of concurrent chemotherapy as a dose-modifying factor significantly improved the LKB model (P=.005), and the model was further improved by including a variable representing treatment with >30 fractions. Examining individual types of chemotherapy agents revealed a trend toward receipt of concurrent taxanes and increased risk of RE (P=.105). Conclusions: Fractional dose (dose rate) and number of fractions (total dose) distinctly affect the risk of severe RE, estimated using the LKB model, and concurrent chemotherapy improves the model fit. This risk of severe RE is underestimated by this model in patients receiving IMRT.« less
Gomez, Daniel R; Tucker, Susan L; Martel, Mary K; Mohan, Radhe; Balter, Peter A; Lopez Guerra, Jose Luis; Liu, Hongmei; Komaki, Ritsuko; Cox, James D; Liao, Zhongxing
2012-11-15
We analyzed the ability of various patient- and treatment-related factors to predict radiation-induced esophagitis (RE) in patients with non-small cell lung cancer (NSCLC) treated with three-dimensional conformal radiation therapy (3D-CRT), intensity-modulated radiation therapy (IMRT), or proton beam therapy (PBT). Patients were treated for NSCLC with 3D-CRT, IMRT, or PBT at MD Anderson from 2000 to 2008 and had full dose-volume histogram (DVH) data available. The endpoint was severe (grade≥3) RE. The Lyman-Kutcher-Burman (LKB) model was used to analyze RE as a function of the fractional esophageal DVH, with clinical variables included as dose-modifying factors. Overall, 652 patients were included: 405 patients were treated with 3D-CRT, 139 with IMRT, and 108 with PBT; corresponding rates of grade≥3 RE were 8%, 28%, and 6%, respectively, with a median time to onset of 42 days (range, 11-93 days). A fit of the fractional DVH LKB model demonstrated that the fractional effective dose was significantly different (P=.046) than 1 (fractional mean dose) indicating that high doses to small volumes are more predictive than mean esophageal dose. The model fit was better for 3D-CRT and PBT than for IMRT. Including receipt of concurrent chemotherapy as a dose-modifying factor significantly improved the LKB model (P=.005), and the model was further improved by including a variable representing treatment with >30 fractions. Examining individual types of chemotherapy agents revealed a trend toward receipt of concurrent taxanes and increased risk of RE (P=.105). Fractional dose (dose rate) and number of fractions (total dose) distinctly affect the risk of severe RE, estimated using the LKB model, and concurrent chemotherapy improves the model fit. This risk of severe RE is underestimated by this model in patients receiving IMRT. Copyright © 2012 Elsevier Inc. All rights reserved.
SU-G-TeP3-14: Three-Dimensional Cluster Model in Inhomogeneous Dose Distribution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wei, J; Penagaricano, J; Narayanasamy, G
2016-06-15
Purpose: We aim to investigate 3D cluster formation in inhomogeneous dose distribution to search for new models predicting radiation tissue damage and further leading to new optimization paradigm for radiotherapy planning. Methods: The aggregation of higher dose in the organ at risk (OAR) than a preset threshold was chosen as the cluster whose connectivity dictates the cluster structure. Upon the selection of the dose threshold, the fractional density defined as the fraction of voxels in the organ eligible to be part of the cluster was determined according to the dose volume histogram (DVH). A Monte Carlo method was implemented tomore » establish a case pertinent to the corresponding DVH. Ones and zeros were randomly assigned to each OAR voxel with the sampling probability equal to the fractional density. Ten thousand samples were randomly generated to ensure a sufficient number of cluster sets. A recursive cluster searching algorithm was developed to analyze the cluster with various connectivity choices like 1-, 2-, and 3-connectivity. The mean size of the largest cluster (MSLC) from the Monte Carlo samples was taken to be a function of the fractional density. Various OARs from clinical plans were included in the study. Results: Intensive Monte Carlo study demonstrates the inverse relationship between the MSLC and the cluster connectivity as anticipated and the cluster size does not change with fractional density linearly regardless of the connectivity types. An initially-slow-increase to exponential growth transition of the MSLC from low to high density was observed. The cluster sizes were found to vary within a large range and are relatively independent of the OARs. Conclusion: The Monte Carlo study revealed that the cluster size could serve as a suitable index of the tissue damage (percolation cluster) and the clinical outcome of the same DVH might be potentially different.« less
Mashouf, Shahram; Fleury, Emmanuelle; Lai, Priscilla; Merino, Tomas; Lechtman, Eli; Kiss, Alex; McCann, Claire; Pignol, Jean-Philippe
2016-03-15
The inhomogeneity correction factor (ICF) method provides heterogeneity correction for the fast calculation TG43 formalism in seed brachytherapy. This study compared ICF-corrected plans to their standard TG43 counterparts, looking at their capacity to assess inadequate coverage and/or risk of any skin toxicities for patients who received permanent breast seed implant (PBSI). Two-month postimplant computed tomography scans and plans of 140 PBSI patients were used to calculate dose distributions by using the TG43 and the ICF methods. Multiple dose-volume histogram (DVH) parameters of clinical target volume (CTV) and skin were extracted and compared for both ICF and TG43 dose distributions. Short-term (desquamation and erythema) and long-term (telangiectasia) skin toxicity data were available on 125 and 110 of the patients, respectively, at the time of the study. The predictive value of each DVH parameter of skin was evaluated using the area under the receiver operating characteristic (ROC) curve for each toxicity endpoint. Dose-volume histogram parameters of CTV, calculated using the ICF method, showed an overall decrease compared to TG43, whereas those of skin showed an increase, confirming previously reported findings of the impact of heterogeneity with low-energy sources. The ICF methodology enabled us to distinguish patients for whom the CTV V100 and V90 are up to 19% lower compared to TG43, which could present a risk of recurrence not detected when heterogeneity are not accounted for. The ICF method also led to an increase in the prediction of desquamation, erythema, and telangiectasia for 91% of skin DVH parameters studied. The ICF methodology has the advantage of distinguishing any inadequate dose coverage of CTV due to breast heterogeneity, which can be missed by TG43. Use of ICF correction also led to an increase in prediction accuracy of skin toxicities in most cases. Copyright © 2016 Elsevier Inc. All rights reserved.
Software engineering project management - A state-of-the-art report
NASA Technical Reports Server (NTRS)
Thayer, R. H.; Lehman, J. H.
1977-01-01
The management of software engineering projects in the aerospace industry was investigated. The survey assessed such features as contract type, specification preparation techniques, software documentation required by customers, planning and cost-estimating, quality control, the use of advanced program practices, software tools and test procedures, the education levels of project managers, programmers and analysts, work assignment, automatic software monitoring capabilities, design and coding reviews, production times, success rates, and organizational structure of the projects.
Managers Handbook for Software Development
NASA Technical Reports Server (NTRS)
Agresti, W.; Mcgarry, F.; Card, D.; Page, J.; Church, V.; Werking, R.
1984-01-01
Methods and aids for the management of software development projects are presented. The recommendations are based on analyses and experiences with flight dynamics software development. The management aspects of organizing the project, producing a development plan, estimation costs, scheduling, staffing, preparing deliverable documents, using management tools, monitoring the project, conducting reviews, auditing, testing, and certifying are described.
Software Framework for Peer Data-Management Services
NASA Technical Reports Server (NTRS)
Hughes, John; Hardman, Sean; Crichton, Daniel; Hyon, Jason; Kelly, Sean; Tran, Thuy
2007-01-01
Object Oriented Data Technology (OODT) is a software framework for creating a Web-based system for exchange of scientific data that are stored in diverse formats on computers at different sites under the management of scientific peers. OODT software consists of a set of cooperating, distributed peer components that provide distributed peer-to-peer (P2P) services that enable one peer to search and retrieve data managed by another peer. In effect, computers running OODT software at different locations become parts of an integrated data-management system.
NASA Technical Reports Server (NTRS)
Church, Victor E.; Long, D.; Hartenstein, Ray; Perez-Davila, Alfredo
1992-01-01
This report is one of a series discussing configuration management (CM) topics for Space Station ground systems software development. It provides a description of the Software Support Environment (SSE)-developed Software Test Management (STM) capability, and discusses the possible use of this capability for management of developed software during testing performed on target platforms. This is intended to supplement the formal documentation of STM provided by the SEE Project. How STM can be used to integrate contractor CM and formal CM for software before delivery to operations is described. STM provides a level of control that is flexible enough to support integration and debugging, but sufficiently rigorous to insure the integrity of the testing process.
NASA's Approach to Software Assurance
NASA Technical Reports Server (NTRS)
Wetherholt, Martha
2015-01-01
NASA defines software assurance as: the planned and systematic set of activities that ensure conformance of software life cycle processes and products to requirements, standards, and procedures via quality, safety, reliability, and independent verification and validation. NASA's implementation of this approach to the quality, safety, reliability, security and verification and validation of software is brought together in one discipline, software assurance. Organizationally, NASA has software assurance at each NASA center, a Software Assurance Manager at NASA Headquarters, a Software Assurance Technical Fellow (currently the same person as the SA Manager), and an Independent Verification and Validation Organization with its own facility. An umbrella risk mitigation strategy for safety and mission success assurance of NASA's software, software assurance covers a wide area and is better structured to address the dynamic changes in how software is developed, used, and managed, as well as it's increasingly complex functionality. Being flexible, risk based, and prepared for challenges in software at NASA is essential, especially as much of our software is unique for each mission.
Software for Managing Personal Files.
ERIC Educational Resources Information Center
Lundeen, Gerald
1989-01-01
Discusses the special characteristics of personal file management software and compares four microcomputer software packages: Notebook II with Bibliography and Convert, Pro-Cite with Biblio-Links, askSam, and Reference Manager. Each package is evaluated in terms of the user interface, file maintenance, retrieval capabilities, output, and…
Software Development Management: Empirical and Analytical Perspectives
ERIC Educational Resources Information Center
Kang, Keumseok
2011-01-01
Managing software development is a very complex activity because it must deal with people, organizations, technologies, and business processes. My dissertation consists of three studies that examine software development management from various perspectives. The first study empirically investigates the impacts of prior experience with similar…
Guideline for Software Documentation Management.
ERIC Educational Resources Information Center
National Bureau of Standards (DOC), Washington, DC.
Designed as a basic reference for federal personnel concerned with the development, maintenance, enhancement, control, and management of computer-based systems, this manual provides a general overview of the software development process and software documentation issues so that managers can assess their own documentation requirements. Reference is…
The integration of the risk management process with the lifecycle of medical device software.
Pecoraro, F; Luzi, D
2014-01-01
The application of software in the Medical Device (MD) domain has become central to the improvement of diagnoses and treatments. The new European regulations that specifically address software as an important component of MD, require complex procedures to make software compliant with safety requirements, introducing thereby new challenges in the qualification and classification of MD software as well as in the performance of risk management activities. Under this perspective, the aim of this paper is to propose an integrated framework that combines the activities to be carried out by the manufacturer to develop safe software within the development lifecycle based on the regulatory requirements reported in US and European regulations as well as in the relevant standards and guidelines. A comparative analysis was carried out to identify the main issues related to the application of the current new regulations. In addition, standards and guidelines recently released to harmonise procedures for the validation of MD software have been used to define the risk management activities to be carried out by the manufacturer during the software development process. This paper highlights the main issues related to the qualification and classification of MD software, providing an analysis of the different regulations applied in Europe and the US. A model that integrates the risk management process within the software development lifecycle has been proposed too. It is based on regulatory requirements and considers software risk analysis as a central input to be managed by the manufacturer already at the initial stages of the software design, in order to prevent MD failures. Relevant changes in the process of MD development have been introduced with the recognition of software being an important component of MDs as stated in regulations and standards. This implies the performance of highly iterative processes that have to integrate the risk management in the framework of software development. It also makes it necessary to involve both medical and software engineering competences to safeguard patient and user safety.
NASA Astrophysics Data System (ADS)
Hardiyanti, Y.; Haekal, M.; Waris, A.; Haryanto, F.
2016-08-01
This research compares the quadratic optimization program on Intensity Modulated Radiation Therapy Treatment Planning (IMRTP) with the Computational Environment for Radiotherapy Research (CERR) software. We assumed that the number of beams used for the treatment planner was about 9 and 13 beams. The case used the energy of 6 MV with Source Skin Distance (SSD) of 100 cm from target volume. Dose calculation used Quadratic Infinite beam (QIB) from CERR. CERR was used in the comparison study between Gauss Primary threshold method and Gauss Primary exponential method. In the case of lung cancer, the threshold variation of 0.01, and 0.004 was used. The output of the dose was distributed using an analysis in the form of DVH from CERR. The maximum dose distributions obtained were on the target volume (PTV) Planning Target Volume, (CTV) Clinical Target Volume, (GTV) Gross Tumor Volume, liver, and skin. It was obtained that if the dose calculation method used exponential and the number of beam 9. When the dose calculation method used the threshold and the number of beam 13, the maximum dose distributions obtained were on the target volume PTV, GTV, heart, and skin.
NASA Astrophysics Data System (ADS)
Yamashita, T.; Akagi, T.; Aso, T.; Kimura, A.; Sasaki, T.
2012-11-01
The pencil beam algorithm (PBA) is reasonably accurate and fast. It is, therefore, the primary method used in routine clinical treatment planning for proton radiotherapy; still, it needs to be validated for use in highly inhomogeneous regions. In our investigation of the effect of patient inhomogeneity, PBA was compared with Monte Carlo (MC). A software framework was developed for the MC simulation of radiotherapy based on Geant4. Anatomical sites selected for the comparison were the head/neck, liver, lung and pelvis region. The dose distributions calculated by the two methods in selected examples were compared, as well as a dose volume histogram (DVH) derived from the dose distributions. The comparison of the off-center ratio (OCR) at the iso-center showed good agreement between the PBA and MC, while discrepancies were seen around the distal fall-off regions. While MC showed a fine structure on the OCR in the distal fall-off region, the PBA showed smoother distribution. The fine structures in MC calculation appeared downstream of very low-density regions. Comparison of DVHs showed that most of the target volumes were similarly covered, while some OARs located around the distal region received a higher dose when calculated by MC than the PBA.
Code of Federal Regulations, 2012 CFR
2012-07-01
... such as disk, tape) or type (e.g., data bases, applications software, data base management software...., date bases, applications software, data base management software, utilities), sufficient to reflect... timeliness of the cost data, the FBI or other representatives of the Government shall have the right to...
Code of Federal Regulations, 2010 CFR
2010-07-01
... such as disk, tape) or type (e.g., data bases, applications software, data base management software...., date bases, applications software, data base management software, utilities), sufficient to reflect... timeliness of the cost data, the FBI or other representatives of the Government shall have the right to...
Code of Federal Regulations, 2011 CFR
2011-07-01
... such as disk, tape) or type (e.g., data bases, applications software, data base management software...., date bases, applications software, data base management software, utilities), sufficient to reflect... timeliness of the cost data, the FBI or other representatives of the Government shall have the right to...
Code of Federal Regulations, 2013 CFR
2013-07-01
... such as disk, tape) or type (e.g., data bases, applications software, data base management software...., date bases, applications software, data base management software, utilities), sufficient to reflect... timeliness of the cost data, the FBI or other representatives of the Government shall have the right to...
Code of Federal Regulations, 2014 CFR
2014-07-01
... such as disk, tape) or type (e.g., data bases, applications software, data base management software...., date bases, applications software, data base management software, utilities), sufficient to reflect... timeliness of the cost data, the FBI or other representatives of the Government shall have the right to...
The dynamics of software development project management: An integrative systems dynamic perspective
NASA Technical Reports Server (NTRS)
Vandervelde, W. E.; Abdel-Hamid, T.
1984-01-01
Rather than continuing to focus on software development projects per se, the system dynamics modeling approach outlined is extended to investigate a broader set of issues pertaining to the software development organization. Rather than trace the life cycle(s) of one or more software projects, the focus is on the operations of a software development department as a continuous stream of software products are developed, placed into operation, and maintained. A number of research questions are ""ripe'' for investigating including: (1) the efficacy of different organizational structures in different software development environments, (2) personnel turnover, (3) impact of management approaches such as management by objectives, and (4) the organizational/environmental determinants of productivity.
Ada and software management in NASA: Assessment and recommendations
NASA Technical Reports Server (NTRS)
1989-01-01
Recent NASA missions have required software systems that are larger, more complex, and more critical than NASA software systems of the past. The Ada programming language and the software methods and support environments associated with it are seen as potential breakthroughs in meeting NASA's software requirements. The findings of a study by the Ada and Software Management Assessment Working Group (ASMAWG) are presented. The study was chartered to perform three tasks: (1) assess the agency's ongoing and planned Ada activities; (2) assess the infrastructure (standards, policies, and internal organizations) supporting software management and the Ada activities; and (3) present an Ada implementation and use strategy appropriate for NASA over the next 5 years.
15 CFR 995.25 - Quality management system.
Code of Federal Regulations, 2012 CFR
2012-01-01
... management system are those defined in this part. The quality management system must ensure that the... type approved conversion software is maintained by a third party, CEVAD shall ensure that no changes made to the conversion software render the type approval of the conversion software invalid, and shall...
15 CFR 995.25 - Quality management system.
Code of Federal Regulations, 2014 CFR
2014-01-01
... management system are those defined in this part. The quality management system must ensure that the... type approved conversion software is maintained by a third party, CEVAD shall ensure that no changes made to the conversion software render the type approval of the conversion software invalid, and shall...
15 CFR 995.25 - Quality management system.
Code of Federal Regulations, 2011 CFR
2011-01-01
... management system are those defined in this part. The quality management system must ensure that the... type approved conversion software is maintained by a third party, CEVAD shall ensure that no changes made to the conversion software render the type approval of the conversion software invalid, and shall...
15 CFR 995.25 - Quality management system.
Code of Federal Regulations, 2013 CFR
2013-01-01
... management system are those defined in this part. The quality management system must ensure that the... type approved conversion software is maintained by a third party, CEVAD shall ensure that no changes made to the conversion software render the type approval of the conversion software invalid, and shall...
Academic Web Authoring Mulitmedia Development and Course Management Tools
ERIC Educational Resources Information Center
Halloran, Margaret E.
2005-01-01
Course management software enables faculty members to learn one software package for web-based curriculum, assessment, synchronous and asynchronous discussions, collaborative work, multimedia and interactive resource development. There are as many as 109 different course management software packages on the market and several studies have evaluated…
Management Aspects of Software Maintenance.
1984-09-01
educated in * the complex nature of software maintenance to be able to properly evaluate and manage the software maintenance effort. In this...maintenance and improvement may be called "software evolution". The soft- ware manager must be Educated in the complex nature cf soft- Iware maintenance to be...complaint of error or request for modification is also studied in order to determine what action needs tc be taken. 2. Define Objective and Approach :
System integration test plan for HANDI 2000 business management system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, D.
This document presents the system integration test plan for the Commercial-Off-The-Shelf, PassPort and PeopleSoft software, and custom software created to work with the COTS products. The PP software is an integrated application for AP, Contract Management, Inventory Management, Purchasing and Material Safety Data Sheet. The PS software is an integrated application for Project Costing, General Ledger, Human Resources/Training, Payroll, and Base Benefits.
Software Acquisition Risk Management Key Process Area (KPA) - A Guidebook Version 1.0.
1997-08-01
Budget - Software Project Management Practices and Techniques. McGraw-Hill International (UK) Limited, 1992. [Boehm 81 ] Boehm, Barry . Software...Engineering Economics. Englewood Cliffs, N.J.: Prentice-Hall, Inc., 1981. [Boehm 89] Boehm, Barry . IEEE Tutorial on Software Risk Management. New York: IEEE...95] [Mayrhauser 90] [Moran 90] [Myers 96] [NRC 89] [Osborn 53] [Paulk 95] [ Pressman 92] [Pulford 96] [Scholtes 88] [Sisti 94] [STSC 96
The Effectiveness of Software Project Management Practices: A Quantitative Measurement
2011-03-01
Assessment (SPMMA) model ( Ramli , 2007). The purpose of the SPMMA was to help a company measure the strength and weaknesses of its software project...Practices,” Fuazi and Ramli presented a model to assess software project management practices using their Software Project Management Maturity...Analysis The SPMMA was carried out on one mid-size Information Technology (IT) Company . Based on the questionnaire responses, interviews and discussions
ISWHM: Tools and Techniques for Software and System Health Management
NASA Technical Reports Server (NTRS)
Schumann, Johann; Mengshoel, Ole J.; Darwiche, Adnan
2010-01-01
This presentation presents status and results of research on Software Health Management done within the NRA "ISWHM: Tools and Techniques for Software and System Health Management." Topics include: Ingredients of a Guidance, Navigation, and Control System (GN and C); Selected GN and C Testbed example; Health Management of major ingredients; ISWHM testbed architecture; and Conclusions and next Steps.
Writing references and using citation management software.
Sungur, Mukadder Orhan; Seyhan, Tülay Özkan
2013-09-01
The correct citation of references is obligatory to gain scientific credibility, to honor the original ideas of previous authors and to avoid plagiarism. Currently, researchers can easily find, cite and store references using citation management software. In this review, two popular citation management software programs (EndNote and Mendeley) are summarized.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-27
... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-677] In the Matter of: Certain Course Management System Software Products; Notice of Commission Determination Not To Review an Initial... course management system software products that infringe certain claims of United States Patent No. 6,988...
76 FR 12617 - Airworthiness Directives; The Boeing Company Model 777-200 and -300 Series Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-08
... installing new operational software for the electrical load management system and configuration database... the electrical load management system operational software and configuration database software, in... Management, P.O. Box 3707, MC 2H-65, Seattle, Washington 98124-2207; telephone 206- 544-5000, extension 1...
Concept Development for Software Health Management
NASA Technical Reports Server (NTRS)
Riecks, Jung; Storm, Walter; Hollingsworth, Mark
2011-01-01
This report documents the work performed by Lockheed Martin Aeronautics (LM Aero) under NASA contract NNL06AA08B, delivery order NNL07AB06T. The Concept Development for Software Health Management (CDSHM) program was a NASA funded effort sponsored by the Integrated Vehicle Health Management Project, one of the four pillars of the NASA Aviation Safety Program. The CD-SHM program focused on defining a structured approach to software health management (SHM) through the development of a comprehensive failure taxonomy that is used to characterize the fundamental failure modes of safety-critical software.
GSC configuration management plan
NASA Technical Reports Server (NTRS)
Withers, B. Edward
1990-01-01
The tools and methods used for the configuration management of the artifacts (including software and documentation) associated with the Guidance and Control Software (GCS) project are described. The GCS project is part of a software error studies research program. Three implementations of GCS are being produced in order to study the fundamental characteristics of the software failure process. The Code Management System (CMS) is used to track and retrieve versions of the documentation and software. Application of the CMS for this project is described and the numbering scheme is delineated for the versions of the project artifacts.
The personal receiving document management and the realization of email function in OAS
NASA Astrophysics Data System (ADS)
Li, Biqing; Li, Zhao
2017-05-01
This software is an independent software system, suitable for small and medium enterprises, contains personal office, scientific research project management and system management functions, independently run in relevant environment, and to solve practical needs. This software is an independent software system, using the current popular B/S (browser/server) structure and ASP.NET technology development, using the Windows 7 operating system, Microsoft SQL Server2005 Visual2008 and database as a development platform, suitable for small and medium enterprises, contains personal office, scientific research project management and system management functions, independently run in relevant environment, and to solve practical needs.
Glaser, S; Warfel, B; Price, J; Sinacore, J; Albuquerque, K
2012-10-01
Virtual reality simulation software (VRS - FocalSim Version 4.40 with VRS prototype, Computerized Medical Systems, St. Louis, MO) is a new radiation dose planning tool that allows for 3D visualization of the patient and the machine couch (treatment table) in relationship to the linear accelerator. This allows the radiation treatment planner to have a "room's-eye-view" and enhances the process of virtual simulation. The aim of this study was to compare VRS to a standard planning program (XiO - Version 4.50, Computerized Medical Systems, St. Louis, MO) in regards to the time it took to use each program, the angles chosen in each, and to determine if there was a dosimetric benefit to using VRS. Ten patients who had undergone left-sided lumpectomies were chosen to have treatment plans generated. A partial breast irradiation (PBI) treatment plan by external beam radiation therapy (EBRT) was generated for each patient using two different methods. In the first method the full plan was generated using XiO software. In the second method beam angles were chosen using the VRS software, those angles were transferred to XiO, and the remaining part of the plan was completed using XiO (since VRS does not allow dose calculations). On average, using VRS to choose angles took about 10 minutes longer than XiO. None of the five gantry angles differed significantly between the two programs, but four of the five couch angles did. Dose-volume histogram (DVH) data showed a significantly better conformality index, and trends toward decreased hot spots and increased coverage of the planed treatment volume (PTV) when using VRS. However, when angels were chosen in VRS a greater volume of the ipsilateral breast received a low dose of radiation (between 3% and 50% of the prescribed dose) (VRS = 23.06%, XiO = 19.57%, p < 0.0005). A significant advantage that VRS provided over XiO was the ability to detect potential collisions prior to actual treatment of the patient in three of the ten patients studied. The potential to save time with VRS by not having to redo plans because of a collision increases clinic efficiency.
Program Helps Standardize Documentation Of Software
NASA Technical Reports Server (NTRS)
Howe, G.
1994-01-01
Intelligent Documentation Management System, IDMS, computer program developed to assist project managers in implementing information system documentation standard known as NASA-STD-2100-91, NASA STD, COS-10300, of NASA's Software Management and Assurance Program. Standard consists of data-item descriptions or templates, each of which governs particular component of software documentation. IDMS helps program manager in tailoring documentation standard to project. Written in C language.
78 FR 23866 - Airworthiness Directives; the Boeing Company
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-23
... operational software in the cabin management system, and loading new software into the mass memory card. The...-200 and -300 series airplanes. The proposed AD would have required installing new operational software in the cabin management system, and loading new software into the mass memory card. Since the...
Software-Enabled Project Management Techniques and Their Relationship to the Triple Constraints
ERIC Educational Resources Information Center
Elleh, Festus U.
2013-01-01
This study investigated the relationship between software-enabled project management techniques and the triple constraints (time, cost, and scope). There was the dearth of academic literature that focused on the relationship between software-enabled project management techniques and the triple constraints (time, cost, and scope). Based on the gap…
Writing references and using citation management software
Sungur, Mukadder Orhan; Seyhan, Tülay Özkan
2013-01-01
The correct citation of references is obligatory to gain scientific credibility, to honor the original ideas of previous authors and to avoid plagiarism. Currently, researchers can easily find, cite and store references using citation management software. In this review, two popular citation management software programs (EndNote and Mendeley) are summarized. PMID:26328132
ERIC Educational Resources Information Center
Managan, William H.
1999-01-01
Describes a facilities-management software program that helps managers better document and understand maintenance backlogs, improvements, and future cyclic renewal needs. Major software components are examined including a software tool that filters, groups, and ranks projects to help determine funding requests. (GR)
1992-12-01
provide program 5 managers some level of confidence that their software will operate at an acceptable level of risk. A number of structured safety...safety within the constraints of operational effectiveness, schedule, and cost through timely application of system safety management and engineering...Master of Science in Software Systems Management Peter W. Colan, B.S.E. Robert W. Prouhet, B.S. Captain, USAF Captain, USAF December 1992 Approved for
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beltran, C; Kamal, H
Purpose: To provide a multicriteria optimization algorithm for intensity modulated radiation therapy using pencil proton beam scanning. Methods: Intensity modulated radiation therapy using pencil proton beam scanning requires efficient optimization algorithms to overcome the uncertainties in the Bragg peaks locations. This work is focused on optimization algorithms that are based on Monte Carlo simulation of the treatment planning and use the weights and the dose volume histogram (DVH) control points to steer toward desired plans. The proton beam treatment planning process based on single objective optimization (representing a weighted sum of multiple objectives) usually leads to time-consuming iterations involving treatmentmore » planning team members. We proved a time efficient multicriteria optimization algorithm that is developed to run on NVIDIA GPU (Graphical Processing Units) cluster. The multicriteria optimization algorithm running time benefits from up-sampling of the CT voxel size of the calculations without loss of fidelity. Results: We will present preliminary results of Multicriteria optimization for intensity modulated proton therapy based on DVH control points. The results will show optimization results of a phantom case and a brain tumor case. Conclusion: The multicriteria optimization of the intensity modulated radiation therapy using pencil proton beam scanning provides a novel tool for treatment planning. Work support by a grant from Varian Inc.« less
Software development environments: Status and trends
NASA Technical Reports Server (NTRS)
Duffel, Larry E.
1988-01-01
Currently software engineers are the essential integrating factors tying several components together. The components consist of process, methods, computers, tools, support environments, and software engineers. The engineers today empower the tools versus the tools empowering the engineers. Some of the issues in software engineering are quality, managing the software engineering process, and productivity. A strategy to accomplish this is to promote the evolution of software engineering from an ad hoc, labor intensive activity to a managed, technology supported discipline. This strategy may be implemented by putting the process under management control, adopting appropriate methods, inserting the technology that provides automated support for the process and methods, collecting automated tools into an integrated environment and educating the personnel.
[Application of password manager software in health care].
Ködmön, József
2016-12-01
When using multiple IT systems, handling of passwords in a secure manner means a potential source of problem. The most frequent issues are choosing the appropriate length and complexity, and then remembering the strong passwords. Password manager software provides a good solution for this problem, while greatly increasing the security of sensitive medical data. This article introduces a password manager software and provides basic information of the application. It also discusses how to select a really secure password manager software and suggests a practical application to efficient, safe and comfortable use for health care. Orv. Hetil., 2016, 157(52), 2066-2073.
Final Report Ra Power Management 1255 10-15-16 FINAL_Public
DOE Office of Scientific and Technical Information (OSTI.GOV)
Iverson, Aaron
Ra Power Management (RPM) has developed a cloud based software platform that manages the financial and operational functions of third party financed solar projects throughout their lifecycle. RPM’s software streamlines and automates the sales, financing, and management of a portfolio of solar assets. The software helps solar developers automate the most difficult aspects of asset management, leading to increased transparency, efficiency, and reduction in human error. More importantly, our platform will help developers save money by improving their operating margins
Software Size Estimation Using Expert Estimation: A Fuzzy Logic Approach
ERIC Educational Resources Information Center
Stevenson, Glenn A.
2012-01-01
For decades software managers have been using formal methodologies such as the Constructive Cost Model and Function Points to estimate the effort of software projects during the early stages of project development. While some research shows these methodologies to be effective, many software managers feel that they are overly complicated to use and…
A Process for Evaluating Student Records Management Software. ERIC/AE Digest.
ERIC Educational Resources Information Center
Vecchioli, Lisa
This digest provides practical advice on evaluating software for managing student records. An evaluation of record-keeping software should start with a process to identify all of the individual needs the software produce must meet in order to be considered for purchase. The first step toward establishing an administrative computing system is…
NASA Technical Reports Server (NTRS)
Hancock, David W., III
1999-01-01
This document provides the Software Management Plan for the GLAS Standard Data Software (SDS) supporting the GLAS instrument of the EOS ICESat Spacecraft. The SDS encompasses the ICESat Science Investigator-led Processing System (I-SIPS) Software and the Instrument Support Terminal (IST) Software. For the I-SIPS Software, the SDS will produce Level 0, Level 1, and Level 2 data products as well as the associated product quality assessments and descriptive information. For the IST Software, the SDS will accommodate the GLAS instrument support areas of engineering status, command, performance assessment, and instrument health status.
A Prototype for the Support of Integrated Software Process Development and Improvement
NASA Astrophysics Data System (ADS)
Porrawatpreyakorn, Nalinpat; Quirchmayr, Gerald; Chutimaskul, Wichian
An efficient software development process is one of key success factors for quality software. Not only can the appropriate establishment but also the continuous improvement of integrated project management and of the software development process result in efficiency. This paper hence proposes a software process maintenance framework which consists of two core components: an integrated PMBOK-Scrum model describing how to establish a comprehensive set of project management and software engineering processes and a software development maturity model advocating software process improvement. Besides, a prototype tool to support the framework is introduced.
NASA Technical Reports Server (NTRS)
Hayhurst, Kelly J. (Editor)
2008-01-01
The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes configuration management and quality assurance documents from the GCS project. Volume 4 contains six appendices: A. Software Accomplishment Summary for the Guidance and Control Software Project; B. Software Configuration Index for the Guidance and Control Software Project; C. Configuration Management Records for the Guidance and Control Software Project; D. Software Quality Assurance Records for the Guidance and Control Software Project; E. Problem Report for the Pluto Implementation of the Guidance and Control Software Project; and F. Support Documentation Change Reports for the Guidance and Control Software Project.
Improving the Effectiveness of Program Managers
2006-05-03
Improving the Effectiveness of Program Managers Systems and Software Technology Conference Salt Lake City, Utah May 3, 2006 Presented by GAO’s...Companies’ best practices Motorola Caterpillar Toyota FedEx NCR Teradata Boeing Hughes Space and Communications Disciplined software and management...and total ownership costs Collection of metrics data to improve software reliability Technology readiness levels and design maturity Statistical
Health software: a new CEI Guide for software management in medical environment.
Giacomozzi, Claudia; Martelli, Francesco
2016-01-01
The increasing spread of software components in the healthcare context renders explanatory guides relevant and mandatory to interpret laws and standards, and to support safe management of software products in healthcare. In 2012 a working group has been settled for the above purposes at Italian Electrotechnical Committee (CEI), made of experts from Italian National Institute of Health (ISS), representatives of industry, and representatives of the healthcare organizations. As a first outcome of the group activity, Guide CEI 62-237 was published in February 2015. The Guide incorporates an innovative approach based on the proper contextualization of software products, either medical devices or not, to the specific healthcare scenario, and addresses the risk management of IT systems. The Guide provides operators and manufacturers with an interpretative support with many detailed examples to facilitate the proper contextualization and management of health software, in compliance with related European and international regulations and standards.
Vehicle management and mission planning systems with shuttle applications
NASA Technical Reports Server (NTRS)
1972-01-01
A preliminary definition of a concept for an automated system is presented that will support the effective management and planning of space shuttle operations. It is called the Vehicle Management and Mission Planning System (VMMPS). In addition to defining the system and its functions, some of the software requirements of the system are identified and a phased and evolutionary method is recommended for software design, development, and implementation. The concept is composed of eight software subsystems supervised by an executive system. These subsystems are mission design and analysis, flight scheduler, launch operations, vehicle operations, payload support operations, crew support, information management, and flight operations support. In addition to presenting the proposed system, a discussion of the evolutionary software development philosophy that the Mission Planning and Analysis Division (MPAD) would propose to use in developing the required supporting software is included. A preliminary software development schedule is also included.
Advanced information processing system: Input/output network management software
NASA Technical Reports Server (NTRS)
Nagle, Gail; Alger, Linda; Kemp, Alexander
1988-01-01
The purpose of this document is to provide the software requirements and specifications for the Input/Output Network Management Services for the Advanced Information Processing System. This introduction and overview section is provided to briefly outline the overall architecture and software requirements of the AIPS system before discussing the details of the design requirements and specifications of the AIPS I/O Network Management software. A brief overview of the AIPS architecture followed by a more detailed description of the network architecture.
Software management tools: Lessons learned from use
NASA Technical Reports Server (NTRS)
Reifer, D. J.; Valett, J.; Knight, J.; Wenneson, G.
1985-01-01
Experience in inserting software project planning tools into more than 100 projects producing mission critical software are discussed. The problems the software project manager faces are listed along with methods and tools available to handle them. Experience is reported with the Project Manager's Workstation (PMW) and the SoftCost-R cost estimating package. Finally, the results of a survey, which looked at what could be done in the future to overcome the problems experienced and build a set of truly useful tools, are presented.
SAGA: A project to automate the management of software production systems
NASA Technical Reports Server (NTRS)
Campbell, R. H.; Badger, W.; Beckman, C. S.; Beshers, G.; Hammerslag, D.; Kimball, J.; Kirslis, P. A.; Render, H.; Richards, P.; Terwilliger, R.
1984-01-01
The project to automate the management of software production systems is described. The SAGA system is a software environment that is designed to support most of the software development activities that occur in a software lifecycle. The system can be configured to support specific software development applications using given programming languages, tools, and methodologies. Meta-tools are provided to ease configuration. Several major components of the SAGA system are completed to prototype form. The construction methods are described.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Iverson, Aaron
Ra Power Management (RPM) has developed a cloud based software platform that manages the financial and operational functions of third party financed solar projects throughout their lifecycle. RPM’s software streamlines and automates the sales, financing, and management of a portfolio of solar assets. The software helps solar developers automate the most difficult aspects of asset management, leading to increased transparency, efficiency, and reduction in human error. More importantly, our platform will help developers save money by improving their operating margins
Continuous Risk Management Course. Revised
NASA Technical Reports Server (NTRS)
Hammer, Theodore F.
1999-01-01
This document includes a course plan for Continuous Risk Management taught by the Software Assurance Technology Center along with the Continuous Risk Management Guidebook of the Software Engineering Institute of Carnegie Mellon University and a description of Continuous Risk Management at NASA.
Implementing Extreme Programming in Distributed Software Project Teams: Strategies and Challenges
NASA Astrophysics Data System (ADS)
Maruping, Likoebe M.
Agile software development methods and distributed forms of organizing teamwork are two team process innovations that are gaining prominence in today's demanding software development environment. Individually, each of these innovations has yielded gains in the practice of software development. Agile methods have enabled software project teams to meet the challenges of an ever turbulent business environment through enhanced flexibility and responsiveness to emergent customer needs. Distributed software project teams have enabled organizations to access highly specialized expertise across geographic locations. Although much progress has been made in understanding how to more effectively manage agile development teams and how to manage distributed software development teams, managers have little guidance on how to leverage these two potent innovations in combination. In this chapter, I outline some of the strategies and challenges associated with implementing agile methods in distributed software project teams. These are discussed in the context of a study of a large-scale software project in the United States that lasted four months.
Microcomputer Software for Libraries: A Survey.
ERIC Educational Resources Information Center
Nolan, Jeanne M.
1983-01-01
Reports on findings of research done by Nolan Information Management Services concerning availability of microcomputer software for libraries. Highlights include software categories (specific, generic-database management programs, original); number of programs available in 1982 for 12 applications; projections for 1983; and future software…
7 Processes that Enable NASA Software Engineering Technologies: Value-Added Process Engineering
NASA Technical Reports Server (NTRS)
Housch, Helen; Godfrey, Sally
2011-01-01
The presentation reviews Agency process requirements and the purpose, benefits, and experiences or seven software engineering processes. The processes include: product integration, configuration management, verification, software assurance, measurement and analysis, requirements management, and planning and monitoring.
Software Engineering Laboratory (SEL) relationships, models, and management rules
NASA Technical Reports Server (NTRS)
Decker, William; Hendrick, Robert; Valett, Jon D.
1991-01-01
Over 50 individual Software Engineering Laboratory (SEL) research results, extracted from a review of published SEL documentation, that can be applied directly to managing software development projects are captured. Four basic categories of results are defined and discussed - environment profiles, relationships, models, and management rules. In each category, research results are presented as a single page that summarizes the individual result, lists potential uses of the result by managers, and references the original SEL documentation where the result was found. The document serves as a concise reference summary of applicable research for SEL managers.
NASA Technical Reports Server (NTRS)
Voigt, S. (Editor); Beskenis, S. (Editor)
1985-01-01
Issues in the development of software for the Space Station are discussed. Software acquisition and management, software development environment, standards, information system support for software developers, and a future software advisory board are addressed.
interoperability emerging infrastructure for data management on computational grids Software Packages Services : ATLAS: Management and Steering: Computing Management Board Software Project Management Board Database Model Group Computing TDR: 4.5 Event Data 4.8 Database and Data Management Services 6.3.4 Production and
Space Flight Software Development Software for Intelligent System Health Management
NASA Technical Reports Server (NTRS)
Trevino, Luis C.; Crumbley, Tim
2004-01-01
The slide presentation examines the Marshall Space Flight Center Flight Software Branch, including software development projects, mission critical space flight software development, software technical insight, advanced software development technologies, and continuous improvement in the software development processes and methods.
Custodial Management in the Information Age.
ERIC Educational Resources Information Center
Harris, Jim, Sr.
1999-01-01
Explains how computerizing the custodial department can be achieved through bar coding, hand-held readers, and the appropriate software packages. Software programs that aid cleaning management, track assets, and manage stock are discussed. (GR)
New technologies for supporting real-time on-board software development
NASA Astrophysics Data System (ADS)
Kerridge, D.
1995-03-01
The next generation of on-board data management systems will be significantly more complex than current designs, and will be required to perform more complex and demanding tasks in software. Improved hardware technology, in the form of the MA31750 radiation hard processor, is one key component in addressing the needs of future embedded systems. However, to complement these hardware advances, improved support for the design and implementation of real-time data management software is now needed. This will help to control the cost and risk assoicated with developing data management software development as it becomes an increasingly significant element within embedded systems. One particular problem with developing embedded software is managing the non-functional requirements in a systematic way. This paper identifies how Logica has exploited recent developments in hard real-time theory to address this problem through the use of new hard real-time analysis and design methods which can be supported by specialized tools. The first stage in transferring this technology from the research domain to industrial application has already been completed. The MA37150 Hard Real-Time Embedded Software Support Environment (HESSE) is a loosely integrated set of hardware and software tools which directly support the process of hard real-time analysis for software targeting the MA31750 processor. With further development, this HESSE promises to provide embedded system developers with software tools which can reduce the risks associated with developing complex hard real-time software. Supported in this way by more sophisticated software methods and tools, it is foreseen that MA31750 based embedded systems can meet the processing needs for the next generation of on-board data management systems.
NASA Astrophysics Data System (ADS)
Schyns, Lotte E. J. R.; Persoon, Lucas C. G. G.; Podesta, Mark; van Elmpt, Wouter J. C.; Verhaegen, Frank
2016-05-01
The aim of this work is to compare time-resolved (TR) and time-integrated (TI) portal dosimetry, focussing on the role of an object’s position with respect to the isocenter in volumetric modulated arc therapy (VMAT). Portal dose images (PDIs) are simulated and measured for different cases: a sphere (1), a bovine bone (2) and a patient geometry (3). For the simulated case (1) and the experimental case (2), several transformations are applied at different off-axis positions. In the patient case (3), three simple plans with different isocenters are created and pleural effusion is simulated in the patient. The PDIs before and after the sphere transformations, as well as the PDIs with and without simulated pleural effusion, are compared using a TI and TR gamma analysis. In addition, the performance of the TI and TR gamma analyses for the detection of real geometric changes in patients treated with clinical plans is investigated and a correlation analysis is performed between gamma fail rates and differences in dose volume histogram (DVH) metrics. The TI gamma analysis can show large differences in gamma fail rates for the same transformation at different off-axis positions (or for different plan isocenters). The TR gamma analysis, however, shows consistent gamma fail rates. For the detection of real geometric changes in patients treated with clinical plans, the TR gamma analysis has a higher sensitivity than the TI gamma analysis. However, the specificity for the TR gamma analysis is lower than for the TI gamma analysis. Both the TI and TR gamma fail rates show no correlation with changes in DVH metrics. This work shows that TR portal dosimetry is fundamentally superior to TI portal dosimetry, because it removes the strong dependence of the gamma fail rate on the off-axis position/plan isocenter. However, for 2D TR portal dosimetry, it is still difficult to interpret gamma fail rates in terms of changes in DVH metrics for patients treated with VMAT.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gueorguiev, G; Cotter, C; Young, M
2016-06-15
Purpose: To present a 3D QA method and clinical results for 550 patients. Methods: Five hundred and fifty patient treatment deliveries (400 IMRT, 75 SBRT and 75 VMAT) from various treatment sites, planned on Raystation treatment planning system (TPS), were measured on three beam-matched Elekta linear accelerators using IBA’s COMPASS system. The difference between TPS computed and delivered dose was evaluated in 3D by applying three statistical parameters to each structure of interest: absolute average dose difference (AADD, 6% allowed difference), absolute dose difference greater than 6% (ADD6, 4% structure volume allowed to fail) and 3D gamma test (3%/3mm DTA,more » 4% structure volume allowed to fail). If the allowed value was not met for a given structure, manual review was performed. The review consisted of overlaying dose difference or gamma results with the patient CT, scrolling through the slices. For QA to pass, areas of high dose difference or gamma must be small and not on consecutive slices. For AADD to manually pass QA, the average dose difference in cGy must be less than 50cGy. The QA protocol also includes DVH analysis based on QUANTEC and TG-101 recommended dose constraints. Results: Figures 1–3 show the results for the three parameters per treatment modality. Manual review was performed on 67 deliveries (27 IMRT, 22 SBRT and 18 VMAT), for which all passed QA. Results show that statistical parameter AADD may be overly sensitive for structures receiving low dose, especially for the SBRT deliveries (Fig.1). The TPS computed and measured DVH values were in excellent agreement and with minimum difference. Conclusion: Applying DVH analysis and different statistical parameters to any structure of interest, as part of the 3D QA protocol, provides a comprehensive treatment plan evaluation. Author G. Gueorguiev discloses receiving travel and research funding from IBA for unrelated to this project work. Author B. Crawford discloses receiving travel funding from IBA for unrelated to this project work.« less
NASA Astrophysics Data System (ADS)
Burgos, Ninon; Guerreiro, Filipa; McClelland, Jamie; Presles, Benoît; Modat, Marc; Nill, Simeon; Dearnaley, David; deSouza, Nandita; Oelfke, Uwe; Knopf, Antje-Christin; Ourselin, Sébastien; Cardoso, M. Jorge
2017-06-01
To tackle the problem of magnetic resonance imaging (MRI)-only radiotherapy treatment planning (RTP), we propose a multi-atlas information propagation scheme that jointly segments organs and generates pseudo x-ray computed tomography (CT) data from structural MR images (T1-weighted and T2-weighted). As the performance of the method strongly depends on the quality of the atlas database composed of multiple sets of aligned MR, CT and segmented images, we also propose a robust way of registering atlas MR and CT images, which combines structure-guided registration, and CT and MR image synthesis. We first evaluated the proposed framework in terms of segmentation and CT synthesis accuracy on 15 subjects with prostate cancer. The segmentations obtained with the proposed method were compared using the Dice score coefficient (DSC) to the manual segmentations. Mean DSCs of 0.73, 0.90, 0.77 and 0.90 were obtained for the prostate, bladder, rectum and femur heads, respectively. The mean absolute error (MAE) and the mean error (ME) were computed between the reference CTs (non-rigidly aligned to the MRs) and the pseudo CTs generated with the proposed method. The MAE was on average 45.7+/- 4.6 HU and the ME -1.6+/- 7.7 HU. We then performed a dosimetric evaluation by re-calculating plans on the pseudo CTs and comparing them to the plans optimised on the reference CTs. We compared the cumulative dose volume histograms (DVH) obtained for the pseudo CTs to the DVH obtained for the reference CTs in the planning target volume (PTV) located in the prostate, and in the organs at risk at different DVH points. We obtained average differences of -0.14 % in the PTV for {{D}98 % } , and between -0.14 % and 0.05% in the PTV, bladder, rectum and femur heads for D mean and {{D}2 % } . Overall, we demonstrate that the proposed framework is able to automatically generate accurate pseudo CT images and segmentations in the pelvic region, potentially bypassing the need for CT scan for accurate RTP.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang Jiayi; Robertson, John M., E-mail: jrobertson@beaumont.edu; Ye Hong
2012-07-15
Purpose: To identify dosimetric predictors for the development of gastrointestinal (GI) toxicity in patients with locally advanced pancreatic adenocarcinoma (LAPC) treated with concurrent full-dose gemcitabine and radiotherapy (GemRT). Methods and Materials: From June 2002 to June 2009, 46 LAPC patients treated with definitive GemRT were retrospectively analyzed. The stomach and duodenum were retrospectively contoured separately to determine their dose-volume histogram (DVH) parameters. GI toxicity was defined as Grade 3 or higher GI toxicity. The follow-up time was calculated from the start of RT to the date of death or last contact. Univariate analysis (UVA) and multivariate analysis (MVA) using Kaplan-Meiermore » and Cox regression models were performed to identify risk factors associated with GI toxicity. The receiver operating characteristic curve and the area under the receiver operating characteristic curve (AUC) were used to determine the best DVH parameter to predict for GI toxicity. Results: Of the patients, 28 (61%) received concurrent gemcitabine alone, and 18 (39%) had concurrent gemcitabine with daily erlotinib. On UVA, only the V{sub 20Gy} to V{sub 35Gy} of duodenum were significantly associated with GI toxicity (all p {<=} 0.05). On MVA, the V{sub 25Gy} of duodenum and the use of erlotinib were independent risk factors for GI toxicity (p = 0.006 and 0.02, respectively). For the entire cohort, the V{sub 25Gy} of duodenum is the best predictor for GI toxicity (AUC = 0.717), and the 12-month GI toxicity rate was 8% vs. 48% for V{sub 25Gy} {<=} 45% and V{sub 25Gy} > 45%, respectively (p = 0.03). However, excluding the erlotinib group, the V{sub 35Gy} is the best predictor (AUC = 0.725), and the 12-month GI toxicity rate was 0% vs. 41% for V{sub 35Gy} {<=} 20% and V{sub 35Gy} > 20%, respectively (p = 0.04). Conclusions: DVH parameters of duodenum may predict Grade 3 GI toxicity after GemRT for LAPC. Concurrent use of erlotinib during GemRT may increase GI toxicity.« less
A Varian DynaLog file-based procedure for patient dose-volume histogram-based IMRT QA.
Calvo-Ortega, Juan F; Teke, Tony; Moragues, Sandra; Pozo, Miquel; Casals-Farran, Joan
2014-03-06
In the present study, we describe a method based on the analysis of the dynamic MLC log files (DynaLog) generated by the controller of a Varian linear accelerator in order to perform patient-specific IMRT QA. The DynaLog files of a Varian Millennium MLC, recorded during an IMRT treatment, can be processed using a MATLAB-based code in order to generate the actual fluence for each beam and so recalculate the actual patient dose distribution using the Eclipse treatment planning system. The accuracy of the DynaLog-based dose reconstruction procedure was assessed by introducing ten intended errors to perturb the fluence of the beams of a reference plan such that ten subsequent erroneous plans were generated. In-phantom measurements with an ionization chamber (ion chamber) and planar dose measurements using an EPID system were performed to investigate the correlation between the measured dose changes and the expected ones detected by the reconstructed plans for the ten intended erroneous cases. Moreover, the method was applied to 20 cases of clinical plans for different locations (prostate, lung, breast, and head and neck). A dose-volume histogram (DVH) metric was used to evaluate the impact of the delivery errors in terms of dose to the patient. The ionometric measurements revealed a significant positive correlation (R² = 0.9993) between the variations of the dose induced in the erroneous plans with respect to the reference plan and the corresponding changes indicated by the DynaLog-based reconstructed plans. The EPID measurements showed that the accuracy of the DynaLog-based method to reconstruct the beam fluence was comparable with the dosimetric resolution of the portal dosimetry used in this work (3%/3 mm). The DynaLog-based reconstruction method described in this study is a suitable tool to perform a patient-specific IMRT QA. This method allows us to perform patient-specific IMRT QA by evaluating the result based on the DVH metric of the planning CT image (patient DVH-based IMRT QA).
A Quality Assurance Method that Utilizes 3D Dosimetry and Facilitates Clinical Interpretation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oldham, Mark, E-mail: mark.oldham@duke.edu; Thomas, Andrew; O'Daniel, Jennifer
2012-10-01
Purpose: To demonstrate a new three-dimensional (3D) quality assurance (QA) method that provides comprehensive dosimetry verification and facilitates evaluation of the clinical significance of QA data acquired in a phantom. Also to apply the method to investigate the dosimetric efficacy of base-of-skull (BOS) intensity-modulated radiotherapy (IMRT) treatment. Methods and Materials: Two types of IMRT QA verification plans were created for 6 patients who received BOS IMRT. The first plan enabled conventional 2D planar IMRT QA using the Varian portal dosimetry system. The second plan enabled 3D verification using an anthropomorphic head phantom. In the latter, the 3D dose distribution wasmore » measured using the DLOS/Presage dosimetry system (DLOS = Duke Large-field-of-view Optical-CT System, Presage Heuris Pharma, Skillman, NJ), which yielded isotropic 2-mm data throughout the treated volume. In a novel step, measured 3D dose distributions were transformed back to the patient's CT to enable calculation of dose-volume histograms (DVH) and dose overlays. Measured and planned patient DVHs were compared to investigate clinical significance. Results: Close agreement between measured and calculated dose distributions was observed for all 6 cases. For gamma criteria of 3%, 2 mm, the mean passing rate for portal dosimetry was 96.8% (range, 92.0%-98.9%), compared to 94.9% (range, 90.1%-98.9%) for 3D. There was no clear correlation between 2D and 3D passing rates. Planned and measured dose distributions were evaluated on the patient's anatomy, using DVH and dose overlays. Minor deviations were detected, and the clinical significance of these are presented and discussed. Conclusions: Two advantages accrue to the methods presented here. First, treatment accuracy is evaluated throughout the whole treated volume, yielding comprehensive verification. Second, the clinical significance of any deviations can be assessed through the generation of DVH curves and dose overlays on the patient's anatomy. The latter step represents an important development that advances the clinical relevance of complex treatment QA.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Badkul, R; Nicolai, W; Pokhrel, D
Purpose: To compare the impact of Pencil Beam(PB) and Anisotropic Analytic Algorithm(AAA) dose calculation algorithms on OARs and planning target volume (PTV) in thoracic spine stereotactic body radiation therapy (SBRT). Methods: Ten Spine SBRT patients were planned on Brainlab iPlan system using hybrid plan consisting of 1–2 non-coplanar conformal-dynamic arcs and few IMRT beams treated on NovalisTx with 6MV photon. Dose prescription varied from 20Gy to 30Gy in 5 fractions depending on the situation of the patient. PB plans were retrospectively recalculated using the Varian Eclipse with AAA algorithm using same MUs, MLC pattern and grid size(3mm).Differences in dose volumemore » parameters for PTV, spinal cord, lung, and esophagus were analyzed and compared for PB and AAA algorithms. OAR constrains were followed per RTOG-0631. Results: Since patients were treated using PB calculation, we compared all the AAA DVH values with respect to PB plan values as standard, although AAA predicts the dose more accurately than PB. PTV(min), PTV(Max), PTV(mean), PTV(D99%), PTV(D90%) were overestimated with AAA calculation on average by 3.5%, 1.84%, 0.95%, 3.98% and 1.55% respectively as compared to PB. All lung DVH parameters were underestimated with AAA algorithm mean deviation of lung V20, V10, V5, and 1000cc were 42.81%,19.83%, 18.79%, and 18.35% respectively. AAA overestimated Cord(0.35cc) by mean of 17.3%; cord (0.03cc) by 12.19% and cord(max) by 10.5% as compared to PB. Esophagus max dose were overestimated by 4.4% and 5cc by 3.26% for AAA algorithm as compared to PB. Conclusion: AAA overestimated the PTV dose values by up to 4%.The lung DVH had the greatest underestimation of dose by AAA versus PB. Spinal cord dose was overestimated by AAA versus PB. Given the critical importance of accuracy of OAR and PTV dose calculation for SBRT spine, more accurate algorithms and validation of calculated doses in phantom models are indicated.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Magome, T; Haga, A; Igaki, H
Purpose: Although many outcome prediction models based on dose-volume information have been proposed, it is well known that the prognosis may be affected also by multiple clinical factors. The purpose of this study is to predict the survival time after radiotherapy for high-grade glioma patients based on features including clinical and dose-volume histogram (DVH) information. Methods: A total of 35 patients with high-grade glioma (oligodendroglioma: 2, anaplastic astrocytoma: 3, glioblastoma: 30) were selected in this study. All patients were treated with prescribed dose of 30–80 Gy after surgical resection or biopsy from 2006 to 2013 at The University of Tokyomore » Hospital. All cases were randomly separated into training dataset (30 cases) and test dataset (5 cases). The survival time after radiotherapy was predicted based on a multiple linear regression analysis and artificial neural network (ANN) by using 204 candidate features. The candidate features included the 12 clinical features (tumor location, extent of surgical resection, treatment duration of radiotherapy, etc.), and the 192 DVH features (maximum dose, minimum dose, D95, V60, etc.). The effective features for the prediction were selected according to a step-wise method by using 30 training cases. The prediction accuracy was evaluated by a coefficient of determination (R{sup 2}) between the predicted and actual survival time for the training and test dataset. Results: In the multiple regression analysis, the value of R{sup 2} between the predicted and actual survival time was 0.460 for the training dataset and 0.375 for the test dataset. On the other hand, in the ANN analysis, the value of R{sup 2} was 0.806 for the training dataset and 0.811 for the test dataset. Conclusion: Although a large number of patients would be needed for more accurate and robust prediction, our preliminary Result showed the potential to predict the outcome in the patients with high-grade glioma. This work was partly supported by the JSPS Core-to-Core Program(No. 23003) and Grant-in-aid from the JSPS Fellows.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanguineti, Giuseppe, E-mail: gsangui1@jhmi.edu; Department of Radiation Oncology and Molecular Radiation Sciences, Johns Hopkins University, Baltimore, MD; Sormani, Maria Pia
2012-05-01
Purpose: To define the roles of radiotherapy and chemotherapy on the risk of Grade 3+ mucositis during intensity-modulated radiation therapy (IMRT) for oropharyngeal cancer. Methods and Materials: 164 consecutive patients treated with IMRT at two institutions in nonoverlapping treatment eras were selected. All patients were treated with a dose painting approach, three dose levels, and comprehensive bilateral neck treatment under the supervision of the same radiation oncologist. Ninety-three patients received concomitant chemotherapy (cCHT) and 14 received induction chemotherapy (iCHT). Individual information of the dose received by the oral mucosa (OM) was extracted as absolute cumulative dose-volume histogram (DVH), corrected formore » the elapsed treatment days and reported as weekly (w) DVH. Patients were seen weekly during treatment, and peak acute toxicity equal to or greater than confluent mucositis at any point during the course of IMRT was considered the endpoint. Results: Overall, 129 patients (78.7%) reached the endpoint. The regions that best discriminated between patients with/without Grade 3+ mucositis were found at 10.1 Gy/w (V10.1) and 21 cc (D21), along the x-axis and y-axis of the OM-wDVH, respectively. On multivariate analysis, D21 (odds ratio [OR] = 1.016, 95% confidence interval [CI], 1.009-1.023, p < 0.001) and cCHT (OR = 4.118, 95% CI, 1.659-10.217, p = 0.002) were the only independent predictors. However, V10.1 and D21 were highly correlated (rho = 0.954, p < 0.001) and mutually interchangeable. cCHT would correspond to 88.4 cGy/w to at least 21 cc of OM. Conclusions: Radiotherapy and chemotherapy act independently in determining acute mucosal toxicity; cCHT increases the risk of mucosal Grade 3 toxicity Almost-Equal-To 4 times over radiation therapy alone, and it is equivalent to an extra Almost-Equal-To 6.2 Gy to 21 cc of OM over a 7-week course.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Badkul, R; Doke, K; Pokhrel, D
Purpose: Lung and heart doses and associated toxicity are of concern in radiotherapy for esophageal cancer. This study evaluates the dosimetry of deep-inspiration-breath-hold (DIBH) technique as compared to freebreathing( FB) using 3D-conformal treatment(3D-CRT) of esophageal cancer. Methods: Eight patients were planned with FB and DIBH CT scans. DIBH scans were acquired using Varian RPM system. FB and DIBH CTs were contoured per RTOG-1010 to create the planning target volume(PTV) as well as organs at risk volumes(OAR). Two sets of gross target volumes(GTV) with 5cm length were contoured for each patient: proximal at the level of the carina and distal atmore » the level of gastroesophageal junction and were enlarged with appropriate margin to generate Clinical Target Volume and PTV. 3D-CRT plans were created on Eclipse planning system for 45Gy to cover 95% of PTV in 25 fractions for both proximal and distal tumors on FB and DIBH scans. For distal tumors celiac nodes were covered electively. DVH parameters for lung and heart OARs were generated and analyzed. Results: All DIBH DVH parameters were normalized to FB plan values. Average of heart-mean and heart-V40 was 0.70 and 0.66 for proximal lesions. For distal lesions ratios were 1.21 and 2.22 respectively. For DIBH total lung volume increased by 2.43 times versus FB scan. Average of lung-mean, V30, V20, V10, V5 are 0.82, 0.92, 0.76, 0.77 and 0.79 for proximal lesions and 1.17,0.66,0.87,0.93 and 1.03 for distal lesions. Heart doses were lower for breath-hold proximal lesions but higher for distal lesions as compared to free-breathing plans. Lung doses were lower for both proximal and distal breath-hold lesions except mean lung dose and V5 for distal lesions. Conclusion: This study showed improvement of OAR doses for esophageal lesions at mid-thoracic level utilizing DIBH vs FB technique but did not show consistent OAR sparing with DIBH for distal lesions.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tang, Chad; Liao, Zhongxing, E-mail: zliao@mdanderson.org; Gomez, Daniel
2014-08-01
Purpose: Radiation therapy (RT) can both suppress and stimulate the immune system. We sought to investigate the mechanisms underlying radiation-induced lymphopenia and its associations with patient outcomes in non-small cell lung cancer (NSCLC). Methods and Materials: Subjects consisted of 711 patients who had received definitive RT for NSCLC. A lymphocyte nadir was calculated as the minimum lymphocyte value measured during definitive RT. Associations between gross tumor volumes (GTVs) and lung dose-volume histogram (DVH) parameters with lymphocyte nadirs were assessed with Spearman correlation coefficients. Relationships between lymphocyte nadirs with overall survival (OS) and event free survival (EFS) were evaluated with Kaplan-Meiermore » analysis and compared with log-rank test results. Multivariate regressions were conducted with linear and Cox regression analyses. All variables were analyzed as continuous if possible. Results: Larger GTVs were correlated with lower lymphocyte nadirs regardless of concurrent chemotherapy receipt (with concurrent: r = −0.26, P<.0001; without: r = −0.48, P<.0001). Analyses of lung DVH parameters revealed significant correlations at lower doses (lung V5-V10: P<.0001) that incrementally decreased and became nonsignificant at higher doses (lung V60-V70: P>.05). Of note, no significant associations were detected between GTV and lung DVH parameters with total leukocyte, neutrophil, or monocyte nadirs during RT or with lymphocyte count prior to RT. Multivariate analysis revealed larger GTV (P<.0001), receipt of concurrent chemotherapy (P<.0001), twice-daily radiation fractionation (P=.02), and stage III disease (P=.05) to be associated with lower lymphocyte nadirs. On univariate analysis, patients with higher lymphocyte nadirs exhibited significantly improved OS (hazard ratio [HR] = 0.51 per 10{sup 3} lymphocytes/μL, P=.01) and EFS (HR = 0.46 per 10{sup 3} lymphocytes/μL, P<.0001). These differences held on multivariate analyses, controlling for common disease and treatment characteristics including GTV. Conclusions: Lower lymphocyte nadirs during definitive RT were associated with larger GTVs and worse patient outcomes.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kalet, A; Cao, N; Meyer, J
Purpose: The purpose of this study was to evaluate the dosimetric and practical effects of the Monaco treatment planning system “max arcs-per-beam” optimization parameter in pelvic radiotherapy treatments. Methods: A total of 17 previously treated patients were selected for this study with a range of pelvic disease site including prostate(9), bladder(1), uterus(3), rectum(3), and cervix(1). For each patient, two plans were generated, one using a arc-per-beam setting of ‘1’ and another with setting of ‘2’. The setting allows the optimizer to add a gantry direction change, creating multiple arc passes per beam sequence. Volumes and constraints established from the initialmore » clinical treatments were used for planning. All constraints and dose coverage objects were kept the same between plans, and all plans were normalized to 99.7% to ensure 100% of the PTV received 95% of the prescription dose. We evaluated the PTV conformity index, homogeneity index, total monitor units, number of control points, and various dose volume histogram (DVH) points for statistical comparison (alpha=0.05). Results: We found for the 10 complex shaped target volumes (small central volumes with extending bilateral ‘arms’ to cover nodal regions) that the use of 2 arcs-per-beam achieved significantly lower average DVH values for the bladder V20 (p=0.036) and rectum V30 (p=0.001) while still meeting the high dose target constraints. DVH values for the simpler, more spherical PTVs were not found significantly different. Additionally, we found a beam delivery time reduction of approximately 25%. Conclusion: In summary, the dosimetric benefit, while moderate, was improved over a 1 arc-per-beam setting for complex PTVs, and equivalent in other cases. The overall reduced delivery time suggests that the use of multiple arcs-per-beam could lead to reduced patient on table time, increased clinical throughput, and reduced medical physics quality assurance effort.« less
Chen, Huixiao; Winey, Brian A; Daartz, Juliane; Oh, Kevin S; Shin, John H; Gierga, David P
2015-01-01
To evaluate plan quality and delivery efficiency gains of volumetric modulated arc therapy (VMAT) versus a multicriteria optimization-based intensity modulated radiation therapy (MCO-IMRT) for stereotactic radiosurgery of spinal metastases. MCO-IMRT plans (RayStation V2.5; RaySearch Laboratories, Stockholm, Sweden) of 10 spinal radiosurgery cases using 7-9 beams were developed for clinical delivery, and patients were replanned using VMAT with partial arcs. The prescribed dose was 18 Gy, and target coverage was maximized such that the maximum dose to the planning organ-at-risk volume (PRV) of the spinal cord was 10 or 12 Gy. Dose-volume histogram (DVH) constraints from the clinically acceptable MCO-IMRT plans were utilized for VMAT optimization. Plan quality and delivery efficiency with and without collimator rotation for MCO-IMRT and VMAT were compared and analyzed based upon DVH, planning target volume coverage, homogeneity index, conformity number, cord PRV sparing, total monitor units (MU), and delivery time. The VMAT plans were capable of matching most DVH constraints from the MCO-IMRT plans. The ranges of MU were 4808-7193 for MCO-IMRT without collimator rotation, 3509-5907 for MCO-IMRT with collimator rotation, 4444-7309 for VMAT without collimator rotation, and 3277-5643 for VMAT with collimator of 90 degrees. The MU for the VMAT plans were similar to their corresponding MCO-IMRT plans, depending upon the complexity of the target and PRV geometries, but had a larger range. The delivery times of the MCO-IMRT and VMAT plans, both with collimator rotation, were 18.3 ± 2.5 minutes and 14.2 ± 2.0 minutes, respectively (P < .05). The MCO-IMRT and VMAT can create clinically acceptable plans for spinal radiosurgery. The MU for MCO-IMRT and VMAT can be reduced significantly by utilizing a collimator rotation following the orientation of the spinal cord. Plan quality for VMAT is similar to MCO-IMRT, with similar MU for both modalities. Delivery times can be reduced by nominally 25% with VMAT. Copyright © 2015 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.
Selecting Data-Base Management Software for Microcomputers in Libraries and Information Units.
ERIC Educational Resources Information Center
Pieska, K. A. O.
1986-01-01
Presents a model for the evaluation of database management systems software from the viewpoint of librarians and information specialists. The properties of data management systems, database management systems, and text retrieval systems are outlined and compared. (10 references) (CLB)
Executive Guide to Software Maintenance. Reports on Computer Science and Technology.
ERIC Educational Resources Information Center
Osborne, Wilma M.
This guide is designed for federal executives and managers who have a responsibility for the planning and management of software projects and for federal staff members who are affected by, or involved in, making software changes, and who need to be aware of steps that can reduce both the difficulty and cost of software maintenance. Organized in a…
Special Report: Part One. New Tools for Professionals.
ERIC Educational Resources Information Center
Liskin, Miriam; And Others
1984-01-01
This collection of articles includes an examination of word-processing software; project management software; new expert systems that turn microcomputers into logical, well-informed consultants; simulated negotiation software; telephone management systems; and the physical design of an efficient microcomputer work space. (MBR)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Branson, Donald
The KCNSC Automated RAIL (Rolling Action Item List) system provides an electronic platform to manage and escalate rolling action items within an business and manufacturing environment at Honeywell. The software enables a tiered approach to issue management where issues are escalated up a management chain based on team input and compared to business metrics. The software manages action items at different levels of the organization and allows all users to discuss action items concurrently. In addition, the software drives accountability through timely emails and proper visibility during team meetings.
Watershed Management Optimization Support Tool (WMOST) is a software application designed tofacilitate integrated water resources management across wet and dry climate regions. It allows waterresources managers and planners to screen a wide range of practices across their watersh...
NASA Technical Reports Server (NTRS)
1989-01-01
At their March 1988 meeting, members of the National Aeronautics and Space Administration (NASA) Information Resources Management (IRM) Council expressed concern that NASA may not have the infrastructure necessary to support the use of Ada for major NASA software projects. Members also observed that the agency has no coordinated strategy for applying its experiences with Ada to subsequent projects (Hinners, 27 June 1988). To deal with these problems, the IRM Council chair appointed an intercenter Ada and Software Management Assessment Working Group (ASMAWG). They prepared a report (McGarry et al., March 1989) entitled, 'Ada and Software Management in NASA: Findings and Recommendations'. That report presented a series of recommendations intended to enable NASA to develop better software at lower cost through the use of Ada and other state-of-the-art software engineering technologies. The purpose here is to describe the steps (called objectives) by which this goal may be achieved, to identify the NASA officials or organizations responsible for carrying out the steps, and to define a schedule for doing so. This document sets forth four goals: adopt agency-wide software standards and policies; use Ada as the programming language for all mission software; establish an infrastructure to support software engineering, including the use of Ada, and to leverage the agency's software experience; and build the agency's knowledge base in Ada and software engineering. A schedule for achieving the objectives and goals is given.
Teaching Reprint File Management: Basic Principles and Software Programs.
ERIC Educational Resources Information Center
Wood, Elizabeth H.
1989-01-01
Describes a workshop for teaching library users how to manage reprint files which was developed at the University of Southern California Norris Medical Library. Software programs designed for this purpose are suggested, and a sidebar lists software features to consider. (eight references) (MES)
Earth Observing System/Advanced Microwave Sounding Unit-A (EOS/AMSU-A) software management plan
NASA Technical Reports Server (NTRS)
Schwantje, Robert
1994-01-01
This document defines the responsibilites for the management of the like-cycle development of the flight software installed in the AMSU-A instruments, and the ground support software used in the test and integration of the AMSU-A instruments.
Security Risks: Management and Mitigation in the Software Life Cycle
NASA Technical Reports Server (NTRS)
Gilliam, David P.
2004-01-01
A formal approach to managing and mitigating security risks in the software life cycle is requisite to developing software that has a higher degree of assurance that it is free of security defects which pose risk to the computing environment and the organization. Due to its criticality, security should be integrated as a formal approach in the software life cycle. Both a software security checklist and assessment tools should be incorporated into this life cycle process and integrated with a security risk assessment and mitigation tool. The current research at JPL addresses these areas through the development of a Sotfware Security Assessment Instrument (SSAI) and integrating it with a Defect Detection and Prevention (DDP) risk management tool.
Software to Manage the Unmanageable
NASA Technical Reports Server (NTRS)
2005-01-01
In 1995, NASA s Jet Propulsion Laboratory (JPL) contracted Redmond, Washington-based Lucidoc Corporation, to design a technology infrastructure to automate the intersection between policy management and operations management with advanced software that automates document workflow, document status, and uniformity of document layout. JPL had very specific parameters for the software. It expected to store and catalog over 8,000 technical and procedural documents integrated with hundreds of processes. The project ended in 2000, but NASA still uses the resulting highly secure document management system, and Lucidoc has managed to help other organizations, large and small, with integrating document flow and operations management to ensure a compliance-ready culture.
Airland Battlefield Environment (ALBE) Tactical Decision Aid (TDA) Demonstration Program,
1987-11-12
Management System (DBMS) software, GKS graphics libraries, and user interface software. These components of the ATB system software architecture will be... knowlede base ano auqent the decision mak:n• process by providing infocr-mation useful in the formulation and execution of battlefield strategies...Topographic Laboratories as an Engineer. Ms. Capps is managing the software development of the AirLand Battlefield Environment (ALBE) geographic
Assessments of Sequential Intensity Modulated Radiation Therapy Boost (SqIB) Treatments Using HART
NASA Astrophysics Data System (ADS)
Pyakuryal, Anil
2009-05-01
A retrospective study was pursued to evaluate the SqIB treatments performed on ten head and neck cancer patients(n=10).Average prescription doses (PDs) of 39 Gy,15Gy and 17.8Gy were delivered consecutively from larger to smaller planning target volumes(ptvs) in three different treatment plans using 6 MV X-ray photon beams from a Linear accelerator (SLA Linac, Elekta) on BID weak on-weak off schedules. These plans were statistically evaluated on basis of plan indices (PIs),dose response of targets and critical structures, and dose tolerance(DT) of various organs utilizing the DVH analysis automated software known as Histogram Analysis in Radiation Therapy-HART(S.Jang et al., 2008, Med Phys 35, p.2812). Mean SqIB PIs were found consistent with the reported values for varying radio-surgical systems.The 95.5%(n=10)of each ptvs and the gross tumor volume also received 95% (n=10)of PDs in treatments. The average volume of ten organs (N=10) affected by each PDs shrank with decreasing size of ptvs in above plans.A largest volume of Oropharynx (79%,n=10,N=10) irradiated at PD, but the largest volume of Larynx (98%, n=10, N=10) was vulnerable to DT of structure (TD50).Thus, we have demonstrated the efficiency and accuracy of HART in the assessment of Linac based plans in radiation therapy treatments of cancer.
Health Monitor for Multitasking, Safety-Critical, Real-Time Software
NASA Technical Reports Server (NTRS)
Zoerner, Roger
2011-01-01
Health Manager can detect Bad Health prior to a failure occurring by periodically monitoring the application software by looking for code corruption errors, and sanity-checking each critical data value prior to use. A processor s memory can fail and corrupt the software, or the software can accidentally write to the wrong address and overwrite the executing software. This innovation will continuously calculate a checksum of the software load to detect corrupted code. This will allow a system to detect a failure before it happens. This innovation monitors each software task (thread) so that if any task reports "bad health," or does not report to the Health Manager, the system is declared bad. The Health Manager reports overall system health to the outside world by outputting a square wave signal. If the square wave stops, this indicates that system health is bad or hung and cannot report. Either way, "bad health" can be detected, whether caused by an error, corrupted data, or a hung processor. A separate Health Monitor Task is started and run periodically in a loop that starts and stops pending on a semaphore. Each monitored task registers with the Health Manager, which maintains a count for the task. The registering task must indicate if it will run more or less often than the Health Manager. If the task runs more often than the Health Manager, the monitored task calls a health function that increments the count and verifies it did not go over max-count. When the periodic Health Manager runs, it verifies that the count did not go over the max-count and zeroes it. If the task runs less often than the Health Manager, the periodic Health Manager will increment the count. The monitored task zeroes the count, and both the Health Manager and monitored task verify that the count did not go over the max-count.
EDRMS for Academic Records Management: A Design Study in a Malaysian University
ERIC Educational Resources Information Center
Miah, Shah Jahan; Samsudin, Ahmad Zam Hariro
2017-01-01
Higher education institutes such as universities suffer from a range of issues in managing their academic records and relevant digital contents. Many universities nowadays use specific software applications for their effective mechanism in records management. The effective provision of enterprises records management (ERM) software for managing…
Report on a Knowledge-Based Software Assistant.
1983-08-01
maintainers, project managers , and end-users). In this paradigm, software activities, including definition, management , and validation will be...project management . This report also presents a plan for the development of the KBSA, along with a description of the necessary supporting technology...Activity Coordination .. .. .. ..... ...... ..... .... 19 3.2 Project Management and Documentation. .. ... ........ 20 3.2.1 Project Management Facet
Software-Engineering Process Simulation (SEPS) model
NASA Technical Reports Server (NTRS)
Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.
1992-01-01
The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
RIECK, C.A.
1999-02-23
This Software Configuration Management Plan (SCMP) provides the instructions for change control of the W-211 Project, Retrieval Control System (RCS) software after initial approval/release but prior to the transfer of custody to the waste tank operations contractor. This plan applies to the W-211 system software developed by the project, consisting of the computer human-machine interface (HMI) and programmable logic controller (PLC) software source and executable code, for production use by the waste tank operations contractor. The plan encompasses that portion of the W-211 RCS software represented on project-specific AUTOCAD drawings that are released as part of the C1 definitive designmore » package (these drawings are identified on the drawing list associated with each C-1 package), and the associated software code. Implementation of the plan is required for formal acceptance testing and production release. The software configuration management plan does not apply to reports and data generated by the software except where specifically identified. Control of information produced by the software once it has been transferred for operation is the responsibility of the receiving organization.« less
Computer software management, evaluation, and dissemination
NASA Technical Reports Server (NTRS)
1983-01-01
The activities of the Computer Software Management and Information Center involving the collection, processing, and distribution of software developed under the auspices of NASA and certain other federal agencies are reported. Program checkout and evaluation, inventory control, customer services and marketing, dissemination, program maintenance, and special development tasks are discussed.
At the Creation: Chaos, Control, and Automation--Commercial Software Development for Archives.
ERIC Educational Resources Information Center
Drr, W. Theodore
1988-01-01
An approach to the design of flexible text-based management systems for archives includes tiers for repository, software, and user management systems. Each tier has four layers--objective, program, result, and interface. Traps awaiting software development companies involve the market, competition, operations, and finance. (10 references) (MES)
Records Inventory Data Collection Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cole, Brian A.
1995-03-01
DATALINK was created to provide an easy to use data collection program for records management software products. It provides several useful tools for capturing and validating record index data in the field. It also allows users to easily create a comma delimited, ASCII text file for data export into most records management software products.
The Birth, Death, and Resurrection of an SPI Project
NASA Astrophysics Data System (ADS)
Carlsson, Sven; Schönström, Mikael
Commentators on contemporary themes of strategic management and firm competitiveness stress that a firm's competitive advantage flows from its unique knowledge and how it manages knowledge, and for many firms their ability to create, share, exchange, and use knowledge have a major impact on their competitiveness (Nonaka & Teece 2001). In software development, knowledge management (KM) plays an increasingly important role. It has been argued that the KM-field is an important source for creating new perspectives on the software development process (Iivari 2000). Several Software Process Improvement (SPI) approaches stress the importance of managing knowledge and experiences as a way for improving software processes (Ahem et al. 2001). Another SPI-trend is the use of ideas from process management like in the Capability Maturity Model (CMM). Unfortunately, little research on the effects of the use of process management ideas in SPI exists. Given the influx of process management ideas to SPI, the impact of these ideas should be addressed.
2015-01-01
Objective: Spacer gel is used to reduce the rectal dose in prostate radiotherapy. It is injected to increase the distance between the prostate and rectum. During the course of external radiotherapy treatment, physiological changes in rectal volume exist. When using polyethylene glycol material, such as DuraSeal® (Covidien, Mansfield, MA), gel resorption also occurs. Together, these factors alter the original dose plan distribution. Methods: External dose planning and calculations were simulated using images acquired from 10 patients who were treated with brachytherapy and gel. The CT series was taken relative to gel injection: pre 1 day, post 1 day, post 1 month and post 2 months. Adaptive planning was compared with a single plan. Results: Adaptive planning shows better results compared with the single plan used in the total treatment course; however, the effect is minor. Conclusion: Gel usage is clearly favourable to rectal DVH. Using adaptive planning with gel improves rectal DVH but is not necessary according to this study. Advances in knowledge: Spacer gel is used in prostate radiotherapy to increase distance between the prostate and the rectum, thus reducing the rectal doses. During the treatment course, gel resorption exists which affects the rectal doses. The usefulness of adaptive planning to compensate this resorption effect has not been studied before. PMID:26370300
Radiation-induced complications in prostate cancer patients treated with radiotherapy
NASA Astrophysics Data System (ADS)
Azuddin, A. Yusof; Rahman, I. Abdul; Siah, N. J.; Mohamed, F.; Saadc, M.; Ismail, F.
2014-09-01
The purpose of the study is to determine the relationship between radiation-induced complications with dosimetric and radiobiological parameters for prostate cancer patients that underwent the conformal radiotherapy treatment. 17 prostate cancer patients that have been treated with conformal radiotherapy were retrospectively analysed. The dosimetric data was retrieved in the form of dose-volume histogram (DVH) from Radiotherapy Treatment Planning System. The DVH was utilised to derived Normal Tissue Complication Probability (NTCP) in radiobiological data. Follow-up data from medical records were used to grade the occurrence of acute gastrointestinal (GI) and genitourinary (GU) complications using Radiation Therapy Oncology Group (RTOG) scoring system. The chi-square test was used to determine the relationship between radiation-induced complication with dosimetric and radiobiological parameters. 8 (47%) and 7 (41%) patients were having acute GI and GU complications respectively. The acute GI complication can be associated with V60rectum, rectal mean dose and NTCPrectum with p-value of 0.016, 0.038 and 0.049 respectively. There are no significant relationships of acute GU complication with dosimetric and radiobiological variables. Further study can be done by increase the sample size and follow up duration for deeper understanding of the factors that effecting the GU and GI complication in prostate cancer radiotherapy.
FAILSAFE Health Management for Embedded Systems
NASA Technical Reports Server (NTRS)
Horvath, Gregory A.; Wagner, David A.; Wen, Hui Ying; Barry, Matthew
2010-01-01
The FAILSAFE project is developing concepts and prototype implementations for software health management in mission- critical, real-time embedded systems. The project unites features of the industry-standard ARINC 653 Avionics Application Software Standard Interface and JPL s Mission Data System (MDS) technology (see figure). The ARINC 653 standard establishes requirements for the services provided by partitioned, real-time operating systems. The MDS technology provides a state analysis method, canonical architecture, and software framework that facilitates the design and implementation of software-intensive complex systems. The MDS technology has been used to provide the health management function for an ARINC 653 application implementation. In particular, the focus is on showing how this combination enables reasoning about, and recovering from, application software problems.
Using Decision Structures for Policy Analysis in Software Product-line Evolution - A Case Study
NASA Astrophysics Data System (ADS)
Sarang, Nita; Sanglikar, Mukund A.
Project management decisions are the primary basis for project success (or failure). Mostly, such decisions are based on an intuitive understanding of the underlying software engineering and management process and have a likelihood of being misjudged. Our problem domain is product-line evolution. We model the dynamics of the process by incorporating feedback loops appropriate to two decision structures: staffing policy, and the forces of growth associated with long-term software evolution. The model is executable and supports project managers to assess the long-term effects of possible actions. Our work also corroborates results from earlier studies of E-type systems, in particular the FEAST project and the rules for software evolution, planning and management.
Model Transformation for a System of Systems Dependability Safety Case
NASA Technical Reports Server (NTRS)
Murphy, Judy; Driskell, Stephen B.
2010-01-01
Software plays an increasingly larger role in all aspects of NASA's science missions. This has been extended to the identification, management and control of faults which affect safety-critical functions and by default, the overall success of the mission. Traditionally, the analysis of fault identification, management and control are hardware based. Due to the increasing complexity of system, there has been a corresponding increase in the complexity in fault management software. The NASA Independent Validation & Verification (IV&V) program is creating processes and procedures to identify, and incorporate safety-critical software requirements along with corresponding software faults so that potential hazards may be mitigated. This Specific to Generic ... A Case for Reuse paper describes the phases of a dependability and safety study which identifies a new, process to create a foundation for reusable assets. These assets support the identification and management of specific software faults and, their transformation from specific to generic software faults. This approach also has applications to other systems outside of the NASA environment. This paper addresses how a mission specific dependability and safety case is being transformed to a generic dependability and safety case which can be reused for any type of space mission with an emphasis on software fault conditions.
NASA Astrophysics Data System (ADS)
1992-06-01
The House Committee on Science, Space, and Technology asked NASA to study software development issues for the space station. How well NASA has implemented key software engineering practices for the station was asked. Specifically, the objectives were to determine: (1) if independent verification and validation techniques are being used to ensure that critical software meets specified requirements and functions; (2) if NASA has incorporated software risk management techniques into program; (3) whether standards are in place that will prescribe a disciplined, uniform approach to software development; and (4) if software support tools will help, as intended, to maximize efficiency in developing and maintaining the software. To meet the objectives, NASA proceeded: (1) reviewing and analyzing software development objectives and strategies contained in NASA conference publications; (2) reviewing and analyzing NASA, other government, and industry guidelines for establishing good software development practices; (3) reviewing and analyzing technical proposals and contracts; (4) reviewing and analyzing software management plans, risk management plans, and program requirements; (4) reviewing and analyzing reports prepared by NASA and contractor officials that identified key issues and challenges facing the program; (5) obtaining expert opinions on what constitutes appropriate independent V-and-V and software risk management activities; (6) interviewing program officials at NASA headquarters in Washington, DC; at the Space Station Program Office in Reston, Virginia; and at the three work package centers; Johnson in Houston, Texas; Marshall in Huntsville, Alabama; and Lewis in Cleveland, Ohio; and (7) interviewing contractor officials doing work for NASA at Johnson and Marshall. The audit work was performed in accordance with generally accepted government auditing standards, between April 1991 and May 1992.
ERIC Educational Resources Information Center
McDonald, Joseph
1986-01-01
Focusing on management decisions in academic libraries, this article compares management information systems (MIS) with decision support systems (DSS) and discusses the decision-making process, information needs of library managers, sources of data, reasons for choosing microcomputer, preprogrammed application software, prototyping a system, and…
NASA software documentation standard software engineering program
NASA Technical Reports Server (NTRS)
1991-01-01
The NASA Software Documentation Standard (hereinafter referred to as Standard) can be applied to the documentation of all NASA software. This Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. This basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.
Configuring the Orion Guidance, Navigation, and Control Flight Software for Automated Sequencing
NASA Technical Reports Server (NTRS)
Odegard, Ryan G.; Siliwinski, Tomasz K.; King, Ellis T.; Hart, Jeremy J.
2010-01-01
The Orion Crew Exploration Vehicle is being designed with greater automation capabilities than any other crewed spacecraft in NASA s history. The Guidance, Navigation, and Control (GN&C) flight software architecture is designed to provide a flexible and evolvable framework that accommodates increasing levels of automation over time. Within the GN&C flight software, a data-driven approach is used to configure software. This approach allows data reconfiguration and updates to automated sequences without requiring recompilation of the software. Because of the great dependency of the automation and the flight software on the configuration data, the data management is a vital component of the processes for software certification, mission design, and flight operations. To enable the automated sequencing and data configuration of the GN&C subsystem on Orion, a desktop database configuration tool has been developed. The database tool allows the specification of the GN&C activity sequences, the automated transitions in the software, and the corresponding parameter reconfigurations. These aspects of the GN&C automation on Orion are all coordinated via data management, and the database tool provides the ability to test the automation capabilities during the development of the GN&C software. In addition to providing the infrastructure to manage the GN&C automation, the database tool has been designed with capabilities to import and export artifacts for simulation analysis and documentation purposes. Furthermore, the database configuration tool, currently used to manage simulation data, is envisioned to evolve into a mission planning tool for generating and testing GN&C software sequences and configurations. A key enabler of the GN&C automation design, the database tool allows both the creation and maintenance of the data artifacts, as well as serving the critical role of helping to manage, visualize, and understand the data-driven parameters both during software development and throughout the life of the Orion project.
Top down, bottom up structured programming and program structuring
NASA Technical Reports Server (NTRS)
Hamilton, M.; Zeldin, S.
1972-01-01
New design and programming techniques for shuttle software. Based on previous Apollo experience, recommendations are made to apply top-down structured programming techniques to shuttle software. New software verification techniques for large software systems are recommended. HAL, the higher order language selected for the shuttle flight code, is discussed and found to be adequate for implementing these techniques. Recommendations are made to apply the workable combination of top-down, bottom-up methods in the management of shuttle software. Program structuring is discussed relevant to both programming and management techniques.
CrossTalk. The Journal of Defense Software Engineering. Volume 13, Number 6, June 2000
2000-06-01
Techniques for Efficiently Generating and Testing Software This paper presents a proven process that uses advanced tools to design, develop and test... optimal software. by Keith R. Wegner Large Software Systems—Back to Basics Development methods that work on small problems seem to not scale well to...Ability Requirements for Teamwork: Implications for Human Resource Management, Journal of Management, Vol. 20, No. 2, 1994. 11. Ferguson, Pat, Watts S
NASA Technical Reports Server (NTRS)
Jester, Peggy L.; Hancock, David W., III
1999-01-01
This document provides the Data Management Plan for the GLAS Standard Data Software (SDS) supporting the GLAS instrument of the EOS ICESat Spacecraft. The SDS encompasses the ICESat Science Investigator-led Processing System (I-SIPS) Software and the Instrument Support Facility (ISF) Software. This Plan addresses the identification, authority, and description of the interface nodes associated with the GLAS Standard Data Products and the GLAS Ancillary Data.
Software Project Management and Measurement on the World-Wide-Web (WWW)
NASA Technical Reports Server (NTRS)
Callahan, John; Ramakrishnan, Sudhaka
1996-01-01
We briefly describe a system for forms-based, work-flow management that helps members of a software development team overcome geographical barriers to collaboration. Our system, called the Web Integrated Software Environment (WISE), is implemented as a World-Wide-Web service that allows for management and measurement of software development projects based on dynamic analysis of change activity in the workflow. WISE tracks issues in a software development process, provides informal communication between the users with different roles, supports to-do lists, and helps in software process improvement. WISE minimizes the time devoted to metrics collection and analysis by providing implicit delivery of messages between users based on the content of project documents. The use of a database in WISE is hidden from the users who view WISE as maintaining a personal 'to-do list' of tasks related to the many projects on which they may play different roles.
Pros and Cons of Clinical Pathway Software Management: A Qualitative Study.
Aarnoutse, M F; Brinkkemper, S; de Mul, M; Askari, M
2018-01-01
In this study we aimed to assess the perceived effectiveness of clinical pathway management software for healthcare professionals. A case study on the clinical pathway management software program Check-It was performed in three departments at an academic medical center. Four months after the implementation of the software, interviews were held with healthcare professionals who work with the system. The interview questions were posed in a semi-structured interview format and the participant were asked about the perceived positive or negative effects of Check-It, and whether they thought the software is effective for them. The interviews were recorded and transcribed based on grounded theory, using different coding techniques. Our results showed fewer overlooked tasks, pre-filled orders and letters, better overview, and increased protocol insight as positive aspects of using the software. Being not flexible enough was experienced as a negative aspect.
Software for minimalistic data management in large camera trap studies
Krishnappa, Yathin S.; Turner, Wendy C.
2014-01-01
The use of camera traps is now widespread and their importance in wildlife studies well understood. Camera trap studies can produce millions of photographs and there is a need for software to help manage photographs efficiently. In this paper, we describe a software system that was built to successfully manage a large behavioral camera trap study that produced more than a million photographs. We describe the software architecture and the design decisions that shaped the evolution of the program over the study’s three year period. The software system has the ability to automatically extract metadata from images, and add customized metadata to the images in a standardized format. The software system can be installed as a standalone application on popular operating systems. It is minimalistic, scalable and extendable so that it can be used by small teams or individual researchers for a broad variety of camera trap studies. PMID:25110471
Microcomputer Database Management Systems that Interface with Online Public Access Catalogs.
ERIC Educational Resources Information Center
Rice, James
1988-01-01
Describes a study that assessed the availability and use of microcomputer database management interfaces to online public access catalogs. The software capabilities needed to effect such an interface are identified, and available software packages are evaluated by these criteria. A directory of software vendors is provided. (4 notes with…
DATALINK. Records Inventory Data Collection Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cole, B.
1995-03-01
DATALINK was created to provide an easy to use data collection program for records management software products. It provides several useful tools for capturing and validating record index data in the field. It also allows users to easily create a comma delimited, ASCII text file for data export into most records management software products.
DOT National Transportation Integrated Search
1978-01-01
This report deals with the Periodic Motor Vehicle Inspection Management Evaluation System software documentation and implementation procedures. A companion report entitled "A Management System for Evaluating the Virginia Periodic Motor Vehicle Inspec...
Computer Bits: Child Care Center Management Software Buying Guide Update.
ERIC Educational Resources Information Center
Neugebauer, Roger
1987-01-01
Compares seven center management programs used for basic financial and data management tasks such as accounting, payroll and attendance records, and mailing lists. Describes three other specialized programs and gives guidelines for selecting the best software for a particular center. (NH)
DOT National Transportation Integrated Search
2009-04-01
This project evaluates the process that was followed by MDOT and other stakeholders for the acquisition : of new Advanced Traffic Management System (ATMS) software aiming to integrate and facilitate the : management of various Intelligent Transportat...
Management Guidelines for Database Developers' Teams in Software Development Projects
NASA Astrophysics Data System (ADS)
Rusu, Lazar; Lin, Yifeng; Hodosi, Georg
Worldwide job market for database developers (DBDs) is continually increasing in last several years. In some companies, DBDs are organized as a special team (DBDs team) to support other projects and roles. As a new role, the DBDs team is facing a major problem that there are not any management guidelines for them. The team manager does not know which kinds of tasks should be assigned to this team and what practices should be used during DBDs work. Therefore in this paper we have developed a set of management guidelines, which includes 8 fundamental tasks and 17 practices from software development process, by using two methodologies Capability Maturity Model (CMM) and agile software development in particular Scrum in order to improve the DBDs team work. Moreover the management guidelines developed here has been complemented with practices from authors' experience in this area and has been evaluated in the case of a software company. The management guidelines for DBD teams presented in this paper could be very usefully for other companies too that are using a DBDs team and could contribute towards an increase of the efficiency of these teams in their work on software development projects.
Web-Based Software for Managing Research
NASA Technical Reports Server (NTRS)
Hoadley, Sherwood T.; Ingraldi, Anthony M.; Gough, Kerry M.; Fox, Charles; Cronin, Catherine K.; Hagemann, Andrew G.; Kemmerly, Guy T.; Goodman, Wesley L.
2007-01-01
aeroCOMPASS is a software system, originally designed to aid in the management of wind tunnels at Langley Research Center, that could be adapted to provide similar aid to other enterprises in which research is performed in common laboratory facilities by users who may be geographically dispersed. Included in aeroCOMPASS is Web-interface software that provides a single, convenient portal to a set of project- and test-related software tools and other application programs. The heart of aeroCOMPASS is a user-oriented document-management software subsystem that enables geographically dispersed users to easily share and manage a variety of documents. A principle of "write once, read many" is implemented throughout aeroCOMPASS to eliminate the need for multiple entry of the same information. The Web framework of aeroCOMPASS provides links to client-side application programs that are fully integrated with databases and server-side application programs. Other subsystems of aeroCOMPASS include ones for reserving hardware, tracking of requests and feedback from users, generating interactive notes, administration of a customer-satisfaction questionnaire, managing execution of tests, managing archives of metadata about tests, planning tests, and providing online help and instruction for users.
Space Station Software Recommendations
NASA Technical Reports Server (NTRS)
Voigt, S. (Editor)
1985-01-01
Four panels of invited experts and NASA representatives focused on the following topics: software management, software development environment, languages, and software standards. Each panel deliberated in private, held two open sessions with audience participation, and developed recommendations for the NASA Space Station Program. The major thrusts of the recommendations were as follows: (1) The software management plan should establish policies, responsibilities, and decision points for software acquisition; (2) NASA should furnish a uniform modular software support environment and require its use for all space station software acquired (or developed); (3) The language Ada should be selected for space station software, and NASA should begin to address issues related to the effective use of Ada; and (4) The space station software standards should be selected (based upon existing standards where possible), and an organization should be identified to promulgate and enforce them. These and related recommendations are described in detail in the conference proceedings.
Computer-aided software development process design
NASA Technical Reports Server (NTRS)
Lin, Chi Y.; Levary, Reuven R.
1989-01-01
The authors describe an intelligent tool designed to aid managers of software development projects in planning, managing, and controlling the development process of medium- to large-scale software projects. Its purpose is to reduce uncertainties in the budget, personnel, and schedule planning of software development projects. It is based on dynamic model for the software development and maintenance life-cycle process. This dynamic process is composed of a number of time-varying, interacting developmental phases, each characterized by its intended functions and requirements. System dynamics is used as a modeling methodology. The resulting Software LIfe-Cycle Simulator (SLICS) and the hybrid expert simulation system of which it is a subsystem are described.
NASA Technical Reports Server (NTRS)
Pepe, J. T.
1972-01-01
A functional design of software executive system for the space shuttle avionics computer is presented. Three primary functions of the executive are emphasized in the design: task management, I/O management, and configuration management. The executive system organization is based on the applications software and configuration requirements established during the Phase B definition of the Space Shuttle program. Although the primary features of the executive system architecture were derived from Phase B requirements, it was specified for implementation with the IBM 4 Pi EP aerospace computer and is expected to be incorporated into a breadboard data management computer system at NASA Manned Spacecraft Center's Information system division. The executive system was structured for internal operation on the IBM 4 Pi EP system with its external configuration and applications software assumed to the characteristic of the centralized quad-redundant avionics systems defined in Phase B.
Federal Emergency Management Information System (FEMIS) system administration guide. Version 1.3
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burford, M.J.; Burnett, R.A.; Downing, T.R.
The Federal Emergency Management Information System (FEMIS) is an emergency management planning and analysis tool that was developed by the (Pacific Northwest National Laboratory) (PNNL) under the direction of the U.S. Army Chemical Biological Defense Command. The FEMIS System Administration Guide defines FEMIS hardware and software requirements and gives instructions for installing the FEMIS software package. 91 This document also contains information on the following: software installation for the FEMIS data servers, communication server, mail server, and the emergency management workstations; distribution media loading and FEMIS installation validation and troubleshooting; and system management of FEMIS users, login, privileges, and usage.more » The system administration utilities (tools), available in the FEMIS client software, are described for user accounts and site profile. This document also describes the installation and use of system and database administration utilities that will assist in keeping the FEMIS system running in an operational environment.« less
Software engineering and Ada in design
NASA Technical Reports Server (NTRS)
Oneill, Don
1986-01-01
Modern software engineering promises significant reductions in software costs and improvements in software quality. The Ada language is the focus for these software methodology and tool improvements. The IBM FSD approach, including the software engineering practices that guide the systematic design and development of software products and the management of the software process are examined. The revised Ada design language adaptation is revealed. This four level design methodology is detailed including the purpose of each level, the management strategy that integrates the software design activity with the program milestones, and the technical strategy that maps the Ada constructs to each level of design. A complete description of each design level is provided along with specific design language recording guidelines for each level. Finally, some testimony is offered on education, tools, architecture, and metrics resulting from project use of the four level Ada design language adaptation.
SU-F-T-94: Plan2pdf - a Software Tool for Automatic Plan Report for Philips Pinnacle TPS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, C
Purpose: To implement an automatic electronic PDF plan reporting tool for Philips Pinnacle treatment planning system (TPS) Methods: An electronic treatment plan reporting software is developed by us to enable fully automatic PDF report from Pinnacle TPS to external EMR programs such as MOSAIQ. The tool is named “plan2pdf”. plan2pdf is implemented using Pinnacle scripts, Java and UNIX shell scripts, without any external program needed. plan2pdf supports full auto-mode and manual mode reporting. In full auto-mode, with a single mouse click, plan2pdf will generate a detailed Pinnacle plan report in PDF format, which includes customizable cover page, Pinnacle plan summary,more » orthogonal views through each plan POI and maximum dose point, DRR for each beam, serial transverse views captured throughout the dose grid at a user specified interval, DVH and scorecard windows. The final PDF report is also automatically bookmarked for each section above for convenient plan review. The final PDF report can either be saved on a user specified folder on Pinnacle, or it can be automatically exported to an EMR import folder via a user configured FTP service. In manual capture mode, plan2pdf allows users to capture any Pinnacle plan by full screen, individual window or rectangular ROI drawn on screen. Furthermore, to avoid possible patients’ plan mix-up during auto-mode reporting, a user conflict check feature is included in plan2pdf: it prompts user to wait if another patient is being exported by plan2pdf by another user. Results: plan2pdf is tested extensively and successfully at our institution consists of 5 centers, 15 dosimetrists and 10 physicists, running Pinnacle version 9.10 on Enterprise servers. Conclusion: plan2pdf provides a highly efficient, user friendly and clinical proven platform for all Philips Pinnacle users, to generate a detailed plan report in PDF format for external EMR systems.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nie, K; Pouliot, J; Smith, E
Purpose: To evaluate the performance variations in commercial deformable image registration (DIR) tools for adaptive radiation therapy. Methods: Representative plans from three different anatomical sites, prostate, head-and-neck (HN) and cranial spinal irradiation (CSI) with L-spine boost, were included. Computerized deformed CT images were first generated using virtual DIR QA software (ImSimQA) for each case. The corresponding transformations served as the “reference”. Three commercial software packages MIMVista v5.5 and MIMMaestro v6.0, VelocityAI v2.6.2, and OnQ rts v2.1.15 were tested. The warped contours and doses were compared with the “reference” and among each other. Results: The performance in transferring contours was comparablemore » among all three tools with an average DICE coefficient of 0.81 for all the organs. However, the performance of dose warping accuracy appeared to rely on the evaluation end points. Volume based DVH comparisons were not sensitive enough to illustrate all the detailed variations while isodose assessment on a slice-by-slice basis could be tedious. Point-based evaluation was over-sensitive by having up to 30% hot/cold-spot differences. If adapting the 3mm/3% gamma analysis into the evaluation of dose warping, all three algorithms presented a reasonable level of equivalency. One algorithm had over 10% of the voxels not meeting this criterion for the HN case while another showed disagreement for the CSI case. Conclusion: Overall, our results demonstrated that evaluation based only on the performance of contour transformation could not guarantee the accuracy in dose warping. However, the performance of dose warping accuracy relied on the evaluation methodologies. Nevertheless, as more DIR tools are available for clinical use, the performance could vary at certain degrees. A standard quality assurance criterion with clinical meaning should be established for DIR QA, similar to the gamma index concept, in the near future.« less
A Quantitative Study of Global Software Development Teams, Requirements, and Software Projects
ERIC Educational Resources Information Center
Parker, Linda L.
2016-01-01
The study explored the relationship between global software development teams, effective software requirements, and stakeholders' perception of successful software development projects within the field of information technology management. It examined the critical relationship between Global Software Development (GSD) teams creating effective…
NASA Technical Reports Server (NTRS)
Church, Victor E.; Long, D.; Hartenstein, Ray; Perez-Davila, Alfredo
1992-01-01
A set of functional requirements for software configuration management (CM) and metrics reporting for Space Station Freedom ground systems software are described. This report is one of a series from a study of the interfaces among the Ground Systems Development Environment (GSDE), the development systems for the Space Station Training Facility (SSTF) and the Space Station Control Center (SSCC), and the target systems for SSCC and SSTF. The focus is on the CM of the software following delivery to NASA and on the software metrics that relate to the quality and maintainability of the delivered software. The CM and metrics requirements address specific problems that occur in large-scale software development. Mechanisms to assist in the continuing improvement of mission operations software development are described.
Introduction to Software Packages. [Final Report.
ERIC Educational Resources Information Center
Frankel, Sheila, Ed.; And Others
This document provides an introduction to applications computer software packages that support functional managers in government and encourages the use of such packages as an alternative to in-house development. A review of current application areas includes budget/project management, financial management/accounting, payroll, personnel,…
Managing Automation: A Process, Not a Project.
ERIC Educational Resources Information Center
Hoffmann, Ellen
1988-01-01
Discussion of issues in management of library automation includes: (1) hardware, including systems growth and contracts; (2) software changes, vendor relations, local systems, and microcomputer software; (3) item and authority databases; (4) automation and library staff, organizational structure, and managing change; and (5) environmental issues,…
NIRP Core Software Suite v. 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitener, Dustin Heath; Folz, Wesley; Vo, Duong
The NIRP Core Software Suite is a core set of code that supports multiple applications. It includes miscellaneous base code for data objects, mathematic equations, and user interface components; and the framework includes several fully-developed software applications that exist as stand-alone tools to compliment other applications. The stand-alone tools are described below. Analyst Manager: An application to manage contact information for people (analysts) that use the software products. This information is often included in generated reports and may be used to identify the owners of calculations. Radionuclide Viewer: An application for viewing the DCFPAK radiological data. Compliments the Mixture Managermore » tool. Mixture Manager: An application to create and manage radionuclides mixtures that are commonly used in other applications. High Explosive Manager: An application to manage explosives and their properties. Chart Viewer: An application to view charts of data (e.g. meteorology charts). Other applications may use this framework to create charts specific to their data needs.« less
NASA Astrophysics Data System (ADS)
Gunawan, D.; Amalia, A.; Rahmat, R. F.; Muchtar, M. A.; Siregar, I.
2018-02-01
Identification of software maturity level is a technique to determine the quality of the software. By identifying the software maturity level, the weaknesses of the software can be observed. As a result, the recommendations might be a reference for future software maintenance and development. This paper discusses the software Capability Level (CL) with case studies on Quality Management Unit (Unit Manajemen Mutu) University of Sumatera Utara (UMM-USU). This research utilized Standard CMMI Appraisal Method for Process Improvement class C (SCAMPI C) model with continuous representation. This model focuses on activities for developing quality products and services. The observation is done in three process areas, such as Project Planning (PP), Project Monitoring and Control (PMC), and Requirements Management (REQM). According to the measurement of software capability level for UMM-USU software, turns out that the capability level for the observed process area is in the range of CL1 and CL2. Planning Project (PP) is the only process area which reaches capability level 2, meanwhile, PMC and REQM are still in CL 1 or in performed level. This research reveals several weaknesses of existing UMM-USU software. Therefore, this study proposes several recommendations for UMM-USU to improve capability level for observed process areas.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Angel, L.K.; Bower, J.C.; Burnett, R.A.
1999-06-29
The Federal Emergency Management Information System (FEMIS) is an emergency management planning and response tool that was developed by the Pacific Northwest National Laboratory (PNNL) under the direction of the U.S. Army Chemical Biological Defense Command. The FEMIS System Administration Guide provides information necessary for the system administrator to maintain the FEMIS system. The FEMIS system is designed for a single Chemical Stockpile Emergency Preparedness Program (CSEPP) site that has multiple Emergency Operations Centers (EOCs). Each EOC has personal computers (PCs) that emergency planners and operations personnel use to do their jobs. These PCs are corrected via a local areamore » network (LAN) to servers that provide EOC-wide services. Each EOC is interconnected to other EOCs via a Wide Area Network (WAN). Thus, FEMIS is an integrated software product that resides on client/server computer architecture. The main body of FEMIS software, referred to as the FEMIS Application Software, resides on the PC client(s) and is directly accessible to emergency management personnel. The remainder of the FEMIS software, referred to as the FEMIS Support Software, resides on the UNIX server. The Support Software provides the communication data distribution and notification functionality necessary to operate FEMIS in a networked, client/server environment.« less
Agile Methods: Selected DoD Management and Acquisition Concerns
2011-10-01
SIDRE Software Intensive Innovative Development and Reengineering/Evolution SLIM Software Lifecycle Management -Estimate SLOC source lines of code...ISBN #0321502752 Coaching Agile Teams Lyssa Adkins ISBN #0321637704 Agile Project Management : Creating Innovative Products – Second Edition Jim...Accessed July 13, 2011. [Highsmith 2009] Highsmith, J. Agile Project Management : Creating Innovative Products, 2nd ed. Addison- Wesley, 2009
NASA Technical Reports Server (NTRS)
1998-01-01
Under an SBIR (Small Business Innovative Research) contract with Johnson Space Center, Knowledge Based Systems Inc. (KBSI) developed an intelligent software environment for modeling and analyzing mission planning activities, simulating behavior, and, using a unique constraint propagation mechanism, updating plans with each change in mission planning activities. KBSI developed this technology into a commercial product, PROJECTLINK, a two-way bridge between PROSIm, KBSI's process modeling and simulation software and leading project management software like Microsoft Project and Primavera's SureTrak Project Manager.
NASA Technical Reports Server (NTRS)
Clinedinst, Winston C.; Debure, Kelly R.; Dickson, Richard W.; Heaphy, William J.; Parks, Mark A.; Slominski, Christopher J.; Wolverton, David A.
1988-01-01
The Flight Management/Flight Controls (FM/FC) software for the Norden 2 (PDP-11/70M) computer installed on the NASA 737 aircraft is described. The software computes the navigation position estimates, guidance commands, those commands to be issued to the control surfaces to direct the aircraft in flight based on the modes selected on the Advanced Guidance Control System (AGSC) mode panel, and the flight path selected via the Navigation Control/Display Unit (NCDU).
NASA Astrophysics Data System (ADS)
Arif Shah, Muhammad; Hashim, Rathiah; Shah, Adil Ali; Farooq Khattak, Umar
2016-11-01
Developing software through Global Software Development (GSD) became very common now days in the software industry. Pakistan is one of the countries where projects are taken and designed from different countries including Afghanistan. The purpose of this paper is to identify and provide an analysis on several communication barriers that can have a negative impact on the project and to provide management guidelines for medium size software organizations working in Pakistan with clients from Afghanistan and to overcome these communication barriers and challenges organizations face when coordinating with client. Initially we performed a literature review to identify different communication barriers and to check if there are any standardized communications management guidelines for medium size software houses provided in the past. The second stage of the research paper involves guidelines with vendor's perspective that include interviews and focus group discussions with different stakeholders and employees of software houses with clients from Afghanistan. Based on those interviews and discussions we established communication management guidelines in order to overcome the communication problems and barriers working with clients from Afghanistan. As a result of the literature review, we have identified that barriers such as cultural barriers and language barrier were one of the main reasons behind the project failure and suggested that software organizations working in Pakistan should follow certain defined communication guidelines in order to overcome communication barriers that affect the project directly.
NASA Technical Reports Server (NTRS)
Flora-Adams, Dana; Makihara, Jeanne; Benenyan, Zabel; Berner, Jeff; Kwok, Andrew
2007-01-01
Object Oriented Data Technology (OODT) is a software framework for creating a Web-based system for exchange of scientific data that are stored in diverse formats on computers at different sites under the management of scientific peers. OODT software consists of a set of cooperating, distributed peer components that provide distributed peer-topeer (P2P) services that enable one peer to search and retrieve data managed by another peer. In effect, computers running OODT software at different locations become parts of an integrated data-management system.
Simulation modeling for the health care manager.
Kennedy, Michael H
2009-01-01
This article addresses the use of simulation software to solve administrative problems faced by health care managers. Spreadsheet add-ins, process simulation software, and discrete event simulation software are available at a range of costs and complexity. All use the Monte Carlo method to realistically integrate probability distributions into models of the health care environment. Problems typically addressed by health care simulation modeling are facility planning, resource allocation, staffing, patient flow and wait time, routing and transportation, supply chain management, and process improvement.
Project management in the development of scientific software
NASA Astrophysics Data System (ADS)
Platz, Jochen
1986-08-01
This contribution is a rough outline of a comprehensive project management model for the development of software for scientific applications. The model was tested in the unique environment of the Siemens AG Corporate Research and Technology Division. Its focal points are the structuring of project content - the so-called phase organization, the project organization and the planning model used, and its particular applicability to innovative projects. The outline focuses largely on actual project management aspects rather than associated software engineering measures.
Use of a Wiki-Based Software to Manage Research Group Activities
ERIC Educational Resources Information Center
Wang, Ting; Vezenov, Dmitri V.; Simboli, Brian
2014-01-01
This paper discusses use of the wiki software Confluence to organize research group activities and lab resources. Confluence can serve as an electronic lab notebook (ELN), as well as an information management and collaboration tool. The article provides a case study in how researchers can use wiki software in "home-grown" fashion to…
Software for Intelligent System Health Management (ISHM)
NASA Technical Reports Server (NTRS)
Trevino, Luis C.
2004-01-01
The slide presentation is a briefing in four areas: overview of health management paradigms; overview of the ARC-Houston Software Engineering Technology Workshop held on April 20-22, 2004; identified technologies relevant to technical themes of intelligent system health management; and the author's thoughts on these topics.
Multimedia Delivery of Coastal Zone Management Training.
ERIC Educational Resources Information Center
Clark, M. J.; And Others
1995-01-01
Describes Coastal Zone Management (CZM) multimedia course modules, educational software written by the GeoData Institute at the University of Southamptom for an environmental management undergraduate course. Examines five elements that converge to create CZM multimedia teaching: course content, source material, a hardware/software delivery system,…
Software Engineering Education Directory
1990-04-01
and Engineering (CMSC 735) Codes: GPEV2 * Textiooks: IEEE Tutoria on Models and Metrics for Software Management and Engameeing by Basi, Victor R...Software Engineering (Comp 227) Codes: GPRY5 Textbooks: IEEE Tutoria on Software Design Techniques by Freeman, Peter and Wasserman, Anthony 1. Software
Building quality into medical product software design.
Mallory, S R
1993-01-01
The software engineering and quality assurance disciplines are a requisite to the design of safe and effective software-based medical devices. It is in the areas of software methodology and process that the most beneficial application of these disciplines to software development can be made. Software is a product of complex operations and methodologies and is not amenable to the traditional electromechanical quality assurance processes. Software quality must be built in by the developers, with the software verification and validation engineers acting as the independent instruments for ensuring compliance with performance objectives and with development and maintenance standards. The implementation of a software quality assurance program is a complex process involving management support, organizational changes, and new skill sets, but the benefits are profound. Its rewards provide safe, reliable, cost-effective, maintainable, and manageable software, which may significantly speed the regulatory review process and therefore potentially shorten the overall time to market. The use of a trial project can greatly facilitate the learning process associated with the first-time application of a software quality assurance program.
Software architecture of the Magdalena Ridge Observatory Interferometer
NASA Astrophysics Data System (ADS)
Farris, Allen; Klinglesmith, Dan; Seamons, John; Torres, Nicolas; Buscher, David; Young, John
2010-07-01
Merging software from 36 independent work packages into a coherent, unified software system with a lifespan of twenty years is the challenge faced by the Magdalena Ridge Observatory Interferometer (MROI). We solve this problem by using standardized interface software automatically generated from simple highlevel descriptions of these systems, relying only on Linux, GNU, and POSIX without complex software such as CORBA. This approach, based on gigabit Ethernet with a TCP/IP protocol, provides the flexibility to integrate and manage diverse, independent systems using a centralized supervisory system that provides a database manager, data collectors, fault handling, and an operator interface.
Training survey -- educational profile for Hanford HANDI 2000 project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, D.
Fluor Daniel Hanford, Inc. (FDH) is currently adopting streamlined business processes through integrated software solutions. Replacing the legacy software (current/replacement systems, attached) also avoids significant maintenance required to resolve Year 2000 issues. This initiative is being referred to as `HANDI 2000`. The software being implemented in the first phase of this project includes Indus International`s PASSPORT Software, Peoplesoft and Primavera P3 Software. The project, which encompasses all the system replacements that will occur, has been named `HANDI 2000.` The PASSPORT applications being implemented are Inventory Management, Purchasing, Contract Management, Accounts Payable, and MSDS (Material Safety Data Sheets).
Securing Ground Data System Applications for Space Operations
NASA Technical Reports Server (NTRS)
Pajevski, Michael J.; Tso, Kam S.; Johnson, Bryan
2014-01-01
The increasing prevalence and sophistication of cyber attacks has prompted the Multimission Ground Systems and Services (MGSS) Program Office at Jet Propulsion Laboratory (JPL) to initiate the Common Access Manager (CAM) effort to protect software applications used in Ground Data Systems (GDSs) at JPL and other NASA Centers. The CAM software provides centralized services and software components used by GDS subsystems to meet access control requirements and ensure data integrity, confidentiality, and availability. In this paper we describe the CAM software; examples of its integration with spacecraft commanding software applications and an information management service; and measurements of its performance and reliability.
Proceedings of Tenth Annual Software Engineering Workshop
NASA Technical Reports Server (NTRS)
1985-01-01
Papers are presented on the following topics: measurement of software technology, recent studies of the Software Engineering Lab, software management tools, expert systems, error seeding as a program validation technique, software quality assurance, software engineering environments (including knowledge-based environments), the Distributed Computing Design System, and various Ada experiments.
For operation of the Computer Software Management and Information Center (COSMIC)
NASA Technical Reports Server (NTRS)
Carmon, J. L.
1983-01-01
Computer programs for relational information management data base systems, spherical roller bearing analysis, a generalized pseudoinverse of a rectangular matrix, and software design and documentation language are summarized.
ERIC Educational Resources Information Center
Milshtein, Amy
1998-01-01
Discusses how to avoid costly mistakes when buying facility-management software. Provides answers to questions buyers should ask before committing funds to a particular program. Selected facility-management software companies and product profiles are highlighted. (GR)
An approach to software cost estimation
NASA Technical Reports Server (NTRS)
Mcgarry, F.; Page, J.; Card, D.; Rohleder, M.; Church, V.
1984-01-01
A general procedure for software cost estimation in any environment is outlined. The basic concepts of work and effort estimation are explained, some popular resource estimation models are reviewed, and the accuracy of source estimates is discussed. A software cost prediction procedure based on the experiences of the Software Engineering Laboratory in the flight dynamics area and incorporating management expertise, cost models, and historical data is described. The sources of information and relevant parameters available during each phase of the software life cycle are identified. The methodology suggested incorporates these elements into a customized management tool for software cost prediction. Detailed guidelines for estimation in the flight dynamics environment developed using this methodology are presented.
Organizational management practices for achieving software process improvement
NASA Technical Reports Server (NTRS)
Kandt, Ronald Kirk
2004-01-01
The crisis in developing software has been known for over thirty years. Problems that existed in developing software in the early days of computing still exist today. These problems include the delivery of low-quality products, actual development costs that exceed expected development costs, and actual development time that exceeds expected development time. Several solutions have been offered to overcome out inability to deliver high-quality software, on-time and within budget. One of these solutions involves software process improvement. However, such efforts often fail because of organizational management issues. This paper discusses business practices that organizations should follow to improve their chances of initiating and sustaining successful software process improvement efforts.
Selecting Software for Libraries.
ERIC Educational Resources Information Center
Beiser, Karl
1993-01-01
Discusses resources and strategies that libraries can use to evaluate competing database management software for purchase. Needs assessments, types of software available, features of good software, evaluation aids, shareware, and marketing and product trends are covered. (KRN)
A Brief Study of Software Engineering Professional Continuing Education in DoD Acquisition
2010-04-01
Lifecycle Processes (IEEE 12207 ) (810) 37% 61% 2% Guide to the Software Engineering Body of K l d (SWEBOK) (804) 67% 31% 2% now e ge Software...Engineering-Software Measurement Process ( ISO /IEC 15939) (797) 55% 44% 2% Capability Maturity Model Integration (806) 17% 81% 2% Six Sigma Process...Improvement (804) 7% 91% 1% ISO 9000 Quality Management Systems (803) 10% 89% 1% 28 Conclusions Significant problem areas R i tequ remen s Management Very
1979-08-21
Appendix s - Outline and Draft Material for Proposed Triservice Interim Guideline on Application of Software Acceptance Criteria....... 269 Appendix 9...AND DRAFT MATERIAL FOR PROPOSED TRISERVICE INTERIM GUIDELINE ON APPLICATION OF SOFTWARE ACCEPTANCE CRITERIA I I INTRODUCTION The purpose of this guide...contract item (CPCI) (code) 5. CPCI test plan 6. CPCI test procedures 7. CPCI test report 8. Handbooks and manuals. Al though additional material does
Models and metrics for software management and engineering
NASA Technical Reports Server (NTRS)
Basili, V. R.
1988-01-01
This paper attempts to characterize and present a state of the art view of several quantitative models and metrics of the software life cycle. These models and metrics can be used to aid in managing and engineering software projects. They deal with various aspects of the software process and product, including resources allocation and estimation, changes and errors, size, complexity and reliability. Some indication is given of the extent to which the various models have been used and the success they have achieved.
Seven Processes that Enable NASA Software Engineering Technologies
NASA Technical Reports Server (NTRS)
Housch, Helen; Godfrey, Sally
2011-01-01
This slide presentation reviews seven processes that NASA uses to ensure that software is developed, acquired and maintained as specified in the NPR 7150.2A requirement. The requirement is to ensure that all software be appraised for the Capability Maturity Model Integration (CMMI). The enumerated processes are: (7) Product Integration, (6) Configuration Management, (5) Verification, (4) Software Assurance, (3) Measurement and Analysis, (2) Requirements Management and (1) Planning & Monitoring. Each of these is described and the group(s) that are responsible is described.
Increase Return on Investment of Software Development Life Cycle by Managing the Risk - A Case Study
2015-04-01
for increasing the return on investment during the Software Development Life Cycle ( SDLC ) through selected quantitative analyses employing both the...defect rate, return on investment (ROI), software development life cycle ( SDLC ) DE FE N SE A C Q U IS IT IO N UN IVERSITY ALU M N I A SSO C IATIO N R...becomes comfortable due to its intricacies and learning cycle. The same may be said with respect to software development life cycle ( SDLC ) management
2009-02-01
management, available at <http://www.iso.org/ iso /en/CatalogueDetailPage.CatalogueDetail?CSNUMBER=39612&ICS1=35&ICS2=40 &ICS3=>. ISO /IEC 27001 . Information...Management of the Systems Engineering Process. [ ISO /IEC 27001 ] ISO /IEC 27001 :2005. Information technology -- Security techniques -- Information security...software life cycles [ ISO /IEC 15026]. Software assurance is a key element of national security and homeland security. It is critical because dramatic
The determination of measures of software reliability
NASA Technical Reports Server (NTRS)
Maxwell, F. D.; Corn, B. C.
1978-01-01
Measurement of software reliability was carried out during the development of data base software for a multi-sensor tracking system. The failure ratio and failure rate were found to be consistent measures. Trend lines could be established from these measurements that provide good visualization of the progress on the job as a whole as well as on individual modules. Over one-half of the observed failures were due to factors associated with the individual run submission rather than with the code proper. Possible application of these findings for line management, project managers, functional management, and regulatory agencies is discussed. Steps for simplifying the measurement process and for use of these data in predicting operational software reliability are outlined.
NASA Technical Reports Server (NTRS)
Wolverton, David A.; Dickson, Richard W.; Clinedinst, Winston C.; Slominski, Christopher J.
1993-01-01
The flight software developed for the Flight Management/Flight Controls (FM/FC) MicroVAX computer used on the Transport Systems Research Vehicle for Advanced Transport Operating Systems (ATOPS) research is described. The FM/FC software computes navigation position estimates, guidance commands, and those commands issued to the control surfaces to direct the aircraft in flight. Various modes of flight are provided for, ranging from computer assisted manual modes to fully automatic modes including automatic landing. A high-level system overview as well as a description of each software module comprising the system is provided. Digital systems diagrams are included for each major flight control component and selected flight management functions.
SSCR Automated Manager (SAM) release 1. 1 reference manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1988-10-01
This manual provides instructions for using the SSCR Automated Manager (SAM) to manage System Software Change Records (SSCRs) online. SSCRs are forms required to document all system software changes for the Martin Marietta Energy Systems, Inc., Central computer systems. SAM, a program developed at Energy Systems, is accessed through IDMS/R (Integrated Database Management System) on an IBM system.
Unisys' experience in software quality and productivity management of an existing system
NASA Technical Reports Server (NTRS)
Munson, John B.
1988-01-01
A summary of Quality Improvement techniques, implementation, and results in the maintenance, management, and modification of large software systems for the Space Shuttle Program's ground-based systems is provided.
For operation of the Computer Software Management and Information Center (COSMIC)
NASA Technical Reports Server (NTRS)
Carmon, J. L.
1983-01-01
Progress report on current status of computer software management and information center (COSMIC) includes the following areas: inventory, evaluation and publication, marketing, customer service, maintenance and support, and budget summary.
Software Reviews Since Acquisition Reform - The Artifact Perspective
2004-01-01
Risk Management OLD NEW Slide 13Acquisition of Software Intensive Systems 2004 – Peter Hantos Single, basic software paradigm Single processor Low...software risk mitigation related trade-offs must be done together Integral Software Engineering Activities Process Maturity and Quality Frameworks Quality
A Strategy for Improved System Assurance
2007-06-20
Quality (Measurements Life Cycle Safety, Security & Others) ISO /IEC 12207 * Software Life Cycle Processes ISO 9001 Quality Management System...14598 Software Product Evaluation Related ISO /IEC 90003 Guidelines for the Application of ISO 9001:2000 to Computer Software IEEE 12207 Industry...Implementation of International Standard ISO /IEC 12207 IEEE 1220 Standard for Application and Management of the System Engineering Process Use in
Modular Filter and Source-Management Upgrade of RADAC
NASA Technical Reports Server (NTRS)
Lanzi, R. James; Smith, Donna C.
2007-01-01
In an upgrade of the Range Data Acquisition Computer (RADAC) software, a modular software object library was developed to implement required functionality for filtering of flight-vehicle-tracking data and management of tracking-data sources. (The RADAC software is used to process flight-vehicle metric data for realtime display in the Wallops Flight Facility Range Control Center and Mobile Control Center.)
ERIC Educational Resources Information Center
Bodén, Linnea
2013-01-01
An increasing number of Swedish municipalities use digital software to manage the registration of students' school absences. The software is regarded as a problem-solving tool to make registration more efficient, but its effects on the educational setting have been largely neglected. Focusing on an event with two students from a class of…
Architecture independent environment for developing engineering software on MIMD computers
NASA Technical Reports Server (NTRS)
Valimohamed, Karim A.; Lopez, L. A.
1990-01-01
Engineers are constantly faced with solving problems of increasing complexity and detail. Multiple Instruction stream Multiple Data stream (MIMD) computers have been developed to overcome the performance limitations of serial computers. The hardware architectures of MIMD computers vary considerably and are much more sophisticated than serial computers. Developing large scale software for a variety of MIMD computers is difficult and expensive. There is a need to provide tools that facilitate programming these machines. First, the issues that must be considered to develop those tools are examined. The two main areas of concern were architecture independence and data management. Architecture independent software facilitates software portability and improves the longevity and utility of the software product. It provides some form of insurance for the investment of time and effort that goes into developing the software. The management of data is a crucial aspect of solving large engineering problems. It must be considered in light of the new hardware organizations that are available. Second, the functional design and implementation of a software environment that facilitates developing architecture independent software for large engineering applications are described. The topics of discussion include: a description of the model that supports the development of architecture independent software; identifying and exploiting concurrency within the application program; data coherence; engineering data base and memory management.
NASA Software Documentation Standard
NASA Technical Reports Server (NTRS)
1991-01-01
The NASA Software Documentation Standard (hereinafter referred to as "Standard") is designed to support the documentation of all software developed for NASA; its goal is to provide a framework and model for recording the essential information needed throughout the development life cycle and maintenance of a software system. The NASA Software Documentation Standard can be applied to the documentation of all NASA software. The Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. The basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.
NASA Technical Reports Server (NTRS)
Khambatta, Cyrus F.
2007-01-01
A technique for automated development of scenarios for use in the Multi-Center Traffic Management Advisor (McTMA) software simulations is described. The resulting software is designed and implemented to automate the generation of simulation scenarios with the intent of reducing the time it currently takes using an observational approach. The software program is effective in achieving this goal. The scenarios created for use in the McTMA simulations are based on data taken from data files from the McTMA system, and were manually edited before incorporation into the simulations to ensure accuracy. Despite the software s overall favorable performance, several key software issues are identified. Proposed solutions to these issues are discussed. Future enhancements to the scenario generator software may address the limitations identified in this paper.
3DVEM Software Modules for Efficient Management of Point Clouds and Photorealistic 3d Models
NASA Astrophysics Data System (ADS)
Fabado, S.; Seguí, A. E.; Cabrelles, M.; Navarro, S.; García-De-San-Miguel, D.; Lerma, J. L.
2013-07-01
Cultural heritage managers in general and information users in particular are not usually used to deal with high-technological hardware and software. On the contrary, information providers of metric surveys are most of the times applying latest developments for real-life conservation and restoration projects. This paper addresses the software issue of handling and managing either 3D point clouds or (photorealistic) 3D models to bridge the gap between information users and information providers as regards the management of information which users and providers share as a tool for decision-making, analysis, visualization and management. There are not many viewers specifically designed to handle, manage and create easily animations of architectural and/or archaeological 3D objects, monuments and sites, among others. 3DVEM - 3D Viewer, Editor & Meter software will be introduced to the scientific community, as well as 3DVEM - Live and 3DVEM - Register. The advantages of managing projects with both sets of data, 3D point cloud and photorealistic 3D models, will be introduced. Different visualizations of true documentation projects in the fields of architecture, archaeology and industry will be presented. Emphasis will be driven to highlight the features of new userfriendly software to manage virtual projects. Furthermore, the easiness of creating controlled interactive animations (both walkthrough and fly-through) by the user either on-the-fly or as a traditional movie file will be demonstrated through 3DVEM - Live.
Detailed requirements document for common software of shuttle program information management system
NASA Technical Reports Server (NTRS)
Everette, J. M.; Bradfield, L. D.; Horton, C. L.
1975-01-01
Common software was investigated as a method for minimizing development and maintenance cost of the shuttle program information management system (SPIMS) applications while reducing the time-frame of their development. Those requirements satisfying these criteria are presented along with the stand-alone modules which may be used directly by applications. The SPIMS applications operating on the CYBER 74 computer, are specialized information management systems which use System 2000 as a data base manager. Common software provides the features to support user interactions on a CRT terminal using form input and command response capabilities. These features are available as subroutines to the applications.
Evaluating foodservice software: a suggested approach.
Fowler, K D
1986-09-01
In an era of cost containment, the computer has become a viable management tool. Its use in health care has demonstrated accelerated growth in recent years, and a literature review supports an increased trend in this direction. Foodservice, which is a major cost center, is no exception to this predicted trend. Because software has proliferated, foodservice managers and dietitians are experiencing growing concern about how to evaluate the numerous software packages from which to choose. A suggested approach to evaluating software is offered to dietitians and managers alike to lessen the confusion in software selection and to improve the system satisfaction level post-purchase. Steps of the software evaluatory approach include: delineation of goals, assessment of needs, assignment of value weight factors, development of a vendor checklist, survey of vendors by means of the vendor checklist and elimination of inappropriate systems, thorough development of the request for proposal (RFP) for submission to the selected vendors, an analysis of the returned RFPs in terms of system features and cost factors, and selection of the system(s) for implementation.
Master Pump Shutdown MPS Software Quality Assurance Plan (SQAP)
DOE Office of Scientific and Technical Information (OSTI.GOV)
BEVINS, R.R.
2000-09-20
The MPSS Software Quality Assurance (SQAP) describes the tools and strategy used in the development of the MPSS software. The document also describes the methodology for controlling and managing changes to the software.
Is Dose Deformation–Invariance Hypothesis Verified in Prostate IGRT?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simon, Antoine, E-mail: antoine.simon@univ-rennes1.fr; Laboratoire Traitement du Signal et de l'Image, Université de Rennes 1, 35000 Rennes; Le Maitre, Amandine
Purpose: To assess dose uncertainties resulting from the dose deformation–invariance hypothesis in prostate cone beam computed tomography (CT)–based image guided radiation therapy (IGRT), namely to evaluate whether rigidly propagated planned dose distribution enables good estimation of fraction dose distributions. Methods and Materials: Twenty patients underwent a CT scan for planning intensity modulated radiation therapy–IGRT delivering 80 Gy to the prostate, followed by weekly CT scans. Two methods were used to obtain the dose distributions on the weekly CT scans: (1) recalculating the dose using the original treatment plan; and (2) rigidly propagating the planned dose distribution. The cumulative doses were then estimatedmore » in the organs at risk for each dose distribution by deformable image registration. The differences between recalculated and propagated doses were finally calculated for the fraction and the cumulative dose distributions, by use of per-voxel and dose-volume histogram (DVH) metrics. Results: For the fraction dose, the mean per-voxel absolute dose difference was <1 Gy for 98% and 95% of the fractions for the rectum and bladder, respectively. The maximum dose difference within 1 voxel reached, however, 7.4 Gy in the bladder and 8.0 Gy in the rectum. The mean dose differences were correlated with gas volume for the rectum and patient external contour variations for the bladder. The mean absolute differences for the considered volume receiving greater than or equal to dose x (V{sub x}) of the DVH were between 0.37% and 0.70% for the rectum and between 0.53% and 1.22% for the bladder. For the cumulative dose, the mean differences in the DVH were between 0.23% and 1.11% for the rectum and between 0.55% and 1.66% for the bladder. The largest dose difference was 6.86%, for bladder V{sub 80Gy}. The mean dose differences were <1.1 Gy for the rectum and <1 Gy for the bladder. Conclusions: The deformation–invariance hypothesis was corroborated for the organs at risk in prostate IGRT except in cases of a large disappearance or appearance of rectal gas for the rectum and large external contour variations for the bladder.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Velec, Michael; Haddad, Carol R.; Craig, Tim
Purpose: To identify risk factors associated with a decline in liver function after stereotactic body radiation therapy (SBRT) for hepatocellular carcinoma. Methods and Materials: Data were analyzed from patients with hepatocellular carcinoma treated on clinical trials of 6-fraction SBRT. Liver toxicity was defined as an increase in Child-Pugh (CP) score ≥2 three months after SBRT. Clinical factors, SBRT details, and liver dose-volume histogram (DVH) parameters were tested for association with toxicity using logistic regression. CP class B patients were analyzed separately. Results: Among CP class A patients, 101 were evaluable, with a baseline score of A5 (72%) or A6 (28%).more » Fifty-three percent had portal vein thrombus. The median liver volume was 1286 cc (range, 766-3967 cc), and the median prescribed dose was 36 Gy (range, 27-54 Gy). Toxicity was seen in 26 patients (26%). Thrombus, baseline CP of A6, and lower platelet count were associated with toxicity on univariate analysis, as were several liver DVH-based parameters. Absolute and spared liver volumes were not significant. On multivariate analysis for CP class A patients, significant associations were found for baseline CP score of A6 (odds ratio [OR], 4.85), lower platelet count (OR, 0.90; median, 108 × 10{sup 9}/L vs 150 × 10{sup 9}/L), higher mean liver dose (OR, 1.33; median, 16.9 Gy vs 14.7 Gy), and higher dose to 800 cc of liver (OR, 1.11; median, 14.3 Gy vs 6.0 Gy). With 13 CP-B7 patients included or when dose to 800 cc of liver was replaced with other DVH parameters (eg, dose to 700 or 900 cc of liver) in the multivariate analysis, effective volume and portal vein thrombus were associated with an increased risk. Conclusions: Baseline CP scores and higher liver doses (eg, mean dose, effective volume, doses to 700-900 cc) were strongly associated with liver function decline 3 months after SBRT. A lower baseline platelet count and portal vein thrombus were also associated with an increased risk.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, W; Patel, S; Shen, J
Purpose: Lack of plan robustness may contribute to local failure in volumetric-modulated arc therapy (VMAT) to treat head and neck (H&N) cancer. Thus we compared plan robustness of VMAT with intensity-modulated radiation therapy (IMRT). Methods: VMAT and IMRT plans were created for 9 H&N cancer patients. For each plan, six new perturbed dose distributions were computed — one each for ± 3mm setup deviations along the S-I, A-P and L-R directions. We used three robustness quantification tools: (1) worst-case analysis (WCA); (2) dose-volume histograms (DVHs) band (DVHB); and (3) root-mean-square-dose deviation (RMSD) volume histogram (DDVH). DDVH represents the relative volumemore » (y) on the vertical axis and the RMSD (x) on the horizontal axis. Similar to DVH, this means that y% of the volume of the indicated structure has the RMSD at least x Gy[RBE].The width from the first two methods at different target DVH indices (such as D95 and D5) and the area under the DDVH curves (AUC) for the target were used to indicate plan robustness. In these robustness quantification tools, the smaller the value, the more robust the plan is. Plan robustness evaluation metrics were compared using Wilcoxon test. Results: DVHB showed the width at D95 from IMRT to be larger than from VMAT (unit Gy) [1.59 vs 1.18 (p=0.49)], while the width at D5 from IMRT was found to be slightly larger than from VMAT [0.59 vs 0.54 (p=0.84)]. WCA showed similar results [D95: 3.28 vs 3.00 (p=0.56); D5: 1.68 vs 1.95 (p=0.23)]. DDVH showed the AUC from IMRT to be slightly smaller than from VMAT [1.13 vs 1.15 (p=0.43)]. Conclusion: VMAT plan robustness is comparable to IMRT plan robustness. The plan robustness conclusions from WCA and DVHB are DVH parameter dependent. On the other hand DDVH captures the overall effect of uncertainties on the dose to a volume of interest. NIH/NCI K25CA168984; Eagles Cancer Research Career Development; The Lawrence W. and Marilyn W. Matteson Fund for Cancer Research Mayo ASU Seed Grant; The Kemper Marley Foundation.« less
Software Quality Perceptions of Stakeholders Involved in the Software Development Process
ERIC Educational Resources Information Center
Padmanabhan, Priya
2013-01-01
Software quality is one of the primary determinants of project management success. Stakeholders involved in software development widely agree that quality is important (Barney and Wohlin 2009). However, they may differ on what constitutes software quality, and which of its attributes are more important than others. Although, software quality…
Spacelab software development and integration concepts study report. Volume 2: Appendices
NASA Technical Reports Server (NTRS)
1973-01-01
Software considerations were developed for incorporation in the spacelab systems design, and include management concepts for top-down structured programming, composite designs for modular programs, and team management methods for production programming.
Managing Written Directives: A Software Solution to Streamline Workflow.
Wagner, Robert H; Savir-Baruch, Bital; Gabriel, Medhat S; Halama, James R; Bova, Davide
2017-06-01
A written directive is required by the U.S. Nuclear Regulatory Commission for any use of 131 I above 1.11 MBq (30 μCi) and for patients receiving radiopharmaceutical therapy. This requirement has also been adopted and must be enforced by the agreement states. As the introduction of new radiopharmaceuticals increases therapeutic options in nuclear medicine, time spent on regulatory paperwork also increases. The pressure of managing these time-consuming regulatory requirements may heighten the potential for inaccurate or incomplete directive data and subsequent regulatory violations. To improve on the paper-trail method of directive management, we created a software tool using a Health Insurance Portability and Accountability Act (HIPAA)-compliant database. This software allows for secure data-sharing among physicians, technologists, and managers while saving time, reducing errors, and eliminating the possibility of loss and duplication. Methods: The software tool was developed using Visual Basic, which is part of the Visual Studio development environment for the Windows platform. Patient data are deposited in an Access database on a local HIPAA-compliant secure server or hard disk. Once a working version had been developed, it was installed at our institution and used to manage directives. Updates and modifications of the software were released regularly until no more significant problems were found with its operation. Results: The software has been used at our institution for over 2 y and has reliably kept track of all directives. All physicians and technologists use the software daily and find it superior to paper directives. They can retrieve active directives at any stage of completion, as well as completed directives. Conclusion: We have developed a software solution for the management of written directives that streamlines and structures the departmental workflow. This solution saves time, centralizes the information for all staff to share, and decreases confusion about the creation, completion, filing, and retrieval of directives. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.
Production roll out plan for HANDI 2000 business management system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, D.E.
The Hanford Data Integration 2000 (HANDI 2000) Project will result in an integrated and comprehensive set of functional applications containing core information necessary to support the Project Hanford Management Contract (PHMC). It is based on the Commercial-Off-The-Shelf (COTS) product solution with commercially proven business processes. The COTS product solution set, of Passport (PP) and PeopleSoft (PS) software, supports finance, supply, human resources, and payroll activities under the current PHMC direction. The PP software is an integrated application for Accounts Payable, Contract Management, Inventory Management, Purchasing and Material Safety Data Sheets (MSDS). The PS software is an integrated application for Projects,more » General Ledger, Human Resources Training, Payroll, and Base Benefits. This set of software constitutes the Business Management System (BMS) and MSDS, a subset of the HANDI 2000 suite of systems. The primary objective of the Production Roll Out Plan is to communicate the methods and schedules for implementation and roll out to end users of BMS.« less
ERIC Educational Resources Information Center
Davies, Daniel K.; Stock, Steven E.; Wehmeyer, Michael L.
2003-01-01
This report describes results of an initial investigation of the utility of a specially designed money management software program for improving management of personal checking accounts for individuals with mental retardation. Use with 19 adults with mental retardation indicated the software resulted in significant reduction in check writing and…
An Ontology and a Software Framework for Competency Modeling and Management
ERIC Educational Resources Information Center
Paquette, Gilbert
2007-01-01
The importance given to competency management is well justified. Acquiring new competencies is the central goal of any education or knowledge management process. Thus, it must be embedded in any software framework as an instructional engineering tool, to inform the runtime environment of the knowledge that is processed by actors, and their…
Mark J. Twery; Peter D. Knopp; Scott A. Thomasma; Donald E. Nute
2011-01-01
This is the user's guide for NED-2, which is the latest version of NED, a forest ecosystem management decision support system. This software is part of a family of software products intended to help resource managers develop goals, assess current and future conditions, and produce sustainable management plans for forest properties. Designed for stand-alone Windows...
Mark J. Twery; Peter D. Knopp; Scott A. Thomasma; Donald E. Nute
2012-01-01
This is the reference guide for NED-2, which is the latest version of NED, a forest ecosystem management decision support system. This software is part of a family of software products intended to help resource managers develop goals, assess current and future conditions, and produce sustainable management plans for forest properties. Designed for stand-alone Windows-...
Benefits of an automated GLP final report preparation software solution.
Elvebak, Larry E
2011-07-01
The final product of analytical laboratories performing US FDA-regulated (or GLP) method validation and bioanalysis studies is the final report. Although there are commercial-off-the-shelf (COTS) software/instrument systems available to laboratory managers to automate and manage almost every aspect of the instrumental and sample-handling processes of GLP studies, there are few software systems available to fully manage the GLP final report preparation process. This lack of appropriate COTS tools results in the implementation of rather Byzantine and manual processes to cobble together all the information needed to generate a GLP final report. The manual nature of these processes results in the need for several iterative quality control and quality assurance events to ensure data accuracy and report formatting. The industry is in need of a COTS solution that gives laboratory managers and study directors the ability to manage as many portions as possible of the GLP final report writing process and the ability to generate a GLP final report with the click of a button. This article describes the COTS software features needed to give laboratory managers and study directors such a solution.
Teaching Tip: Managing Software Engineering Student Teams Using Pellerin's 4-D System
ERIC Educational Resources Information Center
Doman, Marguerite; Besmer, Andrew; Olsen, Anne
2015-01-01
In this article, we discuss the use of Pellerin's Four Dimension Leadership System (4-D) as a way to manage teams in a classroom setting. Over a 5-year period, we used a modified version of the 4-D model to manage teams within a senior level Software Engineering capstone course. We found that this approach for team management in a classroom…
Software Health Management: A Short Review of Challenges and Existing Techniques
NASA Technical Reports Server (NTRS)
Pipatsrisawat, Knot; Darwiche, Adnan; Mengshoel, Ole J.; Schumann, Johann
2009-01-01
Modern spacecraft (as well as most other complex mechanisms like aircraft, automobiles, and chemical plants) rely more and more on software, to a point where software failures have caused severe accidents and loss of missions. Software failures during a manned mission can cause loss of life, so there are severe requirements to make the software as safe and reliable as possible. Typically, verification and validation (V&V) has the task of making sure that all software errors are found before the software is deployed and that it always conforms to the requirements. Experience, however, shows that this gold standard of error-free software cannot be reached in practice. Even if the software alone is free of glitches, its interoperation with the hardware (e.g., with sensors or actuators) can cause problems. Unexpected operational conditions or changes in the environment may ultimately cause a software system to fail. Is there a way to surmount this problem? In most modern aircraft and many automobiles, hardware such as central electrical, mechanical, and hydraulic components are monitored by IVHM (Integrated Vehicle Health Management) systems. These systems can recognize, isolate, and identify faults and failures, both those that already occurred as well as imminent ones. With the help of diagnostics and prognostics, appropriate mitigation strategies can be selected (replacement or repair, switch to redundant systems, etc.). In this short paper, we discuss some challenges and promising techniques for software health management (SWHM). In particular, we identify unique challenges for preventing software failure in systems which involve both software and hardware components. We then present our classifications of techniques related to SWHM. These classifications are performed based on dimensions of interest to both developers and users of the techniques, and hopefully provide a map for dealing with software faults and failures.
Software Management for the NOνAExperiment
NASA Astrophysics Data System (ADS)
Davies, G. S.; Davies, J. P.; C Group; Rebel, B.; Sachdev, K.; Zirnstein, J.
2015-12-01
The NOvAsoftware (NOνASoft) is written in C++, and built on the Fermilab Computing Division's art framework that uses ROOT analysis software. NOνASoftmakes use of more than 50 external software packages, is developed by more than 50 developers and is used by more than 100 physicists from over 30 universities and laboratories in 3 continents. The software builds are handled by Fermilab's custom version of Software Release Tools (SRT), a UNIX based software management system for large, collaborative projects that is used by several experiments at Fermilab. The system provides software version control with SVN configured in a client-server mode and is based on the code originally developed by the BaBar collaboration. In this paper, we present efforts towards distributing the NOvA software via the CernVM File System distributed file system. We will also describe our recent work to use a CMake build system and Jenkins, the open source continuous integration system, for NOνASoft.
NASA Technical Reports Server (NTRS)
Mizell, Carolyn; Malone, Linda
2007-01-01
It is very difficult for project managers to develop accurate cost and schedule estimates for large, complex software development projects. None of the approaches or tools available today can estimate the true cost of software with any high degree of accuracy early in a project. This paper provides an approach that utilizes a software development process simulation model that considers and conveys the level of uncertainty that exists when developing an initial estimate. A NASA project will be analyzed using simulation and data from the Software Engineering Laboratory to show the benefits of such an approach.
2012-08-14
Long Term (2035) M2A3 IFV M113 FOV M1A2 SEPv2 Bradley A3 Non-IFV: -M3A3 Cav Fighting Veh -M7A3 BFIST Bradley A2 Non-IFV: -M2A2 ODS-E Stryker...GCV – IFV AMPV FoV GCV – IFV M2A3 IFV M113 FOV M109A6 M1A2 SEPv2 ECP1 Future Howitzer ? Future Tank? Future Modernization? DVH Future
Federal Emergency Management Information System (FEMIS) system administration guide, version 1.4.5
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arp, J.A.; Burnett, R.A.; Carter, R.J.
The Federal Emergency Management Information Systems (FEMIS) is an emergency management planning and response tool that was developed by the Pacific Northwest National Laboratory (PNNL) under the direction of the US Army Chemical Biological Defense Command. The FEMIS System Administration Guide provides information necessary for the system administrator to maintain the FEMIS system. The FEMIS system is designed for a single Chemical Stockpile Emergency Preparedness Program (CSEPP) site that has multiple Emergency Operations Centers (EOCs). Each EOC has personal computers (PCs) that emergency planners and operations personnel use to do their jobs. These PCs are connected via a local areamore » network (LAN) to servers that provide EOC-wide services. Each EOC is interconnected to other EOCs via a Wide Area Network (WAN). Thus, FEMIS is an integrated software product that resides on client/server computer architecture. The main body of FEMIS software, referred to as the FEMIS Application Software, resides on the PC client(s) and is directly accessible to emergency management personnel. The remainder of the FEMIS software, referred to as the FEMIS Support Software, resides on the UNIX server. The Support Software provides the communication, data distribution, and notification functionality necessary to operate FEMIS in a networked, client/server environment. The UNIX server provides an Oracle relational database management system (RDBMS) services, ARC/INFO GIS (optional) capabilities, and basic file management services. PNNL developed utilities that reside on the server include the Notification Service, the Command Service that executes the evacuation model, and AutoRecovery. To operate FEMIS, the Application Software must have access to a site specific FEMIS emergency management database. Data that pertains to an individual EOC`s jurisdiction is stored on the EOC`s local server. Information that needs to be accessible to all EOCs is automatically distributed by the FEMIS database to the other EOCs at the site.« less
DATALINK: Records inventory data collection software. User`s guide
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cole, B.A.
1995-03-01
DATALINK was created to provide an easy to use data collection program for records management software products. It provides several useful tools for capturing and validating record index data in the field. It also allows users to easily create a comma delimited, ASCII text file for data export into most records management software products. It runs on virtually any computer us MS-DOS.
Power subsystem automation study
NASA Technical Reports Server (NTRS)
Tietz, J. C.; Sewy, D.; Pickering, C.; Sauers, R.
1984-01-01
The purpose of the phase 2 of the power subsystem automation study was to demonstrate the feasibility of using computer software to manage an aspect of the electrical power subsystem on a space station. The state of the art in expert systems software was investigated in this study. This effort resulted in the demonstration of prototype expert system software for managing one aspect of a simulated space station power subsystem.
Global Hawk Systems Engineering. Case Study
2010-01-01
Management Core System ( TBMCS ) (complex software development) • F-111 Fighter (joint program with significant involvement by the Office of the...Software Requirements Specification TACC Tailored Airworthiness Certification Criteria TBMCS Theater Battle Management Core System TEMP Test and
Management of an affiliated Physics Residency Program using a commercial software tool.
Zacarias, Albert S; Mills, Michael D
2010-06-01
A review of commercially available allied health educational management software tools was performed to evaluate their capacity to manage program data associated with a CAMPEP-accredited Therapy Physics Residency Program. Features of these software tools include: a) didactic course reporting and organization, b) competency reporting by topic, category and didactic course, c) student time management and accounting, and d) student patient case reporting by topic, category and course. The software package includes features for recording school administrative information; setting up lists of courses, faculty, clinical sites, categories, competencies, and time logs; and the inclusion of standardized external documents. There are provisions for developing evaluation and survey instruments. The mentors and program may be evaluated by residents, and residents may be evaluated by faculty members using this feature. Competency documentation includes the time spent on the problem or with the patient, time spent with the mentor, date of the competency, and approval by the mentor and program director. Course documentation includes course and lecture title, lecturer, topic information, date of lecture and approval by the Program Director. These software tools have the facility to include multiple clinical sites, with local subadministrators having the ability to approve competencies and attendance at clinical conferences. In total, these software tools have the capability of managing all components of a CAMPEP-accredited residency program. The application database lends the software to the support of multiple affiliated clinical sites within a single residency program. Such tools are a critical and necessary component if the medical physics profession is to meet the projected needs for qualified medical physicists in future years.
Web Application Software for Ground Operations Planning Database (GOPDb) Management
NASA Technical Reports Server (NTRS)
Lanham, Clifton; Kallner, Shawn; Gernand, Jeffrey
2013-01-01
A Web application facilitates collaborative development of the ground operations planning document. This will reduce costs and development time for new programs by incorporating the data governance, access control, and revision tracking of the ground operations planning data. Ground Operations Planning requires the creation and maintenance of detailed timelines and documentation. The GOPDb Web application was created using state-of-the-art Web 2.0 technologies, and was deployed as SaaS (Software as a Service), with an emphasis on data governance and security needs. Application access is managed using two-factor authentication, with data write permissions tied to user roles and responsibilities. Multiple instances of the application can be deployed on a Web server to meet the robust needs for multiple, future programs with minimal additional cost. This innovation features high availability and scalability, with no additional software that needs to be bought or installed. For data governance and security (data quality, management, business process management, and risk management for data handling), the software uses NAMS. No local copy/cloning of data is permitted. Data change log/tracking is addressed, as well as collaboration, work flow, and process standardization. The software provides on-line documentation and detailed Web-based help. There are multiple ways that this software can be deployed on a Web server to meet ground operations planning needs for future programs. The software could be used to support commercial crew ground operations planning, as well as commercial payload/satellite ground operations planning. The application source code and database schema are owned by NASA.
Integrated software system for low level waste management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Worku, G.
1995-12-31
In the continually changing and uncertain world of low level waste management, many generators in the US are faced with the prospect of having to store their waste on site for the indefinite future. This consequently increases the set of tasks performed by the generators in the areas of packaging, characterizing, classifying, screening (if a set of acceptance criteria applies), and managing the inventory for the duration of onsite storage. When disposal sites become available, it is expected that the work will require re-evaluating the waste packages, including possible re-processing, re-packaging, or re-classifying in preparation for shipment for disposal undermore » the regulatory requirements of the time. In this day and age, when there is wide use of computers and computer literacy is at high levels, an important waste management tool would be an integrated software system that aids waste management personnel in conducting these tasks quickly and accurately. It has become evident that such an integrated radwaste management software system offers great benefits to radwaste generators both in the US and other countries. This paper discusses one such approach to integrated radwaste management utilizing some globally accepted radiological assessment software applications.« less
NASA Astrophysics Data System (ADS)
Preradović, D. M.; Mićić, Lj S.; Barz, C.
2017-05-01
Production conditions in today’s world require software support at every stage of production and development of new products, for quality assurance and compliance with ISO standards. In addition to ISO standards such as usual metrics of quality, companies today are focused on other optional standards, such as CMMI (Capability Maturity Model Integrated) or prescribing they own standards. However, while there is intensive progress being made in the PM (project management), there is still a significant number of projects, at the global level, that are failures. These have failed to achieve their goals, within budget or timeframe. This paper focuses on checking the role of software tools through the rate of success in projects implemented in the case of internationally manufactured electrical equipment. The results of this research show the level of contribution of the project management software used to manage and develop new products to improve PM processes and PM functions, and how selection of the software tools affects the quality of PM processes and successfully completed projects.
The M68HC11 gripper controller software. Thesis
NASA Technical Reports Server (NTRS)
Tsai, Jodi Wei-Duk
1991-01-01
This thesis discusses the development of firmware for the 68HC11 gripper controller. A general description of the software and hardware interfaces is given. The C library interface for the gripper is then described and followed by a detailed discussion of the software architecture of the firmware. A procedure to assemble and download 68HC11 programs is presented in the form of a tutorial. The tools used to implement this environment are then described. Finally, the implementation of the configuration management scheme used to manage all CIRSSE software is presented.
1982-03-01
pilot systems. Magnitude of the mutant error is classified as: o Program does not compute. o Program computes but does not run test data. o Program...14 Test and Integration ... ............ .. 105 15 The Mapping of SQM to the SDLC ........ ... 108 16 ADS Development .... .............. . 224 17...and funds. While the test phase concludes the normal development cycle, one should realize that with software the development continues in the
Understanding Acceptance of Software Metrics--A Developer Perspective
ERIC Educational Resources Information Center
Umarji, Medha
2009-01-01
Software metrics are measures of software products and processes. Metrics are widely used by software organizations to help manage projects, improve product quality and increase efficiency of the software development process. However, metrics programs tend to have a high failure rate in organizations, and developer pushback is one of the sources…
48 CFR 208.7401 - Definitions.
Code of Federal Regulations, 2010 CFR
2010-10-01
... OF DEFENSE ACQUISITION PLANNING REQUIRED SOURCES OF SUPPLIES AND SERVICES Enterprise Software... a contract that is used to acquire designated commercial software or related services such as... Officer to develop processes for DoD-wide software asset management. Software maintenance means services...
The Elements of an Effective Software Development Plan - Software Development Process Guidebook
2011-11-11
standards and practices required for all XMPL software development. This SDP implements the <corporate> Standard Software Process (SSP). as tailored...Developing and integrating reusable software products • Approach to managing COTS/Reuse software implementation • COTS/Reuse software selection...final selection and submit to change board for approval MAINTENANCE Monitor current products for obsolescence or end of support Track new
Feasibility study for the redesign of MDOT's pavement management systems software.
DOT National Transportation Integrated Search
2011-04-01
In August of 2006 the Mississippi Department of Transportation (MDOT) initiated State Study No. 191, entitled Feasibility : Study for the Redesign of MDOTs Pavement Management System (PMS) Software. At the initiation of this study, the : Dep...
Software Schedules Missions, Aids Project Management
NASA Technical Reports Server (NTRS)
2008-01-01
NASA missions require advanced planning, scheduling, and management, and the Space Agency has worked extensively to develop the programs and software suites necessary to facilitate these complex missions. These enormously intricate undertakings have hundreds of active components that need constant management and monitoring. It is no surprise, then, that the software developed for these tasks is often applicable in other high-stress, complex environments, like in government or industrial settings. NASA work over the past few years has resulted in a handful of new scheduling, knowledge-management, and research tools developed under contract with one of NASA s partners. These tools have the unique responsibility of supporting NASA missions, but they are also finding uses outside of the Space Program.
Enhance Learning on Software Project Management through a Role-Play Game in a Virtual World
ERIC Educational Resources Information Center
Maratou, Vicky; Chatzidaki, Eleni; Xenos, Michalis
2016-01-01
This article presents a role-play game for software project management (SPM) in a three-dimensional online multiuser virtual world. The Opensimulator platform is used for the creation of an immersive virtual environment that facilitates students' collaboration and realistic interaction, in order to manage unexpected events occurring during the…
New and revised fire effects tools for fire management
Robert E. Keane; Greg Dillon; Stacy Drury; Robin Innes; Penny Morgan; Duncan Lutes; Susan J. Prichard; Jane Smith; Eva Strand
2014-01-01
Announcing the release of new software packages for application in wildland fire science and management, two fields that are already fully saturated with computer technology, may seem a bit too much to many managers. However, there have been some recent releases of new computer programs and revisions of existing software and information tools that deserve mention...
ERIC Educational Resources Information Center
Anderson, Scott; Raasch, Kevin
2002-01-01
Provides an evaluation template for student activities professionals charged with evaluating competitive event scheduling software. Guides staff in making an informed decision on whether to retain event management technology provided through an existing vendor or choose "best-of-breed" scheduling software. (EV)
Cleanroom Software Engineering Reference Model. Version 1.0.
1996-11-01
teams. It also serves as a baseline for continued evolution of Cleanroom practice. The scope of the CRM is software management , specification...addition to project staff, participants include management , peer organization representatives, and customer representatives as appropriate for...2 Review the status of the process with management , the project team, peer groups, and the customer . These verification activities include
Better software, better research: the challenge of preserving your research and your reputation
NASA Astrophysics Data System (ADS)
Chue Hong, N.
2017-12-01
Software is fundamental to research. From short, thrown-together temporary scripts, through an abundance of complex spreadsheets analysing collected data, to the hundreds of software engineers and millions of lines of code behind international efforts such as the Large Hadron Collider and the Square Kilometre Array, software has made an invaluable contribution to advancing our research knowledge. Within the earth and space sciences, data is being generated, collected, processed and analysed in ever greater amounts and detail. However the pace of this improvement leads to challenges around the persistence of research outputs and artefacts. A specific challenge in this field is that often experiments and measurements cannot be repeated, yet the infrastructure used to manage, store and process this data must be continually updated and developed: constant change just to stay still. The UK-based Software Sustainability Institute (SSI) aims to improve research software sustainability, working with researchers, funders, research software engineers, managers, and other stakeholders across the research spectrum. In this talk, I will present lessons learned and good practice based on the work of the Institute and its collaborators. I will summarise some of the work that is being done to improve the integration of infrastructure for managing research outputs, including around software citation and reward, extending data management plans, and improving researcher skills: "better software, better research". Ultimately, being a modern researcher in the geosciences requires you to efficiently balance the pursuit of new knowledge with making your work reusable and reproducible. And as scientists are placed under greater scrutiny about whether others can trust their results, the preservation of your artefacts has a key role in the preservation of your reputation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Landry, Guillaume, E-mail: g.landry@lmu.de; Nijhuis, Reinoud; Thieke, Christian
2015-03-15
Purpose: Intensity modulated proton therapy (IMPT) of head and neck (H and N) cancer patients may be improved by plan adaptation. The decision to adapt the treatment plan based on a dose recalculation on the current anatomy requires a diagnostic quality computed tomography (CT) scan of the patient. As gantry-mounted cone beam CT (CBCT) scanners are currently being offered by vendors, they may offer daily or weekly updates of patient anatomy. CBCT image quality may not be sufficient for accurate proton dose calculation and it is likely necessary to perform CBCT CT number correction. In this work, the authors investigatedmore » deformable image registration (DIR) of the planning CT (pCT) to the CBCT to generate a virtual CT (vCT) to be used for proton dose recalculation. Methods: Datasets of six H and N cancer patients undergoing photon intensity modulated radiation therapy were used in this study to validate the vCT approach. Each dataset contained a CBCT acquired within 3 days of a replanning CT (rpCT), in addition to a pCT. The pCT and rpCT were delineated by a physician. A Morphons algorithm was employed in this work to perform DIR of the pCT to CBCT following a rigid registration of the two images. The contours from the pCT were deformed using the vector field resulting from DIR to yield a contoured vCT. The DIR accuracy was evaluated with a scale invariant feature transform (SIFT) algorithm comparing automatically identified matching features between vCT and CBCT. The rpCT was used as reference for evaluation of the vCT. The vCT and rpCT CT numbers were converted to stopping power ratio and the water equivalent thickness (WET) was calculated. IMPT dose distributions from treatment plans optimized on the pCT were recalculated with a Monte Carlo algorithm on the rpCT and vCT for comparison in terms of gamma index, dose volume histogram (DVH) statistics as well as proton range. The DIR generated contours on the vCT were compared to physician-drawn contours on the rpCT. Results: The DIR accuracy was better than 1.4 mm according to the SIFT evaluation. The mean WET differences between vCT (pCT) and rpCT were below 1 mm (2.6 mm). The amount of voxels passing 3%/3 mm gamma criteria were above 95% for the vCT vs rpCT. When using the rpCT contour set to derive DVH statistics from dose distributions calculated on the rpCT and vCT the differences, expressed in terms of 30 fractions of 2 Gy, were within [−4, 2 Gy] for parotid glands (D{sub mean}), spinal cord (D{sub 2%}), brainstem (D{sub 2%}), and CTV (D{sub 95%}). When using DIR generated contours for the vCT, those differences ranged within [−8, 11 Gy]. Conclusions: In this work, the authors generated CBCT based stopping power distributions using DIR of the pCT to a CBCT scan. DIR accuracy was below 1.4 mm as evaluated by the SIFT algorithm. Dose distributions calculated on the vCT agreed well to those calculated on the rpCT when using gamma index evaluation as well as DVH statistics based on the same contours. The use of DIR generated contours introduced variability in DVH statistics.« less
Engineering Quality Software: 10 Recommendations for Improved Software Quality Management
2010-04-27
lack of user involvement • Inadequate Software Process Management & Control By Contractors • No “Team” of Vendors and users; little SME participation...1990 Quality Perspectives • Process Quality ( CMMI ) • Product Quality (ISO/IEC 2500x) – Internal Quality Attributes – External Quality Attributes... CMMI /ISO 9000 Assessments – Capture organizational knowledge • Identify best practices, lessons learned Know where you are, and where you need to be
Improving Software Quality and Management Through Use of Service Level Agreements
2005-03-01
many who believe that the quality of the development process is the best predictor of software product quality. ( Fenton ) Repeatable software processes...reduced errors per KLOC for small projects ( Fenton ), and the quality management metric (QMM) (Machniak, Osmundson). There are also numerous IEEE 14...attention to cosmetic user interface issues and any problems that may arise with the prototype. (Sawyer) The validation process is also another check
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arp, J.A.; Bower, J.C.; Burnett, R.A.
The Federal Emergency Management Information System (FEMIS) is an emergency management planning and response tool that was developed by the Pacific Northwest National Laboratory (PNNL) under the direction of the U.S. Army Chemical Biological Defense Command. The FEMIS System Administration Guide provides information necessary for the system administrator to maintain the FEMIS system. The FEMIS system is designed for a single Chemical Stockpile Emergency Preparedness Program (CSEPP) site that has multiple Emergency Operations Centers (EOCs). Each EOC has personal computers (PCs) that emergency planners and operations personnel use to do their jobs. These PCs are corrected via a local areamore » network (LAN) to servers that provide EOC-wide services. Each EOC is interconnected to other EOCs via a Wide Area Network (WAN). Thus, FEMIS is an integrated software product that resides on client/server computer architecture. The main body of FEMIS software, referred to as the FEMIS Application Software, resides on the PC client(s) and is directly accessible to emergency management personnel. The remainder of the FEMIS software, referred to as the FEMIS Support Software, resides on the UNIX server. The Support Software provides the communication data distribution and notification functionality necessary to operate FEMIS in a networked, client/server environment.« less
Federal Emergency Management Information System (FEMIS), Installation Guide for FEMIS 1.4.6
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arp, J.A.; Burnett, R.A.; Carter, R.J.
The Federal Emergency Management Information System (FEMIS) is an emergency management planning and response tool that was developed by the Pacific Northwest National Laboratory (PNNL) under the direction of the U.S. Army Chemical Biological Defense Command. The FEMIS System Administration Guide provides information necessary for the system administrator to maintain the FEMIS system. The FEMIS system is designed for a single Chemical Stockpile Emergency Preparedness Program (CSEPP) site that has multiple Emergency Operations Centers (EOCs). Each EOC has personal computers (PCs) that emergency planners and operations personnel use to do their jobs. These PCs are corrected via a local areamore » network (LAN) to servers that provide EOC-wide services. Each EOC is interconnected to other EOCs via a Wide Area Network (WAN). Thus, FEMIS is an integrated software product that resides on client/server computer architecture. The main body of FEMIS software, referred to as the FEMIS Application Software, resides on the PC client(s) and is directly accessible to emergency management personnel. The remainder of the FEMIS software, referred to as the FEMIS Support Software, resides on the UNIX server. The Support Software provides the communication data distribution and notification functionality necessary to operate FEMIS in a networked, client/server environment.« less
Software project management tools in global software development: a systematic mapping study.
Chadli, Saad Yasser; Idri, Ali; Ros, Joaquín Nicolás; Fernández-Alemán, José Luis; de Gea, Juan M Carrillo; Toval, Ambrosio
2016-01-01
Global software development (GSD) which is a growing trend in the software industry is characterized by a highly distributed environment. Performing software project management (SPM) in such conditions implies the need to overcome new limitations resulting from cultural, temporal and geographic separation. The aim of this research is to discover and classify the various tools mentioned in literature that provide GSD project managers with support and to identify in what way they support group interaction. A systematic mapping study has been performed by means of automatic searches in five sources. We have then synthesized the data extracted and presented the results of this study. A total of 102 tools were identified as being used in SPM activities in GSD. We have classified these tools, according to the software life cycle process on which they focus and how they support the 3C collaboration model (communication, coordination and cooperation). The majority of the tools found are standalone tools (77%). A small number of platforms (8%) also offer a set of interacting tools that cover the software development lifecycle. Results also indicate that SPM areas in GSD are not adequately supported by corresponding tools and deserve more attention from tool builders.
Fowler, K. R.; Jenkins, E.W.; Parno, M.; Chrispell, J.C.; Colón, A. I.; Hanson, Randall T.
2016-01-01
The development of appropriate water management strategies requires, in part, a methodology for quantifying and evaluating the impact of water policy decisions on regional stakeholders. In this work, we describe the framework we are developing to enhance the body of resources available to policy makers, farmers, and other community members in their e orts to understand, quantify, and assess the often competing objectives water consumers have with respect to usage. The foundation for the framework is the construction of a simulation-based optimization software tool using two existing software packages. In particular, we couple a robust optimization software suite (DAKOTA) with the USGS MF-OWHM water management simulation tool to provide a flexible software environment that will enable the evaluation of one or multiple (possibly competing) user-defined (or stakeholder) objectives. We introduce the individual software components and outline the communication strategy we defined for the coupled development. We present numerical results for case studies related to crop portfolio management with several defined objectives. The objectives are not optimally satisfied for any single user class, demonstrating the capability of the software tool to aid in the evaluation of a variety of competing interests.
Quantitative CMMI Assessment for Offshoring through the Analysis of Project Management Repositories
NASA Astrophysics Data System (ADS)
Sunetnanta, Thanwadee; Nobprapai, Ni-On; Gotel, Olly
The nature of distributed teams and the existence of multiple sites in offshore software development projects pose a challenging setting for software process improvement. Often, the improvement and appraisal of software processes is achieved through a turnkey solution where best practices are imposed or transferred from a company’s headquarters to its offshore units. In so doing, successful project health checks and monitoring for quality on software processes requires strong project management skills, well-built onshore-offshore coordination, and often needs regular onsite visits by software process improvement consultants from the headquarters’ team. This paper focuses on software process improvement as guided by the Capability Maturity Model Integration (CMMI) and proposes a model to evaluate the status of such improvement efforts in the context of distributed multi-site projects without some of this overhead. The paper discusses the application of quantitative CMMI assessment through the collection and analysis of project data gathered directly from project repositories to facilitate CMMI implementation and reduce the cost of such implementation for offshore-outsourced software development projects. We exemplify this approach to quantitative CMMI assessment through the analysis of project management data and discuss the future directions of this work in progress.
Selecting information technology for physicians' practices: a cross-sectional study.
Eden, Karen Beekman
2002-04-05
Many physicians are transitioning from paper to electronic formats for billing, scheduling, medical charts, communications, etc. The primary objective of this research was to identify the relationship (if any) between the software selection process and the office staff's perceptions of the software's impact on practice activities. A telephone survey was conducted with office representatives of 407 physician practices in Oregon who had purchased information technology. The respondents, usually office managers, answered scripted questions about their selection process and their perceptions of the software after implementation. Multiple logistic regression revealed that software type, selection steps, and certain factors influencing the purchase were related to whether the respondents felt the software improved the scheduling and financial analysis practice activities. Specifically, practices that selected electronic medical record or practice management software, that made software comparisons, or that considered prior user testimony as important were more likely to have perceived improvements in the scheduling process than were other practices. Practices that considered value important, that did not consider compatibility important, that selected managed care software, that spent less than 10,000 dollars, or that provided learning time (most dramatic increase in odds ratio, 8.2) during implementation were more likely to perceive that the software had improved the financial analysis process than were other practices. Perhaps one of the most important predictors of improvement was providing learning time during implementation, particularly when the software involves several practice activities. Despite this importance, less than half of the practices reported performing this step.
Workflow-Based Software Development Environment
NASA Technical Reports Server (NTRS)
Izygon, Michel E.
2013-01-01
The Software Developer's Assistant (SDA) helps software teams more efficiently and accurately conduct or execute software processes associated with NASA mission-critical software. SDA is a process enactment platform that guides software teams through project-specific standards, processes, and procedures. Software projects are decomposed into all of their required process steps or tasks, and each task is assigned to project personnel. SDA orchestrates the performance of work required to complete all process tasks in the correct sequence. The software then notifies team members when they may begin work on their assigned tasks and provides the tools, instructions, reference materials, and supportive artifacts that allow users to compliantly perform the work. A combination of technology components captures and enacts any software process use to support the software lifecycle. It creates an adaptive workflow environment that can be modified as needed. SDA achieves software process automation through a Business Process Management (BPM) approach to managing the software lifecycle for mission-critical projects. It contains five main parts: TieFlow (workflow engine), Business Rules (rules to alter process flow), Common Repository (storage for project artifacts, versions, history, schedules, etc.), SOA (interface to allow internal, GFE, or COTS tools integration), and the Web Portal Interface (collaborative web environment
Programmable bandwidth management in software-defined EPON architecture
NASA Astrophysics Data System (ADS)
Li, Chengjun; Guo, Wei; Wang, Wei; Hu, Weisheng; Xia, Ming
2016-07-01
This paper proposes a software-defined EPON architecture which replaces the hardware-implemented DBA module with reprogrammable DBA module. The DBA module allows pluggable bandwidth allocation algorithms among multiple ONUs adaptive to traffic profiles and network states. We also introduce a bandwidth management scheme executed at the controller to manage the customized DBA algorithms for all date queues of ONUs. Our performance investigation verifies the effectiveness of this new EPON architecture, and numerical results show that software-defined EPONs can achieve less traffic delay and provide better support to service differentiation in comparison with traditional EPONs.
[Sem: a suitable statistical software adaptated for research in oncology].
Kwiatkowski, F; Girard, M; Hacene, K; Berlie, J
2000-10-01
Many softwares have been adapted for medical use; they rarely enable conveniently both data management and statistics. A recent cooperative work ended up in a new software, Sem (Statistics Epidemiology Medicine), which allows data management of trials and, as well, statistical treatments on them. Very convenient, it can be used by non professional in statistics (biologists, doctors, researchers, data managers), since usually (excepted with multivariate models), the software performs by itself the most adequate test, after what complementary tests can be requested if needed. Sem data base manager (DBM) is not compatible with usual DBM: this constitutes a first protection against loss of privacy. Other shields (passwords, cryptage...) strengthen data security, all the more necessary today since Sem can be run on computers nets. Data organization enables multiplicity: forms can be duplicated by patient. Dates are treated in a special but transparent manner (sorting, date and delay calculations...). Sem communicates with common desktop softwares, often with a simple copy/paste. So, statistics can be easily performed on data stored in external calculation sheets, and slides by pasting graphs with a single mouse click (survival curves...). Already used over fifty places in different hospitals for daily work, this product, combining data management and statistics, appears to be a convenient and innovative solution.
Health management and controls for Earth-to-orbit propulsion systems
NASA Astrophysics Data System (ADS)
Bickford, R. L.
1995-03-01
Avionics and health management technologies increase the safety and reliability while decreasing the overall cost for Earth-to-orbit (ETO) propulsion systems. New ETO propulsion systems will depend on highly reliable fault tolerant flight avionics, advanced sensing systems and artificial intelligence aided software to ensure critical control, safety and maintenance requirements are met in a cost effective manner. Propulsion avionics consist of the engine controller, actuators, sensors, software and ground support elements. In addition to control and safety functions, these elements perform system monitoring for health management. Health management is enhanced by advanced sensing systems and algorithms which provide automated fault detection and enable adaptive control and/or maintenance approaches. Aerojet is developing advanced fault tolerant rocket engine controllers which provide very high levels of reliability. Smart sensors and software systems which significantly enhance fault coverage and enable automated operations are also under development. Smart sensing systems, such as flight capable plume spectrometers, have reached maturity in ground-based applications and are suitable for bridging to flight. Software to detect failed sensors has reached similar maturity. This paper will discuss fault detection and isolation for advanced rocket engine controllers as well as examples of advanced sensing systems and software which significantly improve component failure detection for engine system safety and health management.
Proceedings of the 14th Annual Software Engineering Workshop
NASA Technical Reports Server (NTRS)
1989-01-01
Several software related topics are presented. Topics covered include studies and experiment at the Software Engineering Laboratory at the Goddard Space Flight Center, predicting project success from the Software Project Management Process, software environments, testing in a reuse environment, domain directed reuse, and classification tree analysis using the Amadeus measurement and empirical analysis.
NASA Technical Reports Server (NTRS)
1990-01-01
Papers presented at RICIS Software Engineering Symposium are compiled. The following subject areas are covered: flight critical software; management of real-time Ada; software reuse; megaprogramming software; Ada net; POSIX and Ada integration in the Space Station Freedom Program; and assessment of formal methods for trustworthy computer systems.
Evaluation of Agricultural Accounting Software. Improved Decision Making. Third Edition.
ERIC Educational Resources Information Center
Lovell, Ashley C., Comp.
Following a discussion of the evaluation criteria for choosing accounting software, this guide contains reviews of 27 accounting software programs that could be used by farm or ranch business managers. The information in the reviews was provided by the software vendors and covers the following points for each software package: general features,…
Software Health Management with Bayesian Networks
NASA Technical Reports Server (NTRS)
Mengshoel, Ole; Schumann, JOhann
2011-01-01
Most modern aircraft as well as other complex machinery is equipped with diagnostics systems for its major subsystems. During operation, sensors provide important information about the subsystem (e.g., the engine) and that information is used to detect and diagnose faults. Most of these systems focus on the monitoring of a mechanical, hydraulic, or electromechanical subsystem of the vehicle or machinery. Only recently, health management systems that monitor software have been developed. In this paper, we will discuss our approach of using Bayesian networks for Software Health Management (SWHM). We will discuss SWHM requirements, which make advanced reasoning capabilities for the detection and diagnosis important. Then we will present our approach to using Bayesian networks for the construction of health models that dynamically monitor a software system and is capable of detecting and diagnosing faults.
Software reliability through fault-avoidance and fault-tolerance
NASA Technical Reports Server (NTRS)
Vouk, Mladen A.; Mcallister, David F.
1993-01-01
Strategies and tools for the testing, risk assessment and risk control of dependable software-based systems were developed. Part of this project consists of studies to enable the transfer of technology to industry, for example the risk management techniques for safety-concious systems. Theoretical investigations of Boolean and Relational Operator (BRO) testing strategy were conducted for condition-based testing. The Basic Graph Generation and Analysis tool (BGG) was extended to fully incorporate several variants of the BRO metric. Single- and multi-phase risk, coverage and time-based models are being developed to provide additional theoretical and empirical basis for estimation of the reliability and availability of large, highly dependable software. A model for software process and risk management was developed. The use of cause-effect graphing for software specification and validation was investigated. Lastly, advanced software fault-tolerance models were studied to provide alternatives and improvements in situations where simple software fault-tolerance strategies break down.
Program Manager: Journal of the Defense Systems Management College, Volume 17, Number 3
1988-06-01
34 modernizing plants and processes, We have established a network with What does "quality" mean? First, the streamlining management, pooling trade associations...pant an opportunity to reflect on the - Network building may be the first opportunity for some organizational climate and hierarchical managers to...s devop slom p se result of the soaring cost of soft -Software Performance Testing. 3 ware enhancements. This difference in hardware and software
Theory and Practice Meets in Industrial Process Design -Educational Perspective-
NASA Astrophysics Data System (ADS)
Aramo-Immonen, Heli; Toikka, Tarja
Software engineer should see himself as a business process designer in enterprise resource planning system (ERP) re-engineering project. Software engineers and managers should have design dialogue. The objective of this paper is to discuss the motives to study the design research in connection of management education in order to envision and understand the soft human issues in the management context. Second goal is to develop means of practicing social skills between designers and managers. This article explores the affective components of design thinking in industrial management domain. In the conceptual part of this paper are discussed concepts of network and project economy, creativity, communication, use of metaphors, and design thinking. Finally is introduced empirical research plan and first empirical results from design method experiments among the multi-disciplined groups of the master-level students of industrial engineering and management and software engineering.
Software Assessment of the Global Force Management (GFM) Search Capability Study
2017-02-01
Study by Timothy Hanratty, Mark Mittrick, Alex Vertlieb, and Frederick Brundick Approved for public release; distribution...Army Research Laboratory Software Assessment of the Global Force Management (GFM) Search Capability Study by Timothy Hanratty, Mark Mittrick...Force Management (GFM) Search Capability Study 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Timothy
ERIC Educational Resources Information Center
Moreau, Nancy
2008-01-01
This article discusses the impact of patents for computer algorithms in course management systems. Referring to historical documents and court cases, the positive and negative aspects of software patents are presented. The key argument is the accessibility to algorithms comprising a course management software program such as Blackboard. The…
Assessing Your Assets: Systems for Tracking and Managing IT Assets Can Save Time and Dollars
ERIC Educational Resources Information Center
Holub, Patricia A.
2007-01-01
The average school district loses more than $80,000 per year because of lost or damaged IT assets, according to a QED survey cosponsored by Follett Software Company. And many districts--59 percent--still use manual systems to track assets. Enter asset management systems. Software for managing assets, when implemented properly, can save time,…
An Investigation of Techniques for Detecting Data Anomalies in Earned Value Management Data
2011-12-01
Management Studio Harte Hanks Trillium Software Trillium Software System IBM Info Sphere Foundation Tools Informatica Data Explorer Informatica ...Analyst Informatica Developer Informatica Administrator Pitney Bowes Business Insight Spectrum SAP BusinessObjects Data Quality Management DataFlux...menting quality monitoring efforts and tracking data quality improvements Informatica http://www.informatica.com/products_services/Pages/index.aspx
Software Configuration Management Plan for the B-Plant Canyon Ventilation Control System
DOE Office of Scientific and Technical Information (OSTI.GOV)
MCDANIEL, K.S.
1999-08-31
Project W-059 installed a new B Plant Canyon Ventilation System. Monitoring and control of the system is implemented by the Canyon Ventilation Control System (CVCS). This Software Configuration Management Plan provides instructions for change control of the CVCS.
SafetyAnalyst : software tools for safety management of specific highway sites
DOT National Transportation Integrated Search
2010-07-01
SafetyAnalyst provides a set of software tools for use by state and local highway agencies for highway safety management. SafetyAnalyst can be used by highway agencies to improve their programming of site-specific highway safety improvements. SafetyA...
Integrated Modeling Environment
NASA Technical Reports Server (NTRS)
Mosier, Gary; Stone, Paul; Holtery, Christopher
2006-01-01
The Integrated Modeling Environment (IME) is a software system that establishes a centralized Web-based interface for integrating people (who may be geographically dispersed), processes, and data involved in a common engineering project. The IME includes software tools for life-cycle management, configuration management, visualization, and collaboration.
ENVIRONMENTAL METHODS TESTING SITE PROJECT: DATA MANAGEMENT PROCEDURES PLAN
The Environmental Methods Testing Site (EMTS) Data Management Procedures Plan identifies the computer hardware and software resources used in the EMTS project. It identifies the major software packages that are available for use by principal investigators for the analysis of data...
2009-09-01
NII)/CIO Assistant Secretary of Defense for Networks and Information Integration/Chief Information Officer CMMI Capability Maturity Model...a Web-based portal to share knowledge about software process-related methodologies, such as the SEI’s Capability Maturity Model Integration ( CMMI ...19 SEI’s IDEALSM model, and Lean Six Sigma.20 For example, the portal features content areas such as software acquisition management, the SEI CMMI
1985-11-01
McAuto) Transaction Manager Subsystem during 1984/1985 period. On-Line Software Responsible for programming the International (OSI) Communications...Network Transaction Manager (NTM) in 1981/1984 period. Software Performance Responsible for directing the Engineering (SPE) work on performance...computer software Contained herein are theoretical and/or SCAN Project 1prierity sao referenoes that In so way reflect Air Forceowmed or -developed $62 LO
Database Access Manager for the Software Engineering Laboratory (DAMSEL) user's guide
NASA Technical Reports Server (NTRS)
1990-01-01
Operating instructions for the Database Access Manager for the Software Engineering Laboratory (DAMSEL) system are presented. Step-by-step instructions for performing various data entry and report generation activities are included. Sample sessions showing the user interface display screens are also included. Instructions for generating reports are accompanied by sample outputs for each of the reports. The document groups the available software functions by the classes of users that may access them.
Sustainable Software Decisions for Long-term Projects (Invited)
NASA Astrophysics Data System (ADS)
Shepherd, A.; Groman, R. C.; Chandler, C. L.; Gaylord, D.; Sun, M.
2013-12-01
Adopting new, emerging technologies can be difficult for established projects that are positioned to exist for years to come. In some cases the challenge lies in the pre-existing software architecture. In others, the challenge lies in the fluctuation of resources like people, time and funding. The Biological and Chemical Oceanography Data Management Office (BCO-DMO) was created in late 2006 by combining the data management offices for the U.S. GLOBEC and U.S. JGOFS programs to publish data for researchers funded by the National Science Foundation (NSF). Since its inception, BCO-DMO has been supporting access and discovery of these data through web-accessible software systems, and the office has worked through many of the challenges of incorporating new technologies into its software systems. From migrating human readable, flat file metadata storage into a relational database, and now, into a content management system (Drupal) to incorporating controlled vocabularies, new technologies can radically affect the existing software architecture. However, through the use of science-driven use cases, effective resource management, and loosely coupled software components, BCO-DMO has been able to adapt its existing software architecture to adopt new technologies. One of the latest efforts at BCO-DMO revolves around applying metadata semantics for publishing linked data in support of data discovery. This effort primarily affects the metadata web interface software at http://bco-dmo.org and the geospatial interface software at http://mapservice.bco-dmo.org/. With guidance from science-driven use cases and consideration of our resources, implementation decisions are made using a strategy to loosely couple the existing software systems to the new technologies. The results of this process led to the use of REST web services and a combination of contributed and custom Drupal modules for publishing BCO-DMO's content using the Resource Description Framework (RDF) via an instance of the Virtuoso Open-Source triplestore.
NASA Technical Reports Server (NTRS)
McNeill, Justin
1995-01-01
The Multimission Image Processing Subsystem (MIPS) at the Jet Propulsion Laboratory (JPL) has managed transitions of application software sets from one operating system and hardware platform to multiple operating systems and hardware platforms. As a part of these transitions, cost estimates were generated from the personal experience of in-house developers and managers to calculate the total effort required for such projects. Productivity measures have been collected for two such transitions, one very large and the other relatively small in terms of source lines of code. These estimates used a cost estimation model similar to the Software Engineering Laboratory (SEL) Effort Estimation Model. Experience in transitioning software within JPL MIPS have uncovered a high incidence of interface complexity. Interfaces, both internal and external to individual software applications, have contributed to software transition project complexity, and thus to scheduling difficulties and larger than anticipated design work on software to be ported.
Omics Metadata Management Software v. 1 (OMMS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Our application, the Omics Metadata Management Software (OMMS), answers both needs, empowering experimentalists to generate intuitive, consistent metadata, and to perform bioinformatics analyses and information management tasks via a simple and intuitive web-based interface. Several use cases with short-read sequence datasets are provided to showcase the full functionality of the OMMS, from metadata curation tasks, to bioinformatics analyses and results management and downloading. The OMMS can be implemented as a stand alone-package for individual laboratories, or can be configured for web-based deployment supporting geographically dispersed research teams. Our software was developed with open-source bundles, is flexible, extensible and easily installedmore » and run by operators with general system administration and scripting language literacy.« less
Technology in nursing scholarship: use of citation reference managers.
Smith, Cheryl M; Baker, Bradford
2007-06-01
Nurses, especially those in academia, feel the pressure to publish but have a limited time to write. One of the more time-consuming and frustrating tasks of research, and subsequent publications, is the collection and organization of accurate citations of sources of information. The purpose of this article is to discuss three types of citation reference managers (personal bibliographic software) and how their use can provide consistency and accuracy in recording all the information needed for the research and writing process. The advantages and disadvantages of three software programs, EndNote, Reference Manager, and ProCite, are discussed. These three software products have a variety of options that can be used in personal data management to assist researchers in becoming published authors.
An Introduction to Flight Software Development: FSW Today, FSW 2010
NASA Technical Reports Server (NTRS)
Gouvela, John
2004-01-01
Experience and knowledge gained from ongoing maintenance of Space Shuttle Flight Software and new development projects including Cockpit Avionics Upgrade are applied to projected needs of the National Space Exploration Vision through Spiral 2. Lessons learned from these current activities are applied to create a sustainable, reliable model for development of critical software to support Project Constellation. This presentation introduces the technologies, methodologies, and infrastructure needed to produce and sustain high quality software. It will propose what is needed to support a Vision for Space Exploration that places demands on the innovation and productivity needed to support future space exploration. The technologies in use today within FSW development include tools that provide requirements tracking, integrated change management, modeling and simulation software. Specific challenges that have been met include the introduction and integration of Commercial Off the Shelf (COTS) Real Time Operating System for critical functions. Though technology prediction has proved to be imprecise, Project Constellation requirements will need continued integration of new technology with evolving methodologies and changing project infrastructure. Targets for continued technology investment are integrated health monitoring and management, self healing software, standard payload interfaces, autonomous operation, and improvements in training. Emulation of the target hardware will also allow significant streamlining of development and testing. The methodologies in use today for FSW development are object oriented UML design, iterative development using independent components, as well as rapid prototyping . In addition, Lean Six Sigma and CMMI play a critical role in the quality and efficiency of the workforce processes. Over the next six years, we expect these methodologies to merge with other improvements into a consolidated office culture with all processes being guided by automated office assistants. The infrastructure in use today includes strict software development and configuration management procedures, including strong control of resource management and critical skills coverage. This will evolve to a fully integrated staff organization with efficient and effective communication throughout all levels guided by a Mission-Systems Architecture framework with focus on risk management and attention toward inevitable product obsolescence. This infrastructure of computing equipment, software and processes will itself be subject to technological change and need for management of change and improvement,
A conceptual model for megaprogramming
NASA Technical Reports Server (NTRS)
Tracz, Will
1990-01-01
Megaprogramming is component-based software engineering and life-cycle management. Magaprogramming and its relationship to other research initiatives (common prototyping system/common prototyping language, domain specific software architectures, and software understanding) are analyzed. The desirable attributes of megaprogramming software components are identified and a software development model and resulting prototype megaprogramming system (library interconnection language extended by annotated Ada) are described.
A Quantitative Analysis of Open Source Software's Acceptability as Production-Quality Code
ERIC Educational Resources Information Center
Fischer, Michael
2011-01-01
The difficulty in writing defect-free software has been long acknowledged both by academia and industry. A constant battle occurs as developers seek to craft software that works within aggressive business schedules and deadlines. Many tools and techniques are used in attempt to manage these software projects. Software metrics are a tool that has…
ERIC Educational Resources Information Center
Kamthan, Pankaj
2007-01-01
Open Source Software (OSS) has introduced a new dimension in software community. As the development and use of OSS becomes prominent, the question of its integration in education arises. In this paper, the following practices fundamental to projects and processes in software engineering are examined from an OSS perspective: project management;…
Software Tools for Development on the Peregrine System | High-Performance
Computing | NREL Software Tools for Development on the Peregrine System Software Tools for and manage software at the source code level. Cross-Platform Make and SCons The "Cross-Platform Make" (CMake) package is from Kitware, and SCons is a modern software build tool based on Python
Modern Corneal Eye-Banking Using a Software-Based IT Management Solution.
Kern, C; Kortuem, K; Wertheimer, C; Nilmayer, O; Dirisamer, M; Priglinger, S; Mayer, W J
2018-01-01
Increasing government legislation and regulations in manufacturing have led to additional documentation regarding the pharmaceutical product requirements of corneal grafts in the European Union. The aim of this project was to develop a software within a hospital information system (HIS) to support the documentation process, to improve the management of the patient waiting list and to increase informational flow between the clinic and eye bank. After an analysis of the current documentation process, a new workflow and software were implemented in our electronic health record (EHR) system. The software takes over most of the documentation and reduces the time required for record keeping. It guarantees real-time tracing of all steps during human corneal tissue processing from the start of production until allocation during surgery and includes follow-up within the HIS. Moreover, listing of the patient for surgery as well as waiting list management takes place in the same system. The new software for corneal eye banking supports the whole process chain by taking over both most of the required documentation and the management of the transplant waiting list. It may provide a standardized IT-based solution for German eye banks working within the same HIS.
Software for the occupational health and safety integrated management system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vătăsescu, Mihaela
2015-03-10
This paper intends to present the design and the production of a software for the Occupational Health and Safety Integrated Management System with the view to a rapid drawing up of the system documents in the field of occupational health and safety.
78 FR 23685 - Airworthiness Directives; The Boeing Company
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-22
... installing new operational software for the electrical load management system and configuration database. The..., installing a new electrical power control panel, and installing new operational software for the electrical load management system and configuration database. Since the proposed AD was issued, we have received...
User systems guidelines for software projects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abrahamson, L.
1986-04-01
This manual presents guidelines for software standards which were developed so that software project-development teams and management involved in approving the software could have a generalized view of all phases in the software production procedure and the steps involved in completing each phase. Guidelines are presented for six phases of software development: project definition, building a user interface, designing software, writing code, testing code, and preparing software documentation. The discussions for each phase include examples illustrating the recommended guidelines. 45 refs. (DWL)
Introduction of the UNIX International Performance Management Work Group
NASA Technical Reports Server (NTRS)
Newman, Henry
1993-01-01
In this paper we presented the planned direction of the UNIX International Performance Management Work Group. This group consists of concerned system developers and users who have organized to synthesize recommendations for standard UNIX performance management subsystem interfaces and architectures. The purpose of these recommendations is to provide a core set of performance management functions and these functions can be used to build tools by hardware system developers, vertical application software developers, and performance application software developers.
A Management Information System for Allocating, Monitoring and Reviewing Work Assignments.
1986-06-01
This thesis investigated the feasibility of developing a small scale management information system on a micro-computer. The working system was...ORSA journal. The management information system was designed using Ashton-Tate’s dBaseIII software. As designed, the system will operate on any...computer operating under microsoft’s Disk Operating System (MS-DOS). The user must provide his own dBaseIII software. A similar management information system could
NASA Technical Reports Server (NTRS)
Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, John W., IV; Henderson, Richard; Futrell, Michael T.
1991-01-01
The Framework Programmable Software Development Platform (FPP) is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software environment. Guided by the model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated. The focus here is on the design of components that make up the FPP. These components serve as supporting systems for the Integration Mechanism and the Framework Processor and provide the 'glue' that ties the FPP together. Also discussed are the components that allow the platform to operate in a distributed, heterogeneous environment and to manage the development and evolution of software system artifacts.
Policy-Based Management Natural Language Parser
NASA Technical Reports Server (NTRS)
James, Mark
2009-01-01
The Policy-Based Management Natural Language Parser (PBEM) is a rules-based approach to enterprise management that can be used to automate certain management tasks. This parser simplifies the management of a given endeavor by establishing policies to deal with situations that are likely to occur. Policies are operating rules that can be referred to as a means of maintaining order, security, consistency, or other ways of successfully furthering a goal or mission. PBEM provides a way of managing configuration of network elements, applications, and processes via a set of high-level rules or business policies rather than managing individual elements, thus switching the control to a higher level. This software allows unique management rules (or commands) to be specified and applied to a cross-section of the Global Information Grid (GIG). This software embodies a parser that is capable of recognizing and understanding conversational English. Because all possible dialect variants cannot be anticipated, a unique capability was developed that parses passed on conversation intent rather than the exact way the words are used. This software can increase productivity by enabling a user to converse with the system in conversational English to define network policies. PBEM can be used in both manned and unmanned science-gathering programs. Because policy statements can be domain-independent, this software can be applied equally to a wide variety of applications.
International Inventory of Software Packages in the Information Field.
ERIC Educational Resources Information Center
Keren, Carl, Ed.; Sered, Irina, Ed.
Designed to provide guidance in selecting appropriate software for library automation, information storage and retrieval, or management of bibliographic databases, this inventory describes 188 computer software packages. The information was obtained through a questionnaire survey of 600 software suppliers and developers who were asked to describe…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peck, T; Sparkman, D; Storch, N
''The LLNL Site-Specific Advanced Simulation and Computing (ASCI) Software Quality Engineering Recommended Practices VI.I'' document describes a set of recommended software quality engineering (SQE) practices for ASCI code projects at Lawrence Livermore National Laboratory (LLNL). In this context, SQE is defined as the process of building quality into software products by applying the appropriate guiding principles and management practices. Continual code improvement and ongoing process improvement are expected benefits. Certain practices are recommended, although projects may select the specific activities they wish to improve, and the appropriate time lines for such actions. Additionally, projects can rely on the guidance ofmore » this document when generating ASCI Verification and Validation (VSrV) deliverables. ASCI program managers will gather information about their software engineering practices and improvement. This information can be shared to leverage the best SQE practices among development organizations. It will further be used to ensure the currency and vitality of the recommended practices. This Overview is intended to provide basic information to the LLNL ASCI software management and development staff from the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I'' document. Additionally the Overview provides steps to using the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I'' document. For definitions of terminology and acronyms, refer to the Glossary and Acronyms sections in the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I''.« less
NASA Technical Reports Server (NTRS)
Mallasch, Paul G.
1993-01-01
This volume contains the complete software system documentation for the Federal Communications Commission (FCC) Transponder Loading Data Conversion Software (FIX-FCC). This software was written to facilitate the formatting and conversion of FCC Transponder Occupancy (Loading) Data before it is loaded into the NASA Geosynchronous Satellite Orbital Statistics Database System (GSOSTATS). The information that FCC supplies NASA is in report form and must be converted into a form readable by the database management software used in the GSOSTATS application. Both the User's Guide and Software Maintenance Manual are contained in this document. This volume of documentation passed an independent quality assurance review and certification by the Product Assurance and Security Office of the Planning Research Corporation (PRC). The manuals were reviewed for format, content, and readability. The Software Management and Assurance Program (SMAP) life cycle and documentation standards were used in the development of this document. Accordingly, these standards were used in the review. Refer to the System/Software Test/Product Assurance Report for the Geosynchronous Satellite Orbital Statistics Database System (GSOSTATS) for additional information.
Non-Algorithmic Issues in Automated Computational Mechanics
1991-04-30
Tworzydlo, Senior Research Engineer and Manager of Advanced Projects Group I. Professor I J. T. Oden, President and Senior Scientist of COMCO, was project...practical applications of the systems reported so far is due to the extremely arduous and complex development and management of a realistic knowledge base...software, designed to effectively implement deep, algorithmic knowledge, * and 0 "intelligent" software, designed to manage shallow, heuristic
ERIC Educational Resources Information Center
Kurien, Sam
2013-01-01
The purpose of the study was to explore whether there are relationships between elements of information technology (IT) governance, strategic planning, and strategic functions among senior and mid-level management at medium-scaled software development firms. Several topics and models of IT governance literature were discussed and the gap in…
NASA Technical Reports Server (NTRS)
1981-01-01
Guidelines and recommendations are presented for the collection of software development data. Motivation and planning for, and implementation and management of, a data collection effort are discussed. Topics covered include types, sources, and availability of data; methods and costs of data collection; types of analyses supported; and warnings and suggestions based on software engineering laboratory (SEL) experiences. This document is intended as a practical guide for software managers and engineers, abstracted and generalized from 5 years of SEL data collection.
Natural language processing-based COTS software and related technologies survey.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stickland, Michael G.; Conrad, Gregory N.; Eaton, Shelley M.
Natural language processing-based knowledge management software, traditionally developed for security organizations, is now becoming commercially available. An informal survey was conducted to discover and examine current NLP and related technologies and potential applications for information retrieval, information extraction, summarization, categorization, terminology management, link analysis, and visualization for possible implementation at Sandia National Laboratories. This report documents our current understanding of the technologies, lists software vendors and their products, and identifies potential applications of these technologies.
Space Station Information Systems
NASA Technical Reports Server (NTRS)
Pittman, Clarence W.
1988-01-01
The utility of the Space Station is improved, the ability to manage and integrate its development and operation enhanced, and the cost and risk of developing the software for it is minimized by three major information systems. The Space Station Information System (SSIS) provides for the transparent collection and dissemination of operational information to all users and operators. The Technical and Management Information System (TMIS) provides all the developers with timely and consistent program information and a project management 'window' to assess the project status. The Software Support Environment (SSE) provides automated tools and standards to be used by all software developers. Together, these three systems are vital to the successful execution of the program.
Engineering Documentation and Data Control
NASA Technical Reports Server (NTRS)
Matteson, Michael J.; Bramley, Craig; Ciaruffoli, Veronica
2001-01-01
Mississippi Space Services (MSS) the facility services contractor for NASA's John C. Stennis Space Center (SSC), is utilizing technology to improve engineering documentation and data control. Two identified improvement areas, labor intensive documentation research and outdated drafting standards, were targeted as top priority. MSS selected AutoManager(R) WorkFlow from Cyco software to manage engineering documentation. The software is currently installed on over 150 desctops. The outdated SSC drafting standard was written for pre-CADD drafting methods, in other words, board drafting. Implementation of COTS software solutions to manage engineering documentation and update the drafting standard resulted in significant increases in productivity by reducing the time spent searching for documents.
Mission Management Computer Software for RLV-TD
NASA Astrophysics Data System (ADS)
Manju, C. R.; Joy, Josna Susan; Vidya, L.; Sheenarani, I.; Sruthy, C. N.; Viswanathan, P. C.; Dinesh, Sudin; Jayalekshmy, L.; Karuturi, Kesavabrahmaji; Sheema, E.; Syamala, S.; Unnikrishnan, S. Manju; Ali, S. Akbar; Paramasivam, R.; Sheela, D. S.; Shukkoor, A. Abdul; Lalithambika, V. R.; Mookiah, T.
2017-12-01
The Mission Management Computer (MMC) software is responsible for the autonomous navigation, sequencing, guidance and control of the Re-usable Launch Vehicle (RLV), through lift-off, ascent, coasting, re-entry, controlled descent and splashdown. A hard real-time system has been designed for handling the mission requirements in an integrated manner and for meeting the stringent timing constraints. Redundancy management and fault-tolerance techniques are also built into the system, in order to achieve a successful mission even in presence of component failures. This paper describes the functions and features of the components of the MMC software which has accomplished the successful RLV-Technology Demonstrator mission.
Architectures and Evaluation for Adjustable Control Autonomy for Space-Based Life Support Systems
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Schreckenghost, Debra K.
2001-01-01
In the past five years, a number of automation applications for control of crew life support systems have been developed and evaluated in the Adjustable Autonomy Testbed at NASA's Johnson Space Center. This paper surveys progress on an adjustable autonomous control architecture for situations where software and human operators work together to manage anomalies and other system problems. When problems occur, the level of control autonomy can be adjusted, so that operators and software agents can work together on diagnosis and recovery. In 1997 adjustable autonomy software was developed to manage gas transfer and storage in a closed life support test. Four crewmembers lived and worked in a chamber for 91 days, with both air and water recycling. CO2 was converted to O2 by gas processing systems and wheat crops. With the automation software, significantly fewer hours were spent monitoring operations. System-level validation testing of the software by interactive hybrid simulation revealed problems both in software requirements and implementation. Since that time, we have been developing multi-agent approaches for automation software and human operators, to cooperatively control systems and manage problems. Each new capability has been tested and demonstrated in realistic dynamic anomaly scenarios, using the hybrid simulation tool.
NASA Technical Reports Server (NTRS)
Wichmann, Benjamin C.
2013-01-01
I work directly with the System Monitoring and Control (SMC) software engineers who develop, test and release custom and commercial software in support of the Kennedy Space Center Spaceport Command and Control System. (SCCS). SMC uses Commercial Off-The-Shelf (COTS) Enterprise Management Systems (EMS) software which provides a centralized subsystem for configuring, monitoring, and controlling SCCS hardware and software used in the Control Rooms. There are multiple projects being worked on using the COTS EMS software. I am currently working with the HP Operations Manager for UNIX (OMU) software which allows Master Console Operators (MCO) to access, view and interpret messages regarding the status of the SCCS hardware and software. The OMU message browser gets cluttered with messages which can make it difficult for the MCO to manage. My main project involves determining ways to reduce the number of messages being displayed in the OMU message browser. I plan to accomplish this task in two different ways: (1) by correlating multiple messages into one single message being displayed and (2) to create policies that will determine the significance of each message and whether or not it needs to be displayed to the MCO. The core idea is to lessen the number of messages being sent to the OMU message browser so the MCO can more effectively use it.
Cost Estimation of Software Development and the Implications for the Program Manager
1992-06-01
Software Lifecycle Model (SLIM), the Jensen System-4 model, the Software Productivity, Quality, and Reliability Estimator ( SPQR \\20), the Constructive...function models in current use are the Software Productivity, Quality, and Reliability Estimator ( SPQR /20) and the Software Architecture Sizing and...Estimator ( SPQR /20) was developed by T. Capers Jones of Software Productivity Research, Inc., in 1985. The model is intended to estimate the outcome
DOE Office of Scientific and Technical Information (OSTI.GOV)
WHITE, D.A.
1999-12-29
This Software Configuration Management Plan (SCMP) provides the instructions for change control of the AZ1101 Mixer Pump Demonstration Data Acquisition System (DAS) and the Sludge Mobilization Cart (Gamma Cart) Data Acquisition and Control System (DACS).
From Workstation to Teacher Support System: A Tool to Increase Productivity.
ERIC Educational Resources Information Center
Chen, J. Wey
1989-01-01
Describes a teacher support system which is a computer-based workstation that provides support for teachers and administrators by integrating teacher utility programs, instructional management software, administrative packages, and office automation tools. Hardware is described and software components are explained, including database managers,…
For operation of the Computer Software Management and Information Center (COSMIC)
NASA Technical Reports Server (NTRS)
Carmon, J. L.
1983-01-01
During the month of June, the Survey Research Center (SRC) at the University of Georgia designed new benefits questionnaires for computer software management and information center (COSMIC). As a test of their utility, these questionnaires are now used in the benefits identification process.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-02
... Software Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory Commission. ACTION... Computer Software Used in Safety Systems of Nuclear Power Plants.'' This RG endorses, with clarifications... Electrical and Electronic Engineers (IEEE) Standard 828-2005, ``IEEE Standard for Software Configuration...
Managing configuration software of ground software applications with glueware
NASA Technical Reports Server (NTRS)
Larsen, B.; Herrera, R.; Sesplaukis, T.; Cheng, L.; Sarrel, M.
2003-01-01
This paper reports on a simple, low-cost effort to streamline the configuration of the uplink software tools. Even though the existing ground system consisted of JPL and custom Cassini software rather than COTS, we chose a glueware approach--reintegrating with wrappers and bridges and adding minimal new functionality.
Designing Educational Software for Tomorrow.
ERIC Educational Resources Information Center
Harvey, Wayne
Designed to address the management and use of computer software in education and training, this paper explores both good and poor software design, calling for improvements in the quality of educational software by attending to design considerations that are based on general principles of learning rather than specific educational objectives. This…
Ethics in computer software design and development
Alan J. Thomson; Daniel L. Schmoldt
2001-01-01
Over the past 20 years, computer software has become integral and commonplace for operational and management tasks throughout agricultural and natural resource disciplines. During this software infusion, however, little thought has been afforded human impacts, both good and bad. This paper examines current ethical issues of software system design and development in...
Implementing Large Projects in Software Engineering Courses
ERIC Educational Resources Information Center
Coppit, David
2006-01-01
In software engineering education, large projects are widely recognized as a useful way of exposing students to the real-world difficulties of team software development. But large projects are difficult to put into practice. First, educators rarely have additional time to manage software projects. Second, classrooms have inherent limitations that…
NASA Technical Reports Server (NTRS)
Martin, F. H.
1972-01-01
An overview of the executive system design task is presented. The flight software executive system, software verification, phase B baseline avionics system review, higher order languages and compilers, and computer hardware features are also discussed.
A methodology for producing reliable software, volume 1
NASA Technical Reports Server (NTRS)
Stucki, L. G.; Moranda, P. B.; Foshee, G.; Kirchoff, M.; Omre, R.
1976-01-01
An investigation into the areas having an impact on producing reliable software including automated verification tools, software modeling, testing techniques, structured programming, and management techniques is presented. This final report contains the results of this investigation, analysis of each technique, and the definition of a methodology for producing reliable software.
WISE: Automated support for software project management and measurement. M.S. Thesis
NASA Technical Reports Server (NTRS)
Ramakrishnan, Sudhakar
1995-01-01
One important aspect of software development and IV&V is measurement. Unless a software development effort is measured in some way, it is difficult to judge the effectiveness of current efforts and predict future performances. Collection of metrics and adherence to a process are difficult tasks in a software project. Change activity is a powerful indicator of project status. Automated systems that can handle change requests, issues, and other process documents provide an excellent platform for tracking the status of the project. A World Wide Web based architecture is developed for (a) making metrics collection an implicit part of the software process, (b) providing metric analysis dynamically, (c) supporting automated tools that can complement current practices of in-process improvement, and (d) overcoming geographical barrier. An operational system (WISE) instantiates this architecture allowing for the improvement of software process in a realistic environment. The tool tracks issues in software development process, provides informal communication between the users with different roles, supports to-do lists (TDL), and helps in software process improvement. WISE minimizes the time devoted to metrics collection, analysis, and captures software change data. Automated tools like WISE focus on understanding and managing the software process. The goal is improvement through measurement.
NASA Astrophysics Data System (ADS)
Liang, Likai; Bi, Yushen
Considered on the distributed network management system's demand of high distributives, extensibility and reusability, a framework model of Three-tier distributed network management system based on COM/COM+ and DNA is proposed, which adopts software component technology and N-tier application software framework design idea. We also give the concrete design plan of each layer of this model. Finally, we discuss the internal running process of each layer in the distributed network management system's framework model.
A novel method for the evaluation of uncertainty in dose-volume histogram computation.
Henríquez, Francisco Cutanda; Castrillón, Silvia Vargas
2008-03-15
Dose-volume histograms (DVHs) are a useful tool in state-of-the-art radiotherapy treatment planning, and it is essential to recognize their limitations. Even after a specific dose-calculation model is optimized, dose distributions computed by using treatment-planning systems are affected by several sources of uncertainty, such as algorithm limitations, measurement uncertainty in the data used to model the beam, and residual differences between measured and computed dose. This report presents a novel method to take them into account. To take into account the effect of associated uncertainties, a probabilistic approach using a new kind of histogram, a dose-expected volume histogram, is introduced. The expected value of the volume in the region of interest receiving an absorbed dose equal to or greater than a certain value is found by using the probability distribution of the dose at each point. A rectangular probability distribution is assumed for this point dose, and a formulation that accounts for uncertainties associated with point dose is presented for practical computations. This method is applied to a set of DVHs for different regions of interest, including 6 brain patients, 8 lung patients, 8 pelvis patients, and 6 prostate patients planned for intensity-modulated radiation therapy. Results show a greater effect on planning target volume coverage than in organs at risk. In cases of steep DVH gradients, such as planning target volumes, this new method shows the largest differences with the corresponding DVH; thus, the effect of the uncertainty is larger.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xiao, Z; Zou, J; Yue, N
Purpose: To evaluate if the same DVH constrains used in photon plans can be safely used to plan proton therapy for lung cancer. Since protons and photons have different dose deposition patterns, the hypothesis is following DVH constrains derived from photon world is not safe for proton. Methods: We retrospectively evaluated plans for 11 lung cancer patients. Each patient was planned with photon and proton following the same dose constrains. Dose statistics on PTV, normal lung, heart and esophagus were extracted for comparison. gEUD for normal lung was calculated and compared between proton and photon plans. We calculated series ofmore » gEUDs for each plan by varying the parameter “a” in gEUD formula from 0.1 to 3, covering the whole confidence interval. Results: For all patients, proton plans yield similar PTV coverage and lower dose to heart and esophagus than photon plans. Normal lung V5 was 32.3 % on average in proton plans than 55.4 % in photon. Normal lung gEUD monotonically increased with increasing “a” for all proton and photon plans. For a given patient, the gEUD-proton(a) had a steeper slope than gEUD-photon(a). The two curves crossed for 8 out of 11 patients when “a” = [0.1, 3]. a-crossing ranged from 0.8 to 2.44 with an average of 1.15. For a« less
Taylor, Bruce G; Stein, Nan D; Mumford, Elizabeth A; Woods, Daniel
2013-02-01
We randomly assigned the Shifting Boundaries interventions to 30 public middle schools in New York City, enrolling 117 sixth and seventh grade classes (over 2,500 students) to receive a classroom, a building, a combined, or neither intervention. The classroom intervention included a six-session curriculum emphasizing the laws and consequences for perpetrators of dating violence and sexual harassment (DV/H), the social construction of gender roles, and healthy relationships. The building-based intervention included the use of building-based restraining orders, higher levels of faculty/security presence in safe/unsafe "hot spots" mapped by students, and posters to increase DV/H awareness and reporting. Student surveys were implemented at baseline, immediately after the intervention, and 6-months post-intervention. As hypothesized, behaviors improved as a result of the interventions. The building-only and the combined interventions were effective in reducing sexual violence victimization involving either peers or dating partners at 6-months post-intervention. This was mirrored by reductions in sexual violence perpetration by peers in the building-only intervention. While the preponderance of results indicates that the interventions were effective, an anomalous result (increase in sexual harassment victimization reports that was contradicted by lower frequency estimates) did emerge. However, after analysis these anomalous results were deemed to be most likely spurious. The success of the building-only intervention alone is important because it can be implemented with very few extra costs to schools.
Respiratory gating and multifield technique radiotherapy for esophageal cancer.
Ohta, Atsushi; Kaidu, Motoki; Tanabe, Satoshi; Utsunomiya, Satoru; Sasamoto, Ryuta; Maruyama, Katsuya; Tanaka, Kensuke; Saito, Hirotake; Nakano, Toshimichi; Shioi, Miki; Takahashi, Haruna; Kushima, Naotaka; Abe, Eisuke; Aoyama, Hidefumi
2017-03-01
To investigate the effects of a respiratory gating and multifield technique on the dose-volume histogram (DVH) in radiotherapy for esophageal cancer. Twenty patients who underwent four-dimensional computed tomography for esophageal cancer were included. We retrospectively created the four treatment plans for each patient, with or without the respiratory gating and multifield technique: No gating-2-field, No gating-4-field, Gating-2-field, and Gating-4-field plans. We compared the DVH parameters of the lung and heart in the No gating-2-field plan with the other three plans. In the comparison of the parameters in the No gating-2-field plan, there are significant differences in the Lung V 5Gy , V 20Gy , mean dose with all three plans and the Heart V 25Gy -V 40Gy with Gating-2-field plan, V 35Gy , V 40Gy , mean dose with No Gating-4-field plan and V 30Gy -V 40Gy , and mean dose with Gating-4-field plan. The lung parameters were smaller in the Gating-2-field plan and larger in the No gating-4-field and Gating-4-field plans. The heart parameters were all larger in the No gating-2-field plan. The lung parameters were reduced by the respiratory gating technique and increased by the multifield technique. The heart parameters were reduced by both techniques. It is important to select the optimal technique according to the risk of complications.
Kawakami, Shogo; Ishiyama, Hiromichi; Satoh, Takefumi; Tsumura, Hideyasu; Sekiguchi, Akane; Takenaka, Kouji; Tabata, Ken-Ichi; Iwamura, Masatsugu; Hayakawa, Kazushige
2017-08-01
To compare prostate contours on conventional stepping transverse image acquisitions with those on twister-based sagittal image acquisitions. Twenty prostate cancer patients who were planned to have permanent interstitial prostate brachytherapy were prospectively accrued. A transrectal ultrasonography probe was inserted, with the patient in lithotomy position. Transverse images were obtained with stepping movement of the transverse transducer. In the same patient, sagittal images were also obtained through rotation of the sagittal transducer using the "Twister" mode. The differences of prostate size among the two types of image acquisitions were compared. The relationships among the difference of the two types of image acquisitions, dose-volume histogram (DVH) parameters on the post-implant computed tomography (CT) analysis, as well as other factors were analyzed. The sagittal image acquisitions showed a larger prostate size compared to the transverse image acquisitions especially in the anterior-posterior (AP) direction ( p < 0.05). Interestingly, relative size of prostate apex in AP direction in sagittal image acquisitions compared to that in transverse image acquisitions was correlated to DVH parameters such as D 90 ( R = 0.518, p = 0.019), and V 100 ( R = 0.598, p = 0.005). There were small but significant differences in the prostate contours between the transverse and the sagittal planning image acquisitions. Furthermore, our study suggested that the differences between the two types of image acquisitions might correlated to dosimetric results on CT analysis.
Fukugawa, Yoshiyuki; Namimoto, Tomohiro; Toya, Ryo; Saito, Tetsuo; Yuki, Hideaki; Matsuyama, Tomohiko; Ikeda, Osamu; Yamashita, Yasuyuki; Oya, Natsuo
2017-02-01
Focal liver reaction (FLR) appears in the hepatobiliary-phase images of gadolinium-ethoxybenzyl-diethylenetriamine pentaacetic acid-enhanced magnetic resonance imaging (Gd-EOB-DTPA-enhanced MRI) following radiotherapy (RT). We investigated the threshold dose (TD) for FLR development in 13 patients with hepatocellular carcinoma (HCC) who underwent three-dimensional conformal radiotherapy (3D-CRT) with 45 Gy in 15 fractions. FLR volumes (FLRVs) were calculated based on planning CT images by referring to fused hepatobiliary- phase images. We also calculated the TD and the irradiated volumes (IVs) of the liver parenchyma at a given dose of every 5 Gy (IVdose) based on a dose-volume histogram (DVH). The median TD was 35.2 Gy. The median IV20, IV25, IV30, IV35, IV40, and IV45 values were 371.1, 274.8, 233.4, 188.6, 145.8, and 31.0 ml, respectively. The median FLRV was 144.9 ml. There was a significant difference between the FLRV and IV20, IV25, and IV45 (p<0.05), but no significant differences between the FLRV and IV30, IV35, or IV40. These results suggest that the threshold dose of the FLR is approx. 35 Gy in HCC patients who undergo 3D-CRT in 15 fractions. The percentage of the whole liver volume receiving a dose of more than 30-40 Gy (V30-40) is a potential candidate optimal DVH parameter for this fractionation schedule.
The NCC project: A quality management perspective
NASA Technical Reports Server (NTRS)
Lee, Raymond H.
1993-01-01
The Network Control Center (NCC) Project introduced the concept of total quality management (TQM) in mid-1990. The CSC project team established a program which focused on continuous process improvement in software development methodology and consistent deliveries of high quality software products for the NCC. The vision of the TQM program was to produce error free software. Specific goals were established to allow continuing assessment of the progress toward meeting the overall quality objectives. The total quality environment, now a part of the NCC Project culture, has become the foundation for continuous process improvement and has resulted in the consistent delivery of quality software products over the last three years.
STGT program: Ada coding and architecture lessons learned
NASA Technical Reports Server (NTRS)
Usavage, Paul; Nagurney, Don
1992-01-01
STGT (Second TDRSS Ground Terminal) is currently halfway through the System Integration Test phase (Level 4 Testing). To date, many software architecture and Ada language issues have been encountered and solved. This paper, which is the transcript of a presentation at the 3 Dec. meeting, attempts to define these lessons plus others learned regarding software project management and risk management issues, training, performance, reuse, and reliability. Observations are included regarding the use of particular Ada coding constructs, software architecture trade-offs during the prototyping, development and testing stages of the project, and dangers inherent in parallel or concurrent systems, software, hardware, and operations engineering.
NASA Astrophysics Data System (ADS)
Hussain, Azham; Mkpojiogu, Emmanuel O. C.; Abdullah, Inam
2016-08-01
Requirements Engineering (RE) is a systemic and integrated process of eliciting, elaborating, negotiating, validating and managing of the requirements of a system in a software development project. UUM has been supported by various systems developed and maintained by the UUM Information Technology (UUMIT) Centre. The aim of this study was to assess the current requirements engineering practices at UUMIT. The main problem that prompted this research is the lack of studies that support software development activities at the UUMIT. The study is geared at helping UUMIT produce quality but time and cost saving software products by implementing cutting edge and state of the art requirements engineering practices. Also, the study contributes to UUM by identifying the activities needed for software development so that the management will be able to allocate budget to provide adequate and precise training for the software developers. Three variables were investigated: Requirement Description, Requirements Development (comprising: Requirements Elicitation, Requirements Analysis and Negotiation, Requirements Validation), and Requirement Management. The results from the study showed that the current practice of requirement engineering in UUMIT is encouraging, but still need further development and improvement because a few RE practices were seldom practiced.
Design of a secure remote management module for a software-operated medical device.
Burnik, Urban; Dobravec, Štefan; Meža, Marko
2017-12-09
Software-based medical devices need to be maintained throughout their entire life cycle. The efficiency of after-sales maintenance can be improved by managing medical systems remotely. This paper presents how to design the remote access function extensions in order to prevent risks imposed by uncontrolled remote access. A thorough analysis of standards and legislation requirements regarding safe operation and risk management of medical devices is presented. Based on the formal requirements, a multi-layer machine design solution is proposed that eliminates remote connectivity risks by strict separation of regular device functionalities from remote management service, deploys encrypted communication links and uses digital signatures to prevent mishandling of software images. The proposed system may also be used as an efficient version update of the existing medical device designs.
Increasing productivity through Total Reuse Management (TRM)
NASA Technical Reports Server (NTRS)
Schuler, M. P.
1991-01-01
Total Reuse Management (TRM) is a new concept currently being promoted by the NASA Langley Software Engineering and Ada Lab (SEAL). It uses concepts similar to those promoted in Total Quality Management (TQM). Both technical and management personnel are continually encouraged to think in terms of reuse. Reuse is not something that is aimed for after a product is completed, but rather it is built into the product from inception through development. Lowering software development costs, reducing risk, and increasing code reliability are the more prominent goals of TRM. Procedures and methods used to adopt and apply TRM are described. Reuse is frequently thought of as only being applicable to code. However, reuse can apply to all products and all phases of the software life cycle. These products include management and quality assurance plans, designs, and testing procedures. Specific examples of successfully reused products are given and future goals are discussed.
Johnston, Sharon; Wong, Sabrina T; Blackman, Stephanie; Chau, Leena W; Grool, Anne M; Hogg, William
2017-11-16
Recruiting family physicians into primary care research studies requires researchers to continually manage information coming in, going out, and coming in again. In many research groups, Microsoft Excel and Access are the usual data management tools, but they are very basic and do not support any automation, linking, or reminder systems to manage and integrate recruitment information and processes. We explored whether a commercial customer relationship management (CRM) software program - designed for sales people in businesses to improve customer relations and communications - could be used to make the research recruitment system faster, more effective, and more efficient. We found that while there was potential for long-term studies, it simply did not adapt effectively enough for our shorter study and recruitment budget. The amount of training required to master the software and our need for ongoing flexible and timely support were greater than the benefit of using CRM software for our study.
Development of a change management system
NASA Technical Reports Server (NTRS)
Parks, Cathy Bonifas
1993-01-01
The complexity and interdependence of software on a computer system can create a situation where a solution to one problem causes failures in dependent software. In the computer industry, software problems arise and are often solved with 'quick and dirty' solutions. But in implementing these solutions, documentation about the solution or user notification of changes is often overlooked, and new problems are frequently introduced because of insufficient review or testing. These problems increase when numerous heterogeneous systems are involved. Because of this situation, a change management system plays an integral part in the maintenance of any multisystem computing environment. At the NASA Ames Advanced Computational Facility (ACF), the Online Change Management System (OCMS) was designed and developed to manage the changes being applied to its multivendor computing environment. This paper documents the research, design, and modifications that went into the development of this change management system (CMS).
Using Knowledge Management to Revise Software-Testing Processes
ERIC Educational Resources Information Center
Nogeste, Kersti; Walker, Derek H. T.
2006-01-01
Purpose: This paper aims to use a knowledge management (KM) approach to effectively revise a utility retailer's software testing process. This paper presents a case study of how the utility organisation's customer services IT production support group improved their test planning skills through applying the American Productivity and Quality Center…
Power Electronics and Electric Machines Publications | Transportation
electric machines. For more information about the following publications, contact Sreekant Narumanchi. A , NREL Software Spray System Evaluation (Software 1.1 MB) Papers 2017 Electric Motor Thermal Management Source: Douglas DeVoto. 2017. 14 pp. NREL/MP-5400-67117. Power Electronics Thermal Management Research
Bibliographic Management Software Seminars: Funding and Implementation.
ERIC Educational Resources Information Center
Henry, Marcia
This paper contains the grant proposal and final report for a project conducted by the California State University at Northridge library to demonstrate online database searching and introduce the use of bibliographic management software to faculty and graduate students. Day-long, discipline-oriented seminars were planned to increase the…
Building Databases for Education. ERIC Digest.
ERIC Educational Resources Information Center
Klausmeier, Jane A.
This digest provides a brief explanation of what a database is; explains how a database can be used; identifies important factors that should be considered when choosing database management system software; and provides citations to sources for finding reviews and evaluations of database management software. The digest is concerned primarily with…
Database Software Selection for the Egyptian National STI Network.
ERIC Educational Resources Information Center
Slamecka, Vladimir
The evaluation and selection of information/data management system software for the Egyptian National Scientific and Technical (STI) Network are described. An overview of the state-of-the-art of database technology elaborates on the differences between information retrieval and database management systems (DBMS). The desirable characteristics of…
Archiving a Software Development Project
2013-04-01
an ongoing monitoring system that identifies attempts and requests for retrieval, and ensures that the attempts and requests cannot proceed without...Intelligence Division Peter Fisher has worked as a consultant, systems analyst, software developer and project manager in Australia, Holland, the USA...4 3.1.3 DRMS – Defence Records Management System
Frameworks Coordinate Scientific Data Management
NASA Technical Reports Server (NTRS)
2012-01-01
Jet Propulsion Laboratory computer scientists developed a unique software framework to help NASA manage its massive amounts of science data. Through a partnership with the Apache Software Foundation of Forest Hill, Maryland, the technology is now available as an open-source solution and is in use by cancer researchers and pediatric hospitals.
2003-03-01
private sector . Researchers have also identified software acquisitions as one of the major differences between the private sector and public sector MIS. This indicates that the elements for a successful software project in the public sector may be different from the private sector . Private sector project success depends on many elements. Three of them are user interaction with the project’s development, critical success factors, and how the project manager prioritizes the traditional success criteria.
NASA Technical Reports Server (NTRS)
1994-01-01
A software management system, originally developed for Goddard Space Flight Center (GSFC) by Century Computing, Inc. has evolved from a menu and command oriented system to a state-of-the art user interface development system supporting high resolution graphics workstations. Transportable Applications Environment (TAE) was initially distributed through COSMIC and backed by a TAE support office at GSFC. In 1993, Century Computing assumed the support and distribution functions and began marketing TAE Plus, the system's latest version. The software is easy to use and does not require programming experience.
NASA Astrophysics Data System (ADS)
Barlow, P. M.; Filali-Meknassi, Y.; Sanford, W. E.; Winston, R. B.; Kuniansky, E.; Dawson, C.
2015-12-01
UNESCO's HOPE Initiative—the Hydro Free and (or) Open-source Platform of Experts—was launched in June 2013 as part of UNESCO's International Hydrological Programme. The Initiative arose in response to a recognized need to make free and (or) open-source water-resources software more widely accessible to Africa's water sector. A kit of software is being developed to provide African water authorities, teachers, university lecturers, and researchers with a set of programs that can be enhanced and (or) applied to the development of efficient and sustainable management strategies for Africa's water resources. The Initiative brings together experts from the many fields of water resources to identify software that might be included in the kit, to oversee an objective process for selecting software for the kit, and to engage in training and other modes of capacity building to enhance dissemination of the software. To date, teams of experts from the fields of wastewater treatment, groundwater hydrology, surface-water hydrology, and data management have been formed to identify relevant software from their respective fields. An initial version of the HOPE Software Kit was released in late August 2014 and consists of the STOAT model for wastewater treatment developed by the Water Research Center (United Kingdom) and the MODFLOW-2005 model for groundwater-flow simulation developed by the U.S. Geological Survey. The Kit is available on the UNESCO HOPE website (http://www.hope-initiative.net/).Training in the theory and use of MODFLOW-2005 is planned in southern Africa in conjunction with UNESCO's study of the Kalahari-Karoo/Stampriet Transboundary Aquifer, which extends over an area that includes parts of Botswana, Namibia, and South Africa, and in support of the European Commission's Horizon 2020 FREEWAT project (FREE and open source software tools for WATer resource management; see the UNESCO HOPE website).
RICIS Software Engineering 90 Symposium: Aerospace Applications and Research Directions Proceedings
NASA Technical Reports Server (NTRS)
1990-01-01
Papers presented at RICIS Software Engineering Symposium are compiled. The following subject areas are covered: synthesis - integrating product and process; Serpent - a user interface management system; prototyping distributed simulation networks; and software reuse.
A study of software management and guidelines for flight projects
NASA Technical Reports Server (NTRS)
1980-01-01
A survey of present software development policies and practices, and an analysis of these policies and practices are summarized. Background information necessary to assess the adequacy of present NASA flight software development approaches is presented.
Data systems and computer science: Software Engineering Program
NASA Technical Reports Server (NTRS)
Zygielbaum, Arthur I.
1991-01-01
An external review of the Integrated Technology Plan for the Civil Space Program is presented. This review is specifically concerned with the Software Engineering Program. The goals of the Software Engineering Program are as follows: (1) improve NASA's ability to manage development, operation, and maintenance of complex software systems; (2) decrease NASA's cost and risk in engineering complex software systems; and (3) provide technology to assure safety and reliability of software in mission critical applications.
Insights into software development in Japan
NASA Technical Reports Server (NTRS)
Duvall, Lorraine M.
1992-01-01
The interdependence of the U.S.-Japanese economies makes it imperative that we in the United States understand how business and technology developments take place in Japan. We can gain insight into these developments in software engineering by studying the context in which Japanese software is developed, the practices that are used, the problems encountered, the setting surrounding these problems, and the resolution of these problems. Context includes the technological and sociological characteristics of the software development environment, the software processes applied, personnel involved in the development process, and the corporate and social culture surrounding the development. Presented in this paper is a summary of results of a study that addresses these issues. Data for this study was collected during a three month visit to Japan where the author interviewed 20 software managers representing nine companies involved in developing software in Japan. These data are compared to similar data from the United States in which 12 managers from five companies were interviewed.
SAGA: A project to automate the management of software production systems
NASA Technical Reports Server (NTRS)
Campbell, Roy H.; Beckman-Davies, C. S.; Benzinger, L.; Beshers, G.; Laliberte, D.; Render, H.; Sum, R.; Smith, W.; Terwilliger, R.
1986-01-01
Research into software development is required to reduce its production cost and to improve its quality. Modern software systems, such as the embedded software required for NASA's space station initiative, stretch current software engineering techniques. The requirements to build large, reliable, and maintainable software systems increases with time. Much theoretical and practical research is in progress to improve software engineering techniques. One such technique is to build a software system or environment which directly supports the software engineering process, i.e., the SAGA project, comprising the research necessary to design and build a software development which automates the software engineering process. Progress under SAGA is described.
Utilizing knowledge from prior plans in the evaluation of quality assurance
NASA Astrophysics Data System (ADS)
Stanhope, Carl; Wu, Q. Jackie; Yuan, Lulin; Liu, Jianfei; Hood, Rodney; Yin, Fang-Fang; Adamson, Justus
2015-06-01
Increased interest regarding sensitivity of pre-treatment intensity modulated radiotherapy and volumetric modulated arc radiotherapy (VMAT) quality assurance (QA) to delivery errors has led to the development of dose-volume histogram (DVH) based analysis. This paradigm shift necessitates a change in the acceptance criteria and action tolerance for QA. Here we present a knowledge based technique to objectively quantify degradations in DVH for prostate radiotherapy. Using machine learning, organ-at-risk (OAR) DVHs from a population of 198 prior patients’ plans were adapted to a test patient’s anatomy to establish patient-specific DVH ranges. This technique was applied to single arc prostate VMAT plans to evaluate various simulated delivery errors: systematic single leaf offsets, systematic leaf bank offsets, random normally distributed leaf fluctuations, systematic lag in gantry angle of the mutli-leaf collimators (MLCs), fluctuations in dose rate, and delivery of each VMAT arc with a constant rather than variable dose rate. Quantitative Analyses of Normal Tissue Effects in the Clinic suggests V75Gy dose limits of 15% for the rectum and 25% for the bladder, however the knowledge based constraints were more stringent: 8.48 ± 2.65% for the rectum and 4.90 ± 1.98% for the bladder. 19 ± 10 mm single leaf and 1.9 ± 0.7 mm single bank offsets resulted in rectum DVHs worse than 97.7% (2σ) of clinically accepted plans. PTV degradations fell outside of the acceptable range for 0.6 ± 0.3 mm leaf offsets, 0.11 ± 0.06 mm bank offsets, 0.6 ± 1.3 mm of random noise, and 1.0 ± 0.7° of gantry-MLC lag. Utilizing a training set comprised of prior treatment plans, machine learning is used to predict a range of achievable DVHs for the test patient’s anatomy. Consequently, degradations leading to statistical outliers may be identified. A knowledge based QA evaluation enables customized QA criteria per treatment site, institution and/or physician and can often be more sensitive to errors than criteria based on organ complication rates.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, K; Zhou, L; Chen, Z
Purpose: RapidPlan uses a library consisting of expert plans from different patients to create a model that can predict achievable dose-volume histograms (DVHs) for new patients. The goal of this study is to investigate the impacts of model library population (plan numbers) on the DVH prediction for rectal cancer patients treated with volumetric-modulated radiotherapy (VMAT) Methods: Ninety clinically accepted rectal cancer patients’ VMAT plans were selected to establish 3 models, named as Model30, Model60 and Model90, with 30,60, and 90 plans in the model training. All plans had sufficient target coverage and bladder and femora sparings. Additional 10 patients weremore » enrolled to test the DVH prediction differences with these 3 models. The predicted DVHs from these 3 models were compared and analyzed. Results: Predicted V40 (Vx, percent of volume that received x Gy for the organs at risk) and Dmean (mean dose, cGy) of the bladder were 39.84±13.38 and 2029.4±141.6 for the Model30,37.52±16.00 and 2012.5±152.2 for the Model60, and 36.33±18.35 and 2066.5±174.3 for the Model90. Predicted V30 and Dmean of the left femur were 23.33±9.96 and 1443.3±114.5 for the Model30, 21.83±5.75 and 1436.6±61.9 for the Model60, and 20.31±4.6 and 1415.0±52.4 for the Model90.There were no significant differences among the 3 models for the bladder and left femur predictions. Predicted V40 and Dmean of the right femur were 19.86±10.00 and 1403.6±115.6 (Model30),18.97±6.19 and 1401.9±68.78 (Model60), and 21.08±7.82 and 1424.0±85.3 (Model90). Although a slight lower DVH prediction of the right femur was found on the Model60, the mean differences for V30 and mean dose were less than 2% and 1%, respectively. Conclusion: There were no significant differences among Model30, Model60 and Model90 for predicting DVHs on rectal patients treated with VMAT. The impact of plan numbers for model library might be limited for cancers with similar target shape.« less
Software control and system configuration management - A process that works
NASA Technical Reports Server (NTRS)
Petersen, K. L.; Flores, C., Jr.
1983-01-01
A comprehensive software control and system configuration management process for flight-crucial digital control systems of advanced aircraft has been developed and refined to insure efficient flight system development and safe flight operations. Because of the highly complex interactions among the hardware, software, and system elements of state-of-the-art digital flight control system designs, a systems-wide approach to configuration control and management has been used. Specific procedures are implemented to govern discrepancy reporting and reconciliation, software and hardware change control, systems verification and validation testing, and formal documentation requirements. An active and knowledgeable configuration control board reviews and approves all flight system configuration modifications and revalidation tests. This flexible process has proved effective during the development and flight testing of several research aircraft and remotely piloted research vehicles with digital flight control systems that ranged from relatively simple to highly complex, integrated mechanizations.
Software life cycle dynamic simulation model: The organizational performance submodel
NASA Technical Reports Server (NTRS)
Tausworthe, Robert C.
1985-01-01
The submodel structure of a software life cycle dynamic simulation model is described. The software process is divided into seven phases, each with product, staff, and funding flows. The model is subdivided into an organizational response submodel, a management submodel, a management influence interface, and a model analyst interface. The concentration here is on the organizational response model, which simulates the performance characteristics of a software development subject to external and internal influences. These influences emanate from two sources: the model analyst interface, which configures the model to simulate the response of an implementing organization subject to its own internal influences, and the management submodel that exerts external dynamic control over the production process. A complete characterization is given of the organizational response submodel in the form of parameterized differential equations governing product, staffing, and funding levels. The parameter values and functions are allocated to the two interfaces.
Marshall Space Flight Center Ground Systems Development and Integration
NASA Technical Reports Server (NTRS)
Wade, Gina
2016-01-01
Ground Systems Development and Integration performs a variety of tasks in support of the Mission Operations Laboratory (MOL) and other Center and Agency projects. These tasks include various systems engineering processes such as performing system requirements development, system architecture design, integration, verification and validation, software development, and sustaining engineering of mission operations systems that has evolved the Huntsville Operations Support Center (HOSC) into a leader in remote operations for current and future NASA space projects. The group is also responsible for developing and managing telemetry and command configuration and calibration databases. Personnel are responsible for maintaining and enhancing their disciplinary skills in the areas of project management, software engineering, software development, software process improvement, telecommunications, networking, and systems management. Domain expertise in the ground systems area is also maintained and includes detailed proficiency in the areas of real-time telemetry systems, command systems, voice, video, data networks, and mission planning systems.
NASA Technical Reports Server (NTRS)
Felippa, Carlos A.
1989-01-01
This is the fifth of a set of five volumes which describe the software architecture for the Computational Structural Mechanics Testbed. Derived from NICE, an integrated software system developed at Lockheed Palo Alto Research Laboratory, the architecture is composed of the command language (CLAMP), the command language interpreter (CLIP), and the data manager (GAL). Volumes 1, 2, and 3 (NASA CR's 178384, 178385, and 178386, respectively) describe CLAMP and CLIP and the CLIP-processor interface. Volumes 4 and 5 (NASA CR's 178387 and 178388, respectively) describe GAL and its low-level I/O. CLAMP, an acronym for Command Language for Applied Mechanics Processors, is designed to control the flow of execution of processors written for NICE. Volume 5 describes the low-level data management component of the NICE software. It is intended only for advanced programmers involved in maintenance of the software.
Software control and system configuration management: A systems-wide approach
NASA Technical Reports Server (NTRS)
Petersen, K. L.; Flores, C., Jr.
1984-01-01
A comprehensive software control and system configuration management process for flight-crucial digital control systems of advanced aircraft has been developed and refined to insure efficient flight system development and safe flight operations. Because of the highly complex interactions among the hardware, software, and system elements of state-of-the-art digital flight control system designs, a systems-wide approach to configuration control and management has been used. Specific procedures are implemented to govern discrepancy reporting and reconciliation, software and hardware change control, systems verification and validation testing, and formal documentation requirements. An active and knowledgeable configuration control board reviews and approves all flight system configuration modifications and revalidation tests. This flexible process has proved effective during the development and flight testing of several research aircraft and remotely piloted research vehicles with digital flight control systems that ranged from relatively simple to highly complex, integrated mechanizations.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-06
..., ``Configuration Management Plans for Digital Computer Software used in Safety Systems of Nuclear Power Plants... Digital Computer Software Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory..., Reviews, and Audits for Digital Computer Software Used in Safety Systems of Nuclear Power Plants.'' This...
Experimental Evaluation of a Serious Game for Teaching Software Process Modeling
ERIC Educational Resources Information Center
Chaves, Rafael Oliveira; von Wangenheim, Christiane Gresse; Furtado, Julio Cezar Costa; Oliveira, Sandro Ronaldo Bezerra; Santos, Alex; Favero, Eloi Luiz
2015-01-01
Software process modeling (SPM) is an important area of software engineering because it provides a basis for managing, automating, and supporting software process improvement (SPI). Teaching SPM is a challenging task, mainly because it lays great emphasis on theory and offers few practical exercises. Furthermore, as yet few teaching approaches…
Guidance and Control Software Project Data - Volume 1: Planning Documents
NASA Technical Reports Server (NTRS)
Hayhurst, Kelly J. (Editor)
2008-01-01
The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the planning documents from the GCS project. Volume 1 contains five appendices: A. Plan for Software Aspects of Certification for the Guidance and Control Software Project; B. Software Development Standards for the Guidance and Control Software Project; C. Software Verification Plan for the Guidance and Control Software Project; D. Software Configuration Management Plan for the Guidance and Control Software Project; and E. Software Quality Assurance Activities.
Software Formal Inspections Guidebook
NASA Technical Reports Server (NTRS)
1993-01-01
The Software Formal Inspections Guidebook is designed to support the inspection process of software developed by and for NASA. This document provides information on how to implement a recommended and proven method for conducting formal inspections of NASA software. This Guidebook is a companion document to NASA Standard 2202-93, Software Formal Inspections Standard, approved April 1993, which provides the rules, procedures, and specific requirements for conducting software formal inspections. Application of the Formal Inspections Standard is optional to NASA program or project management. In cases where program or project management decide to use the formal inspections method, this Guidebook provides additional information on how to establish and implement the process. The goal of the formal inspections process as documented in the above-mentioned Standard and this Guidebook is to provide a framework and model for an inspection process that will enable the detection and elimination of defects as early as possible in the software life cycle. An ancillary aspect of the formal inspection process incorporates the collection and analysis of inspection data to effect continual improvement in the inspection process and the quality of the software subjected to the process.
Investigating interoperability of the LSST data management software stack with Astropy
NASA Astrophysics Data System (ADS)
Jenness, Tim; Bosch, James; Owen, Russell; Parejko, John; Sick, Jonathan; Swinbank, John; de Val-Borro, Miguel; Dubois-Felsmann, Gregory; Lim, K.-T.; Lupton, Robert H.; Schellart, Pim; Krughoff, K. S.; Tollerud, Erik J.
2016-07-01
The Large Synoptic Survey Telescope (LSST) will be an 8.4m optical survey telescope sited in Chile and capable of imaging the entire sky twice a week. The data rate of approximately 15TB per night and the requirements to both issue alerts on transient sources within 60 seconds of observing and create annual data releases means that automated data management systems and data processing pipelines are a key deliverable of the LSST construction project. The LSST data management software has been in development since 2004 and is based on a C++ core with a Python control layer. The software consists of nearly a quarter of a million lines of code covering the system from fundamental WCS and table libraries to pipeline environments and distributed process execution. The Astropy project began in 2011 as an attempt to bring together disparate open source Python projects and build a core standard infrastructure that can be used and built upon by the astronomy community. This project has been phenomenally successful in the years since it has begun and has grown to be the de facto standard for Python software in astronomy. Astropy brings with it considerable expectations from the community on how astronomy Python software should be developed and it is clear that by the time LSST is fully operational in the 2020s many of the prospective users of the LSST software stack will expect it to be fully interoperable with Astropy. In this paper we describe the overlap between the LSST science pipeline software and Astropy software and investigate areas where the LSST software provides new functionality. We also discuss the possibilities of re-engineering the LSST science pipeline software to build upon Astropy, including the option of contributing affliated packages.
Guidance and Control Software,
1980-05-01
commitments of function, cost, and schedule . The phrase "software engineering" was intended to contrast with the phrase "computer science" the latter aims...the software problems of cost, delivery schedule , and quality were gradually being recognized at the highest management levels. Thus, in a project... schedule dates. Although the analysis of software problems indicated that the entire software development process (figure 1) needed new methods, only
NASA space station software standards issues
NASA Technical Reports Server (NTRS)
Tice, G. D., Jr.
1985-01-01
The selection and application of software standards present the NASA Space Station Program with the opportunity to serve as a pacesetter for the United States software in the area of software standards. The strengths and weaknesses of each of the NASA defined software standards issues are summerized and discussed. Several significant standards issues are offered for NASA consideration. A challenge is presented for the NASA Space Station Program to serve as a pacesetter for the U.S. Software Industry through: (1) Management commitment to software standards; (2) Overall program participation in software standards; and (3) Employment of the best available technology to support software standards
Reliability measurement during software development. [for a multisensor tracking system
NASA Technical Reports Server (NTRS)
Hecht, H.; Sturm, W. A.; Trattner, S.
1977-01-01
During the development of data base software for a multi-sensor tracking system, reliability was measured. The failure ratio and failure rate were found to be consistent measures. Trend lines were established from these measurements that provided good visualization of the progress on the job as a whole as well as on individual modules. Over one-half of the observed failures were due to factors associated with the individual run submission rather than with the code proper. Possible application of these findings for line management, project managers, functional management, and regulatory agencies is discussed. Steps for simplifying the measurement process and for use of these data in predicting operational software reliability are outlined.
NASA Astrophysics Data System (ADS)
Kang, Won-Seok; Son, Chang-Sik; Lee, Sangho; Choi, Rock-Hyun; Ha, Yeong-Mi
2017-07-01
In this paper, we introduce a wellness software platform, called WellnessHumanCare, is a semi-automatic wellness management software platform which has the functions of complex wellness data acquisition(mental, physical and environmental one) with smart wearable devices, complex wellness condition analysis, private-aware online/offline recommendation, real-time monitoring apps (Smartphone-based, Web-based) and so on and we has demonstrated a wellness management service with 79 participants (experimental group: 39, control group: 40) who has worked at experimental group (H Corp.) and control group (K Corp.), Korea and 3 months in order to show the efficiency of the WellnessHumanCare.
Software risk management through independent verification and validation
NASA Technical Reports Server (NTRS)
Callahan, John R.; Zhou, Tong C.; Wood, Ralph
1995-01-01
Software project managers need tools to estimate and track project goals in a continuous fashion before, during, and after development of a system. In addition, they need an ability to compare the current project status with past project profiles to validate management intuition, identify problems, and then direct appropriate resources to the sources of problems. This paper describes a measurement-based approach to calculating the risk inherent in meeting project goals that leverages past project metrics and existing estimation and tracking models. We introduce the IV&V Goal/Questions/Metrics model, explain its use in the software development life cycle, and describe our attempts to validate the model through the reverse engineering of existing projects.
Research a Novel Integrated and Dynamic Multi-object Trade-Off Mechanism in Software Project
NASA Astrophysics Data System (ADS)
Jiang, Weijin; Xu, Yuhui
Aiming at practical requirements of present software project management and control, the paper presented to construct integrated multi-object trade-off model based on software project process management, so as to actualize integrated and dynamic trade-oil of the multi-object system of project. Based on analyzing basic principle of dynamic controlling and integrated multi-object trade-off system process, the paper integrated method of cybernetics and network technology, through monitoring on some critical reference points according to the control objects, emphatically discussed the integrated and dynamic multi- object trade-off model and corresponding rules and mechanism in order to realize integration of process management and trade-off of multi-object system.
A CMMI-based approach for medical software project life cycle study.
Chen, Jui-Jen; Su, Wu-Chen; Wang, Pei-Wen; Yen, Hung-Chi
2013-01-01
In terms of medical techniques, Taiwan has gained international recognition in recent years. However, the medical information system industry in Taiwan is still at a developing stage compared with the software industries in other nations. In addition, systematic development processes are indispensable elements of software development. They can help developers increase their productivity and efficiency and also avoid unnecessary risks arising during the development process. Thus, this paper presents an application of Light-Weight Capability Maturity Model Integration (LW-CMMI) to Chang Gung Medical Research Project (CMRP) in the Nuclear medicine field. This application was intended to integrate user requirements, system design and testing of software development processes into three layers (Domain, Concept and Instance) model. Then, expressing in structural System Modeling Language (SysML) diagrams and converts part of the manual effort necessary for project management maintenance into computational effort, for example: (semi-) automatic delivery of traceability management. In this application, it supports establishing artifacts of "requirement specification document", "project execution plan document", "system design document" and "system test document", and can deliver a prototype of lightweight project management tool on the Nuclear Medicine software project. The results of this application can be a reference for other medical institutions in developing medical information systems and support of project management to achieve the aim of patient safety.
NASA Technical Reports Server (NTRS)
Gawadiak, Yuri; Wong, Alan; Maluf, David; Bell, David; Gurram, Mohana; Tran, Khai Peter; Hsu, Jennifer; Yagi, Kenji; Patel, Hemil
2007-01-01
The Program Management Tool (PMT) is a comprehensive, Web-enabled business intelligence software tool for assisting program and project managers within NASA enterprises in gathering, comprehending, and disseminating information on the progress of their programs and projects. The PMT provides planning and management support for implementing NASA programmatic and project management processes and requirements. It provides an online environment for program and line management to develop, communicate, and manage their programs, projects, and tasks in a comprehensive tool suite. The information managed by use of the PMT can include monthly reports as well as data on goals, deliverables, milestones, business processes, personnel, task plans, monthly reports, and budgetary allocations. The PMT provides an intuitive and enhanced Web interface to automate the tedious process of gathering and sharing monthly progress reports, task plans, financial data, and other information on project resources based on technical, schedule, budget, and management criteria and merits. The PMT is consistent with the latest Web standards and software practices, including the use of Extensible Markup Language (XML) for exchanging data and the WebDAV (Web Distributed Authoring and Versioning) protocol for collaborative management of documents. The PMT provides graphical displays of resource allocations in the form of bar and pie charts using Microsoft Excel Visual Basic for Application (VBA) libraries. The PMT has an extensible architecture that enables integration of PMT with other strategic-information software systems, including, for example, the Erasmus reporting system, now part of the NASA Integrated Enterprise Management Program (IEMP) tool suite, at NASA Marshall Space Flight Center (MSFC). The PMT data architecture provides automated and extensive software interfaces and reports to various strategic information systems to eliminate duplicative human entries and minimize data integrity issues among various NASA systems that impact schedules and planning.
A microprocessor card software server to support the Quebec health microprocessor card project.
Durant, P; Bérubé, J; Lavoie, G; Gamache, A; Ardouin, P; Papillon, M J; Fortin, J P
1995-01-01
The Quebec Health Smart Card Project is advocating the use of a memory card software server[1] (SCAM) to implement a portable medical record (PMR) on a smart card. The PMR is viewed as an object that can be manipulated by SCAM's services. In fact, we can talk about a pseudo-object-oriented approach. This software architecture provides a flexible and evolutive way to manage and optimize the PMR. SCAM is a generic software server; it can manage smart cards as well as optical (laser) cards or other types of memory cards. But, in the specific case of the Quebec Health Card Project, SCAM is used to provide services between physicians' or pharmacists' software and IBM smart card technology. We propose to expose the concepts and techniques used to provide a generic environment to deal with smart cards (and more generally with memory cards), to obtain a dynamic an evolutive PMR, to raise the system global security level and the data integrity, to optimize significantly the management of the PMR, and to provide statistic information about the use of the PMR.
Modeling and analysis of selected space station communications and tracking subsystems
NASA Technical Reports Server (NTRS)
Richmond, Elmer Raydean
1993-01-01
The Communications and Tracking System on board Space Station Freedom (SSF) provides space-to-ground, space-to-space, audio, and video communications, as well as tracking data reception and processing services. Each major category of service is provided by a communications subsystem which is controlled and monitored by software. Among these subsystems, the Assembly/Contingency Subsystem (ACS) and the Space-to-Ground Subsystem (SGS) provide communications with the ground via the Tracking and Data Relay Satellite (TDRS) System. The ACS is effectively SSF's command link, while the SGS is primarily intended as the data link for SSF payloads. The research activities of this project focused on the ACS and SGS antenna management algorithms identified in the Flight System Software Requirements (FSSR) documentation, including: (1) software modeling and evaluation of antenna management (positioning) algorithms; and (2) analysis and investigation of selected variables and parameters of these antenna management algorithms i.e., descriptions and definitions of ranges, scopes, and dimensions. In a related activity, to assist those responsible for monitoring the development of this flight system software, a brief summary of software metrics concepts, terms, measures, and uses was prepared.