Sample records for modelling validation plan

  1. Some guidance on preparing validation plans for the DART Full System Models.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gray, Genetha Anne; Hough, Patricia Diane; Hills, Richard Guy

    2009-03-01

    Planning is an important part of computational model verification and validation (V&V) and the requisite planning document is vital for effectively executing the plan. The document provides a means of communicating intent to the typically large group of people, from program management to analysts to test engineers, who must work together to complete the validation activities. This report provides guidelines for writing a validation plan. It describes the components of such a plan and includes important references and resources. While the initial target audience is the DART Full System Model teams in the nuclear weapons program, the guidelines are generallymore » applicable to other modeling efforts. Our goal in writing this document is to provide a framework for consistency in validation plans across weapon systems, different types of models, and different scenarios. Specific details contained in any given validation plan will vary according to application requirements and available resources.« less

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, X; Wang, J; Hu, W

    Purpose: The Varian RapidPlan™ is a commercial knowledge-based optimization process which uses a set of clinically used treatment plans to train a model that can predict individualized dose-volume objectives. The purpose of this study is to evaluate the performance of RapidPlan to generate intensity modulated radiation therapy (IMRT) plans for cervical cancer. Methods: Totally 70 IMRT plans for cervical cancer with varying clinical and physiological indications were enrolled in this study. These patients were all previously treated in our institution. There were two prescription levels usually used in our institution: 45Gy/25 fractions and 50.4Gy/28 fractions. 50 of these plans weremore » selected to train the RapidPlan model for predicting dose-volume constraints. After model training, this model was validated with 10 plans from training pool(internal validation) and additional other 20 new plans(external validation). All plans used for the validation were re-optimized with the original beam configuration and the generated priorities from RapidPlan were manually adjusted to ensure that re-optimized DVH located in the range of the model prediction. DVH quantitative analysis was performed to compare the RapidPlan generated and the original manual optimized plans. Results: For all the validation cases, RapidPlan based plans (RapidPlan) showed similar or superior results compared to the manual optimized ones. RapidPlan increased the result of D98% and homogeneity in both two validations. For organs at risk, the RapidPlan decreased mean doses of bladder by 1.25Gy/1.13Gy (internal/external validation) on average, with p=0.12/p<0.01. The mean dose of rectum and bowel were also decreased by an average of 2.64Gy/0.83Gy and 0.66Gy/1.05Gy,with p<0.01/ p<0.01and p=0.04/<0.01 for the internal/external validation, respectively. Conclusion: The RapidPlan model based cervical cancer plans shows ability to systematically improve the IMRT plan quality. It suggests that RapidPlan has great potential to make the treatment planning process more efficient.« less

  3. The Challenge of Grounding Planning in Simulation with an Interactive Model Development Environment

    NASA Technical Reports Server (NTRS)

    Clement, Bradley J.; Frank, Jeremy D.; Chachere, John M.; Smith, Tristan B.; Swanson, Keith J.

    2011-01-01

    A principal obstacle to fielding automated planning systems is the difficulty of modeling. Physical systems are modeled conventionally based on specification documents and the modeler's understanding of the system. Thus, the model is developed in a way that is disconnected from the system's actual behavior and is vulnerable to manual error. Another obstacle to fielding planners is testing and validation. For a space mission, generated plans must be validated often by translating them into command sequences that are run in a simulation testbed. Testing in this way is complex and onerous because of the large number of possible plans and states of the spacecraft. Though, if used as a source of domain knowledge, the simulator can ease validation. This paper poses a challenge: to ground planning models in the system physics represented by simulation. A proposed, interactive model development environment illustrates the integration of planning and simulation to meet the challenge. This integration reveals research paths for automated model construction and validation.

  4. Institutional Effectiveness: A Model for Planning, Assessment & Validation.

    ERIC Educational Resources Information Center

    Truckee Meadows Community Coll., Sparks, NV.

    The report presents Truckee Meadows Community College's (Colorado) model for assessing institutional effectiveness and validating the College's mission and vision, and the strategic plan for carrying out the institutional effectiveness model. It also outlines strategic goals for the years 1999-2001. From the system-wide directive that education…

  5. SU-E-T-97: An Analysis of Knowledge Based Planning for Stereotactic Body Radiation Therapy of the Spine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foy, J; Marsh, R; Owen, D

    2015-06-15

    Purpose: Creating high quality SBRT treatment plans for the spine is often tedious and time consuming. In addition, the quality of treatment plans can vary greatly between treatment facilities due to inconsistencies in planning methods. This study investigates the performance of knowledge-based planning (KBP) for spine SBRT. Methods: Treatment plans were created for 28 spine SBRT patients. Each case was planned to meet strict dose objectives and guidelines. After physician and physicist approval, the plans were added to a custom model in a KBP system (RapidPlan, Varian Eclipse v13.5). The model was then trained to be able to predict estimatedmore » DVHs and provide starting objective functions for future patients based on both generated and manual objectives. To validate the model, ten additional spine SBRT cases were planned manually as well as using the model objectives. Plans were compared based on planning time and quality (ability to meet the plan objectives, including dose metrics and conformity). Results: The average dose to the spinal cord and the cord PRV differed between the validation and control plans by <0.25% demonstrating iso-toxicity. Six out of 10 validation plans met all dose objectives without the need for modifications, and overall, target dose coverage was increased by about 4.8%. If the validation plans did not meet the dose requirements initially, only 1–2 iterations of modifying the planning parameters were required before an acceptable plan was achieved. While manually created plans usually required 30 minutes to 3 hours to create, KBP can be used to create similar quality plans in 15–20 minutes. Conclusion: KBP for spinal tumors has shown to greatly decrease the amount of time required to achieve high quality treatment plans with minimal human intervention and could feasibly be used to standardize plan quality between institutions. Supported by Varian Medical Systems.« less

  6. The Theory of Planned Behavior (TPB) and Pre-Service Teachers' Technology Acceptance: A Validation Study Using Structural Equation Modeling

    ERIC Educational Resources Information Center

    Teo, Timothy; Tan, Lynde

    2012-01-01

    This study applies the theory of planned behavior (TPB), a theory that is commonly used in commercial settings, to the educational context to explain pre-service teachers' technology acceptance. It is also interested in examining its validity when used for this purpose. It has found evidence that the TPB is a valid model to explain pre-service…

  7. The Role of Structural Models in the Solar Sail Flight Validation Process

    NASA Technical Reports Server (NTRS)

    Johnston, John D.

    2004-01-01

    NASA is currently soliciting proposals via the New Millennium Program ST-9 opportunity for a potential Solar Sail Flight Validation (SSFV) experiment to develop and operate in space a deployable solar sail that can be steered and provides measurable acceleration. The approach planned for this experiment is to test and validate models and processes for solar sail design, fabrication, deployment, and flight. These models and processes would then be used to design, fabricate, and operate scaleable solar sails for future space science missions. There are six validation objectives planned for the ST9 SSFV experiment: 1) Validate solar sail design tools and fabrication methods; 2) Validate controlled deployment; 3) Validate in space structural characteristics (focus of poster); 4) Validate solar sail attitude control; 5) Validate solar sail thrust performance; 6) Characterize the sail's electromagnetic interaction with the space environment. This poster presents a top-level assessment of the role of structural models in the validation process for in-space structural characteristics.

  8. Prototyping and validating requirements of radiation and nuclear emergency plan simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamid, AHA., E-mail: amyhamijah@nm.gov.my; Faculty of Computing, Universiti Teknologi Malaysia; Rozan, MZA.

    2015-04-29

    Organizational incapability in developing unrealistic, impractical, inadequate and ambiguous mechanisms of radiological and nuclear emergency preparedness and response plan (EPR) causing emergency plan disorder and severe disasters. These situations resulting from 65.6% of poor definition and unidentified roles and duties of the disaster coordinator. Those unexpected conditions brought huge aftermath to the first responders, operators, workers, patients and community at large. Hence, in this report, we discuss prototyping and validating of Malaysia radiation and nuclear emergency preparedness and response plan simulation model (EPRM). A prototyping technique was required to formalize the simulation model requirements. Prototyping as systems requirements validation wasmore » carried on to endorse the correctness of the model itself against the stakeholder’s intensions in resolving those organizational incapability. We have made assumptions for the proposed emergency preparedness and response model (EPRM) through the simulation software. Those assumptions provided a twofold of expected mechanisms, planning and handling of the respective emergency plan as well as in bringing off the hazard involved. This model called RANEPF (Radiation and Nuclear Emergency Planning Framework) simulator demonstrated the training emergency response perquisites rather than the intervention principles alone. The demonstrations involved the determination of the casualties’ absorbed dose range screening and the coordination of the capacity planning of the expected trauma triage. Through user-centred design and sociotechnical approach, RANEPF simulator was strategized and simplified, though certainly it is equally complex.« less

  9. Prototyping and validating requirements of radiation and nuclear emergency plan simulator

    NASA Astrophysics Data System (ADS)

    Hamid, AHA.; Rozan, MZA.; Ibrahim, R.; Deris, S.; Selamat, A.

    2015-04-01

    Organizational incapability in developing unrealistic, impractical, inadequate and ambiguous mechanisms of radiological and nuclear emergency preparedness and response plan (EPR) causing emergency plan disorder and severe disasters. These situations resulting from 65.6% of poor definition and unidentified roles and duties of the disaster coordinator. Those unexpected conditions brought huge aftermath to the first responders, operators, workers, patients and community at large. Hence, in this report, we discuss prototyping and validating of Malaysia radiation and nuclear emergency preparedness and response plan simulation model (EPRM). A prototyping technique was required to formalize the simulation model requirements. Prototyping as systems requirements validation was carried on to endorse the correctness of the model itself against the stakeholder's intensions in resolving those organizational incapability. We have made assumptions for the proposed emergency preparedness and response model (EPRM) through the simulation software. Those assumptions provided a twofold of expected mechanisms, planning and handling of the respective emergency plan as well as in bringing off the hazard involved. This model called RANEPF (Radiation and Nuclear Emergency Planning Framework) simulator demonstrated the training emergency response perquisites rather than the intervention principles alone. The demonstrations involved the determination of the casualties' absorbed dose range screening and the coordination of the capacity planning of the expected trauma triage. Through user-centred design and sociotechnical approach, RANEPF simulator was strategized and simplified, though certainly it is equally complex.

  10. Pretest information for a test to validate plume simulation procedures (FA-17)

    NASA Technical Reports Server (NTRS)

    Hair, L. M.

    1978-01-01

    The results of an effort to plan a final verification wind tunnel test to validate the recommended correlation parameters and application techniques were presented. The test planning effort was complete except for test site finalization and the associated coordination. Two suitable test sites were identified. Desired test conditions were shown. Subsequent sections of this report present the selected model and test site, instrumentation of this model, planned test operations, and some concluding remarks.

  11. SU-E-J-244: Development and Validation of a Knowledge Based Planning Model for External Beam Radiation Therapy of Locally Advanced Non-Small Cell Lung Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Z; Kennedy, A; Larsen, E

    2015-06-15

    Purpose: The study aims to develop and validate a knowledge based planning (KBP) model for external beam radiation therapy of locally advanced non-small cell lung cancer (LA-NSCLC). Methods: RapidPlan™ technology was used to develop a lung KBP model. Plans from 65 patients with LA-NSCLC were used to train the model. 25 patients were treated with VMAT, and the other patients were treated with IMRT. Organs-at-risk (OARs) included right lung, left lung, heart, esophagus, and spinal cord. DVH and geometric distribution DVH were extracted from the treated plans. The model was trained using principal component analysis and step-wise multiple regression. Boxmore » plot and regression plot tools were used to identify geometric outliers and dosimetry outliers and help fine-tune the model. The validation was performed by (a) comparing predicted DVH boundaries to actual DVHs of 63 patients and (b) using an independent set of treatment planning data. Results: 63 out of 65 plans were included in the final KBP model with PTV volume ranging from 102.5cc to 1450.2cc. Total treatment dose prescription varied from 50Gy to 70Gy based on institutional guidelines. One patient was excluded due to geometric outlier where 2.18cc of spinal cord was included in PTV. The other patient was excluded due to dosimetric outlier where the dose sparing to spinal cord was heavily enforced in the clinical plan. Target volume, OAR volume, OAR overlap volume percentage to target, and OAR out-of-field volume were included in the trained model. Lungs and heart had two principal component scores of GEDVH, whereas spinal cord and esophagus had three in the final model. Predicted DVH band (mean ±1 standard deviation) represented 66.2±3.6% of all DVHs. Conclusion: A KBP model was developed and validated for radiotherapy of LA-NSCLC in a commercial treatment planning system. The clinical implementation may improve the consistency of IMRT/VMAT planning.« less

  12. RELAP-7 Software Verification and Validation Plan: Requirements Traceability Matrix (RTM) Part 1 – Physics and numerical methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choi, Yong Joon; Yoo, Jun Soo; Smith, Curtis Lee

    2015-09-01

    This INL plan comprehensively describes the Requirements Traceability Matrix (RTM) on main physics and numerical method of the RELAP-7. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7.

  13. Using Structural Equation Modeling to Validate the Theory of Planned Behavior as a Model for Predicting Student Cheating

    ERIC Educational Resources Information Center

    Mayhew, Matthew J.; Hubbard, Steven M.; Finelli, Cynthia J.; Harding, Trevor S.; Carpenter, Donald D.

    2009-01-01

    The purpose of this paper is to validate the use of a modified Theory of Planned Behavior (TPB) for predicting undergraduate student cheating. Specifically, we administered a survey assessing how the TPB relates to cheating along with a measure of moral reasoning (DIT- 2) to 527 undergraduate students across three institutions; and analyzed the…

  14. Applicability of action planning and coping planning to dental flossing among Norwegian adults: a confirmatory factor analysis approach.

    PubMed

    Astrøm, Anne Nordrehaug

    2008-06-01

    Using a prospective design and a representative sample of 25-yr-old Norwegians, this study hypothesized that action planning and coping planning will add to the prediction of flossing at 4 wk of follow-up over and above the effect of intention and previous flossing. This study tested the validity of a proposed 3-factor structure of the measurement model of intention, action planning, and coping planning and for its invariance across gender. A survey was conducted in three Norwegian counties, and 1,509 out of 8,000 randomly selected individuals completed questionnaires assessing the constructs of action planning and coping planning related to daily flossing. A random subsample of 500 participants was followed up at 4 wk with a telephone interview to assess flossing. Confirmatory factor analysis (CFA) confirmed the proposed 3-factor model after respecification. Although the chi-square test was statistically significant [chi(2) = 58.501, degrees of freedom (d.f.) = 17), complementary fit indices were satisfactory [goodness-of-fit index (GFI) = 0.99, root mean squared error of approximation (RMSEA) = 0.04]. Multigroup CFA provided evidence of complete invariance of the measurement model across gender. After controlling for previous flossing, intention (beta = 0.08) and action planning (beta = 0.11) emerged as independent predictors of subsequent flossing, accounting for 2.3% of its variance. Factorial validity of intention, action planning and coping planning, and the validity of action planning in predicting flossing prospectively, was confirmed by the present study.

  15. Spatial calibration and temporal validation of flow for regional scale hydrologic modeling

    USDA-ARS?s Scientific Manuscript database

    Physically based regional scale hydrologic modeling is gaining importance for planning and management of water resources. Calibration and validation of such regional scale model is necessary before applying it for scenario assessment. However, in most regional scale hydrologic modeling, flow validat...

  16. SU-F-T-447: The Impact of Treatment Planning Methods On RapidPlan Modeling for Rectum Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, S; Peng, J; Li, K

    2016-06-15

    Purpose: To investigate the dose volume histogram (DVH) prediction varieties based on intensity modulate radiotherapy (IMRT) plan or volume arc modulate radiotherapy (VMAT) plan models on the RapidPlan. Methods: Two DVH prediction models were generated in this study, including an IMRT model trained from 83 IMRT rectum plans and a VMAT model trained from 60 VMAT rectum plans. In the internal validation, 20 plans from each training database were selected to verify the clinical feasibility of the model. Then, 10 IMRT plans (PIMRT-by-IMRT-model) generated from IMRT model and 10 IMRT plans generated from VMAT model (PIMRT-by-VMAT-model) were compared on themore » dose to organs at risk (OAR), which included bladder, left and right femoral heads. The similar comparison was also performed on the VMAT plans generated from IMRT model (PVMAT-by-IMRT-model) and VMAT plans generated from VMAT (PVMAT-by-VMAT-model) model. Results: For the internal validation, all plans from IMRT or VMAT model shows significantly improvement on OAR sparing compared with the corresponded clinical ones. Compared to the PIMRT-by-VMAT-model, the PIMRT-by-IMRT-model has a reduction of 6.90±3.87%(p<0.001) on V40 6.63±3.62%(p<0.001) on V45 and 4.74±2.26%(p<0.001) on V50 in bladder; and a mean dose reduction of 2.12±1.75Gy(p=0.004) and 2.84±1.53Gy(p<0.001) in right and left femoral head, respectively. There was no significant difference on OAR sparing between PVMAT-by-IMRT-model and PVMAT-by-VMAT-model. Conclusion: The IMRT model for the rectal cancer in the RapidPlan can be applied to for VMAT planning. However, the VMAT model is not suggested to use in the IMRT planning. Cautions should be taken that the planning model based on some technique may not feasible to other planning techniques.« less

  17. Agent independent task planning

    NASA Technical Reports Server (NTRS)

    Davis, William S.

    1990-01-01

    Agent-Independent Planning is a technique that allows the construction of activity plans without regard to the agent that will perform them. Once generated, a plan is then validated and translated into instructions for a particular agent, whether a robot, crewmember, or software-based control system. Because Space Station Freedom (SSF) is planned for orbital operations for approximately thirty years, it will almost certainly experience numerous enhancements and upgrades, including upgrades in robotic manipulators. Agent-Independent Planning provides the capability to construct plans for SSF operations, independent of specific robotic systems, by combining techniques of object oriented modeling, nonlinear planning and temporal logic. Since a plan is validated using the physical and functional models of a particular agent, new robotic systems can be developed and integrated with existing operations in a robust manner. This technique also provides the capability to generate plans for crewmembers with varying skill levels, and later apply these same plans to more sophisticated robotic manipulators made available by evolutions in technology.

  18. Census mapbook for transportation planning.

    DOT National Transportation Integrated Search

    1994-12-01

    Geographic display of Census data in transportation planning and policy decisions are compiled in a report of 49 maps, depicting use of the data in applications such as travel demand model development and model validation, population forecasting, cor...

  19. Drift-Scale Coupled Processes (DST and THC Seepage) Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    P. Dixon

    The purpose of this Model Report (REV02) is to document the unsaturated zone (UZ) models used to evaluate the potential effects of coupled thermal-hydrological-chemical (THC) processes on UZ flow and transport. This Model Report has been developed in accordance with the ''Technical Work Plan for: Performance Assessment Unsaturated Zone'' (Bechtel SAIC Company, LLC (BSC) 2002 [160819]). The technical work plan (TWP) describes planning information pertaining to the technical scope, content, and management of this Model Report in Section 1.12, Work Package AUZM08, ''Coupled Effects on Flow and Seepage''. The plan for validation of the models documented in this Model Reportmore » is given in Attachment I, Model Validation Plans, Section I-3-4, of the TWP. Except for variations in acceptance criteria (Section 4.2), there were no deviations from this TWP. This report was developed in accordance with AP-SIII.10Q, ''Models''. This Model Report documents the THC Seepage Model and the Drift Scale Test (DST) THC Model. The THC Seepage Model is a drift-scale process model for predicting the composition of gas and water that could enter waste emplacement drifts and the effects of mineral alteration on flow in rocks surrounding drifts. The DST THC model is a drift-scale process model relying on the same conceptual model and much of the same input data (i.e., physical, hydrological, thermodynamic, and kinetic) as the THC Seepage Model. The DST THC Model is the primary method for validating the THC Seepage Model. The DST THC Model compares predicted water and gas compositions, as well as mineral alteration patterns, with observed data from the DST. These models provide the framework to evaluate THC coupled processes at the drift scale, predict flow and transport behavior for specified thermal-loading conditions, and predict the evolution of mineral alteration and fluid chemistry around potential waste emplacement drifts. The DST THC Model is used solely for the validation of the THC Seepage Model and is not used for calibration to measured data.« less

  20. An Overview of NASA's IM&S Verification and Validation Process Plan and Specification for Space Exploration

    NASA Technical Reports Server (NTRS)

    Gravitz, Robert M.; Hale, Joseph

    2006-01-01

    NASA's Exploration Systems Mission Directorate (ESMD) is implementing a management approach for modeling and simulation (M&S) that will provide decision-makers information on the model's fidelity, credibility, and quality. This information will allow the decision-maker to understand the risks involved in using a model's results in the decision-making process. This presentation will discuss NASA's approach for verification and validation (V&V) of its models or simulations supporting space exploration. This presentation will describe NASA's V&V process and the associated M&S verification and validation (V&V) activities required to support the decision-making process. The M&S V&V Plan and V&V Report templates for ESMD will also be illustrated.

  1. Thermal System Verification and Model Validation for NASA's Cryogenic Passively Cooled James Webb Space Telescope

    NASA Technical Reports Server (NTRS)

    Cleveland, Paul E.; Parrish, Keith A.

    2005-01-01

    A thorough and unique thermal verification and model validation plan has been developed for NASA s James Webb Space Telescope. The JWST observatory consists of a large deployed aperture optical telescope passively cooled to below 50 Kelvin along with a suite of several instruments passively and actively cooled to below 37 Kelvin and 7 Kelvin, respectively. Passive cooling to these extremely low temperatures is made feasible by the use of a large deployed high efficiency sunshield and an orbit location at the L2 Lagrange point. Another enabling feature is the scale or size of the observatory that allows for large radiator sizes that are compatible with the expected power dissipation of the instruments and large format Mercury Cadmium Telluride (HgCdTe) detector arrays. This passive cooling concept is simple, reliable, and mission enabling when compared to the alternatives of mechanical coolers and stored cryogens. However, these same large scale observatory features, which make passive cooling viable, also prevent the typical flight configuration fully-deployed thermal balance test that is the keystone to most space missions thermal verification plan. JWST is simply too large in its deployed configuration to be properly thermal balance tested in the facilities that currently exist. This reality, when combined with a mission thermal concept with little to no flight heritage, has necessitated the need for a unique and alternative approach to thermal system verification and model validation. This paper describes the thermal verification and model validation plan that has been developed for JWST. The plan relies on judicious use of cryogenic and thermal design margin, a completely independent thermal modeling cross check utilizing different analysis teams and software packages, and finally, a comprehensive set of thermal tests that occur at different levels of JWST assembly. After a brief description of the JWST mission and thermal architecture, a detailed description of the three aspects of the thermal verification and model validation plan is presented.

  2. Development and validation of a treatment planning model for magnetic nanoparticle hyperthermia cancer therapy

    NASA Astrophysics Data System (ADS)

    Stigliano, Robert Vincent

    The use of magnetic nanoparticles (mNPs) to induce local hyperthermia has been emerging in recent years as a promising cancer therapy, in both a stand-alone and combination treatment setting, including surgery radiation and chemotherapy. The mNP solution can be injected either directly into the tumor, or administered intravenously. Studies have shown that some cancer cells associate with, internalize, and aggregate mNPs more preferentially than normal cells, with and without antibody targeting. Once the mNPs are delivered inside the cells, a low frequency (30-300kHz) alternating electromagnetic field is used to activate the mNPs. The nanoparticles absorb the applied field and provide localized heat generation at nano-micron scales. Treatment planning models have been shown to improve treatment efficacy in radiation therapy by limiting normal tissue damage while maximizing dose to the tumor. To date, there does not exist a clinical treatment planning model for magnetic nanoparticle hyperthermia which is robust, validated, and commercially available. The focus of this research is on the development and experimental validation of a treatment planning model, consisting of a coupled electromagnetic and thermal model that predicts dynamic thermal distributions during treatment. When allowed to incubate, the mNPs are often sequestered by cancer cells and packed into endosomes. The proximity of the mNPs has a strong influence on their ability to heat due to interparticle magnetic interaction effects. A model of mNP heating which takes into account the effects of magnetic interaction was developed, and validated against experimental data. An animal study in mice was conducted to determine the effects of mNP solution injection duration and PEGylation on macroscale mNP distribution within the tumor, in order to further inform the treatment planning model and future experimental technique. In clinical applications, a critical limiting factor for the maximum applied field is the heating caused by eddy currents, which are induced in the noncancerous tissue. Phantom studies were conducted to validate the ability of the model to accurately predict eddy current heating in the case of zero blood perfusion, and preliminary data was collected to show the validity of the model in live mice to incorporate blood perfusion.

  3. Haptic simulation framework for determining virtual dental occlusion.

    PubMed

    Wu, Wen; Chen, Hui; Cen, Yuhai; Hong, Yang; Khambay, Balvinder; Heng, Pheng Ann

    2017-04-01

    The surgical treatment of many dentofacial deformities is often complex due to its three-dimensional nature. To determine the dental occlusion in the most stable position is essential for the success of the treatment. Computer-aided virtual planning on individualized patient-specific 3D model can help formulate the surgical plan and predict the surgical change. However, in current computer-aided planning systems, it is not possible to determine the dental occlusion of the digital models in the intuitive way during virtual surgical planning because of absence of haptic feedback. In this paper, a physically based haptic simulation framework is proposed, which can provide surgeons with the intuitive haptic feedback to determine the dental occlusion of the digital models in their most stable position. To provide the physically realistic force feedback when the dental models contact each other during the searching process, the contact model is proposed to describe the dynamic and collision properties of the dental models during the alignment. The simulated impulse/contact-based forces are integrated into the unified simulation framework. A validation study has been conducted on fifteen sets of virtual dental models chosen at random and covering a wide range of the dental relationships found clinically. The dental occlusions obtained by an expert were employed as a benchmark to compare the virtual occlusion results. The mean translational and angular deviations of the virtual occlusion results from the benchmark were small. The experimental results show the validity of our method. The simulated forces can provide valuable insights to determine the virtual dental occlusion. The findings of this work and the validation of proposed concept lead the way for full virtual surgical planning on patient-specific virtual models allowing fully customized treatment plans for the surgical correction of dentofacial deformities.

  4. A broad scope knowledge based model for optimization of VMAT in esophageal cancer: validation and assessment of plan quality among different treatment centers.

    PubMed

    Fogliata, Antonella; Nicolini, Giorgia; Clivio, Alessandro; Vanetti, Eugenio; Laksar, Sarbani; Tozzi, Angelo; Scorsetti, Marta; Cozzi, Luca

    2015-10-31

    To evaluate the performance of a broad scope model-based optimisation process for volumetric modulated arc therapy applied to esophageal cancer. A set of 70 previously treated patients in two different institutions, were selected to train a model for the prediction of dose-volume constraints. The model was built with a broad-scope purpose, aiming to be effective for different dose prescriptions and tumour localisations. It was validated on three groups of patients from the same institution and from another clinic not providing patients for the training phase. Comparison of the automated plans was done against reference cases given by the clinically accepted plans. Quantitative improvements (statistically significant for the majority of the analysed dose-volume parameters) were observed between the benchmark and the test plans. Of 624 dose-volume objectives assessed for plan evaluation, in 21 cases (3.3 %) the reference plans failed to respect the constraints while the model-based plans succeeded. Only in 3 cases (<0.5 %) the reference plans passed the criteria while the model-based failed. In 5.3 % of the cases both groups of plans failed and in the remaining cases both passed the tests. Plans were optimised using a broad scope knowledge-based model to determine the dose-volume constraints. The results showed dosimetric improvements when compared to the benchmark data. Particularly the plans optimised for patients from the third centre, not participating to the training, resulted in superior quality. The data suggests that the new engine is reliable and could encourage its application to clinical practice.

  5. Highly Efficient Training, Refinement, and Validation of a Knowledge-based Planning Quality-Control System for Radiation Therapy Clinical Trials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Nan; Carmona, Ruben; Sirak, Igor

    Purpose: To demonstrate an efficient method for training and validation of a knowledge-based planning (KBP) system as a radiation therapy clinical trial plan quality-control system. Methods and Materials: We analyzed 86 patients with stage IB through IVA cervical cancer treated with intensity modulated radiation therapy at 2 institutions according to the standards of the INTERTECC (International Evaluation of Radiotherapy Technology Effectiveness in Cervical Cancer, National Clinical Trials Network identifier: 01554397) protocol. The protocol used a planning target volume and 2 primary organs at risk: pelvic bone marrow (PBM) and bowel. Secondary organs at risk were rectum and bladder. Initial unfiltered dose-volumemore » histogram (DVH) estimation models were trained using all 86 plans. Refined training sets were created by removing sub-optimal plans from the unfiltered sample, and DVH estimation models… and DVH estimation models were constructed by identifying 30 of 86 plans emphasizing PBM sparing (comparing protocol-specified dosimetric cutpoints V{sub 10} (percentage volume of PBM receiving at least 10 Gy dose) and V{sub 20} (percentage volume of PBM receiving at least 20 Gy dose) with unfiltered predictions) and another 30 of 86 plans emphasizing bowel sparing (comparing V{sub 40} (absolute volume of bowel receiving at least 40 Gy dose) and V{sub 45} (absolute volume of bowel receiving at least 45 Gy dose), 9 in common with the PBM set). To obtain deliverable KBP plans, refined models must inform patient-specific optimization objectives and/or priorities (an auto-planning “routine”). Four candidate routines emphasizing different tradeoffs were composed, and a script was developed to automatically re-plan multiple patients with each routine. After selection of the routine that best met protocol objectives in the 51-patient training sample (KBP{sub FINAL}), protocol-specific DVH metrics and normal tissue complication probability were compared for original versus KBP{sub FINAL} plans across the 35-patient validation set. Paired t tests were used to test differences between planning sets. Results: KBP{sub FINAL} plans outperformed manual planning across the validation set in all protocol-specific DVH cutpoints. The mean normal tissue complication probability for gastrointestinal toxicity was lower for KBP{sub FINAL} versus validation-set plans (48.7% vs 53.8%, P<.001). Similarly, the estimated mean white blood cell count nadir was higher (2.77 vs 2.49 k/mL, P<.001) with KBP{sub FINAL} plans, indicating lowered probability of hematologic toxicity. Conclusions: This work demonstrates that a KBP system can be efficiently trained and refined for use in radiation therapy clinical trials with minimal effort. This patient-specific plan quality control resulted in improvements on protocol-specific dosimetric endpoints.« less

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lian, Jun, E-mail: jun-lian@med.unc.edu; Chera, Bhishamjit S.; Chang, Sha

    Purpose: To build a statistical model to quantitatively correlate the anatomic features of structures and the corresponding dose-volume histogram (DVH) of head and neck (HN) Tomotherapy (Tomo) plans. To study if the model built upon one intensity modulated radiation therapy (IMRT) technique (such as conventional Linac) can be used to predict anticipated organs-at-risk (OAR) DVH of patients treated with a different IMRT technique (such as Tomo). To study if the model built upon the clinical experience of one institution can be used to aid IMRT planning for another institution. Methods: Forty-four Tomotherapy intensity modulate radiotherapy plans of HN cases (Tomo-IMRT)more » from Institution A were included in the study. A different patient group of 53 HN fixed gantry IMRT (FG-IMRT) plans was selected from Institution B. The analyzed OARs included the parotid, larynx, spinal cord, brainstem, and submandibular gland. Two major groups of anatomical features were considered: the volumetric information and the spatial information. The volume information includes the volume of target, OAR, and overlapped volume between target and OAR. The spatial information of OARs relative to PTVs was represented by the distance-to-target histogram (DTH). Important anatomical and dosimetric features were extracted from DTH and DVH by principal component analysis. Two regression models, one for Tomotherapy plan and one for IMRT plan, were built independently. The accuracy of intratreatment-modality model prediction was validated by a leave one out cross-validation method. The intertechnique and interinstitution validations were performed by using the FG-IMRT model to predict the OAR dosimetry of Tomo-IMRT plans. The dosimetry of OARs, under the same and different institutional preferences, was analyzed to examine the correlation between the model prediction and planning protocol. Results: Significant patient anatomical factors contributing to OAR dose sparing in HN Tomotherapy plans have been analyzed and identified. For all the OARs, the discrepancies of dose indices between the model predicted values and the actual plan values were within 2.1%. Similar results were obtained from the modeling of FG-IMRT plans. The parotid gland was spared in a comparable fashion during the treatment planning of two institutions. The model based on FG-IMRT plans was found to predict the median dose of the parotid of Tomotherapy plans quite well, with a mean error of 2.6%. Predictions from the FG-IMRT model suggested the median dose of the larynx, median dose of the brainstem and D2 of the brainstem could be reduced by 10.5%, 12.8%, and 20.4%, respectively, in the Tomo-IMRT plans. This was found to be correlated to the institutional differences in OAR constraint settings. Re-planning of six Tomotherapy patients confirmed the potential of optimization improvement predicted by the FG-IMRT model was correct. Conclusions: The authors established a mathematical model to correlate the anatomical features and dosimetric indexes of OARs of HN patients in Tomotherapy plans. The model can be used for the setup of patient-specific OAR dose sparing goals and quality control of planning results. The institutional clinical experience was incorporated into the model which allows the model from one institution to generate a reference plan for another institution, or another IMRT technique.« less

  7. Model Validation Against The Modelers’ Data Archive

    DTIC Science & Technology

    2014-08-01

    completion of the planned Jack Rabbit 2 field trials. The relevant task for the effort addressed here is Task 4 of the current Interagency Agreement, as...readily simulates the Prairie Grass sulfur dioxide plumes. Also, Jack Rabbit II field trials are set to be completed during FY16. Once these data are...available, they will also be used to validate the combined models. This validation may prove to be more useful, as the Jack Rabbit II will release

  8. Global Precipitation Measurement (GPM) Ground Validation (GV) Science Implementation Plan

    NASA Technical Reports Server (NTRS)

    Petersen, Walter A.; Hou, Arthur Y.

    2008-01-01

    For pre-launch algorithm development and post-launch product evaluation Global Precipitation Measurement (GPM) Ground Validation (GV) goes beyond direct comparisons of surface rain rates between ground and satellite measurements to provide the means for improving retrieval algorithms and model applications.Three approaches to GPM GV include direct statistical validation (at the surface), precipitation physics validation (in a vertical columns), and integrated science validation (4-dimensional). These three approaches support five themes: core satellite error characterization; constellation satellites validation; development of physical models of snow, cloud water, and mixed phase; development of cloud-resolving model (CRM) and land-surface models to bridge observations and algorithms; and, development of coupled CRM-land surface modeling for basin-scale water budget studies and natural hazard prediction. This presentation describes the implementation of these approaches.

  9. Soldier Dimensions in Combat Models

    DTIC Science & Technology

    1990-05-07

    and performance. Questionnaires, SQTs, and ARTEPs were often used. Many scales had estimates of reliability but few had validity data. Most studies...pending its validation . Research plans were provided for applications in simulated combat and with simulation devices, for data previously gathered...regarding reliability and validity . Lack of information following an instrument indicates neither reliability nor validity information was provided by the

  10. Integrated Medical Model Verification, Validation, and Credibility

    NASA Technical Reports Server (NTRS)

    Walton, Marlei; Kerstman, Eric; Foy, Millennia; Shah, Ronak; Saile, Lynn; Boley, Lynn; Butler, Doug; Myers, Jerry

    2014-01-01

    The Integrated Medical Model (IMM) was designed to forecast relative changes for a specified set of crew health and mission success risk metrics by using a probabilistic (stochastic process) model based on historical data, cohort data, and subject matter expert opinion. A probabilistic approach is taken since exact (deterministic) results would not appropriately reflect the uncertainty in the IMM inputs. Once the IMM was conceptualized, a plan was needed to rigorously assess input information, framework and code, and output results of the IMM, and ensure that end user requests and requirements were considered during all stages of model development and implementation. METHODS: In 2008, the IMM team developed a comprehensive verification and validation (VV) plan, which specified internal and external review criteria encompassing 1) verification of data and IMM structure to ensure proper implementation of the IMM, 2) several validation techniques to confirm that the simulation capability of the IMM appropriately represents occurrences and consequences of medical conditions during space missions, and 3) credibility processes to develop user confidence in the information derived from the IMM. When the NASA-STD-7009 (7009) was published, the IMM team updated their verification, validation, and credibility (VVC) project plan to meet 7009 requirements and include 7009 tools in reporting VVC status of the IMM. RESULTS: IMM VVC updates are compiled recurrently and include 7009 Compliance and Credibility matrices, IMM VV Plan status, and a synopsis of any changes or updates to the IMM during the reporting period. Reporting tools have evolved over the lifetime of the IMM project to better communicate VVC status. This has included refining original 7009 methodology with augmentation from the NASA-STD-7009 Guidance Document. End user requests and requirements are being satisfied as evidenced by ISS Program acceptance of IMM risk forecasts, transition to an operational model and simulation tool, and completion of service requests from a broad end user consortium including Operations, Science and Technology Planning, and Exploration Planning. CONCLUSIONS: The VVC approach established by the IMM project of combining the IMM VV Plan with 7009 requirements is comprehensive and includes the involvement of end users at every stage in IMM evolution. Methods and techniques used to quantify the VVC status of the IMM have not only received approval from the local NASA community but have also garnered recognition by other federal agencies seeking to develop similar guidelines in the medical modeling community.

  11. Remembering the past and planning for the future in rats

    PubMed Central

    Crystal, Jonathon D.

    2012-01-01

    A growing body of research suggests that rats represent and remember specific earlier events from the past. An important criterion for validating a rodent model of episodic memory is to establish that the content of the representation is about a specific event in the past rather than vague information about remoteness. Recent evidence suggests that rats may also represent events that are anticipated to occur in the future. An important capacity afforded by a representation of the future is the ability to plan for the occurrence of a future event. However, relatively little is known about the content of represented future events and the cognitive mechanisms that may support planning. This article reviews evidence that rats remember specific earlier events from the past, represent events that are anticipated to ccur in the future, and develops criteria for validating a rodent model of future planning. These criteria include representing a specific time in the future, the ability to temporarily disengage from a plan and reactivate the plan at an appropriate time in the future, and flexibility to deploy a plan in novel conditions. PMID:23219951

  12. Clinical implementation of a knowledge based planning tool for prostate VMAT.

    PubMed

    Powis, Richard; Bird, Andrew; Brennan, Matthew; Hinks, Susan; Newman, Hannah; Reed, Katie; Sage, John; Webster, Gareth

    2017-05-08

    A knowledge based planning tool has been developed and implemented for prostate VMAT radiotherapy plans providing a target average rectum dose value based on previously achievable values for similar rectum/PTV overlap. The purpose of this planning tool is to highlight sub-optimal clinical plans and to improve plan quality and consistency. A historical cohort of 97 VMAT prostate plans was interrogated using a RayStation script and used to develop a local model for predicting optimum average rectum dose based on individual anatomy. A preliminary validation study was performed whereby historical plans identified as "optimal" and "sub-optimal" by the local model were replanned in a blinded study by four experienced planners and compared to the original clinical plan to assess whether any improvement in rectum dose was observed. The predictive model was then incorporated into a RayStation script and used as part of the clinical planning process. Planners were asked to use the script during planning to provide a patient specific prediction for optimum average rectum dose and to optimise the plan accordingly. Plans identified as "sub-optimal" in the validation study observed a statistically significant improvement in average rectum dose compared to the clinical plan when replanned whereas plans that were identified as "optimal" observed no improvement when replanned. This provided confidence that the local model can identify plans that were suboptimal in terms of rectal sparing. Clinical implementation of the knowledge based planning tool reduced the population-averaged mean rectum dose by 5.6Gy. There was a small but statistically significant increase in total MU and femoral head dose and a reduction in conformity index. These did not affect the clinical acceptability of the plans and no significant changes to other plan quality metrics were observed. The knowledge-based planning tool has enabled substantial reductions in population-averaged mean rectum dose for prostate VMAT patients. This suggests plans are improved when planners receive quantitative feedback on plan quality against historical data.

  13. Space Weather Modeling Services at the Community Coordinated Modeling Center

    NASA Technical Reports Server (NTRS)

    Hesse, Michael

    2006-01-01

    The Community Coordinated Modeling Center (CCMC) is a multi-agency partnership, which aims at the creation of next generation space weather models. The goal of the CCMC is to support the research and developmental work necessary to substantially increase the present-day modeling capability for space weather purposes, and to provide models for transition to the Rapid Prototyping Centers at the space weather forecast centers. This goal requires close collaborations with and substantial involvement of the research community. The physical regions to be addressed by CCMC-related activities range from the solar atmosphere to the Earth's upper atmosphere. The CCMC is an integral part of the National Space Weather Program Implementation Plan, of NASA's Living With a Star (LWS) initiative, and of the Department of Defense Space Weather Transition Plan. CCMC includes a facility at NASA Goddard Space Flight Center. CCMC also provides, to the research community, access to state-of-the-art space research models. In this paper we will provide a description of the current CCMC status, discuss current plans, research and development accomplishments and goals, and describe the model testing and validation process undertaken as part of the CCMC mandate. Special emphasis will be on solar and heliospheric models currently residing at CCMC, and on plans for validation and verification.

  14. Space Weather Modeling at the Community Coordinated Modeling Center

    NASA Technical Reports Server (NTRS)

    Hesse M.

    2005-01-01

    The Community Coordinated Modeling Center (CCMC) is a multi-agency partnership, which aims at the creation of next generation space weather models. The goal of the CCMC is to support the research and developmental work necessary to substantially increase the present-day modeling capability for space weather purposes, and to provide models for transition to the rapid prototyping centers at the space weather forecast centers. This goal requires dose collaborations with and substantial involvement of the research community. The physical regions to be addressed by CCMC-related activities range from the solar atmosphere to the Earth's upper atmosphere. The CCMC is an integral part of the National Space Weather Program Implementation Plan, of NASA's Living With a Star (LWS) initiative, and of the Department of Defense Space Weather Transition Plan. CCMC includes a facility at NASA Goddard Space Flight Center, as well as distributed computing facilities provided by the US Air Force. CCMC also provides, to the research community, access to state-of-the-art space research models. In this paper we will provide updates on CCMC status, on current plans, research and development accomplishments and goals, and on the model testing and validation process undertaken as part of the CCMC mandate. Special emphasis will be on solar and heliospheric models currently residing at CCMC, and on plans for validation and verification.

  15. Planning of reach-and-grasp movements: effects of validity and type of object information

    NASA Technical Reports Server (NTRS)

    Loukopoulos, L. D.; Engelbrecht, S. F.; Berthier, N. E.

    2001-01-01

    Individuals are assumed to plan reach-and-grasp movements by using two separate processes. In 1 of the processes, extrinsic (direction, distance) object information is used in planning the movement of the arm that transports the hand to the target location (transport planning); whereas in the other, intrinsic (shape) object information is used in planning the preshaping of the hand and the grasping of the target object (manipulation planning). In 2 experiments, the authors used primes to provide information to participants (N = 5, Experiment 1; N = 6, Experiment 2) about extrinsic and intrinsic object properties. The validity of the prime information was systematically varied. The primes were succeeded by a cue, which always correctly identified the location and shape of the target object. Reaction times were recorded. Four models of transport and manipulation planning were tested. The only model that was consistent with the data was 1 in which arm transport and object manipulation planning were postulated to be independent processes that operate partially in parallel. The authors suggest that the processes involved in motor planning before execution are primarily concerned with the geometric aspects of the upcoming movement but not with the temporal details of its execution.

  16. Pre- and Post-Planned Evaluation: Which Is Preferable?

    ERIC Educational Resources Information Center

    Strasser, Stephen; Deniston, O. Lynn

    1978-01-01

    Factors involved in pre-planned and post-planned evaluation of program effectiveness are compared: (1) reliability and cost of data; (2) internal and external validity; (3) obtrusiveness and threat; (4) goal displacement and program direction. A model to help program administrators decide which approach is more appropriate is presented. (Author/MH)

  17. SBKF Modeling and Analysis Plan: Buckling Analysis of Compression-Loaded Orthogrid and Isogrid Cylinders

    NASA Technical Reports Server (NTRS)

    Lovejoy, Andrew E.; Hilburger, Mark W.

    2013-01-01

    This document outlines a Modeling and Analysis Plan (MAP) to be followed by the SBKF analysts. It includes instructions on modeling and analysis formulation and execution, model verification and validation, identifying sources of error and uncertainty, and documentation. The goal of this MAP is to provide a standardized procedure that ensures uniformity and quality of the results produced by the project and corresponding documentation.

  18. Validation of a Parametric Approach for 3d Fortification Modelling: Application to Scale Models

    NASA Astrophysics Data System (ADS)

    Jacquot, K.; Chevrier, C.; Halin, G.

    2013-02-01

    Parametric modelling approach applied to cultural heritage virtual representation is a field of research explored for years since it can address many limitations of digitising tools. For example, essential historical sources for fortification virtual reconstructions like plans-reliefs have several shortcomings when they are scanned. To overcome those problems, knowledge based-modelling can be used: knowledge models based on the analysis of theoretical literature of a specific domain such as bastioned fortification treatises can be the cornerstone of the creation of a parametric library of fortification components. Implemented in Grasshopper, these components are manually adjusted on the data available (i.e. 3D surveys of plans-reliefs or scanned maps). Most of the fortification area is now modelled and the question of accuracy assessment is raised. A specific method is used to evaluate the accuracy of the parametric components. The results of the assessment process will allow us to validate the parametric approach. The automation of the adjustment process can finally be planned. The virtual model of fortification is part of a larger project aimed at valorising and diffusing a very unique cultural heritage item: the collection of plans-reliefs. As such, knowledge models are precious assets when automation and semantic enhancements will be considered.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sudhyadhom, A; McGuinness, C; Descovich, M

    Purpose: To develop a methodology for validation of a Monte-Carlo dose calculation model for robotic small field SRS/SBRT deliveries. Methods: In a robotic treatment planning system, a Monte-Carlo model was iteratively optimized to match with beam data. A two-part analysis was developed to verify this model. 1) The Monte-Carlo model was validated in a simulated water phantom versus a Ray-Tracing calculation on a single beam collimator-by-collimator calculation. 2) The Monte-Carlo model was validated to be accurate in the most challenging situation, lung, by acquiring in-phantom measurements. A plan was created and delivered in a CIRS lung phantom with film insert.more » Separately, plans were delivered in an in-house created lung phantom with a PinPoint chamber insert within a lung simulating material. For medium to large collimator sizes, a single beam was delivered to the phantom. For small size collimators (10, 12.5, and 15mm), a robotically delivered plan was created to generate a uniform dose field of irradiation over a 2×2cm{sup 2} area. Results: Dose differences in simulated water between Ray-Tracing and Monte-Carlo were all within 1% at dmax and deeper. Maximum dose differences occurred prior to dmax but were all within 3%. Film measurements in a lung phantom show high correspondence of over 95% gamma at the 2%/2mm level for Monte-Carlo. Ion chamber measurements for collimator sizes of 12.5mm and above were within 3% of Monte-Carlo calculated values. Uniform irradiation involving the 10mm collimator resulted in a dose difference of ∼8% for both Monte-Carlo and Ray-Tracing indicating that there may be limitations with the dose calculation. Conclusion: We have developed a methodology to validate a Monte-Carlo model by verifying that it matches in water and, separately, that it corresponds well in lung simulating materials. The Monte-Carlo model and algorithm tested may have more limited accuracy for 10mm fields and smaller.« less

  20. A flexible Monte Carlo tool for patient or phantom specific calculations: comparison with preliminary validation measurements

    NASA Astrophysics Data System (ADS)

    Davidson, S.; Cui, J.; Followill, D.; Ibbott, G.; Deasy, J.

    2008-02-01

    The Dose Planning Method (DPM) is one of several 'fast' Monte Carlo (MC) computer codes designed to produce an accurate dose calculation for advanced clinical applications. We have developed a flexible machine modeling process and validation tests for open-field and IMRT calculations. To complement the DPM code, a practical and versatile source model has been developed, whose parameters are derived from a standard set of planning system commissioning measurements. The primary photon spectrum and the spectrum resulting from the flattening filter are modeled by a Fatigue function, cut-off by a multiplying Fermi function, which effectively regularizes the difficult energy spectrum determination process. Commonly-used functions are applied to represent the off-axis softening, increasing primary fluence with increasing angle ('the horn effect'), and electron contamination. The patient dependent aspect of the MC dose calculation utilizes the multi-leaf collimator (MLC) leaf sequence file exported from the treatment planning system DICOM output, coupled with the source model, to derive the particle transport. This model has been commissioned for Varian 2100C 6 MV and 18 MV photon beams using percent depth dose, dose profiles, and output factors. A 3-D conformal plan and an IMRT plan delivered to an anthropomorphic thorax phantom were used to benchmark the model. The calculated results were compared to Pinnacle v7.6c results and measurements made using radiochromic film and thermoluminescent detectors (TLD).

  1. Next generation terminology infrastructure to support interprofessional care planning.

    PubMed

    Collins, Sarah; Klinkenberg-Ramirez, Stephanie; Tsivkin, Kira; Mar, Perry L; Iskhakova, Dina; Nandigam, Hari; Samal, Lipika; Rocha, Roberto A

    2017-11-01

    Develop a prototype of an interprofessional terminology and information model infrastructure that can enable care planning applications to facilitate patient-centered care, learn care plan linkages and associations, provide decision support, and enable automated, prospective analytics. The study steps included a 3 step approach: (1) Process model and clinical scenario development, and (2) Requirements analysis, and (3) Development and validation of information and terminology models. Components of the terminology model include: Health Concerns, Goals, Decisions, Interventions, Assessments, and Evaluations. A terminology infrastructure should: (A) Include discrete care plan concepts; (B) Include sets of profession-specific concerns, decisions, and interventions; (C) Communicate rationales, anticipatory guidance, and guidelines that inform decisions among the care team; (D) Define semantic linkages across clinical events and professions; (E) Define sets of shared patient goals and sub-goals, including patient stated goals; (F) Capture evaluation toward achievement of goals. These requirements were mapped to AHRQ Care Coordination Measures Framework. This study used a constrained set of clinician-validated clinical scenarios. Terminology models for goals and decisions are unavailable in SNOMED CT, limiting the ability to evaluate these aspects of the proposed infrastructure. Defining and linking subsets of care planning concepts appears to be feasible, but also essential to model interprofessional care planning for common co-occurring conditions and chronic diseases. We recommend the creation of goal dynamics and decision concepts in SNOMED CT to further enable the necessary models. Systems with flexible terminology management infrastructure may enable intelligent decision support to identify conflicting and aligned concerns, goals, decisions, and interventions in shared care plans, ultimately decreasing documentation effort and cognitive burden for clinicians and patients. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. The SCALE Verified, Archived Library of Inputs and Data - VALID

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marshall, William BJ J; Rearden, Bradley T

    The Verified, Archived Library of Inputs and Data (VALID) at ORNL contains high quality, independently reviewed models and results that improve confidence in analysis. VALID is developed and maintained according to a procedure of the SCALE quality assurance (QA) plan. This paper reviews the origins of the procedure and its intended purpose, the philosophy of the procedure, some highlights of its implementation, and the future of the procedure and associated VALID library. The original focus of the procedure was the generation of high-quality models that could be archived at ORNL and applied to many studies. The review process associated withmore » model generation minimized the chances of errors in these archived models. Subsequently, the scope of the library and procedure was expanded to provide high quality, reviewed sensitivity data files for deployment through the International Handbook of Evaluated Criticality Safety Benchmark Experiments (IHECSBE). Sensitivity data files for approximately 400 such models are currently available. The VALID procedure and library continue fulfilling these multiple roles. The VALID procedure is based on the quality assurance principles of ISO 9001 and nuclear safety analysis. Some of these key concepts include: independent generation and review of information, generation and review by qualified individuals, use of appropriate references for design data and documentation, and retrievability of the models, results, and documentation associated with entries in the library. Some highlights of the detailed procedure are discussed to provide background on its implementation and to indicate limitations of data extracted from VALID for use by the broader community. Specifically, external users of data generated within VALID must take responsibility for ensuring that the files are used within the QA framework of their organization and that use is appropriate. The future plans for the VALID library include expansion to include additional experiments from the IHECSBE, to include experiments from areas beyond criticality safety, such as reactor physics and shielding, and to include application models. In the future, external SCALE users may also obtain qualification under the VALID procedure and be involved in expanding the library. The VALID library provides a pathway for the criticality safety community to leverage modeling and analysis expertise at ORNL.« less

  3. Deep Impact Sequence Planning Using Multi-Mission Adaptable Planning Tools With Integrated Spacecraft Models

    NASA Technical Reports Server (NTRS)

    Wissler, Steven S.; Maldague, Pierre; Rocca, Jennifer; Seybold, Calina

    2006-01-01

    The Deep Impact mission was ambitious and challenging. JPL's well proven, easily adaptable multi-mission sequence planning tools combined with integrated spacecraft subsystem models enabled a small operations team to develop, validate, and execute extremely complex sequence-based activities within very short development times. This paper focuses on the core planning tool used in the mission, APGEN. It shows how the multi-mission design and adaptability of APGEN made it possible to model spacecraft subsystems as well as ground assets throughout the lifecycle of the Deep Impact project, starting with models of initial, high-level mission objectives, and culminating in detailed predictions of spacecraft behavior during mission-critical activities.

  4. Description of a Website Resource for Turbulence Modeling Verification and Validation

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.; Smith, Brian R.; Huang, George P.

    2010-01-01

    The activities of the Turbulence Model Benchmarking Working Group - which is a subcommittee of the American Institute of Aeronautics and Astronautics (AIAA) Fluid Dynamics Technical Committee - are described. The group s main purpose is to establish a web-based repository for Reynolds-averaged Navier-Stokes turbulence model documentation, including verification and validation cases. This turbulence modeling resource has been established based on feedback from a survey on what is needed to achieve consistency and repeatability in turbulence model implementation and usage, and to document and disseminate information on new turbulence models or improvements to existing models. The various components of the website are described in detail: description of turbulence models, turbulence model readiness rating system, verification cases, validation cases, validation databases, and turbulence manufactured solutions. An outline of future plans of the working group is also provided.

  5. EDMS Multi-year Validation Plan

    DOT National Transportation Integrated Search

    2001-06-01

    The Emissions and Dispersion Modeling System (EDMS) is the air quality model required for use on airport projects by the Federal Aviation Administration (FAA). This model has continued to be improved and recently has included several important enhanc...

  6. Experimental Evaluation of a Planning Language Suitable for Formal Verification

    NASA Technical Reports Server (NTRS)

    Butler, Rick W.; Munoz, Cesar A.; Siminiceanu, Radu I.

    2008-01-01

    The marriage of model checking and planning faces two seemingly diverging alternatives: the need for a planning language expressive enough to capture the complexity of real-life applications, as opposed to a language simple, yet robust enough to be amenable to exhaustive verification and validation techniques. In an attempt to reconcile these differences, we have designed an abstract plan description language, ANMLite, inspired from the Action Notation Modeling Language (ANML) [17]. We present the basic concepts of the ANMLite language as well as an automatic translator from ANMLite to the model checker SAL (Symbolic Analysis Laboratory) [7]. We discuss various aspects of specifying a plan in terms of constraints and explore the implications of choosing a robust logic behind the specification of constraints, rather than simply propose a new planning language. Additionally, we provide an initial assessment of the efficiency of model checking to search for solutions of planning problems. To this end, we design a basic test benchmark and study the scalability of the generated SAL models in terms of plan complexity.

  7. A Model-based Approach to Controlling the ST-5 Constellation Lights-Out Using the GMSEC Message Bus and Simulink

    NASA Technical Reports Server (NTRS)

    Witt, Kenneth J.; Stanley, Jason; Shendock, Robert; Mandl, Daniel

    2005-01-01

    Space Technology 5 (ST-5) is a three-satellite constellation, technology validation mission under the New Millennium Program at NASA to be launched in March 2006. One of the key technologies to be validated is a lights-out, model-based operations approach to be used for one week to control the ST-5 constellation with no manual intervention. The ground architecture features the GSFC Mission Services Evolution Center (GMSEC) middleware, which allows easy plugging in of software components and a standardized messaging protocol over a software bus. A predictive modeling tool built on MatLab's Simulink software package makes use of the GMSEC standard messaging protocol to interface to the Advanced Mission Planning System (AMPS) Scenario Scheduler which controls all activities, resource allocation and real-time re-profiling of constellation resources when non-nominal events occur. The key features of this system, which we refer to as the ST-5 Simulink system, are as follows: Original daily plan is checked to make sure that predicted resources needed are available by comparing the plan against the model. As the plan is run in real-time, the system re-profiles future activities in real-time if planned activities do not occur in the predicted timeframe or fashion. Alert messages are sent out on the GMSEC bus by the system if future predicted problems are detected. This will allow the Scenario Scheduler to correct the situation before the problem happens. The predictive model is evolved automatically over time via telemetry updates thus reducing the cost of implementing and maintaining the models by an order of magnitude from previous efforts at GSFC such as the model-based system built for MAP in the mid-1990's. This paper will describe the key features, lessons learned and implications for future missions once this system is successfully validated on-orbit in 2006.

  8. Validation of Mission Plans Through Simulation

    NASA Astrophysics Data System (ADS)

    St-Pierre, J.; Melanson, P.; Brunet, C.; Crabtree, D.

    2002-01-01

    The purpose of a spacecraft mission planning system is to automatically generate safe and optimized mission plans for a single spacecraft, or more functioning in unison. The system verifies user input syntax, conformance to commanding constraints, absence of duty cycle violations, timing conflicts, state conflicts, etc. Present day constraint-based systems with state-based predictive models use verification rules derived from expert knowledge. A familiar solution found in Mission Operations Centers, is to complement the planning system with a high fidelity spacecraft simulator. Often a dedicated workstation, the simulator is frequently used for operator training and procedure validation, and may be interfaced to actual control stations with command and telemetry links. While there are distinct advantages to having a planning system offer realistic operator training using the actual flight control console, physical verification of data transfer across layers and procedure validation, experience has revealed some drawbacks and inefficiencies in ground segment operations: With these considerations, two simulation-based mission plan validation projects are under way at the Canadian Space Agency (CSA): RVMP and ViSION. The tools proposed in these projects will automatically run scenarios and provide execution reports to operations planning personnel, prior to actual command upload. This can provide an important safeguard for system or human errors that can only be detected with high fidelity, interdependent spacecraft models running concurrently. The core element common to these projects is a spacecraft simulator, built with off-the- shelf components such as CAE's Real-Time Object-Based Simulation Environment (ROSE) technology, MathWork's MATLAB/Simulink, and Analytical Graphics' Satellite Tool Kit (STK). To complement these tools, additional components were developed, such as an emulated Spacecraft Test and Operations Language (STOL) interpreter and CCSDS TM/TC encoders and decoders. This paper discusses the use of simulation in the context of space mission planning, describes the projects under way and proposes additional venues of investigation and development.

  9. Development Instrument’s Learning of Physics Through Scientific Inquiry Model Based Batak Culture to Improve Science Process Skill and Student’s Curiosity

    NASA Astrophysics Data System (ADS)

    Nasution, Derlina; Syahreni Harahap, Putri; Harahap, Marabangun

    2018-03-01

    This research aims to: (1) developed a instrument’s learning (lesson plan, worksheet, student’s book, teacher’s guide book, and instrument test) of physics learning through scientific inquiry learning model based Batak culture to achieve skills improvement process of science students and the students’ curiosity; (2) describe the quality of the result of develop instrument’s learning in high school using scientific inquiry learning model based Batak culture (lesson plan, worksheet, student’s book, teacher’s guide book, and instrument test) to achieve the science process skill improvement of students and the student curiosity. This research is research development. This research developed a instrument’s learning of physics by using a development model that is adapted from the development model Thiagarajan, Semmel, and Semmel. The stages are traversed until retrieved a valid physics instrument’s learning, practical, and effective includes :(1) definition phase, (2) the planning phase, and (3) stages of development. Test performed include expert test/validation testing experts, small groups, and test classes is limited. Test classes are limited to do in SMAN 1 Padang Bolak alternating on a class X MIA. This research resulted in: 1) the learning of physics static fluid material specially for high school grade 10th consisted of (lesson plan, worksheet, student’s book, teacher’s guide book, and instrument test) and quality worthy of use in the learning process; 2) each component of the instrument’s learning meet the criteria have valid learning, practical, and effective way to reach the science process skill improvement and curiosity in students.

  10. Analytical and Experimental Study to Improve Computer Models for Mixing and Dilution of Soluble Hazardous Chemicals.

    DTIC Science & Technology

    1982-08-01

    Trajectory and Concentration of Various Plumes 59 IV.2 Tank and Cargo Geometry Assumed for Discharge Rate Calculation Using HACS Venting Rate Model 61...Discharge Rate Calculation Using HACS Venting Rate Model 62 IV.4 Original Test Plan for Validation of the Continuous Spill Model 66 IV.5 Final Test Plan...at t= 0. exEyEz = turbulent diffusivities. p = water density. Pc = chemical density. Symbols Used Only in Continuous-Spill Models for a Steady River b

  11. Space Weather Model Testing And Validation At The Community Coordinated Modeling Center

    NASA Astrophysics Data System (ADS)

    Hesse, M.; Kuznetsova, M.; Rastaetter, L.; Falasca, A.; Keller, K.; Reitan, P.

    The Community Coordinated Modeling Center (CCMC) is a multi-agency partner- ship aimed at the creation of next generation space weather models. The goal of the CCMC is to undertake the research and developmental work necessary to substantially increase the present-day modeling capability for space weather purposes, and to pro- vide models for transition to the rapid prototyping centers at the space weather forecast centers. This goal requires close collaborations with and substantial involvement of the research community. The physical regions to be addressed by CCMC-related activities range from the solar atmosphere to the Earth's upper atmosphere. The CCMC is an integral part of NASA's Living With aStar initiative, of the National Space Weather Program Implementation Plan, and of the Department of Defense Space Weather Tran- sition Plan. CCMC includes a facility at NASA Goddard Space Flight Center, as well as distributed computing facilities provided by the Air Force. CCMC also provides, to the research community, access to state-of-the-art space research models. In this paper we will provide updates on CCMC status, on current plans, research and devel- opment accomplishments and goals, and on the model testing and validation process undertaken as part of the CCMC mandate.

  12. Monte Carlo modeling of HD120 multileaf collimator on Varian TrueBeam linear accelerator for verification of 6X and 6X FFF VMAT SABR treatment plans

    PubMed Central

    Gete, Ermias; Duzenli, Cheryl; Teke, Tony

    2014-01-01

    A Monte Carlo (MC) validation of the vendor‐supplied Varian TrueBeam 6 MV flattened (6X) phase‐space file and the first implementation of the Siebers‐Keall MC MLC model as applied to the HD120 MLC (for 6X flat and 6X flattening filterfree (6X FFF) beams) are described. The MC model is validated in the context of VMAT patient‐specific quality assurance. The Monte Carlo commissioning process involves: 1) validating the calculated open‐field percentage depth doses (PDDs), profiles, and output factors (OF), 2) adapting the Siebers‐Keall MLC model to match the new HD120‐MLC geometry and material composition, 3) determining the absolute dose conversion factor for the MC calculation, and 4) validating this entire linac/MLC in the context of dose calculation verification for clinical VMAT plans. MC PDDs for the 6X beams agree with the measured data to within 2.0% for field sizes ranging from 2 × 2 to 40 × 40 cm2. Measured and MC profiles show agreement in the 50% field width and the 80%‐20% penumbra region to within 1.3 mm for all square field sizes. MC OFs for the 2 to 40 cm2 square fields agree with measurement to within 1.6%. Verification of VMAT SABR lung, liver, and vertebra plans demonstrate that measured and MC ion chamber doses agree within 0.6% for the 6X beam and within 2.0% for the 6X FFF beam. A 3D gamma factor analysis demonstrates that for the 6X beam, > 99% of voxels meet the pass criteria (3%/3 mm). For the 6X FFF beam, > 94% of voxels meet this criteria. The TrueBeam accelerator delivering 6X and 6X FFF beams with the HD120 MLC can be modeled in Monte Carlo to provide an independent 3D dose calculation for clinical VMAT plans. This quality assurance tool has been used clinically to verify over 140 6X and 16 6X FFF TrueBeam treatment plans. PACS number: 87.55.K‐ PMID:24892341

  13. WE-AB-209-05: Development of an Ultra-Fast High Quality Whole Breast Radiotherapy Treatment Planning System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sheng, Y; Li, T; Yoo, S

    2016-06-15

    Purpose: To enable near-real-time (<20sec) and interactive planning without compromising quality for whole breast RT treatment planning using tangential fields. Methods: Whole breast RT plans from 20 patients treated with single energy (SE, 6MV, 10 patients) or mixed energy (ME, 6/15MV, 10 patients) were randomly selected for model training. Additional 20 cases were used as validation cohort. The planning process for a new case consists of three fully automated steps:1. Energy Selection. A classification model automatically selects energy level. To build the energy selection model, principle component analysis (PCA) was applied to the digital reconstructed radiographs (DRRs) of training casesmore » to extract anatomy-energy relationship.2. Fluence Estimation. Once energy is selected, a random forest (RF) model generates the initial fluence. This model summarizes the relationship between patient anatomy’s shape based features and the output fluence. 3. Fluence Fine-tuning. This step balances the overall dose contribution throughout the whole breast tissue by automatically selecting reference points and applying centrality correction. Fine-tuning works at beamlet-level until the dose distribution meets clinical objectives. Prior to finalization, physicians can also make patient-specific trade-offs between target coverage and high-dose volumes.The proposed method was validated by comparing auto-plans with manually generated clinical-plans using Wilcoxon Signed-Rank test. Results: In 19/20 cases the model suggested the same energy combination as clinical-plans. The target volume coverage V100% was 78.1±4.7% for auto-plans, and 79.3±4.8% for clinical-plans (p=0.12). Volumes receiving 105% Rx were 69.2±78.0cc for auto-plans compared to 83.9±87.2cc for clinical-plans (p=0.13). The mean V10Gy, V20Gy of the ipsilateral lung was 24.4±6.7%, 18.6±6.0% for auto plans and 24.6±6.7%, 18.9±6.1% for clinical-plans (p=0.04, <0.001). Total computational time for auto-plans was < 20s. Conclusion: We developed an automated method that generates breast radiotherapy plans with accurate energy selection, similar target volume coverage, reduced hotspot volumes, and significant reduction in planning time, allowing for near-real-time planning.« less

  14. Partial validation of the Dutch model for emission and transport of nutrients (STONE).

    PubMed

    Overbeek, G B; Tiktak, A; Beusen, A H; van Puijenbroek, P J

    2001-11-17

    The Netherlands has to cope with large losses of N and P to groundwater and surface water. Agriculture is the dominant source of these nutrients, particularly with reference to nutrient excretion due to intensive animal husbandry in combination with fertilizer use. The Dutch government has recently launched a stricter eutrophication abatement policy to comply with the EC nitrate directive. The Dutch consensus model for N and P emission to groundwater and surface water (STONE) has been developed to evaluate the environmental benefits of abatement plans. Due to the possibly severe socioeconomic consequences of eutrophication abatement plans, it is of utmost importance that the model is thoroughly validated. Because STONE is applied on a nationwide scale, the model validation has also been carried out on this scale. For this purpose the model outputs were compared with lumped results from monitoring networks in the upper groundwater and in surface waters. About 13,000 recent point source observations of nitrate in the upper groundwater were available, along with several hundreds of observations showing N and P in local surface water systems. Comparison of observations from the different spatial scales available showed the issue of scale to be important. Scale issues will be addressed in the next stages of the validation study.

  15. Automated Planning and Scheduling for Orbital Express (151)

    NASA Technical Reports Server (NTRS)

    Knight, Russell

    2008-01-01

    The challenging timeline for DARPA's Orbital Express mission demanded a flexible, responsive, and (above all) safe approach to mission planning. Because the mission was a technology demonstration, pertinent planning information was learned during actual mission execution. This information led to amendments to procedures, which led to changes in the mission plan. In general, we used the ASPEN planner scheduler to generate and validate the mission plans. We enhanced ASPEN to enable it to reason about uncertainty. We also developed a model generator that would read the text of a procedure and translate it into an ASPEN model. These technologies had a significant impact on the success of the Orbital Express mission.

  16. Implementation of the validation testing in MPPG 5.a "Commissioning and QA of treatment planning dose calculations-megavoltage photon and electron beams".

    PubMed

    Jacqmin, Dustin J; Bredfeldt, Jeremy S; Frigo, Sean P; Smilowitz, Jennifer B

    2017-01-01

    The AAPM Medical Physics Practice Guideline (MPPG) 5.a provides concise guidance on the commissioning and QA of beam modeling and dose calculation in radiotherapy treatment planning systems. This work discusses the implementation of the validation testing recommended in MPPG 5.a at two institutions. The two institutions worked collaboratively to create a common set of treatment fields and analysis tools to deliver and analyze the validation tests. This included the development of a novel, open-source software tool to compare scanning water tank measurements to 3D DICOM-RT Dose distributions. Dose calculation algorithms in both Pinnacle and Eclipse were tested with MPPG 5.a to validate the modeling of Varian TrueBeam linear accelerators. The validation process resulted in more than 200 water tank scans and more than 50 point measurements per institution, each of which was compared to a dose calculation from the institution's treatment planning system (TPS). Overall, the validation testing recommended in MPPG 5.a took approximately 79 person-hours for a machine with four photon and five electron energies for a single TPS. Of the 79 person-hours, 26 person-hours required time on the machine, and the remainder involved preparation and analysis. The basic photon, electron, and heterogeneity correction tests were evaluated with the tolerances in MPPG 5.a, and the tolerances were met for all tests. The MPPG 5.a evaluation criteria were used to assess the small field and IMRT/VMAT validation tests. Both institutions found the use of MPPG 5.a to be a valuable resource during the commissioning process. The validation testing in MPPG 5.a showed the strengths and limitations of the TPS models. In addition, the data collected during the validation testing is useful for routine QA of the TPS, validation of software upgrades, and commissioning of new algorithms. © 2016 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  17. Validity and reliability analysis of the planned behavior theory scale related to the testicular self-examination in a Turkish context.

    PubMed

    Iyigun, Emine; Tastan, Sevinc; Ayhan, Hatice; Kose, Gulsah; Acikel, Cengizhan

    2016-06-01

    This study aimed to determine the validity and reliability levels of the Planned Behavior Theory Scale as related to a testicular self-examination. The study was carried out in a health-profession higher-education school in Ankara, Turkey, from April to June 2012. The study participants comprised 215 male students. Study data were collected by using a questionnaire, a planned behavior theory scale related to testicular self-examination, and Champion's Health Belief Model Scale (CHBMS). The sub-dimensions of the planned behavior theory scale, namely those of intention, attitude, subjective norms and self-efficacy, were found to have Cronbach's alpha values of between 0.81 and 0.89. Exploratory factor analysis showed that items of the scale had five factors that accounted for 75% of the variance. Of these, the sub-dimension of intention was found to have the highest level of contribution. A significant correlation was found between the sub-dimensions of the testicular self-examination planned behavior theory scale and those of CHBMS (p < 0.05). The findings suggest that the Turkish version of the testicular self-examination Planned Behavior Theory Scale is a valid and reliable measurement for Turkish society.

  18. The space shuttle payload planning working groups. Volume 8: Earth and ocean physics

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The findings and recommendations of the Earth and Ocean Physics working group of the space shuttle payload planning activity are presented. The requirements for the space shuttle mission are defined as: (1) precision measurement for earth and ocean physics experiments, (2) development and demonstration of new and improved sensors and analytical techniques, (3) acquisition of surface truth data for evaluation of new measurement techniques, (4) conduct of critical experiments to validate geophysical phenomena and instrumental results, and (5) development and validation of analytical/experimental models for global ocean dynamics and solid earth dynamics/earthquake prediction. Tables of data are presented to show the flight schedule estimated costs, and the mission model.

  19. Hospital-based expert model for health technology procurement planning in hospitals.

    PubMed

    Miniati, R; Cecconi, G; Frosini, F; Dori, F; Regolini, J; Iadanza, E; Biffi Gentili, G

    2014-01-01

    Although in the last years technology innovation in healthcare brought big improvements in care level and patient quality of life, hospital complexity and management cost became higher. For this reason, necessity of planning for medical equipment procurement within hospitals is getting more and more important in order to sustainable provide appropriate technology for both routine activity and innovative procedures. In order to support hospital decision makers for technology procurement planning, an expert model was designed as reported in the following paper. It combines the most widely used approaches for technology evaluation by taking into consideration Health Technology Assessment (HTA) and Medical Equipment Replacement Model (MERM). The designing phases include a first definition of prioritization algorithms, then the weighting process through experts' interviews and a final step for the model validation that included both statistical testing and comparison with real decisions. In conclusion, the designed model was able to provide a semi-automated tool that through the use of multidisciplinary information is able to prioritize different requests of technology acquisition in hospitals. Validation outcomes improved the model accuracy and created different "user profiles" according to the specific needs of decision makers.

  20. Grid Transmission Expansion Planning Model Based on Grid Vulnerability

    NASA Astrophysics Data System (ADS)

    Tang, Quan; Wang, Xi; Li, Ting; Zhang, Quanming; Zhang, Hongli; Li, Huaqiang

    2018-03-01

    Based on grid vulnerability and uniformity theory, proposed global network structure and state vulnerability factor model used to measure different grid models. established a multi-objective power grid planning model which considering the global power network vulnerability, economy and grid security constraint. Using improved chaos crossover and mutation genetic algorithm to optimize the optimal plan. For the problem of multi-objective optimization, dimension is not uniform, the weight is not easy given. Using principal component analysis (PCA) method to comprehensive assessment of the population every generation, make the results more objective and credible assessment. the feasibility and effectiveness of the proposed model are validated by simulation results of Garver-6 bus system and Garver-18 bus.

  1. MOSAIC : Model Of Sustainability And Integrated Corridors, phase 3 : comprehensive model calibration and validation and additional model enhancement.

    DOT National Transportation Integrated Search

    2015-02-01

    The Maryland State Highway Administration (SHA) has initiated major planning efforts to improve transportation : efficiency, safety, and sustainability on critical highway corridors through its Comprehensive Highway Corridor : (CHC) program. This pro...

  2. Predicting Patient-specific Dosimetric Benefits of Proton Therapy for Skull-base Tumors Using a Geometric Knowledge-based Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hall, David C.; Trofimov, Alexei V.; Winey, Brian A.

    Purpose: To predict the organ at risk (OAR) dose levels achievable with proton beam therapy (PBT), solely based on the geometric arrangement of the target volume in relation to the OARs. A comparison with an alternative therapy yields a prediction of the patient-specific benefits offered by PBT. This could enable physicians at hospitals without proton capabilities to make a better-informed referral decision or aid patient selection in model-based clinical trials. Methods and Materials: Skull-base tumors were chosen to test the method, owing to their geometric complexity and multitude of nearby OARs. By exploiting the correlations between the dose and distance-to-targetmore » in existing PBT plans, the models were independently trained for 6 types of OARs: brainstem, cochlea, optic chiasm, optic nerve, parotid gland, and spinal cord. Once trained, the models could estimate the feasible dose–volume histogram and generalized equivalent uniform dose (gEUD) for OAR structures of new patients. The models were trained using 20 patients and validated using an additional 21 patients. Validation was achieved by comparing the predicted gEUD to that of the actual PBT plan. Results: The predicted and planned gEUD were in good agreement. Considering all OARs, the prediction error was +1.4 ± 5.1 Gy (mean ± standard deviation), and Pearson's correlation coefficient was 93%. By comparing with an intensity modulated photon treatment plan, the model could classify whether an OAR structure would experience a gain, with a sensitivity of 93% (95% confidence interval: 87%-97%) and specificity of 63% (95% confidence interval: 38%-84%). Conclusions: We trained and validated models that could quickly and accurately predict the patient-specific benefits of PBT for skull-base tumors. Similar models could be developed for other tumor sites. Such models will be useful when an estimation of the feasible benefits of PBT is desired but the experience and/or resources required for treatment planning are unavailable.« less

  3. RELAP-7 Software Verification and Validation Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Curtis L.; Choi, Yong-Joon; Zou, Ling

    This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty yearsmore » of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.« less

  4. Pilot utilization plan for satellite data-based service for agriculture in Poland

    NASA Astrophysics Data System (ADS)

    Gatkowska, Martyna; Paradowski, Karol; Wróbel, Karolina

    2017-10-01

    The paper aims at demonstrating the assumptions and achievements of the Pilot Utilization Plan Activities performed within the Project ASAP "Advanced Sustainable Agricultural Production", co-financed by European Space Agency under the ARTES IAP Programme. Within the course of the project, the Pilot Utilization Plan (PilUP) activities are performed in order to develop the remote sensing based models, and further calibrate and validate them in order to achieve the accuracy, which meets the requirements of paying customers. The completion of the first PilUP resulted in development of the following models based of Landsat 8 and Sentinel 2 satellite data: model of homogenous polygons demarcation on the basis of comparison of electromagnetic scanning results and bare soil spectral reflectance, model of problematic areas indication and model for yield potential, delivered on the basis of NDVI map developed 1 month before harvest and the map of yield/collected yield derived from Users participating in PilUP. The second edition of the PilUP is being conducted between March 2017 until the end of 2017. This edition includes farmers and insurance companies. The following activities are planned: development of model for delimitation of loses due to unfavorable wintering of winter crops and validation of the model with in-situ data collected by the insurance companies in-field investigators, further enhancement of the model for homogenous polygons delimitation and primary indication of soil productivity and testing of the applicability and viability of map of problematic areas with the farmers.

  5. Patient-specific musculoskeletal modeling of the hip joint for preoperative planning of total hip arthroplasty: A validation study based on in vivo measurements

    PubMed Central

    Schick, Fabian; Asseln, Malte; Damm, Philipp; Radermacher, Klaus

    2018-01-01

    Validation of musculoskeletal models for application in preoperative planning is still a challenging task. Ideally, the simulation results of a patient-specific musculoskeletal model are compared to corresponding in vivo measurements. Currently, the only possibility to measure in vivo joint forces is to implant an instrumented prosthesis in patients undergoing a total joint replacement. In this study, a musculoskeletal model of the AnyBody Modeling System was adapted patient-specifically and validated against the in vivo hip joint force measurements of ten subjects performing one-leg stance and level walking. The impact of four model parameters was evaluated; hip joint width, muscle strength, muscle recruitment, and type of muscle model. The smallest difference between simulated and in vivo hip joint force was achieved by using the hip joint width measured in computed tomography images, a muscle strength of 90 N/cm2, a third order polynomial muscle recruitment, and a simple muscle model. This parameter combination reached mean deviations between simulation and in vivo measurement during the peak force phase of 12% ± 14% in magnitude and 11° ± 5° in orientation for one-leg stance and 8% ± 6% in magnitude and 10° ± 5° in orientation for level walking. PMID:29649235

  6. Multiple Roles: The Conflicted Realities of Community College Mission Statements

    ERIC Educational Resources Information Center

    Mrozinski, Mark D.

    2010-01-01

    Questions of efficacy have always plagued the use of mission statement as a strategic planning tool. In most planning models, the mission statement serves to clarify goals and guide the formation of strategies. However, little empirical evidence exists validating that mission statements actually improve the performance of organizations, even…

  7. Using individual patient anatomy to predict protocol compliance for prostate intensity-modulated radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Caine, Hannah; Whalley, Deborah; Kneebone, Andrew

    If a prostate intensity-modulated radiation therapy (IMRT) or volumetric-modulated arc therapy (VMAT) plan has protocol violations, it is often a challenge knowing whether this is due to unfavorable anatomy or suboptimal planning. This study aimed to create a model to predict protocol violations based on patient anatomical variables and their potential relationship to target and organ at risk (OAR) end points in the setting of definitive, dose-escalated IMRT/VMAT prostate planning. Radiotherapy plans from 200 consecutive patients treated with definitive radiation for prostate cancer using IMRT or VMAT were analyzed. The first 100 patient plans (hypothesis-generating cohort) were examined to identifymore » anatomical variables that predict for dosimetric outcome, in particular OAR end points. Variables that scored significance were further assessed for their ability to predict protocol violations using a Classification and Regression Tree (CART) analysis. These results were then validated in a second group of 100 patients (validation cohort). In the initial analysis of the hypothesis-generating cohort, percentage of rectum overlap in the planning target volume (PTV) (%OR) and percentage of bladder overlap in the PTV (%OB) were highlighted as significant predictors of rectal and bladder dosimetry. Lymph node treatment was also significant for bladder outcomes. For the validation cohort, CART analysis showed that %OR of < 6%, 6% to 9% and > 9% predicted a 13%, 63%, and 100% rate of rectal protocol violations respectively. For the bladder, %OB of < 9% vs > 9% is associated with 13% vs 88% rate of bladder constraint violations when lymph nodes were not treated. If nodal irradiation was delivered, plans with a %OB of < 9% had a 59% risk of violations. Percentage of rectum and bladder within the PTV can be used to identify individual plan potential to achieve dose-volume histogram (DVH) constraints. A model based on these factors could be used to reduce planning time, improve work flow, and strengthen plan quality and consistency.« less

  8. Supporting the Use of CERT (registered trademark) Secure Coding Standards in DoD Acquisitions

    DTIC Science & Technology

    2012-07-01

    Capability Maturity Model IntegrationSM (CMMI®) [Davis 2009]. SM Team Software Process, TSP, and Capability Maturity Model Integration are service...STP Software Test Plan TEP Test and Evaluation Plan TSP Team Software Process V & V verification and validation CMU/SEI-2012-TN-016 | 47...Supporting the Use of CERT® Secure Coding Standards in DoD Acquisitions Tim Morrow ( Software Engineering Institute) Robert Seacord ( Software

  9. Patient-specific cardiac phantom for clinical training and preprocedure surgical planning.

    PubMed

    Laing, Justin; Moore, John; Vassallo, Reid; Bainbridge, Daniel; Drangova, Maria; Peters, Terry

    2018-04-01

    Minimally invasive mitral valve repair procedures including MitraClip ® are becoming increasingly common. For cases of complex or diseased anatomy, clinicians may benefit from using a patient-specific cardiac phantom for training, surgical planning, and the validation of devices or techniques. An imaging compatible cardiac phantom was developed to simulate a MitraClip ® procedure. The phantom contained a patient-specific cardiac model manufactured using tissue mimicking materials. To evaluate accuracy, the patient-specific model was imaged using computed tomography (CT), segmented, and the resulting point cloud dataset was compared using absolute distance to the original patient data. The result, when comparing the molded model point cloud to the original dataset, resulted in a maximum Euclidean distance error of 7.7 mm, an average error of 0.98 mm, and a standard deviation of 0.91 mm. The phantom was validated using a MitraClip ® device to ensure anatomical features and tools are identifiable under image guidance. Patient-specific cardiac phantoms may allow for surgical complications to be accounted for preoperative planning. The information gained by clinicians involved in planning and performing the procedure should lead to shorter procedural times and better outcomes for patients.

  10. Comparison and Field Validation of Binomial Sampling Plans for Oligonychus perseae (Acari: Tetranychidae) on Hass Avocado in Southern California.

    PubMed

    Lara, Jesus R; Hoddle, Mark S

    2015-08-01

    Oligonychus perseae Tuttle, Baker, & Abatiello is a foliar pest of 'Hass' avocados [Persea americana Miller (Lauraceae)]. The recommended action threshold is 50-100 motile mites per leaf, but this count range and other ecological factors associated with O. perseae infestations limit the application of enumerative sampling plans in the field. Consequently, a comprehensive modeling approach was implemented to compare the practical application of various binomial sampling models for decision-making of O. perseae in California. An initial set of sequential binomial sampling models were developed using three mean-proportion modeling techniques (i.e., Taylor's power law, maximum likelihood, and an empirical model) in combination with two-leaf infestation tally thresholds of either one or two mites. Model performance was evaluated using a robust mite count database consisting of >20,000 Hass avocado leaves infested with varying densities of O. perseae and collected from multiple locations. Operating characteristic and average sample number results for sequential binomial models were used as the basis to develop and validate a standardized fixed-size binomial sampling model with guidelines on sample tree and leaf selection within blocks of avocado trees. This final validated model requires a leaf sampling cost of 30 leaves and takes into account the spatial dynamics of O. perseae to make reliable mite density classifications for a 50-mite action threshold. Recommendations for implementing this fixed-size binomial sampling plan to assess densities of O. perseae in commercial California avocado orchards are discussed. © The Authors 2015. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. Integrated tokamak modeling: when physics informs engineering and research planning

    NASA Astrophysics Data System (ADS)

    Poli, Francesca

    2017-10-01

    Simulations that integrate virtually all the relevant engineering and physics aspects of a real tokamak experiment are a power tool for experimental interpretation, model validation and planning for both present and future devices. This tutorial will guide through the building blocks of an ``integrated'' tokamak simulation, such as magnetic flux diffusion, thermal, momentum and particle transport, external heating and current drive sources, wall particle sources and sinks. Emphasis is given to the connection and interplay between external actuators and plasma response, between the slow time scales of the current diffusion and the fast time scales of transport, and how reduced and high-fidelity models can contribute to simulate a whole device. To illustrate the potential and limitations of integrated tokamak modeling for discharge prediction, a helium plasma scenario for the ITER pre-nuclear phase is taken as an example. This scenario presents challenges because it requires core-edge integration and advanced models for interaction between waves and fast-ions, which are subject to a limited experimental database for validation and guidance. Starting from a scenario obtained by re-scaling parameters from the demonstration inductive ``ITER baseline'', it is shown how self-consistent simulations that encompass both core and edge plasma regions, as well as high-fidelity heating and current drive source models are needed to set constraints on the density, magnetic field and heating scheme. This tutorial aims at demonstrating how integrated modeling, when used with adequate level of criticism, can not only support design of operational scenarios, but also help to asses the limitations and gaps in the available models, thus indicating where improved modeling tools are required and how present experiments can help their validation and inform research planning. Work supported by DOE under DE-AC02-09CH1146.

  12. Recovery Act. Development and Validation of an Advanced Stimulation Prediction Model for Enhanced Geothermal System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gutierrez, Marte

    The research project aims to develop and validate an advanced computer model that can be used in the planning and design of stimulation techniques to create engineered reservoirs for Enhanced Geothermal Systems. The specific objectives of the proposal are to: 1) Develop a true three-dimensional hydro-thermal fracturing simulator that is particularly suited for EGS reservoir creation. 2) Perform laboratory scale model tests of hydraulic fracturing and proppant flow/transport using a polyaxial loading device, and use the laboratory results to test and validate the 3D simulator. 3) Perform discrete element/particulate modeling of proppant transport in hydraulic fractures, and use the resultsmore » to improve understand of proppant flow and transport. 4) Test and validate the 3D hydro-thermal fracturing simulator against case histories of EGS energy production. 5) Develop a plan to commercialize the 3D fracturing and proppant flow/transport simulator. The project is expected to yield several specific results and benefits. Major technical products from the proposal include: 1) A true-3D hydro-thermal fracturing computer code that is particularly suited to EGS, 2) Documented results of scale model tests on hydro-thermal fracturing and fracture propping in an analogue crystalline rock, 3) Documented procedures and results of discrete element/particulate modeling of flow and transport of proppants for EGS applications, and 4) Database of monitoring data, with focus of Acoustic Emissions (AE) from lab scale modeling and field case histories of EGS reservoir creation.« less

  13. Recovery Act. Development and Validation of an Advanced Stimulation Prediction Model for Enhanced Geothermal Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gutierrez, Marte

    2013-12-31

    This research project aims to develop and validate an advanced computer model that can be used in the planning and design of stimulation techniques to create engineered reservoirs for Enhanced Geothermal Systems. The specific objectives of the proposal are to; Develop a true three-dimensional hydro-thermal fracturing simulator that is particularly suited for EGS reservoir creation; Perform laboratory scale model tests of hydraulic fracturing and proppant flow/transport using a polyaxial loading device, and use the laboratory results to test and validate the 3D simulator; Perform discrete element/particulate modeling of proppant transport in hydraulic fractures, and use the results to improve understandmore » of proppant flow and transport; Test and validate the 3D hydro-thermal fracturing simulator against case histories of EGS energy production; and Develop a plan to commercialize the 3D fracturing and proppant flow/transport simulator. The project is expected to yield several specific results and benefits. Major technical products from the proposal include; A true-3D hydro-thermal fracturing computer code that is particularly suited to EGS; Documented results of scale model tests on hydro-thermal fracturing and fracture propping in an analogue crystalline rock; Documented procedures and results of discrete element/particulate modeling of flow and transport of proppants for EGS applications; and Database of monitoring data, with focus of Acoustic Emissions (AE) from lab scale modeling and field case histories of EGS reservoir creation.« less

  14. Development and Validation of a Predictive Model to Identify Individuals Likely to Have Undiagnosed Chronic Obstructive Pulmonary Disease Using an Administrative Claims Database.

    PubMed

    Moretz, Chad; Zhou, Yunping; Dhamane, Amol D; Burslem, Kate; Saverno, Kim; Jain, Gagan; Devercelli, Giovanna; Kaila, Shuchita; Ellis, Jeffrey J; Hernandez, Gemzel; Renda, Andrew

    2015-12-01

    Despite the importance of early detection, delayed diagnosis of chronic obstructive pulmonary disease (COPD) is relatively common. Approximately 12 million people in the United States have undiagnosed COPD. Diagnosis of COPD is essential for the timely implementation of interventions, such as smoking cessation programs, drug therapies, and pulmonary rehabilitation, which are aimed at improving outcomes and slowing disease progression. To develop and validate a predictive model to identify patients likely to have undiagnosed COPD using administrative claims data. A predictive model was developed and validated utilizing a retro-spective cohort of patients with and without a COPD diagnosis (cases and controls), aged 40-89, with a minimum of 24 months of continuous health plan enrollment (Medicare Advantage Prescription Drug [MAPD] and commercial plans), and identified between January 1, 2009, and December 31, 2012, using Humana's claims database. Stratified random sampling based on plan type (commercial or MAPD) and index year was performed to ensure that cases and controls had a similar distribution of these variables. Cases and controls were compared to identify demographic, clinical, and health care resource utilization (HCRU) characteristics associated with a COPD diagnosis. Stepwise logistic regression (SLR), neural networking, and decision trees were used to develop a series of models. The models were trained, validated, and tested on randomly partitioned subsets of the sample (Training, Validation, and Test data subsets). Measures used to evaluate and compare the models included area under the curve (AUC); index of the receiver operating characteristics (ROC) curve; sensitivity, specificity, positive predictive value (PPV); and negative predictive value (NPV). The optimal model was selected based on AUC index on the Test data subset. A total of 50,880 cases and 50,880 controls were included, with MAPD patients comprising 92% of the study population. Compared with controls, cases had a statistically significantly higher comorbidity burden and HCRU (including hospitalizations, emergency room visits, and medical procedures). The optimal predictive model was generated using SLR, which included 34 variables that were statistically significantly associated with a COPD diagnosis. After adjusting for covariates, anticholinergic bronchodilators (OR = 3.336) and tobacco cessation counseling (OR = 2.871) were found to have a large influence on the model. The final predictive model had an AUC of 0.754, sensitivity of 60%, specificity of 78%, PPV of 73%, and an NPV of 66%. This claims-based predictive model provides an acceptable level of accuracy in identifying patients likely to have undiagnosed COPD in a large national health plan. Identification of patients with undiagnosed COPD may enable timely management and lead to improved health outcomes and reduced COPD-related health care expenditures.

  15. Beam commissioning and measurements validating the beam model in a new TPS that converts helical tomotherapy plans to step-and-shoot IMRT plans.

    PubMed

    Petersson, Kristoffer; Ceberg, Crister; Engström, Per; Knöös, Tommy

    2011-01-01

    A new type of treatment planning system called SHAREPLAN has been studied, which enables the transfer of treatment plans generated for helical tomotherapy delivery to plans that can be delivered on C-arm linacs. The purpose is to ensure continuous patient treatment during periods of unscheduled downtime for the TomoTherapy unit, particularly in clinics without a backup unit. The purpose of this work was to verify that the plans generated in this novel planning system are deliverable and accurate. The work consists primarily of beam commissioning, verification of the beam model, and measurements verifying that generated plans are deliverable with sufficient accuracy. The beam commissioning process involves input of general geometric properties of the modeled linac, profiles and depth dose curves for a specific photon nominal energy (6 MV), and the automated modeling of other beam properties. Some manual tuning of the beam model is required. To evaluate its accuracy, the confidence limit concept [J. Venselaar et al., "Tolerances for the accuracy of photon beam dose calculations of treatment planning systems," Radiother. Oncol. 60, 191-201 (2001)] was used, which is a method supported by ESTRO. Measurements were conducted with a 2D diode array at the commissioned linac as a final check of the beam model and to evaluate whether the generated plans were deliverable and accurate. The comparison and evaluation of calculated data points and measured data according to the method applied confirmed the accuracy of the beam model. The profiles had a confidence limit of 1.1% and the depth dose curves had a confidence limit of 1.7%, both of which were well below the tolerance limit of 2%. Plan specific QC measurements and evaluation verified that different plans generated in the TPS were deliverable with sufficient accuracy at the commissioned linac, as none of the 160 beams for the 20 different plans evaluated had a fraction of approved data points below 90%, the local clinical approval criterion for delivery QA measurements. This study is a validation of the new TPS as it verifies that the generated plans are deliverable at a commissioned linac with adequate accuracy. A thorough investigation of the treatment plan quality will require a separate study. The TPS is proving to be a useful and time-saving complement, especially for clinics having a single unit for helical delivery among its conventional linacs.

  16. Validation and uncertainty analysis of a pre-treatment 2D dose prediction model

    NASA Astrophysics Data System (ADS)

    Baeza, Jose A.; Wolfs, Cecile J. A.; Nijsten, Sebastiaan M. J. J. G.; Verhaegen, Frank

    2018-02-01

    Independent verification of complex treatment delivery with megavolt photon beam radiotherapy (RT) has been effectively used to detect and prevent errors. This work presents the validation and uncertainty analysis of a model that predicts 2D portal dose images (PDIs) without a patient or phantom in the beam. The prediction model is based on an exponential point dose model with separable primary and secondary photon fluence components. The model includes a scatter kernel, off-axis ratio map, transmission values and penumbra kernels for beam-delimiting components. These parameters were derived through a model fitting procedure supplied with point dose and dose profile measurements of radiation fields. The model was validated against a treatment planning system (TPS; Eclipse) and radiochromic film measurements for complex clinical scenarios, including volumetric modulated arc therapy (VMAT). Confidence limits on fitted model parameters were calculated based on simulated measurements. A sensitivity analysis was performed to evaluate the effect of the parameter uncertainties on the model output. For the maximum uncertainty, the maximum deviating measurement sets were propagated through the fitting procedure and the model. The overall uncertainty was assessed using all simulated measurements. The validation of the prediction model against the TPS and the film showed a good agreement, with on average 90.8% and 90.5% of pixels passing a (2%,2 mm) global gamma analysis respectively, with a low dose threshold of 10%. The maximum and overall uncertainty of the model is dependent on the type of clinical plan used as input. The results can be used to study the robustness of the model. A model for predicting accurate 2D pre-treatment PDIs in complex RT scenarios can be used clinically and its uncertainties can be taken into account.

  17. Statistical considerations on prognostic models for glioma

    PubMed Central

    Molinaro, Annette M.; Wrensch, Margaret R.; Jenkins, Robert B.; Eckel-Passow, Jeanette E.

    2016-01-01

    Given the lack of beneficial treatments in glioma, there is a need for prognostic models for therapeutic decision making and life planning. Recently several studies defining subtypes of glioma have been published. Here, we review the statistical considerations of how to build and validate prognostic models, explain the models presented in the current glioma literature, and discuss advantages and disadvantages of each model. The 3 statistical considerations to establishing clinically useful prognostic models are: study design, model building, and validation. Careful study design helps to ensure that the model is unbiased and generalizable to the population of interest. During model building, a discovery cohort of patients can be used to choose variables, construct models, and estimate prediction performance via internal validation. Via external validation, an independent dataset can assess how well the model performs. It is imperative that published models properly detail the study design and methods for both model building and validation. This provides readers the information necessary to assess the bias in a study, compare other published models, and determine the model's clinical usefulness. As editors, reviewers, and readers of the relevant literature, we should be cognizant of the needed statistical considerations and insist on their use. PMID:26657835

  18. Integrated Land - Use , Transportation and Environmental Modeling : Validation Case Studies

    DOT National Transportation Integrated Search

    2010-08-01

    For decades the transportation-planning research community has acknowledged the interactions between the evolution of our transportation systems and our land-use, and the need to unify the practices of land-use forecasting and travel-demand modeling ...

  19. Drift-Scale THC Seepage Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    C.R. Bryan

    The purpose of this report (REV04) is to document the thermal-hydrologic-chemical (THC) seepage model, which simulates the composition of waters that could potentially seep into emplacement drifts, and the composition of the gas phase. The THC seepage model is processed and abstracted for use in the total system performance assessment (TSPA) for the license application (LA). This report has been developed in accordance with ''Technical Work Plan for: Near-Field Environment and Transport: Coupled Processes (Mountain-Scale TH/THC/THM, Drift-Scale THC Seepage, and Post-Processing Analysis for THC Seepage) Report Integration'' (BSC 2005 [DIRS 172761]). The technical work plan (TWP) describes planning information pertainingmore » to the technical scope, content, and management of this report. The plan for validation of the models documented in this report is given in Section 2.2.2, ''Model Validation for the DS THC Seepage Model,'' of the TWP. The TWP (Section 3.2.2) identifies Acceptance Criteria 1 to 4 for ''Quantity and Chemistry of Water Contacting Engineered Barriers and Waste Forms'' (NRC 2003 [DIRS 163274]) as being applicable to this report; however, in variance to the TWP, Acceptance Criterion 5 has also been determined to be applicable, and is addressed, along with the other Acceptance Criteria, in Section 4.2 of this report. Also, three FEPS not listed in the TWP (2.2.10.01.0A, 2.2.10.06.0A, and 2.2.11.02.0A) are partially addressed in this report, and have been added to the list of excluded FEPS in Table 6.1-2. This report has been developed in accordance with LP-SIII.10Q-BSC, ''Models''. This report documents the THC seepage model and a derivative used for validation, the Drift Scale Test (DST) THC submodel. The THC seepage model is a drift-scale process model for predicting the composition of gas and water that could enter waste emplacement drifts and the effects of mineral alteration on flow in rocks surrounding drifts. The DST THC submodel uses a drift-scale process model relying on the same conceptual model and many of the same input data (i.e., physical, hydrologic, thermodynamic, and kinetic) as the THC seepage model. The DST THC submodel is the primary means for validating the THC seepage model. The DST THC submodel compares predicted water and gas compositions, and mineral alteration patterns, with observed data from the DST. These models provide the framework to evaluate THC coupled processes at the drift scale, predict flow and transport behavior for specified thermal-loading conditions, and predict the evolution of mineral alteration and fluid chemistry around potential waste emplacement drifts. The DST THC submodel is used solely for the validation of the THC seepage model and is not used for calibration to measured data.« less

  20. Validation, Proof-of-Concept, and Postaudit of the Groundwater Flow and Transport Model of the Project Shoal Area

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahmed Hassan

    2004-09-01

    The groundwater flow and radionuclide transport model characterizing the Shoal underground nuclear test has been accepted by the State of Nevada Division of Environmental Protection. According to the Federal Facility Agreement and Consent Order (FFACO) between DOE and the State of Nevada, the next steps in the closure process for the site are then model validation (or postaudit), the proof-of-concept, and the long-term monitoring stage. This report addresses the development of the validation strategy for the Shoal model, needed for preparing the subsurface Corrective Action Decision Document-Corrective Action Plan and the development of the proof-of-concept tools needed during the five-yearmore » monitoring/validation period. The approach builds on a previous model, but is adapted and modified to the site-specific conditions and challenges of the Shoal site.« less

  1. Simulation Assessment Validation Environment (SAVE). Software User’s Manual

    DTIC Science & Technology

    2000-09-01

    requirements and decisions are made. The integration is leveraging work from other DoD organizations so that high -end results are attainable much faster than...planning through the modeling and simulation data capture and visualization process. The planners can complete the manufacturing process plan with a high ...technologies. This tool is also used to perform “ high level” factory process simulation prior to full CAD model development and help define feasible

  2. Teachers and Technology: Development of an Extended Theory of Planned Behavior

    ERIC Educational Resources Information Center

    Teo, Timothy; Zhou, Mingming; Noyes, Jan

    2016-01-01

    This study tests the validity of an extended theory of planned behaviour (TPB) to explain teachers' intention to use technology for teaching and learning. Five hundred and ninety two participants completed a survey questionnaire measuring their responses to eight constructs which form an extended TPB. Using structural equation modelling, the…

  3. Process for Refining and Validating a Finite Element Model of an Experimental High-Altitude, Long-Endurance (HALE) Aircraft

    DTIC Science & Technology

    2011-06-01

    7 Figure 4. Helios flying near the Hawaiian islands of Niihau and Lehua [15] ................... 8 Figure 5. Plan view of ERAST Program aircraft...Figure 4. Helios flying near the Hawaiian islands of Niihau and Lehua [15] 9 Figure 5. Plan view of ERAST Program aircraft

  4. Virtual Factory Framework for Supporting Production Planning and Control.

    PubMed

    Kibira, Deogratias; Shao, Guodong

    2017-01-01

    Developing optimal production plans for smart manufacturing systems is challenging because shop floor events change dynamically. A virtual factory incorporating engineering tools, simulation, and optimization generates and communicates performance data to guide wise decision making for different control levels. This paper describes such a platform specifically for production planning. We also discuss verification and validation of the constituent models. A case study of a machine shop is used to demonstrate data generation for production planning in a virtual factory.

  5. TU-D-201-05: Validation of Treatment Planning Dose Calculations: Experience Working with MPPG 5.a

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xue, J; Park, J; Kim, L

    2016-06-15

    Purpose: Newly published medical physics practice guideline (MPPG 5.a.) has set the minimum requirements for commissioning and QA of treatment planning dose calculations. We present our experience in the validation of a commercial treatment planning system based on MPPG 5.a. Methods: In addition to tests traditionally performed to commission a model-based dose calculation algorithm, extensive tests were carried out at short and extended SSDs, various depths, oblique gantry angles and off-axis conditions to verify the robustness and limitations of a dose calculation algorithm. A comparison between measured and calculated dose was performed based on validation tests and evaluation criteria recommendedmore » by MPPG 5.a. An ion chamber was used for the measurement of dose at points of interest, and diodes were used for photon IMRT/VMAT validations. Dose profiles were measured with a three-dimensional scanning system and calculated in the TPS using a virtual water phantom. Results: Calculated and measured absolute dose profiles were compared at each specified SSD and depth for open fields. The disagreement is easily identifiable with the difference curve. Subtle discrepancy has revealed the limitation of the measurement, e.g., a spike at the high dose region and an asymmetrical penumbra observed on the tests with an oblique MLC beam. The excellent results we had (> 98% pass rate on 3%/3mm gamma index) on the end-to-end tests for both IMRT and VMAT are attributed to the quality beam data and the good understanding of the modeling. The limitation of the model and the uncertainty of measurement were considered when comparing the results. Conclusion: The extensive tests recommended by the MPPG encourage us to understand the accuracy and limitations of a dose algorithm as well as the uncertainty of measurement. Our experience has shown how the suggested tests can be performed effectively to validate dose calculation models.« less

  6. Modification and validation of an analytical source model for external beam radiotherapy Monte Carlo dose calculations.

    PubMed

    Davidson, Scott E; Cui, Jing; Kry, Stephen; Deasy, Joseph O; Ibbott, Geoffrey S; Vicic, Milos; White, R Allen; Followill, David S

    2016-08-01

    A dose calculation tool, which combines the accuracy of the dose planning method (DPM) Monte Carlo code and the versatility of a practical analytical multisource model, which was previously reported has been improved and validated for the Varian 6 and 10 MV linear accelerators (linacs). The calculation tool can be used to calculate doses in advanced clinical application studies. One shortcoming of current clinical trials that report dose from patient plans is the lack of a standardized dose calculation methodology. Because commercial treatment planning systems (TPSs) have their own dose calculation algorithms and the clinical trial participant who uses these systems is responsible for commissioning the beam model, variation exists in the reported calculated dose distributions. Today's modern linac is manufactured to tight specifications so that variability within a linac model is quite low. The expectation is that a single dose calculation tool for a specific linac model can be used to accurately recalculate dose from patient plans that have been submitted to the clinical trial community from any institution. The calculation tool would provide for a more meaningful outcome analysis. The analytical source model was described by a primary point source, a secondary extra-focal source, and a contaminant electron source. Off-axis energy softening and fluence effects were also included. The additions of hyperbolic functions have been incorporated into the model to correct for the changes in output and in electron contamination with field size. A multileaf collimator (MLC) model is included to facilitate phantom and patient dose calculations. An offset to the MLC leaf positions was used to correct for the rudimentary assumed primary point source. Dose calculations of the depth dose and profiles for field sizes 4 × 4 to 40 × 40 cm agree with measurement within 2% of the maximum dose or 2 mm distance to agreement (DTA) for 95% of the data points tested. The model was capable of predicting the depth of the maximum dose within 1 mm. Anthropomorphic phantom benchmark testing of modulated and patterned MLCs treatment plans showed agreement to measurement within 3% in target regions using thermoluminescent dosimeters (TLD). Using radiochromic film normalized to TLD, a gamma criteria of 3% of maximum dose and 2 mm DTA was applied with a pass rate of least 85% in the high dose, high gradient, and low dose regions. Finally, recalculations of patient plans using DPM showed good agreement relative to a commercial TPS when comparing dose volume histograms and 2D dose distributions. A unique analytical source model coupled to the dose planning method Monte Carlo dose calculation code has been modified and validated using basic beam data and anthropomorphic phantom measurement. While this tool can be applied in general use for a particular linac model, specifically it was developed to provide a singular methodology to independently assess treatment plan dose distributions from those clinical institutions participating in National Cancer Institute trials.

  7. Prediction of exercise in patients across various stages of bariatric surgery: a comparison of the merits of the theory of reasoned action versus the theory of planned behavior.

    PubMed

    Hunt, Hillary R; Gross, Alan M

    2009-11-01

    Obesity is a world-wide health concern approaching epidemic proportions. Successful long-term treatment involves a combination of bariatric surgery, diet, and exercise. Social cognitive models, such as the Theory of Reasoned Action (TRA) and the Theory of Planned Behavior (TPB), are among the most commonly tested theories utilized in the prediction of exercise. As exercise is not a completely volitional behavior, it is hypothesized that the TPB is a superior theoretical model for the prediction of exercise intentions and behavior. This study tested validity of the TPB in a sample of bariatric patients and further validated its improvement over the TRA in predicting exercise adherence at different operative stages. Results generally confirmed research hypotheses. Superiority of the TPB model was validated in this sample of bariatric patients, and Perceived Behavioral Control emerged as the single-best predictor of both exercise intentions and self-reported behavior. Finally, results suggested that both subjective norms and attitudes toward exercise played a larger role in the prediction of intention and behavior than previously reported.

  8. The use of discrete-event simulation modelling to improve radiation therapy planning processes.

    PubMed

    Werker, Greg; Sauré, Antoine; French, John; Shechter, Steven

    2009-07-01

    The planning portion of the radiation therapy treatment process at the British Columbia Cancer Agency is efficient but nevertheless contains room for improvement. The purpose of this study is to show how a discrete-event simulation (DES) model can be used to represent this complex process and to suggest improvements that may reduce the planning time and ultimately reduce overall waiting times. A simulation model of the radiation therapy (RT) planning process was constructed using the Arena simulation software, representing the complexities of the system. Several types of inputs feed into the model; these inputs come from historical data, a staff survey, and interviews with planners. The simulation model was validated against historical data and then used to test various scenarios to identify and quantify potential improvements to the RT planning process. Simulation modelling is an attractive tool for describing complex systems, and can be used to identify improvements to the processes involved. It is possible to use this technique in the area of radiation therapy planning with the intent of reducing process times and subsequent delays for patient treatment. In this particular system, reducing the variability and length of oncologist-related delays contributes most to improving the planning time.

  9. Planning Model of Physics Learning In Senior High School To Develop Problem Solving Creativity Based On National Standard Of Education

    NASA Astrophysics Data System (ADS)

    Putra, A.; Masril, M.; Yurnetti, Y.

    2018-04-01

    One of the causes of low achievement of student’s competence in physics learning in high school is the process which they have not been able to develop student’s creativity in problem solving. This is shown that the teacher’s learning plan is not accordance with the National Eduction Standard. This study aims to produce a reconstruction model of physics learning that fullfil the competency standards, content standards, and assessment standards in accordance with applicable curriculum standards. The development process follows: Needs analysis, product design, product development, implementation, and product evaluation. The research process involves 2 peers judgment, 4 experts judgment and two study groups of high school students in Padang. The data obtained, in the form of qualitative and quantitative data that collected through documentation, observation, questionnaires, and tests. The result of this research up to the product development stage that obtained the physics learning plan model that meets the validity of the content and the validity of the construction in terms of the fulfillment of Basic Competence, Content Standards, Process Standards and Assessment Standards.

  10. Development and evaluation of a calibration and validation procedure for microscopic simulation models.

    DOT National Transportation Integrated Search

    2004-01-01

    Microscopic traffic simulation models have been widely accepted and applied in transportation engineering and planning practice for the past decades because simulation is cost-effective, safe, and fast. To achieve high fidelity and credibility for a ...

  11. STAR-CCM+ Verification and Validation Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pointer, William David

    2016-09-30

    The commercial Computational Fluid Dynamics (CFD) code STAR-CCM+ provides general purpose finite volume method solutions for fluid dynamics and energy transport. This document defines plans for verification and validation (V&V) of the base code and models implemented within the code by the Consortium for Advanced Simulation of Light water reactors (CASL). The software quality assurance activities described herein are port of the overall software life cycle defined in the CASL Software Quality Assurance (SQA) Plan [Sieger, 2015]. STAR-CCM+ serves as the principal foundation for development of an advanced predictive multi-phase boiling simulation capability within CASL. The CASL Thermal Hydraulics Methodsmore » (THM) team develops advanced closure models required to describe the subgrid-resolution behavior of secondary fluids or fluid phases in multiphase boiling flows within the Eulerian-Eulerian framework of the code. These include wall heat partitioning models that describe the formation of vapor on the surface and the forces the define bubble/droplet dynamic motion. The CASL models are implemented as user coding or field functions within the general framework of the code. This report defines procedures and requirements for V&V of the multi-phase CFD capability developed by CASL THM. Results of V&V evaluations will be documented in a separate STAR-CCM+ V&V assessment report. This report is expected to be a living document and will be updated as additional validation cases are identified and adopted as part of the CASL THM V&V suite.« less

  12. A framework for propagation of uncertainty contributed by parameterization, input data, model structure, and calibration/validation data in watershed modeling

    USDA-ARS?s Scientific Manuscript database

    The progressive improvement of computer science and development of auto-calibration techniques means that calibration of simulation models is no longer a major challenge for watershed planning and management. Modelers now increasingly focus on challenges such as improved representation of watershed...

  13. Computational quench model applicable to the SMES/CICC

    NASA Astrophysics Data System (ADS)

    Luongo, Cesar A.; Chang, Chih-Lien; Partain, Kenneth D.

    1994-07-01

    A computational quench model accounting for the hydraulic peculiarities of the 200 kA SMES cable-in-conduit conductor has been developed. The model is presented and used to simulate the quench on the SMES-ETM. Conclusions are drawn concerning quench detection and protection. A plan for quench model validation is presented.

  14. Experimental validation of the van Herk margin formula for lung radiation therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ecclestone, Gillian; Heath, Emily; Bissonnette, Jean-Pierre

    2013-11-15

    Purpose: To validate the van Herk margin formula for lung radiation therapy using realistic dose calculation algorithms and respiratory motion modeling. The robustness of the margin formula against variations in lesion size, peak-to-peak motion amplitude, tissue density, treatment technique, and plan conformity was assessed, along with the margin formula assumption of a homogeneous dose distribution with perfect plan conformity.Methods: 3DCRT and IMRT lung treatment plans were generated within the ORBIT treatment planning platform (RaySearch Laboratories, Sweden) on 4DCT datasets of virtual phantoms. Random and systematic respiratory motion induced errors were simulated using deformable registration and dose accumulation tools available withinmore » ORBIT for simulated cases of varying lesion sizes, peak-to-peak motion amplitudes, tissue densities, and plan conformities. A detailed comparison between the margin formula dose profile model, the planned dose profiles, and penumbra widths was also conducted to test the assumptions of the margin formula. Finally, a correction to account for imperfect plan conformity was tested as well as a novel application of the margin formula that accounts for the patient-specific motion trajectory.Results: The van Herk margin formula ensured full clinical target volume coverage for all 3DCRT and IMRT plans of all conformities with the exception of small lesions in soft tissue. No dosimetric trends with respect to plan technique or lesion size were observed for the systematic and random error simulations. However, accumulated plans showed that plan conformity decreased with increasing tumor motion amplitude. When comparing dose profiles assumed in the margin formula model to the treatment plans, discrepancies in the low dose regions were observed for the random and systematic error simulations. However, the margin formula respected, in all experiments, the 95% dose coverage required for planning target volume (PTV) margin derivation, as defined by the ICRU; thus, suitable PTV margins were estimated. The penumbra widths calculated in lung tissue for each plan were found to be very similar to the 6.4 mm value assumed by the margin formula model. The plan conformity correction yielded inconsistent results which were largely affected by image and dose grid resolution while the trajectory modified PTV plans yielded a dosimetric benefit over the standard internal target volumes approach with up to a 5% decrease in the V20 value.Conclusions: The margin formula showed to be robust against variations in tumor size and motion, treatment technique, plan conformity, as well as low tissue density. This was validated by maintaining coverage of all of the derived PTVs by 95% dose level, as required by the formal definition of the PTV. However, the assumption of perfect plan conformity in the margin formula derivation yields conservative margin estimation. Future modifications to the margin formula will require a correction for plan conformity. Plan conformity can also be improved by using the proposed trajectory modified PTV planning approach. This proves especially beneficial for tumors with a large anterior–posterior component of respiratory motion.« less

  15. Effective learning among elite football players: the development of a football-specific self-regulated learning questionnaire.

    PubMed

    Toering, Tynke; Jordet, Geir; Ripegutu, Anders

    2013-01-01

    The present study aimed to develop a football-specific self-report instrument measuring self-regulated learning in the context of daily practice, which can be used to monitor the extent to which players take responsibility for their own learning. Development of the instrument involved six steps: 1. Literature review based on Zimmerman's (2006) theory of self-regulated learning, 2. Item generation, 3. Item validation, 4. Pilot studies, 5. Exploratory factor analysis (EFA), and 6. Confirmatory factor analysis (CFA). The instrument was tested for reliability and validity among 204 elite youth football players aged 13-16 years (Mage = 14.6; s = 0.60; 123 boys, 81 girls). The EFA indicated that a five-factor model fitted the observed data best (reflection, evaluation, planning, speaking up, and coaching). However, the CFA showed that a three-factor structure including 22 items produced a satisfactory model fit (reflection, evaluation, and planning; non-normed fit index [NNFI] = 0.96, comparative fit index [CFI] = 0.95, root mean square error of approximation [RMSEA] = 0.067). While the self-regulation processes of reflection, evaluation, and planning are strongly related and fit well into one model, other self-regulated learning processes seem to be more individually determined. In conclusion, the questionnaire developed in this study is considered a reliable and valid instrument to measure self-regulated learning among elite football players.

  16. ILS Localizer Performance Study : Part I. Dallas/Fort Worth Regional Airport and Model Validation - Syracuse Hancock Airport

    DOT National Transportation Integrated Search

    1972-07-01

    The TSC electromagnetic scattering model has been used to predict the course deviation indications (CDI) at the planned Dallas Fort Worth Regional Airport. The results show that the CDI due to scattering from the modeled airport structures are within...

  17. MIZMAS: Modeling the Evolution of Ice Thickness and Floe Size Distributions in the Marginal Ice Zone of the Chukchi and Beaufort Seas

    DTIC Science & Technology

    2014-09-30

    existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this...through downscaling future projection simulations. APPROACH To address the scientific objectives, we plan to develop, implement, and validate a...ITD and FSD at the same time. The development of MIZMAS will be based on systematic model parameterization, calibration, and validation, and data

  18. Multi-day activity scheduling reactions to planned activities and future events in a dynamic model of activity-travel behavior

    NASA Astrophysics Data System (ADS)

    Nijland, Linda; Arentze, Theo; Timmermans, Harry

    2014-01-01

    Modeling multi-day planning has received scarce attention in activity-based transport demand modeling so far. However, new dynamic activity-based approaches are being developed at the current moment. The frequency and inflexibility of planned activities and events in activity schedules of individuals indicate the importance of incorporating those pre-planned activities in the new generation of dynamic travel demand models. Elaborating and combining previous work on event-driven activity generation, the aim of this paper is to develop and illustrate an extension of a need-based model of activity generation that takes into account possible influences of pre-planned activities and events. This paper describes the theory and shows the results of simulations of the extension. The simulation was conducted for six different activities, and the parameter values used were consistent with an earlier estimation study. The results show that the model works well and that the influences of the parameters are consistent, logical, and have clear interpretations. These findings offer further evidence of face and construct validity to the suggested modeling approach.

  19. SU-E-T-186: Cloud-Based Quality Assurance Application for Linear Accelerator Commissioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rogers, J

    2015-06-15

    Purpose: To identify anomalies and safety issues during data collection and modeling for treatment planning systems Methods: A cloud-based quality assurance system (AQUIRE - Automated QUalIty REassurance) has been developed to allow the uploading and analysis of beam data aquired during the treatment planning system commissioning process. In addition to comparing and aggregating measured data, tools have also been developed to extract dose from the treatment planning system for end-to-end testing. A gamma index is perfomed on the data to give a dose difference and distance-to-agreement for validation that a beam model is generating plans consistent with the beam datamore » collection. Results: Over 20 linear accelerators have been commissioning using this platform, and a variety of errors and potential saftey issues have been caught through the validation process. For example, the gamma index of 2% dose, 2mm DTA is quite sufficient to see curves not corrected for effective point of measurement. Also, data imported into the database is analyzed against an aggregate of similar linear accelerators to show data points that are outliers. The resulting curves in the database exhibit a very small standard deviation and imply that a preconfigured beam model based on aggregated linear accelerators will be sufficient in most cases. Conclusion: With the use of this new platform for beam data commissioning, errors in beam data collection and treatment planning system modeling are greatly reduced. With the reduction in errors during acquisition, the resulting beam models are quite similar, suggesting that a common beam model may be possible in the future. Development is ongoing to create routine quality assurance tools to compare back to the beam data acquired during commissioning. I am a medical physicist for Alzyen Medical Physics, and perform commissioning services.« less

  20. A Systematic Planning for Science Laboratory Instruction: Research-Based Evidence

    ERIC Educational Resources Information Center

    Balta, Nuri

    2015-01-01

    The aim of this study is to develop an instructional design model for science laboratory instruction. Well-known ID models were analysed and Dick and Carey model was imitated to produce a science laboratory instructional design (SLID) model. In order to validate the usability of the designed model, the views of 34 high school teachers related to…

  1. A Genetically Engineered Mouse Model of Neuroblastoma Driven by Mutated ALK and MYCN

    DTIC Science & Technology

    2015-09-01

    Through publications in scientific journals (see Journal publications). What do you plan to do during the next reporting period to accomplish the goals...We plan to elucidate the mechanism of action of the synergy between ALK and CDK inhibitors, validate CDK7 inhibition as a tractable therapeutic...delays and actions or plans to resolve them Nothing to Report Changes that had a significant impact on expenditures Nothing to Report Significant

  2. Optimal allocation model of construction land based on two-level system optimization theory

    NASA Astrophysics Data System (ADS)

    Liu, Min; Liu, Yanfang; Xia, Yuping; Lei, Qihong

    2007-06-01

    The allocation of construction land is an important task in land-use planning. Whether implementation of planning decisions is a success or not, usually depends on a reasonable and scientific distribution method. Considering the constitution of land-use planning system and planning process in China, multiple levels and multiple objective decision problems is its essence. Also, planning quantity decomposition is a two-level system optimization problem and an optimal resource allocation decision problem between a decision-maker in the topper and a number of parallel decision-makers in the lower. According the characteristics of the decision-making process of two-level decision-making system, this paper develops an optimal allocation model of construction land based on two-level linear planning. In order to verify the rationality and the validity of our model, Baoan district of Shenzhen City has been taken as a test case. Under the assistance of the allocation model, construction land is allocated to ten townships of Baoan district. The result obtained from our model is compared to that of traditional method, and results show that our model is reasonable and usable. In the end, the paper points out the shortcomings of the model and further research directions.

  3. Landscape capability models as a tool to predict fine-scale forest bird occupancy and abundance

    USGS Publications Warehouse

    Loman, Zachary G.; DeLuca, William; Harrison, Daniel J.; Loftin, Cynthia S.; Rolek, Brian W.; Wood, Petra B.

    2018-01-01

    ContextSpecies-specific models of landscape capability (LC) can inform landscape conservation design. Landscape capability is “the ability of the landscape to provide the environment […] and the local resources […] needed for survival and reproduction […] in sufficient quantity, quality and accessibility to meet the life history requirements of individuals and local populations.” Landscape capability incorporates species’ life histories, ecologies, and distributions to model habitat for current and future landscapes and climates as a proactive strategy for conservation planning.ObjectivesWe tested the ability of a set of LC models to explain variation in point occupancy and abundance for seven bird species representative of spruce-fir, mixed conifer-hardwood, and riparian and wooded wetland macrohabitats.MethodsWe compiled point count data sets used for biological inventory, species monitoring, and field studies across the northeastern United States to create an independent validation data set. Our validation explicitly accounted for underestimation in validation data using joint distance and time removal sampling.ResultsBlackpoll warbler (Setophaga striata), wood thrush (Hylocichla mustelina), and Louisiana (Parkesia motacilla) and northern waterthrush (P. noveboracensis) models were validated as predicting variation in abundance, although this varied from not biologically meaningful (1%) to strongly meaningful (59%). We verified all seven species models [including ovenbird (Seiurus aurocapilla), blackburnian (Setophaga fusca) and cerulean warbler (Setophaga cerulea)], as all were positively related to occupancy data.ConclusionsLC models represent a useful tool for conservation planning owing to their predictive ability over a regional extent. As improved remote-sensed data become available, LC layers are updated, which will improve predictions.

  4. Validating Behavioural Change: Teachers' Perception and Use of ICT in England and Korea.

    ERIC Educational Resources Information Center

    Carter, D. S. G.; Leeh, D. J. K.

    This study focused on the test and cross-cultural validation of an organizational and behavioral model of planned change. The aim of the research was to ascertain the nature and direction of different cultural aspects influencing the change process when Information and Communication Technology (ICT) was being implemented in schools. The…

  5. Preparation of the implementation plan of AASHTO Mechanistic-Empirical Pavement Design Guide (M-EPDG) in Connecticut : Phase II : expanded sensitivity analysis and validation with pavement management data.

    DOT National Transportation Integrated Search

    2017-02-08

    The study re-evaluates distress prediction models using the Mechanistic-Empirical Pavement Design Guide (MEPDG) and expands the sensitivity analysis to a wide range of pavement structures and soils. In addition, an extensive validation analysis of th...

  6. Optimal loop placement and models for length - based vehicle classification and stop - and - go traffic.

    DOT National Transportation Integrated Search

    2011-01-01

    Inductive loops are widely used nationwide for traffic monitoring as a data source for a variety of : needs in generating traffic information for operation and planning analysis, validations of travel : demand models, freight studies, pavement design...

  7. A Longitudinal Study of Premarital Couples: A Social Exchange Perspective.

    ERIC Educational Resources Information Center

    Markman, Howard J.

    The attributes of couples planning to marry can affect their future relationship satisfaction. To study this phenomenon, a social exchange model was applied to a sample of couples planning to marry to assess the predictive validity of a measure of positive exchange. The longitudinal design of the two-and-a-half year investigation provided direct…

  8. Validation Test Report For The CRWMS Analysis and Logistics Visually Interactive Model Calvin Version 3.0, 10074-Vtr-3.0-00

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    S. Gillespie

    2000-07-27

    This report describes the tests performed to validate the CRWMS ''Analysis and Logistics Visually Interactive'' Model (CALVIN) Version 3.0 (V3.0) computer code (STN: 10074-3.0-00). To validate the code, a series of test cases was developed in the CALVIN V3.0 Validation Test Plan (CRWMS M&O 1999a) that exercises the principal calculation models and options of CALVIN V3.0. Twenty-five test cases were developed: 18 logistics test cases and 7 cost test cases. These cases test the features of CALVIN in a sequential manner, so that the validation of each test case is used to demonstrate the accuracy of the input to subsequentmore » calculations. Where necessary, the test cases utilize reduced-size data tables to make the hand calculations used to verify the results more tractable, while still adequately testing the code's capabilities. Acceptance criteria, were established for the logistics and cost test cases in the Validation Test Plan (CRWMS M&O 1999a). The Logistics test cases were developed to test the following CALVIN calculation models: Spent nuclear fuel (SNF) and reactivity calculations; Options for altering reactor life; Adjustment of commercial SNF (CSNF) acceptance rates for fiscal year calculations and mid-year acceptance start; Fuel selection, transportation cask loading, and shipping to the Monitored Geologic Repository (MGR); Transportation cask shipping to and storage at an Interim Storage Facility (ISF); Reactor pool allocation options; and Disposal options at the MGR. Two types of cost test cases were developed: cases to validate the detailed transportation costs, and cases to validate the costs associated with the Civilian Radioactive Waste Management System (CRWMS) Management and Operating Contractor (M&O) and Regional Servicing Contractors (RSCs). For each test case, values calculated using Microsoft Excel 97 worksheets were compared to CALVIN V3.0 scenarios with the same input data and assumptions. All of the test case results compare with the CALVIN V3.0 results within the bounds of the acceptance criteria. Therefore, it is concluded that the CALVIN V3.0 calculation models and options tested in this report are validated.« less

  9. Utility of the Montreal assessment of need questionnaire for community mental health planning.

    PubMed

    Tremblay, Jacques; Bamvita, Jean-Marie; Grenier, Guy; Fleury, Marie-Josée

    2014-09-01

    Needs assessment facilitates mental health services planning, provision, and evaluation. This study aimed to (a) validate a new instrument, the Montreal Assessment of Needs Questionnaire (MANQ), and (b) use this to assess variations and predictors of need (number and seriousness) in 297 individuals with severe mental disorders for 18 months, during implementation of the Quebec Mental Health Action Plan. MANQ internal and external validations were adequate. Variables significantly associated with need number and seriousness variations were used to build multiple linear regression models. Autonomous housing, not receiving welfare, not having consulted a health educator, higher level of help from services, Alcohol Use Disorders Identification Test total score, and social support were associated with decreasing need number and seriousness over time. Having a higher education was also associated with decreasing need number. In a reform context, the MANQ's unique ability to detect rapid improvement in patient needs has usefulness for Quebec mental health planning.

  10. SU-E-T-129: Are Knowledge-Based Planning Dose Estimates Valid for Distensible Organs?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lalonde, R; Heron, D; Huq, M

    2015-06-15

    Purpose: Knowledge-based planning programs have become available to assist treatment planning in radiation therapy. Such programs can be used to generate estimated DVHs and planning constraints for organs at risk (OARs), based upon a model generated from previous plans. These estimates are based upon the planning CT scan. However, for distensible OARs like the bladder and rectum, daily variations in volume may make the dose estimates invalid. The purpose of this study is to determine whether knowledge-based DVH dose estimates may be valid for distensible OARs. Methods: The Varian RapidPlan™ knowledge-based planning module was used to generate OAR dose estimatesmore » and planning objectives for 10 prostate cases previously planned with VMAT, and final plans were calculated for each. Five weekly setup CBCT scans of each patient were then downloaded and contoured (assuming no change in size and shape of the target volume), and rectum and bladder DVHs were recalculated for each scan. Dose volumes were then compared at 75, 60,and 40 Gy for the bladder and rectum between the planning scan and the CBCTs. Results: Plan doses and estimates matched well at all dose points., Volumes of the rectum and bladder varied widely between planning CT and the CBCTs, ranging from 0.46 to 2.42 for the bladder and 0.71 to 2.18 for the rectum, causing relative dose volumes to vary between planning CT and CBCT, but absolute dose volumes were more consistent. The overall ratio of CBCT/plan dose volumes was 1.02 ±0.27 for rectum and 0.98 ±0.20 for bladder in these patients. Conclusion: Knowledge-based planning dose volume estimates for distensible OARs are still valid, in absolute volume terms, between treatment planning scans and CBCT’s taken during daily treatment. Further analysis of the data is being undertaken to determine how differences depend upon rectum and bladder filling state. This work has been supported by Varian Medical Systems.« less

  11. Relations Between Autonomous Motivation and Leisure-Time Physical Activity Participation: The Mediating Role of Self-Regulation Techniques.

    PubMed

    Nurmi, Johanna; Hagger, Martin S; Haukkala, Ari; Araújo-Soares, Vera; Hankonen, Nelli

    2016-04-01

    This study tested the predictive validity of a multitheory process model in which the effect of autonomous motivation from self-determination theory on physical activity participation is mediated by the adoption of self-regulatory techniques based on control theory. Finnish adolescents (N = 411, aged 17-19) completed a prospective survey including validated measures of the predictors and physical activity, at baseline and after one month (N = 177). A subsample used an accelerometer to objectively measure physical activity and further validate the physical activity self-report assessment tool (n = 44). Autonomous motivation statistically significantly predicted action planning, coping planning, and self-monitoring. Coping planning and self-monitoring mediated the effect of autonomous motivation on physical activity, although self-monitoring was the most prominent. Controlled motivation had no effect on self-regulation techniques or physical activity. Developing interventions that support autonomous motivation for physical activity may foster increased engagement in self-regulation techniques and positively affect physical activity behavior.

  12. SU-G-BRC-13: Model Based Classification for Optimal Position Selection for Left-Sided Breast Radiotherapy: Free Breathing, DIBH, Or Prone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, H; Liu, T; Xu, X

    Purpose: There are clinical decision challenges to select optimal treatment positions for left-sided breast cancer patients—supine free breathing (FB), supine Deep Inspiration Breath Hold (DIBH) and prone free breathing (prone). Physicians often make the decision based on experiences and trials, which might not always result optimal OAR doses. We herein propose a mathematical model to predict the lowest OAR doses among these three positions, providing a quantitative tool for corresponding clinical decision. Methods: Patients were scanned in FB, DIBH, and prone positions under an IRB approved protocol. Tangential beam plans were generated for each position, and OAR doses were calculated.more » The position with least OAR doses is defined as the optimal position. The following features were extracted from each scan to build the model: heart, ipsilateral lung, breast volume, in-field heart, ipsilateral lung volume, distance between heart and target, laterality of heart, and dose to heart and ipsilateral lung. Principal Components Analysis (PCA) was applied to remove the co-linearity of the input data and also to lower the data dimensionality. Feature selection, another method to reduce dimensionality, was applied as a comparison. Support Vector Machine (SVM) was then used for classification. Thirtyseven patient data were acquired; up to now, five patient plans were available. K-fold cross validation was used to validate the accuracy of the classifier model with small training size. Results: The classification results and K-fold cross validation demonstrated the model is capable of predicting the optimal position for patients. The accuracy of K-fold cross validations has reached 80%. Compared to PCA, feature selection allows causal features of dose to be determined. This provides more clinical insights. Conclusion: The proposed classification system appeared to be feasible. We are generating plans for the rest of the 37 patient images, and more statistically significant results are to be presented.« less

  13. Global Precipitation Measurement (GPM) Ground Validation: Plans and Preparations

    NASA Technical Reports Server (NTRS)

    Schwaller, M.; Bidwell, S.; Durning, F. J.; Smith, E.

    2004-01-01

    The Global Precipitation Measurement (GPM) program is an international partnership led by the National Aeronautics and Space Administration (NASA) and the Japan Aerospace Exploration Agency (JAXA). GPM will improve climate, weather, and hydro-meteorological forecasts through more frequent and more accurate measurement of precipitation across the globe. This paper describes the concept, the planning, and the preparations for Ground Validation within the GPM program. Ground Validation (GV) plays an important role in the program by investigating and quantitatively assessing the errors within the satellite retrievals. These quantitative estimates of retrieval errors will assist the scientific community by bounding the errors within their research products. The two fundamental requirements of the GPM Ground Validation program are: (1) error characterization of the precipitation retrievals and (2) continual improvement of the satellite retrieval algorithms. These two driving requirements determine the measurements, instrumentation, and location for ground observations. This paper outlines GV plans for estimating the systematic and random components of retrieval error and for characterizing the spatial p d temporal structure of the error and plans for algorithm improvement in which error models are developed and experimentally explored to uncover the physical causes of errors within the retrievals. This paper discusses NASA locations for GV measurements as well as anticipated locations from international GPM partners. NASA's primary locations for validation measurements are an oceanic site at Kwajalein Atoll in the Republic of the Marshall Islands and a continental site in north-central Oklahoma at the U.S. Department of Energy's Atmospheric Radiation Measurement Program site.

  14. An Approach to Comprehensive and Sustainable Solar Wind Model Validation

    NASA Astrophysics Data System (ADS)

    Rastaetter, L.; MacNeice, P. J.; Mays, M. L.; Boblitt, J. M.; Wiegand, C.

    2017-12-01

    The number of models of the corona and inner heliosphere and of their updates and upgrades grows steadily, as does the number and character of the model inputs. Maintaining up to date validation of these models, in the face of this constant model evolution, is a necessary but very labor intensive activity. In the last year alone, both NASA's LWS program and the CCMC's ongoing support of model forecasting activities at NOAA SWPC have sought model validation reports on the quality of all aspects of the community's coronal and heliospheric models, including both ambient and CME related wind solutions at L1. In this presentation I will give a brief review of the community's previous model validation results of L1 wind representation. I will discuss the semi-automated web based system we are constructing at the CCMC to present comparative visualizations of all interesting aspects of the solutions from competing models.This system is designed to be easily queried to provide the essential comprehensive inputs to repeat andupdate previous validation studies and support extensions to them. I will illustrate this by demonstrating how the system is being used to support the CCMC/LWS Model Assessment Forum teams focused on the ambient and time dependent corona and solar wind, including CME arrival time and IMF Bz.I will also discuss plans to extend the system to include results from the Forum teams addressing SEP model validation.

  15. Commissioning of intensity modulated neutron radiotherapy (IMNRT).

    PubMed

    Burmeister, Jay; Spink, Robyn; Liang, Liang; Bossenberger, Todd; Halford, Robert; Brandon, John; Delauter, Jonathan; Snyder, Michael

    2013-02-01

    Intensity modulated neutron radiotherapy (IMNRT) has been developed using inhouse treatment planning and delivery systems at the Karmanos Cancer Center∕Wayne State University Fast Neutron Therapy facility. The process of commissioning IMNRT for clinical use is presented here. Results of commissioning tests are provided including validation measurements using representative patient plans as well as those from the TG-119 test suite. IMNRT plans were created using the Varian Eclipse optimization algorithm and an inhouse planning system for calculation of neutron dose distributions. Tissue equivalent ionization chambers and an ionization chamber array were used for point dose and planar dose distribution comparisons with calculated values. Validation plans were delivered to water and virtual water phantoms using TG-119 measurement points and evaluation techniques. Photon and neutron doses were evaluated both inside and outside the target volume for a typical IMNRT plan to determine effects of intensity modulation on the photon dose component. Monitor unit linearity and effects of beam current and gantry angle on output were investigated, and an independent validation of neutron dosimetry was obtained. While IMNRT plan quality is superior to conventional fast neutron therapy plans for clinical sites such as prostate and head and neck, it is inferior to photon IMRT for most TG-119 planning goals, particularly for complex cases. This results significantly from current limitations on the number of segments. Measured and calculated doses for 11 representative plans (six prostate∕five head and neck) agreed to within -0.8 ± 1.4% and 5.0 ± 6.0% within and outside the target, respectively. Nearly all (22∕24) ion chamber point measurements in the two phantom arrangements were within the respective confidence intervals for the quantity [(measured-planned)∕prescription dose] derived in TG-119. Mean differences for all measurements were 0.5% (max = 7.0%) and 1.4% (max = 4.1%) in water and virtual water, respectively. The mean gamma pass rate for all cases was 92.8% (min = 88.6%). These pass rates are lower than typically achieved with photon IMRT, warranting development of a planar dosimetry system designed specifically for IMNRT and∕or the improvement of neutron beam modeling in the penumbral region. The fractional photon dose component did not change significantly in a typical IMNRT plan versus a conventional fast neutron therapy plan, and IMNRT delivery is not expected to significantly alter the RBE. All other commissioning results were considered satisfactory for clinical implementation of IMNRT, including the external neutron dose validation, which agreed with the predicted neutron dose to within 1%. IMNRT has been successfully commissioned for clinical use. While current plan quality is inferior to photon IMRT, it is superior to conventional fast neutron therapy. Ion chamber validation results for IMNRT commissioning are also comparable to those typically achieved with photon IMRT. Gamma pass rates for planar dose distributions are lower than typically observed for photon IMRT but may be improved with improved planar dosimetry equipment and beam modeling techniques. In the meantime, patient-specific quality assurance measurements should rely more heavily on point dose measurements with tissue equivalent ionization chambers. No significant technical impediments are anticipated in the clinical implementation of IMNRT as described here.

  16. Development of Lesson Plans and Student Worksheets Based Socio-Scientific Issues on Pollution Environmental Topic

    NASA Astrophysics Data System (ADS)

    Rahayu, S.; Meyliana, M.; Arlingga, A.; Reny, R.; Siahaan, P.; Hernani, H.

    2017-09-01

    The aim of this study is to develop lesson plans and student worksheets based socio-scientific issues on pollution environmental topic for seventh-grade junior high school students. Environmental pollution topic split into several subtopics namely air pollution, water pollution and soil pollution. The composing of lesson plans were developed based on socio-scientific issues with five stages, namely (1) Motivate; (2) Challenge; (3) Collect scientific evidence; (4) Analyse the evidence; (5) Build knowledge and make connections; and (6) Use evidence. While student worksheets contain articles on socio-scientific issues, practice, and there are a few questions to determine students’ reasoning. The method that is used in this research is research and development (R & D method). Development model used in this study is a model of Plomp that consists of four stages, namely: (1) Initial Research; (2) Design; (3) Realization or Construction; (4) Testing, evaluation and revision; (5) Implementation, while the research was limited to the fourth stage. Lesson plans and student worksheets based on socio-scientific issues was validated through an expert validation. The result showed that lesson plans and student worksheets based socio-scientific issues on pollution theme have a very decent and be able to apply in science classroom.

  17. MO-G-304-01: FEATURED PRESENTATION: Expanding the Knowledge Base for Data-Driven Treatment Planning: Incorporating Patient Outcome Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, SP; Quon, H; Cheng, Z

    2015-06-15

    Purpose: To extend the capabilities of knowledge-based treatment planning beyond simple dose queries by incorporating validated patient outcome models. Methods: From an analytic, relational database of 684 head and neck cancer patients, 372 patients were identified having dose data for both left and right parotid glands as well as baseline and follow-up xerostomia assessments. For each existing patient, knowledge-based treatment planning was simulated for by querying the dose-volume histograms and geometric shape relationships (overlap volume histograms) for all other patients. Dose predictions were captured at normalized volume thresholds (NVT) of 0%, 10%, 20, 30%, 40%, 50%, and 85% and weremore » compared with the actual achieved doses using the Wilcoxon signed-rank test. Next, a logistic regression model was used to predict the maximum severity of xerostomia up to three months following radiotherapy. Baseline xerostomia scores were subtracted from follow-up assessments and were also included in the model. The relative risks from predicted doses and actual doses were computed and compared. Results: The predicted doses for both parotid glands were significantly less than the achieved doses (p < 0.0001), with differences ranging from 830 cGy ± 1270 cGy (0% NVT) to 1673 cGy ± 1197 cGy (30% NVT). The modelled risk of xerostomia ranged from 54% to 64% for achieved doses and from 33% to 51% for the dose predictions. Relative risks varied from 1.24 to 1.87, with maximum relative risk occurring at 85% NVT. Conclusions: Data-driven generation of treatment planning objectives without consideration of the underlying normal tissue complication probability may Result in inferior plans, even if quality metrics indicate otherwise. Inclusion of complication models in knowledge-based treatment planning is necessary in order to close the feedback loop between radiotherapy treatments and patient outcomes. Future work includes advancing and validating complication models in the context of knowledge-based treatment planning. This work is supported by Philips Radiation Oncology Systems.« less

  18. The Determinants of Student Effort at Learning ERP: A Cultural Perspective

    ERIC Educational Resources Information Center

    Alshare, Khaled A.; El-Masri, Mazen; Lane, Peggy L.

    2015-01-01

    This paper develops a research model based on the Unified Theory of Acceptance and Use of Technology model (UTAUT) and Hofstede's cultural dimensions to explore factors that influence student effort at learning Enterprise Resource Planning (ERP) systems. A Structural Equation Model (SEM) using LISREL was utilized to validate the proposed research…

  19. Development of state and transition model assumptions used in National Forest Plan revision

    Treesearch

    Eric B. Henderson

    2008-01-01

    State and transition models are being utilized in forest management analysis processes to evaluate assumptions about disturbances and succession. These models assume valid information about seral class successional pathways and timing. The Forest Vegetation Simulator (FVS) was used to evaluate seral class succession assumptions for the Hiawatha National Forest in...

  20. Dosimetric accuracy of a treatment planning system for actively scanned proton beams and small target volumes: Monte Carlo and experimental validation

    NASA Astrophysics Data System (ADS)

    Magro, G.; Molinelli, S.; Mairani, A.; Mirandola, A.; Panizza, D.; Russo, S.; Ferrari, A.; Valvo, F.; Fossati, P.; Ciocca, M.

    2015-09-01

    This study was performed to evaluate the accuracy of a commercial treatment planning system (TPS), in optimising proton pencil beam dose distributions for small targets of different sizes (5-30 mm side) located at increasing depths in water. The TPS analytical algorithm was benchmarked against experimental data and the FLUKA Monte Carlo (MC) code, previously validated for the selected beam-line. We tested the Siemens syngo® TPS plan optimisation module for water cubes fixing the configurable parameters at clinical standards, with homogeneous target coverage to a 2 Gy (RBE) dose prescription as unique goal. Plans were delivered and the dose at each volume centre was measured in water with a calibrated PTW Advanced Markus® chamber. An EBT3® film was also positioned at the phantom entrance window for the acquisition of 2D dose maps. Discrepancies between TPS calculated and MC simulated values were mainly due to the different lateral spread modeling and resulted in being related to the field-to-spot size ratio. The accuracy of the TPS was proved to be clinically acceptable in all cases but very small and shallow volumes. In this contest, the use of MC to validate TPS results proved to be a reliable procedure for pre-treatment plan verification.

  1. Dosimetric accuracy of a treatment planning system for actively scanned proton beams and small target volumes: Monte Carlo and experimental validation.

    PubMed

    Magro, G; Molinelli, S; Mairani, A; Mirandola, A; Panizza, D; Russo, S; Ferrari, A; Valvo, F; Fossati, P; Ciocca, M

    2015-09-07

    This study was performed to evaluate the accuracy of a commercial treatment planning system (TPS), in optimising proton pencil beam dose distributions for small targets of different sizes (5-30 mm side) located at increasing depths in water. The TPS analytical algorithm was benchmarked against experimental data and the FLUKA Monte Carlo (MC) code, previously validated for the selected beam-line. We tested the Siemens syngo(®) TPS plan optimisation module for water cubes fixing the configurable parameters at clinical standards, with homogeneous target coverage to a 2 Gy (RBE) dose prescription as unique goal. Plans were delivered and the dose at each volume centre was measured in water with a calibrated PTW Advanced Markus(®) chamber. An EBT3(®) film was also positioned at the phantom entrance window for the acquisition of 2D dose maps. Discrepancies between TPS calculated and MC simulated values were mainly due to the different lateral spread modeling and resulted in being related to the field-to-spot size ratio. The accuracy of the TPS was proved to be clinically acceptable in all cases but very small and shallow volumes. In this contest, the use of MC to validate TPS results proved to be a reliable procedure for pre-treatment plan verification.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Y; Giebeler, A; Mascia, A

    Purpose: To quantitatively evaluate dosimetric consequence of spot size variations and validate beam-matching criteria for commissioning a pencil beam model for multiple treatment rooms. Methods: A planning study was first conducted by simulating spot size variations to systematically evaluate dosimetric impact of spot size variations in selected cases, which was used to establish the in-air spot size tolerance for beam matching specifications. A beam model in treatment planning system was created using in-air spot profiles acquired in one treatment room. These spot profiles were also acquired from another treatment room for assessing the actual spot size variations between the twomore » treatment rooms. We created twenty five test plans with targets of different sizes at different depths, and performed dose measurement along the entrance, proximal and distal target regions. The absolute doses at those locations were measured using ionization chambers at both treatment rooms, and were compared against the calculated doses by the beam model. Fifteen additional patient plans were also measured and included in our validation. Results: The beam model is relatively insensitive to spot size variations. With an average of less than 15% measured in-air spot size variations between two treatment rooms, the average dose difference was −0.15% with a standard deviation of 0.40% for 55 measurement points within target region; but the differences increased to 1.4%±1.1% in the entrance regions, which are more affected by in-air spot size variations. Overall, our single-room based beam model in the treatment planning system agreed with measurements in both rooms < 0.5% within the target region. For fifteen patient cases, the agreement was within 1%. Conclusion: We have demonstrated that dosimetrically equivalent machines can be established when in-air spot size variations are within 15% between the two treatment rooms.« less

  3. External validation of the Cairns Prediction Model (CPM) to predict conversion from laparoscopic to open cholecystectomy.

    PubMed

    Hu, Alan Shiun Yew; Donohue, Peter O'; Gunnarsson, Ronny K; de Costa, Alan

    2018-03-14

    Valid and user-friendly prediction models for conversion to open cholecystectomy allow for proper planning prior to surgery. The Cairns Prediction Model (CPM) has been in use clinically in the original study site for the past three years, but has not been tested at other sites. A retrospective, single-centred study collected ultrasonic measurements and clinical variables alongside with conversion status from consecutive patients who underwent laparoscopic cholecystectomy from 2013 to 2016 in The Townsville Hospital, North Queensland, Australia. An area under the curve (AUC) was calculated to externally validate of the CPM. Conversion was necessary in 43 (4.2%) out of 1035 patients. External validation showed an area under the curve of 0.87 (95% CI 0.82-0.93, p = 1.1 × 10 -14 ). In comparison with most previously published models, which have an AUC of approximately 0.80 or less, the CPM has the highest AUC of all published prediction models both for internal and external validation. Crown Copyright © 2018. Published by Elsevier Inc. All rights reserved.

  4. Technology Readiness of the NEXT Ion Propulsion System

    NASA Technical Reports Server (NTRS)

    Benson, Scott W.; Patterson, Michael J.

    2008-01-01

    The NASA's Evolutionary Xenon Thruster (NEXT) ion propulsion system has been in advanced technology development under the NASA In-Space Propulsion Technology project. The highest fidelity hardware planned has now been completed by the government/industry team, including: a flight prototype model (PM) thruster, an engineering model (EM) power processing unit, EM propellant management assemblies, a breadboard gimbal, and control unit simulators. Subsystem and system level technology validation testing is in progress. To achieve the objective Technology Readiness Level 6, environmental testing is being conducted to qualification levels in ground facilities simulating the space environment. Additional tests have been conducted to characterize the performance range and life capability of the NEXT thruster. This paper presents the status and results of technology validation testing accomplished to date, the validated subsystem and system capabilities, and the plans for completion of this phase of NEXT development. The next round of competed planetary science mission announcements of opportunity, and directed mission decisions, are anticipated to occur in 2008 and 2009. Progress to date, and the success of on-going technology validation, indicate that the NEXT ion propulsion system will be a primary candidate for mission consideration in these upcoming opportunities.

  5. Modelling human skull growth: a validated computational model

    PubMed Central

    Marghoub, Arsalan; Johnson, David; Khonsari, Roman H.; Fagan, Michael J.; Moazen, Mehran

    2017-01-01

    During the first year of life, the brain grows rapidly and the neurocranium increases to about 65% of its adult size. Our understanding of the relationship between the biomechanical forces, especially from the growing brain, the craniofacial soft tissue structures and the individual bone plates of the skull vault is still limited. This basic knowledge could help in the future planning of craniofacial surgical operations. The aim of this study was to develop a validated computational model of skull growth, based on the finite-element (FE) method, to help understand the biomechanics of skull growth. To do this, a two-step validation study was carried out. First, an in vitro physical three-dimensional printed model and an in silico FE model were created from the same micro-CT scan of an infant skull and loaded with forces from the growing brain from zero to two months of age. The results from the in vitro model validated the FE model before it was further developed to expand from 0 to 12 months of age. This second FE model was compared directly with in vivo clinical CT scans of infants without craniofacial conditions (n = 56). The various models were compared in terms of predicted skull width, length and circumference, while the overall shape was quantified using three-dimensional distance plots. Statistical analysis yielded no significant differences between the male skull models. All size measurements from the FE model versus the in vitro physical model were within 5%, with one exception showing a 7.6% difference. The FE model and in vivo data also correlated well, with the largest percentage difference in size being 8.3%. Overall, the FE model results matched well with both the in vitro and in vivo data. With further development and model refinement, this modelling method could be used to assist in preoperative planning of craniofacial surgery procedures and could help to reduce reoperation rates. PMID:28566514

  6. Modelling human skull growth: a validated computational model.

    PubMed

    Libby, Joseph; Marghoub, Arsalan; Johnson, David; Khonsari, Roman H; Fagan, Michael J; Moazen, Mehran

    2017-05-01

    During the first year of life, the brain grows rapidly and the neurocranium increases to about 65% of its adult size. Our understanding of the relationship between the biomechanical forces, especially from the growing brain, the craniofacial soft tissue structures and the individual bone plates of the skull vault is still limited. This basic knowledge could help in the future planning of craniofacial surgical operations. The aim of this study was to develop a validated computational model of skull growth, based on the finite-element (FE) method, to help understand the biomechanics of skull growth. To do this, a two-step validation study was carried out. First, an in vitro physical three-dimensional printed model and an in silico FE model were created from the same micro-CT scan of an infant skull and loaded with forces from the growing brain from zero to two months of age. The results from the in vitro model validated the FE model before it was further developed to expand from 0 to 12 months of age. This second FE model was compared directly with in vivo clinical CT scans of infants without craniofacial conditions ( n = 56). The various models were compared in terms of predicted skull width, length and circumference, while the overall shape was quantified using three-dimensional distance plots. Statistical analysis yielded no significant differences between the male skull models. All size measurements from the FE model versus the in vitro physical model were within 5%, with one exception showing a 7.6% difference. The FE model and in vivo data also correlated well, with the largest percentage difference in size being 8.3%. Overall, the FE model results matched well with both the in vitro and in vivo data. With further development and model refinement, this modelling method could be used to assist in preoperative planning of craniofacial surgery procedures and could help to reduce reoperation rates. © 2017 The Author(s).

  7. Modification and validation of an analytical source model for external beam radiotherapy Monte Carlo dose calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davidson, Scott E., E-mail: sedavids@utmb.edu

    Purpose: A dose calculation tool, which combines the accuracy of the dose planning method (DPM) Monte Carlo code and the versatility of a practical analytical multisource model, which was previously reported has been improved and validated for the Varian 6 and 10 MV linear accelerators (linacs). The calculation tool can be used to calculate doses in advanced clinical application studies. One shortcoming of current clinical trials that report dose from patient plans is the lack of a standardized dose calculation methodology. Because commercial treatment planning systems (TPSs) have their own dose calculation algorithms and the clinical trial participant who usesmore » these systems is responsible for commissioning the beam model, variation exists in the reported calculated dose distributions. Today’s modern linac is manufactured to tight specifications so that variability within a linac model is quite low. The expectation is that a single dose calculation tool for a specific linac model can be used to accurately recalculate dose from patient plans that have been submitted to the clinical trial community from any institution. The calculation tool would provide for a more meaningful outcome analysis. Methods: The analytical source model was described by a primary point source, a secondary extra-focal source, and a contaminant electron source. Off-axis energy softening and fluence effects were also included. The additions of hyperbolic functions have been incorporated into the model to correct for the changes in output and in electron contamination with field size. A multileaf collimator (MLC) model is included to facilitate phantom and patient dose calculations. An offset to the MLC leaf positions was used to correct for the rudimentary assumed primary point source. Results: Dose calculations of the depth dose and profiles for field sizes 4 × 4 to 40 × 40 cm agree with measurement within 2% of the maximum dose or 2 mm distance to agreement (DTA) for 95% of the data points tested. The model was capable of predicting the depth of the maximum dose within 1 mm. Anthropomorphic phantom benchmark testing of modulated and patterned MLCs treatment plans showed agreement to measurement within 3% in target regions using thermoluminescent dosimeters (TLD). Using radiochromic film normalized to TLD, a gamma criteria of 3% of maximum dose and 2 mm DTA was applied with a pass rate of least 85% in the high dose, high gradient, and low dose regions. Finally, recalculations of patient plans using DPM showed good agreement relative to a commercial TPS when comparing dose volume histograms and 2D dose distributions. Conclusions: A unique analytical source model coupled to the dose planning method Monte Carlo dose calculation code has been modified and validated using basic beam data and anthropomorphic phantom measurement. While this tool can be applied in general use for a particular linac model, specifically it was developed to provide a singular methodology to independently assess treatment plan dose distributions from those clinical institutions participating in National Cancer Institute trials.« less

  8. A Theory of Planned Behaviour-Based Analysis of TIMSS 2011 to Determine Factors Influencing Inquiry Teaching Practices in High-Performing Countries

    ERIC Educational Resources Information Center

    Pongsophon, Pongprapan; Herman, Benjamin C.

    2017-01-01

    Given the abundance of literature describing the strong relationship between inquiry-based teaching and student achievement, more should be known about the factors impacting science teachers' classroom inquiry implementation. This study utilises the theory of planned behaviour to propose and validate a causal model of inquiry-based teaching…

  9. Validation of a Fast-Response Urban Micrometeorological Model to Assess the Performance of Urban Heat Island Mitigation Strategies

    NASA Astrophysics Data System (ADS)

    Nadeau, D.; Girard, P.; Overby, M.; Pardyjak, E.; Stoll, R., II; Willemsen, P.; Bailey, B.; Parlange, M. B.

    2015-12-01

    Urban heat islands (UHI) are a real threat in many cities worldwide and mitigation measures have become a central component of urban planning strategies. Even within a city, causes of UHI vary from one neighborhood to another, mostly due the spatial variability in surface thermal properties, building geometry, anthropogenic heat flux releases and vegetation cover. As a result, the performance of UHI mitigation measures also varies in space. Hence, there is a need to develop a tool to quantify the efficiency of UHI mitigation measures at the neighborhood scale. The objective of this ongoing study is to validate the fast-response micrometeorological model QUIC EnvSim (QES). This model can provide all information required for UHI studies with a fine spatial resolution (up to 0.5m) and short computation time. QES combines QUIC, a CFD-based wind solver and dispersion model, and EnvSim, composed of a radiation model, a land-surface model and a turbulent transport model. Here, high-resolution (1 m) simulations are run over a subset of the École Polytechnique Fédérale de Lausanne (EPFL) campus including complex buildings, various surfaces properties and vegetation. For nearly five months in 2006-07, a dense network of meteorological observations (92 weather stations over 0.1 km2) was deployed over the campus and these unique data are used here as a validation dataset. We present validation results for different test cases (e.g., sunny vs cloudy days, different incoming wind speeds and directions) and explore the effect of a few UHI mitigation strategies on the spatial distribution of near-surface air temperatures. Preliminary results suggest that QES may be a valuable tool in decision-making regarding adaptation of urban planning to UHI.

  10. Real-world use of the risk-need-responsivity model and the level of service/case management inventory with community-supervised offenders.

    PubMed

    Dyck, Heather L; Campbell, Mary Ann; Wershler, Julie L

    2018-06-01

    The risk-need-responsivity model (RNR; Bonta & Andrews, 2017) has become a leading approach for effective offender case management, but field tests of this model are still required. The present study first assessed the predictive validity of the RNR-informed Level of Service/Case Management Inventory (LS/CMI; Andrews, Bonta, & Wormith, 2004) with a sample of Atlantic Canadian male and female community-supervised provincial offenders (N = 136). Next, the case management plans prepared from these LS/CMI results were analyzed for adherence to the principles of risk, need, and responsivity. As expected, the LS/CMI was a strong predictor of general recidivism for both males (area under the curve = .75, 95% confidence interval [.66, .85]), and especially females (area under the curve = .94, 95% confidence interval [.84, 1.00]), over an average 3.42-year follow-up period. The LS/CMI was predictive of time to recidivism, with lower risk cases taking longer to reoffend than higher risk cases. Despite the robust predictive validity of the LS/CMI, case management plans developed by probation officers generally reflected poor adherence to the RNR principles. These findings highlight the need for better training on how to transfer risk appraisal information from valid risk tools to case plans to better meet the best-practice principles of risk, need, and responsivity for criminal behavior risk reduction. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, J; Hu, W; Chen, X

    Purpose: The aim of this study is to investigate the feasibility of using RapidPlan for breast cancer radiotherapy and to evaluate its performance for planners with different planning experiences. Methods: A training database was collected with 80 expert plan datasets from patients previously received left breast conserving surgery and IMRT-simultaneously integrated boost radiotherapy. The models were created on the RapidPlan. Five patients from the training database and 5 external patients were used for internal and external validation, respectively. Three planners with different planning experiences (beginner, junior, senior) designed manual and RapidPlan based plans for additional ten patients. The plan qualitiesmore » were compared with manual and RapidPlan based ones. Results: For the internal and external validations, there were no significant dose differences on target coverage for plans from RapidPlan and manual. Also, no difference was found in the mean doses to contralateral breast and lung. The RapidPlan improved the heart (V5, V10, V20, V30, and mead dose) and ipsilateral lung (V5, V10, V20, V30, and mean dose) sparing for the beginner and junior planners. Compare to the plans from senior planner, 6 out of 16 clinically checked parameters were improved in RapidPlan, and the left parameters were similar. Conclusion: It is feasible to generate clinical acceptable plans using RapidPlan for breast cancer radiotherapy. The RapidPlan helps to systematically improve the quality of IMRT plans against the benchmark of clinically accepted plans. The RapidPlan shows promise for homogenizing plan quality by transferring planning expertise from more experienced to less experienced planners.« less

  12. Applicability Analysis of Validation Evidence for Biomedical Computational Models

    DOE PAGES

    Pathmanathan, Pras; Gray, Richard A.; Romero, Vicente J.; ...

    2017-09-07

    Computational modeling has the potential to revolutionize medicine the way it transformed engineering. However, despite decades of work, there has only been limited progress to successfully translate modeling research to patient care. One major difficulty which often occurs with biomedical computational models is an inability to perform validation in a setting that closely resembles how the model will be used. For example, for a biomedical model that makes in vivo clinically relevant predictions, direct validation of predictions may be impossible for ethical, technological, or financial reasons. Unavoidable limitations inherent to the validation process lead to challenges in evaluating the credibilitymore » of biomedical model predictions. Therefore, when evaluating biomedical models, it is critical to rigorously assess applicability, that is, the relevance of the computational model, and its validation evidence to the proposed context of use (COU). However, there are no well-established methods for assessing applicability. In this paper, we present a novel framework for performing applicability analysis and demonstrate its use with a medical device computational model. The framework provides a systematic, step-by-step method for breaking down the broad question of applicability into a series of focused questions, which may be addressed using supporting evidence and subject matter expertise. The framework can be used for model justification, model assessment, and validation planning. While motivated by biomedical models, it is relevant to a broad range of disciplines and underlying physics. Finally, the proposed applicability framework could help overcome some of the barriers inherent to validation of, and aid clinical implementation of, biomedical models.« less

  13. Applicability Analysis of Validation Evidence for Biomedical Computational Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pathmanathan, Pras; Gray, Richard A.; Romero, Vicente J.

    Computational modeling has the potential to revolutionize medicine the way it transformed engineering. However, despite decades of work, there has only been limited progress to successfully translate modeling research to patient care. One major difficulty which often occurs with biomedical computational models is an inability to perform validation in a setting that closely resembles how the model will be used. For example, for a biomedical model that makes in vivo clinically relevant predictions, direct validation of predictions may be impossible for ethical, technological, or financial reasons. Unavoidable limitations inherent to the validation process lead to challenges in evaluating the credibilitymore » of biomedical model predictions. Therefore, when evaluating biomedical models, it is critical to rigorously assess applicability, that is, the relevance of the computational model, and its validation evidence to the proposed context of use (COU). However, there are no well-established methods for assessing applicability. In this paper, we present a novel framework for performing applicability analysis and demonstrate its use with a medical device computational model. The framework provides a systematic, step-by-step method for breaking down the broad question of applicability into a series of focused questions, which may be addressed using supporting evidence and subject matter expertise. The framework can be used for model justification, model assessment, and validation planning. While motivated by biomedical models, it is relevant to a broad range of disciplines and underlying physics. Finally, the proposed applicability framework could help overcome some of the barriers inherent to validation of, and aid clinical implementation of, biomedical models.« less

  14. Manpower planning using Markov Chain model

    NASA Astrophysics Data System (ADS)

    Saad, Syafawati Ab; Adnan, Farah Adibah; Ibrahim, Haslinda; Rahim, Rahela

    2014-07-01

    Manpower planning is a planning model which understands the flow of manpower based on the policies changes. For such purpose, numerous attempts have been made by researchers to develop a model to investigate the track of movements of lecturers for various universities. As huge number of lecturers in a university, it is difficult to track the movement of lecturers and also there is no quantitative way used in tracking the movement of lecturers. This research is aimed to determine the appropriate manpower model to understand the flow of lecturers in a university in Malaysia by determine the probability and mean time of lecturers remain in the same status rank. In addition, this research also intended to estimate the number of lecturers in different status rank (lecturer, senior lecturer and associate professor). From the previous studies, there are several methods applied in manpower planning model and appropriate method used in this research is Markov Chain model. Results obtained from this study indicate that the appropriate manpower planning model used is validated by compare to the actual data. The smaller margin of error gives a better result which means that the projection is closer to actual data. These results would give some suggestions for the university to plan the hiring lecturers and budgetary for university in future.

  15. Development and evaluation of a clinical model for lung cancer patients using stereotactic body radiotherapy (SBRT) within a knowledge-based algorithm for treatment planning.

    PubMed

    Chin Snyder, Karen; Kim, Jinkoo; Reding, Anne; Fraser, Corey; Gordon, James; Ajlouni, Munther; Movsas, Benjamin; Chetty, Indrin J

    2016-11-08

    The purpose of this study was to describe the development of a clinical model for lung cancer patients treated with stereotactic body radiotherapy (SBRT) within a knowledge-based algorithm for treatment planning, and to evaluate the model performance and applicability to different planning techniques, tumor locations, and beam arrangements. 105 SBRT plans for lung cancer patients previously treated at our institution were included in the development of the knowledge-based model (KBM). The KBM was trained with a combination of IMRT, VMAT, and 3D CRT techniques. Model performance was validated with 25 cases, for both IMRT and VMAT. The full KBM encompassed lesions located centrally vs. peripherally (43:62), upper vs. lower (62:43), and anterior vs. posterior (60:45). Four separate sub-KBMs were created based on tumor location. Results were compared with the full KBM to evaluate its robustness. Beam templates were used in conjunction with the optimizer to evaluate the model's ability to handle suboptimal beam placements. Dose differences to organs-at-risk (OAR) were evaluated between the plans gener-ated by each KBM. Knowledge-based plans (KBPs) were comparable to clinical plans with respect to target conformity and OAR doses. The KBPs resulted in a lower maximum spinal cord dose by 1.0 ± 1.6 Gy compared to clinical plans, p = 0.007. Sub-KBMs split according to tumor location did not produce significantly better DVH estimates compared to the full KBM. For central lesions, compared to the full KBM, the peripheral sub-KBM resulted in lower dose to 0.035 cc and 5 cc of the esophagus, both by 0.4Gy ± 0.8Gy, p = 0.025. For all lesions, compared to the full KBM, the posterior sub-KBM resulted in higher dose to 0.035 cc, 0.35 cc, and 1.2 cc of the spinal cord by 0.2 ± 0.4Gy, p = 0.01. Plans using template beam arrangements met target and OAR criteria, with an increase noted in maximum heart dose (1.2 ± 2.2Gy, p = 0.01) and GI (0.2 ± 0.4, p = 0.01) for the nine-field plans relative to KBPs planned with custom beam angles. A knowledge-based model for lung SBRT consisting of multiple treatment modalities and lesion loca-tions produced comparable plan quality to clinical plans. With proper training and validation, a robust KBM can be created that encompasses both IMRT and VMAT techniques, as well as different lesion locations. © 2016 The Authors.

  16. Quantifying Unnecessary Normal Tissue Complication Risks due to Suboptimal Planning: A Secondary Study of RTOG 0126.

    PubMed

    Moore, Kevin L; Schmidt, Rachel; Moiseenko, Vitali; Olsen, Lindsey A; Tan, Jun; Xiao, Ying; Galvin, James; Pugh, Stephanie; Seider, Michael J; Dicker, Adam P; Bosch, Walter; Michalski, Jeff; Mutic, Sasa

    2015-06-01

    The purpose of this study was to quantify the frequency and clinical severity of quality deficiencies in intensity modulated radiation therapy (IMRT) planning in the Radiation Therapy Oncology Group 0126 protocol. A total of 219 IMRT patients from the high-dose arm (79.2 Gy) of RTOG 0126 were analyzed. To quantify plan quality, we used established knowledge-based methods for patient-specific dose-volume histogram (DVH) prediction of organs at risk and a Lyman-Kutcher-Burman (LKB) model for grade ≥2 rectal complications to convert DVHs into normal tissue complication probabilities (NTCPs). The LKB model was validated by fitting dose-response parameters relative to observed toxicities. The 90th percentile (22 of 219) of plans with the lowest excess risk (difference between clinical and model-predicted NTCP) were used to create a model for the presumed best practices in the protocol (pDVH0126,top10%). Applying the resultant model to the entire sample enabled comparisons between DVHs that patients could have received to DVHs they actually received. Excess risk quantified the clinical impact of suboptimal planning. Accuracy of pDVH predictions was validated by replanning 30 of 219 patients (13.7%), including equal numbers of presumed "high-quality," "low-quality," and randomly sampled plans. NTCP-predicted toxicities were compared to adverse events on protocol. Existing models showed that bladder-sparing variations were less prevalent than rectum quality variations and that increased rectal sparing was not correlated with target metrics (dose received by 98% and 2% of the PTV, respectively). Observed toxicities were consistent with current LKB parameters. Converting DVH and pDVH0126,top10% to rectal NTCPs, we observed 94 of 219 patients (42.9%) with ≥5% excess risk, 20 of 219 patients (9.1%) with ≥10% excess risk, and 2 of 219 patients (0.9%) with ≥15% excess risk. Replanning demonstrated the predicted NTCP reductions while maintaining the volume of the PTV receiving prescription dose. An equivalent sample of high-quality plans showed fewer toxicities than low-quality plans, 6 of 73 versus 10 of 73 respectively, although these differences were not significant (P=.21) due to insufficient statistical power in this retrospective study. Plan quality deficiencies in RTOG 0126 exposed patients to substantial excess risk for rectal complications. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. An in vivo dose verification method for SBRT-VMAT delivery using the EPID.

    PubMed

    McCowan, P M; Van Uytven, E; Van Beek, T; Asuni, G; McCurdy, B M C

    2015-12-01

    Radiation treatments have become increasingly more complex with the development of volumetric modulated arc therapy (VMAT) and the use of stereotactic body radiation therapy (SBRT). SBRT involves the delivery of substantially larger doses over fewer fractions than conventional therapy. SBRT-VMAT treatments will strongly benefit from in vivo patient dose verification, as any errors in delivery can be more detrimental to the radiobiology of the patient as compared to conventional therapy. Electronic portal imaging devices (EPIDs) are available on most commercial linear accelerators (Linacs) and their documented use for dosimetry makes them valuable tools for patient dose verification. In this work, the authors customize and validate a physics-based model which utilizes on-treatment EPID images to reconstruct the 3D dose delivered to the patient during SBRT-VMAT delivery. The SBRT Linac head, including jaws, multileaf collimators, and flattening filter, were modeled using Monte Carlo methods and verified with measured data. The simulation provides energy spectrum data that are used by their "forward" model to then accurately predict fluence generated by a SBRT beam at a plane above the patient. This fluence is then transported through the patient and then the dose to the phosphor layer in the EPID is calculated. Their "inverse" model back-projects the EPID measured focal fluence to a plane upstream of the patient and recombines it with the extra-focal fluence predicted by the forward model. This estimate of total delivered fluence is then forward projected onto the patient's density matrix and a collapsed cone convolution algorithm calculates the dose delivered to the patient. The model was tested by reconstructing the dose for two prostate, three lung, and two spine SBRT-VMAT treatment fractions delivered to an anthropomorphic phantom. It was further validated against actual patient data for a lung and spine SBRT-VMAT plan. The results were verified with the treatment planning system (TPS) (ECLIPSE AAA) dose calculation. The SBRT-VMAT reconstruction model performed very well when compared to the TPS. A stringent 2%/2 mm χ-comparison calculation gave pass rates better than 91% for the prostate plans, 88% for the lung plans, and 86% for the spine plans for voxels containing 80% or more of the prescribed dose. Patient data were 86% for the lung and 95% for the spine. A 3%/3 mm χ-comparison was also performed and gave pass rates better than 93% for all plan types. The authors have customized and validated a robust, physics-based model that calculates the delivered dose to a patient for SBRT-VMAT delivery using on-treatment EPID images. The accuracy of the results indicates that this approach is suitable for clinical implementation. Future work will incorporate this model into both offline and real-time clinical adaptive radiotherapy.

  18. Extended Axiomatic Conjoint Measurement: A Solution to a Methodological Problem in Studying Fertility-Related Behaviors.

    ERIC Educational Resources Information Center

    Nickerson, Carol A.; McClelland, Gary H.

    1988-01-01

    A methodology is developed based on axiomatic conjoint measurement to accompany a fertility decision-making model. The usefulness of the model is then demonstrated via an application to a study of contraceptive choice (N=100 male and female family-planning clinic clients). Finally, the validity of the model is evaluated. (TJH)

  19. The task of validation of gas-dynamic characteristics of a multistage centrifugal compressor for a natural gas booster compressor station

    NASA Astrophysics Data System (ADS)

    Danilishin, A. M.; Kozhukhov, Y. V.; Neverov, V. V.; Malev, K. G.; Mironov, Y. R.

    2017-08-01

    The aim of this work is the validation study for the numerical modeling of characteristics of a multistage centrifugal compressor for natural gas. In the research process was the analysis used grid interfaces and software systems. The result revealed discrepancies between the simulated and experimental characteristics and outlined the future work plan.

  20. Development of Novel Therapeutics for Neglected Tropical Disease Leishmaniasis

    DTIC Science & Technology

    2015-10-01

    Approved for public release; distribution unlimited We undertook planning of kick off coordination meeting. A low dose infection model of CL was validated...A large scale synthesis of PEN optimized and in vitro studies were performed revealed that PEN alters parasite lipidome. Further studies were...Pentalinonsterol, Leishmania, cutaneous leishmaniasis, treatment Accomplishments • Undertook planning of kick off coordination meeting • Large scale synthesis of

  1. IDEA: Planning at the Core of Autonomous Reactive Agents

    NASA Technical Reports Server (NTRS)

    Muscettola, Nicola; Dorais, Gregory A.; Fry, Chuck; Levinson, Richard; Plaunt, Christian; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Several successful autonomous systems are separated into technologically diverse functional layers operating at different levels of abstraction. This diversity makes them difficult to implement and validate. In this paper, we present IDEA (Intelligent Distributed Execution Architecture), a unified planning and execution framework. In IDEA a layered system can be implemented as separate agents, one per layer, each representing its interactions with the world in a model. At all levels, the model representation primitives and their semantics is the same. Moreover, each agent relies on a single model, plan database, plan runner and on a variety of planners, both reactive and deliberative. The framework allows the specification of agents that operate, within a guaranteed reaction time and supports flexible specification of reactive vs. deliberative agent behavior. Within the IDEA framework we are working to fully duplicate the functionalities of the DS1 Remote Agent and extend it to domains of higher complexity than autonomous spacecraft control.

  2. A GPU-accelerated Monte Carlo dose calculation platform and its application toward validating an MRI-guided radiation therapy beam model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yuhe; Mazur, Thomas R.; Green, Olga

    Purpose: The clinical commissioning of IMRT subject to a magnetic field is challenging. The purpose of this work is to develop a GPU-accelerated Monte Carlo dose calculation platform based on PENELOPE and then use the platform to validate a vendor-provided MRIdian head model toward quality assurance of clinical IMRT treatment plans subject to a 0.35 T magnetic field. Methods: PENELOPE was first translated from FORTRAN to C++ and the result was confirmed to produce equivalent results to the original code. The C++ code was then adapted to CUDA in a workflow optimized for GPU architecture. The original code was expandedmore » to include voxelized transport with Woodcock tracking, faster electron/positron propagation in a magnetic field, and several features that make gPENELOPE highly user-friendly. Moreover, the vendor-provided MRIdian head model was incorporated into the code in an effort to apply gPENELOPE as both an accurate and rapid dose validation system. A set of experimental measurements were performed on the MRIdian system to examine the accuracy of both the head model and gPENELOPE. Ultimately, gPENELOPE was applied toward independent validation of patient doses calculated by MRIdian’s KMC. Results: An acceleration factor of 152 was achieved in comparison to the original single-thread FORTRAN implementation with the original accuracy being preserved. For 16 treatment plans including stomach (4), lung (2), liver (3), adrenal gland (2), pancreas (2), spleen(1), mediastinum (1), and breast (1), the MRIdian dose calculation engine agrees with gPENELOPE with a mean gamma passing rate of 99.1% ± 0.6% (2%/2 mm). Conclusions: A Monte Carlo simulation platform was developed based on a GPU- accelerated version of PENELOPE. This platform was used to validate that both the vendor-provided head model and fast Monte Carlo engine used by the MRIdian system are accurate in modeling radiation transport in a patient using 2%/2 mm gamma criteria. Future applications of this platform will include dose validation and accumulation, IMRT optimization, and dosimetry system modeling for next generation MR-IGRT systems.« less

  3. A GPU-accelerated Monte Carlo dose calculation platform and its application toward validating an MRI-guided radiation therapy beam model

    PubMed Central

    Wang, Yuhe; Mazur, Thomas R.; Green, Olga; Hu, Yanle; Li, Hua; Rodriguez, Vivian; Wooten, H. Omar; Yang, Deshan; Zhao, Tianyu; Mutic, Sasa; Li, H. Harold

    2016-01-01

    Purpose: The clinical commissioning of IMRT subject to a magnetic field is challenging. The purpose of this work is to develop a GPU-accelerated Monte Carlo dose calculation platform based on penelope and then use the platform to validate a vendor-provided MRIdian head model toward quality assurance of clinical IMRT treatment plans subject to a 0.35 T magnetic field. Methods: penelope was first translated from fortran to c++ and the result was confirmed to produce equivalent results to the original code. The c++ code was then adapted to cuda in a workflow optimized for GPU architecture. The original code was expanded to include voxelized transport with Woodcock tracking, faster electron/positron propagation in a magnetic field, and several features that make gpenelope highly user-friendly. Moreover, the vendor-provided MRIdian head model was incorporated into the code in an effort to apply gpenelope as both an accurate and rapid dose validation system. A set of experimental measurements were performed on the MRIdian system to examine the accuracy of both the head model and gpenelope. Ultimately, gpenelope was applied toward independent validation of patient doses calculated by MRIdian’s kmc. Results: An acceleration factor of 152 was achieved in comparison to the original single-thread fortran implementation with the original accuracy being preserved. For 16 treatment plans including stomach (4), lung (2), liver (3), adrenal gland (2), pancreas (2), spleen(1), mediastinum (1), and breast (1), the MRIdian dose calculation engine agrees with gpenelope with a mean gamma passing rate of 99.1% ± 0.6% (2%/2 mm). Conclusions: A Monte Carlo simulation platform was developed based on a GPU- accelerated version of penelope. This platform was used to validate that both the vendor-provided head model and fast Monte Carlo engine used by the MRIdian system are accurate in modeling radiation transport in a patient using 2%/2 mm gamma criteria. Future applications of this platform will include dose validation and accumulation, IMRT optimization, and dosimetry system modeling for next generation MR-IGRT systems. PMID:27370123

  4. A GPU-accelerated Monte Carlo dose calculation platform and its application toward validating an MRI-guided radiation therapy beam model.

    PubMed

    Wang, Yuhe; Mazur, Thomas R; Green, Olga; Hu, Yanle; Li, Hua; Rodriguez, Vivian; Wooten, H Omar; Yang, Deshan; Zhao, Tianyu; Mutic, Sasa; Li, H Harold

    2016-07-01

    The clinical commissioning of IMRT subject to a magnetic field is challenging. The purpose of this work is to develop a GPU-accelerated Monte Carlo dose calculation platform based on penelope and then use the platform to validate a vendor-provided MRIdian head model toward quality assurance of clinical IMRT treatment plans subject to a 0.35 T magnetic field. penelope was first translated from fortran to c++ and the result was confirmed to produce equivalent results to the original code. The c++ code was then adapted to cuda in a workflow optimized for GPU architecture. The original code was expanded to include voxelized transport with Woodcock tracking, faster electron/positron propagation in a magnetic field, and several features that make gpenelope highly user-friendly. Moreover, the vendor-provided MRIdian head model was incorporated into the code in an effort to apply gpenelope as both an accurate and rapid dose validation system. A set of experimental measurements were performed on the MRIdian system to examine the accuracy of both the head model and gpenelope. Ultimately, gpenelope was applied toward independent validation of patient doses calculated by MRIdian's kmc. An acceleration factor of 152 was achieved in comparison to the original single-thread fortran implementation with the original accuracy being preserved. For 16 treatment plans including stomach (4), lung (2), liver (3), adrenal gland (2), pancreas (2), spleen(1), mediastinum (1), and breast (1), the MRIdian dose calculation engine agrees with gpenelope with a mean gamma passing rate of 99.1% ± 0.6% (2%/2 mm). A Monte Carlo simulation platform was developed based on a GPU- accelerated version of penelope. This platform was used to validate that both the vendor-provided head model and fast Monte Carlo engine used by the MRIdian system are accurate in modeling radiation transport in a patient using 2%/2 mm gamma criteria. Future applications of this platform will include dose validation and accumulation, IMRT optimization, and dosimetry system modeling for next generation MR-IGRT systems.

  5. Fast and accurate Monte Carlo modeling of a kilovoltage X-ray therapy unit using a photon-source approximation for treatment planning in complex media.

    PubMed

    Zeinali-Rafsanjani, B; Mosleh-Shirazi, M A; Faghihi, R; Karbasi, S; Mosalaei, A

    2015-01-01

    To accurately recompute dose distributions in chest-wall radiotherapy with 120 kVp kilovoltage X-rays, an MCNP4C Monte Carlo model is presented using a fast method that obviates the need to fully model the tube components. To validate the model, half-value layer (HVL), percentage depth doses (PDDs) and beam profiles were measured. Dose measurements were performed for a more complex situation using thermoluminescence dosimeters (TLDs) placed within a Rando phantom. The measured and computed first and second HVLs were 3.8, 10.3 mm Al and 3.8, 10.6 mm Al, respectively. The differences between measured and calculated PDDs and beam profiles in water were within 2 mm/2% for all data points. In the Rando phantom, differences for majority of data points were within 2%. The proposed model offered an approximately 9500-fold reduced run time compared to the conventional full simulation. The acceptable agreement, based on international criteria, between the simulations and the measurements validates the accuracy of the model for its use in treatment planning and radiobiological modeling studies of superficial therapies including chest-wall irradiation using kilovoltage beam.

  6. Applying the plan-do-study-act model to increase the use of kangaroo care.

    PubMed

    Stikes, Reetta; Barbier, Denise

    2013-01-01

    To increase the rate of participation in kangaroo care within a level III neonatal intensive care unit. Preterm birth typically results in initial separation of mother and infant which may disrupt the bonding process. Nurses within the neonatal intensive care unit can introduce strategies that will assist parents in overcoming fears and developing relationships with their infants. Kangaroo care is a method of skin-to-skin holding that has been shown to enhance the mother-infant relationship while also improving infant outcomes. However, kangaroo care has been used inconsistently within neonatal intensive care unit settings. The Plan-Do-Study-Act Model was used as a framework for this project. Plan-Do-Study-Act Model uses four cyclical steps for continuous quality improvement. Based upon Plan-Do-Study-Act Model, education was planned, surveys were developed and strategies implemented to overcome barriers. Four months post-implementation, the use of kangaroo care increased by 31%. Staff surveys demonstrated a decrease in the perceived barriers to kangaroo care as well as an increase in kangaroo care. Application of Plan-Do-Study-Act Model was successful in meeting the goal of increasing the use of kangaroo care. The use of the Plan-Do-Study-Act Model framework encourages learning, reflection and validation throughout implementation. Plan-Do-Study-Act Model is a strategy that can promote the effective use of innovative practices in nursing. © 2013 Blackwell Publishing Ltd.

  7. Propagation and Directional Scattering of Ocean Waves in the Marginal Ice Zone and Neighboring Seas

    DTIC Science & Technology

    2015-09-30

    expected to be the average of the kernel for 10 s and 12 s. This means that we should be able to calculate empirical formulas for 2 the scattering kernel...floe packing. Thus, establish a way to incorporate what has been done by Squire and co-workers into the wave model paradigm (in which the phase of the...cases observed by Kohout et al. (2014) in Antarctica . vii. Validation: We are planning validation tests for wave-ice scattering / attenuation model by

  8. Goals and Status of the NASA Juncture Flow Experiment

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.; Morrison, Joseph H.

    2016-01-01

    The NASA Juncture Flow experiment is a new effort whose focus is attaining validation data in the juncture region of a wing-body configuration. The experiment is designed specifically for the purpose of CFD validation. Current turbulence models routinely employed by Reynolds-averaged Navier-Stokes CFD are inconsistent in their prediction of corner flow separation in aircraft juncture regions, so experimental data in the near-wall region of such a configuration will be useful both for assessment as well as for turbulence model improvement. This paper summarizes the Juncture Flow effort to date, including preliminary risk-reduction experiments already conducted and planned future experiments. The requirements and challenges associated with conducting a quality validation test are discussed.

  9. Design, development and clinical validation of computer-aided surgical simulation system for streamlined orthognathic surgical planning.

    PubMed

    Yuan, Peng; Mai, Huaming; Li, Jianfu; Ho, Dennis Chun-Yu; Lai, Yingying; Liu, Siting; Kim, Daeseung; Xiong, Zixiang; Alfi, David M; Teichgraeber, John F; Gateno, Jaime; Xia, James J

    2017-12-01

    There are many proven problems associated with traditional surgical planning methods for orthognathic surgery. To address these problems, we developed a computer-aided surgical simulation (CASS) system, the AnatomicAligner, to plan orthognathic surgery following our streamlined clinical protocol. The system includes six modules: image segmentation and three-dimensional (3D) reconstruction, registration and reorientation of models to neutral head posture, 3D cephalometric analysis, virtual osteotomy, surgical simulation, and surgical splint generation. The accuracy of the system was validated in a stepwise fashion: first to evaluate the accuracy of AnatomicAligner using 30 sets of patient data, then to evaluate the fitting of splints generated by AnatomicAligner using 10 sets of patient data. The industrial gold standard system, Mimics, was used as the reference. When comparing the results of segmentation, virtual osteotomy and transformation achieved with AnatomicAligner to the ones achieved with Mimics, the absolute deviation between the two systems was clinically insignificant. The average surface deviation between the two models after 3D model reconstruction in AnatomicAligner and Mimics was 0.3 mm with a standard deviation (SD) of 0.03 mm. All the average surface deviations between the two models after virtual osteotomy and transformations were smaller than 0.01 mm with a SD of 0.01 mm. In addition, the fitting of splints generated by AnatomicAligner was at least as good as the ones generated by Mimics. We successfully developed a CASS system, the AnatomicAligner, for planning orthognathic surgery following the streamlined planning protocol. The system has been proven accurate. AnatomicAligner will soon be available freely to the boarder clinical and research communities.

  10. Design, development and clinical validation of computer-aided surgical simulation system for streamlined orthognathic surgical planning

    PubMed Central

    Yuan, Peng; Mai, Huaming; Li, Jianfu; Ho, Dennis Chun-Yu; Lai, Yingying; Liu, Siting; Kim, Daeseung; Xiong, Zixiang; Alfi, David M.; Teichgraeber, John F.; Gateno, Jaime

    2017-01-01

    Purpose There are many proven problems associated with traditional surgical planning methods for orthognathic surgery. To address these problems, we developed a computer-aided surgical simulation (CASS) system, the AnatomicAligner, to plan orthognathic surgery following our streamlined clinical protocol. Methods The system includes six modules: image segmentation and three-dimensional (3D) reconstruction, registration and reorientation of models to neutral head posture, 3D cephalometric analysis, virtual osteotomy, surgical simulation, and surgical splint generation. The accuracy of the system was validated in a stepwise fashion: first to evaluate the accuracy of AnatomicAligner using 30 sets of patient data, then to evaluate the fitting of splints generated by AnatomicAligner using 10 sets of patient data. The industrial gold standard system, Mimics, was used as the reference. Result When comparing the results of segmentation, virtual osteotomy and transformation achieved with AnatomicAligner to the ones achieved with Mimics, the absolute deviation between the two systems was clinically insignificant. The average surface deviation between the two models after 3D model reconstruction in AnatomicAligner and Mimics was 0.3 mm with a standard deviation (SD) of 0.03 mm. All the average surface deviations between the two models after virtual osteotomy and transformations were smaller than 0.01 mm with a SD of 0.01 mm. In addition, the fitting of splints generated by AnatomicAligner was at least as good as the ones generated by Mimics. Conclusion We successfully developed a CASS system, the AnatomicAligner, for planning orthognathic surgery following the streamlined planning protocol. The system has been proven accurate. AnatomicAligner will soon be available freely to the boarder clinical and research communities. PMID:28432489

  11. Alphabus Mechanical Validation Plan and Test Campaign

    NASA Astrophysics Data System (ADS)

    Calvisi, G.; Bonnet, D.; Belliol, P.; Lodereau, P.; Redoundo, R.

    2012-07-01

    A joint team of the two leading European satellite companies (Astrium and Thales Alenia Space) worked with the support of ESA and CNES to define a product line able to efficiently address the upper segment of communications satellites : Alphabus Starting in 2009 and up to 2011 the mechanical validation of the Alphabus platform has been obtained thanks to static tests performed on dedicated static model and to environmental test performed on the first satellite based on Alphabus: Alphasat I-XL. The mechanical validation of the Alphabus platform presented an excellent opportunity to improve the validation and qualification process, with respect to static, sine vibrations, acoustic and L/V shock environment, minimizing recurrent cost of manufacturing, integration and testing. A main driver on mechanical testing is that mechanical acceptance testing at satellite level will be performed with empty tanks due to technical constraints (limitation of existing vibration devices) and programmatic advantages (test risk reduction, test schedule minimization). In this paper the impacts that such testing logic have on validation plan are briefly recalled and its actual application for Alphasat PFM mechanical test campaign is detailed.

  12. MISR - Science Data Validation Plan

    NASA Technical Reports Server (NTRS)

    Conel, J.; Ledeboer, W.; Ackerman, T.; Marchand, R.; Clothiaux, E.

    2000-01-01

    This Science Data Validation Plan describes the plans for validating a subset of the Multi-angle Imaging SpectroRadiometer (MISR) Level 2 algorithms and data products and supplying top-of-atmosphere (TOA) radiances to the In-flight Radiometric Calibration and Characterization (IFRCC) subsystem for vicarious calibration.

  13. Automated Data Assimilation and Flight Planning for Multi-Platform Observation Missions

    NASA Technical Reports Server (NTRS)

    Oza, Nikunj; Morris, Robert A.; Strawa, Anthony; Kurklu, Elif; Keely, Leslie

    2008-01-01

    This is a progress report on an effort in which our goal is to demonstrate the effectiveness of automated data mining and planning for the daily management of Earth Science missions. Currently, data mining and machine learning technologies are being used by scientists at research labs for validating Earth science models. However, few if any of these advanced techniques are currently being integrated into daily mission operations. Consequently, there are significant gaps in the knowledge that can be derived from the models and data that are used each day for guiding mission activities. The result can be sub-optimal observation plans, lack of useful data, and wasteful use of resources. Recent advances in data mining, machine learning, and planning make it feasible to migrate these technologies into the daily mission planning cycle. We describe the design of a closed loop system for data acquisition, processing, and flight planning that integrates the results of machine learning into the flight planning process.

  14. TH-AB-BRA-07: PENELOPE-Based GPU-Accelerated Dose Calculation System Applied to MRI-Guided Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Y; Mazur, T; Green, O

    Purpose: The clinical commissioning of IMRT subject to a magnetic field is challenging. The purpose of this work is to develop a GPU-accelerated Monte Carlo dose calculation platform based on PENELOPE and then use the platform to validate a vendor-provided MRIdian head model toward quality assurance of clinical IMRT treatment plans subject to a 0.35 T magnetic field. Methods: We first translated PENELOPE from FORTRAN to C++ and validated that the translation produced equivalent results. Then we adapted the C++ code to CUDA in a workflow optimized for GPU architecture. We expanded upon the original code to include voxelized transportmore » boosted by Woodcock tracking, faster electron/positron propagation in a magnetic field, and several features that make gPENELOPE highly user-friendly. Moreover, we incorporated the vendor-provided MRIdian head model into the code. We performed a set of experimental measurements on MRIdian to examine the accuracy of both the head model and gPENELOPE, and then applied gPENELOPE toward independent validation of patient doses calculated by MRIdian’s KMC. Results: We achieve an average acceleration factor of 152 compared to the original single-thread FORTRAN implementation with the original accuracy preserved. For 16 treatment plans including stomach (4), lung (2), liver (3), adrenal gland (2), pancreas (2), spleen (1), mediastinum (1) and breast (1), the MRIdian dose calculation engine agrees with gPENELOPE with a mean gamma passing rate of 99.1% ± 0.6% (2%/2 mm). Conclusions: We developed a Monte Carlo simulation platform based on a GPU-accelerated version of PENELOPE. We validated that both the vendor provided head model and fast Monte Carlo engine used by the MRIdian system are accurate in modeling radiation transport in a patient using 2%/2 mm gamma criteria. Future applications of this platform will include dose validation and accumulation, IMRT optimization, and dosimetry system modeling for next generation MR-IGRT systems.« less

  15. Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)

    NASA Astrophysics Data System (ADS)

    Selvy, Brian M.; Claver, Charles; Angeli, George

    2014-08-01

    This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.

  16. WE-A-BRD-01: Innovation in Radiation Therapy Planning I: Knowledge Guided Treatment Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Q; Olsen, L

    2014-06-15

    Intensity modulated radiation therapy (IMRT) and Volumetric Modulated Arc Therapy (VMAT) offer the capability of normal tissues and organs sparing. However, the exact amount of sparing is often unknown until the plan is complete. This lack of prior guidance has led to the iterative, trial and-error approach in current planning practice. Even with this effort the search for patient-specific optimal organ sparing is still strongly influenced by planner's experience. While experience generally helps in maximizing the dosimetric advantages of IMRT/VMAT, there have been several reports showing unnecessarily high degree of plan quality variability at individual institutions and amongst different institutions,more » even with a large amount of experience and the best available tools. Further, when physician and physicist evaluate a plan, the dosimetric quality of the plan is often compared with a standard protocol that ignores individual patient anatomy and tumor characteristic variations. In recent years, developments of knowledge models for clinical IMRT/VMAT planning guidance have shown promising clinical potentials. These knowledge models extract past expert clinical experience into mathematical models that predict dose sparing references at patient-specific level. For physicians and planners, these references provide objective values that reflect best achievable dosimetric constraints. For quality assurance, applying patient-specific dosimetry requirements will enable more quantitative and objective assessment of protocol compliance for complex IMRT planning. Learning Objectives: Modeling and representation of knowledge for knowledge-guided treatment planning. Demonstrations of knowledge-guided treatment planning with a few clinical caanatomical sites. Validation and evaluation of knowledge models for cost and quality effective standardization of plan optimization.« less

  17. Utility of the Montreal Assessment of Need Questionnaire for Community Mental Health Planning

    PubMed Central

    Tremblay, Jacques; Bamvita, Jean-Marie; Grenier, Guy; Fleury, Marie-Josée

    2014-01-01

    Abstract Needs assessment facilitates mental health services planning, provision, and evaluation. This study aimed to a) validate a new instrument, the Montreal Assessment of Needs Questionnaire (MANQ), and b) use this to assess variations and predictors of need (number and seriousness) in 297 individuals with severe mental disorders for 18 months, during implementation of the Quebec Mental Health Action Plan. MANQ internal and external validations were adequate. Variables significantly associated with need number and seriousness variations were used to build multiple linear regression models. Autonomous housing, not receiving welfare, not having consulted a health educator, higher level of help from services, Alcohol Use Disorders Identification Test total score, and social support were associated with decreasing need number and seriousness over time. Having a higher education was also associated with decreasing need number. In a reform context, the MANQ’s unique ability to detect rapid improvement in patient needs has usefulness for Quebec mental health planning. PMID:25099300

  18. On-Board Entry Trajectory Planning Expanded to Sub-orbital Flight

    NASA Technical Reports Server (NTRS)

    Lu, Ping; Shen, Zuojun

    2003-01-01

    A methodology for on-board planning of sub-orbital entry trajectories is developed. The algorithm is able to generate in a time frame consistent with on-board environment a three-degree-of-freedom (3DOF) feasible entry trajectory, given the boundary conditions and vehicle modeling. This trajectory is then tracked by feedback guidance laws which issue guidance commands. The current trajectory planning algorithm complements the recently developed method for on-board 3DOF entry trajectory generation for orbital missions, and provides full-envelope autonomous adaptive entry guidance capability. The algorithm is validated and verified by extensive high fidelity simulations using a sub-orbital reusable launch vehicle model and difficult mission scenarios including failures and aborts.

  19. Preliminary Multivariable Cost Model for Space Telescopes

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip

    2010-01-01

    Parametric cost models are routinely used to plan missions, compare concepts and justify technology investments. Previously, the authors published two single variable cost models based on 19 flight missions. The current paper presents the development of a multi-variable space telescopes cost model. The validity of previously published models are tested. Cost estimating relationships which are and are not significant cost drivers are identified. And, interrelationships between variables are explored

  20. SU-F-T-156: Monte Carlo Simulation Using TOPAS for Synchrotron Based Proton Discrete Spot Scanning System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moskvin, V; Pirlepesov, F; Tsiamas, P

    Purpose: This study provides an overview of the design and commissioning of the Monte Carlo (MC) model of the spot-scanning proton therapy nozzle and its implementation for the patient plan simulation. Methods: The Hitachi PROBEAT V scanning nozzle was simulated based on vendor specifications using the TOPAS extension of Geant4 code. FLUKA MC simulation was also utilized to provide supporting data for the main simulation. Validation of the MC model was performed using vendor provided data and measurements collected during acceptance/commissioning of the proton therapy machine. Actual patient plans using CT based treatment geometry were simulated and compared to themore » dose distributions produced by the treatment planning system (Varian Eclipse 13.6), and patient quality assurance measurements. In-house MATLAB scripts are used for converting DICOM data into TOPAS input files. Results: Comparison analysis of integrated depth doses (IDDs), therapeutic ranges (R90), and spot shape/sizes at different distances from the isocenter, indicate good agreement between MC and measurements. R90 agreement is within 0.15 mm across all energy tunes. IDDs and spot shapes/sizes differences are within statistical error of simulation (less than 1.5%). The MC simulated data, validated with physical measurements, were used for the commissioning of the treatment planning system. Patient geometry simulations were conducted based on the Eclipse produced DICOM plans. Conclusion: The treatment nozzle and standard option beam model were implemented in the TOPAS framework to simulate a highly conformal discrete spot-scanning proton beam system.« less

  1. Vibro-Acoustic FE Analyses of the Saab 2000 Aircraft

    NASA Technical Reports Server (NTRS)

    Green, Inge S.

    1992-01-01

    A finite element model of the Saab 2000 fuselage structure and interior cavity has been created in order to compute the noise level in the passenger cabin due to propeller noise. Areas covered in viewgraph format include the following: coupled acoustic/structural noise; data base creation; frequency response analysis; model validation; and planned analyses.

  2. Design and architecture of the Mars relay network planning and analysis framework

    NASA Technical Reports Server (NTRS)

    Cheung, K. M.; Lee, C. H.

    2002-01-01

    In this paper we describe the design and architecture of the Mars Network planning and analysis framework that supports generation and validation of efficient planning and scheduling strategy. The goals are to minimize the transmitting time, minimize the delaying time, and/or maximize the network throughputs. The proposed framework would require (1) a client-server architecture to support interactive, batch, WEB, and distributed analysis and planning applications for the relay network analysis scheme, (2) a high-fidelity modeling and simulation environment that expresses link capabilities between spacecraft to spacecraft and spacecraft to Earth stations as time-varying resources, and spacecraft activities, link priority, Solar System dynamic events, the laws of orbital mechanics, and other limiting factors as spacecraft power and thermal constraints, (3) an optimization methodology that casts the resource and constraint models into a standard linear and nonlinear constrained optimization problem that lends itself to commercial off-the-shelf (COTS)planning and scheduling algorithms.

  3. Development and Validation of a Polarimetric-MCScene 3D Atmospheric Radiation Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berk, Alexander; Hawes, Frederick; Fox, Marsha

    2016-03-15

    Polarimetric measurements can substantially enhance the ability of both spectrally resolved and single band imagery to detect the proliferation of weapons of mass destruction, providing data for locating and identifying facilities, materials, and processes of undeclared and proliferant nuclear weapons programs worldwide. Unfortunately, models do not exist that efficiently and accurately predict spectral polarized signatures for the materials of interest embedded in complex 3D environments. Having such a model would enable one to test hypotheses and optimize both the enhancement of scene contrast and the signal processing for spectral signature extraction. The Phase I set the groundwork for development ofmore » fully validated polarimetric spectral signature and scene simulation models. This has been accomplished 1. by (a) identifying and downloading state-of-the-art surface and atmospheric polarimetric data sources, (b) implementing tools for generating custom polarimetric data, and (c) identifying and requesting US Government funded field measurement data for use in validation; 2. by formulating an approach for upgrading the radiometric spectral signature model MODTRAN to generate polarimetric intensities through (a) ingestion of the polarimetric data, (b) polarimetric vectorization of existing MODTRAN modules, and (c) integration of a newly developed algorithm for computing polarimetric multiple scattering contributions; 3. by generating an initial polarimetric model that demonstrates calculation of polarimetric solar and lunar single scatter intensities arising from the interaction of incoming irradiances with molecules and aerosols; 4. by developing a design and implementation plan to (a) automate polarimetric scene construction and (b) efficiently sample polarimetric scattering and reflection events, for use in a to be developed polarimetric version of the existing first-principles synthetic scene simulation model, MCScene; and 5. by planning a validation field measurement program in collaboration with the Remote Sensing and Exploitation group at Sandia National Laboratories (SNL) in which data from their ongoing polarimetric field and laboratory measurement program will be shared and, to the extent allowed, tailored for model validation in exchange for model predictions under conditions and for geometries outside of their measurement domain.« less

  4. Agent-Based Simulation for Interconnection-Scale Renewable Integration and Demand Response Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chassin, David P.; Behboodi, Sahand; Crawford, Curran

    This paper collects and synthesizes the technical requirements, implementation, and validation methods for quasi-steady agent-based simulations of interconnectionscale models with particular attention to the integration of renewable generation and controllable loads. Approaches for modeling aggregated controllable loads are presented and placed in the same control and economic modeling framework as generation resources for interconnection planning studies. Model performance is examined with system parameters that are typical for an interconnection approximately the size of the Western Electricity Coordinating Council (WECC) and a control area about 1/100 the size of the system. These results are used to demonstrate and validate the methodsmore » presented.« less

  5. Choice Inconsistencies among the Elderly: Evidence from Plan Choice in the Medicare Part D Program: Comment.

    PubMed

    Ketcham, Jonathan D; Kuminoff, Nicolai V; Powers, Christopher A

    2016-12-01

    Consumers' enrollment decisions in Medicare Part D can be explained by Abaluck and Gruber’s (2011) model of utility maximization with psychological biases or by a neoclassical version of their model that precludes such biases. We evaluate these competing hypotheses by applying nonparametric tests of utility maximization and model validation tests to administrative data. We find that 79 percent of enrollment decisions from 2006 to 2010 satisfied basic axioms of consumer theory under the assumption of full information. The validation tests provide evidence against widespread psychological biases. In particular, we find that precluding psychological biases improves the structural model's out-of-sample predictions for consumer behavior.

  6. Agent-Based Simulation for Interconnection-Scale Renewable Integration and Demand Response Studies

    DOE PAGES

    Chassin, David P.; Behboodi, Sahand; Crawford, Curran; ...

    2015-12-23

    This paper collects and synthesizes the technical requirements, implementation, and validation methods for quasi-steady agent-based simulations of interconnectionscale models with particular attention to the integration of renewable generation and controllable loads. Approaches for modeling aggregated controllable loads are presented and placed in the same control and economic modeling framework as generation resources for interconnection planning studies. Model performance is examined with system parameters that are typical for an interconnection approximately the size of the Western Electricity Coordinating Council (WECC) and a control area about 1/100 the size of the system. These results are used to demonstrate and validate the methodsmore » presented.« less

  7. A Public-Private Partnership Develops and Externally Validates a 30-Day Hospital Readmission Risk Prediction Model

    PubMed Central

    Choudhry, Shahid A.; Li, Jing; Davis, Darcy; Erdmann, Cole; Sikka, Rishi; Sutariya, Bharat

    2013-01-01

    Introduction: Preventing the occurrence of hospital readmissions is needed to improve quality of care and foster population health across the care continuum. Hospitals are being held accountable for improving transitions of care to avert unnecessary readmissions. Advocate Health Care in Chicago and Cerner (ACC) collaborated to develop all-cause, 30-day hospital readmission risk prediction models to identify patients that need interventional resources. Ideally, prediction models should encompass several qualities: they should have high predictive ability; use reliable and clinically relevant data; use vigorous performance metrics to assess the models; be validated in populations where they are applied; and be scalable in heterogeneous populations. However, a systematic review of prediction models for hospital readmission risk determined that most performed poorly (average C-statistic of 0.66) and efforts to improve their performance are needed for widespread usage. Methods: The ACC team incorporated electronic health record data, utilized a mixed-method approach to evaluate risk factors, and externally validated their prediction models for generalizability. Inclusion and exclusion criteria were applied on the patient cohort and then split for derivation and internal validation. Stepwise logistic regression was performed to develop two predictive models: one for admission and one for discharge. The prediction models were assessed for discrimination ability, calibration, overall performance, and then externally validated. Results: The ACC Admission and Discharge Models demonstrated modest discrimination ability during derivation, internal and external validation post-recalibration (C-statistic of 0.76 and 0.78, respectively), and reasonable model fit during external validation for utility in heterogeneous populations. Conclusions: The ACC Admission and Discharge Models embody the design qualities of ideal prediction models. The ACC plans to continue its partnership to further improve and develop valuable clinical models. PMID:24224068

  8. The National Shipbuilding Research Program. Shipyard MACT Implementation Plan and Compliance Tools

    DTIC Science & Technology

    1996-06-01

    display a currently valid OMB control number. 1. REPORT DATE JUN 1996 2. REPORT TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE The National...ACHIEVABLE CONTROL TECHNOLOGY SECTION TWO: MODEL SHIPYARD IMPLEMENTATION PLAN SECTION THREE: THINNING RATION CALCULATION SHEETS FOR OPTIONS 2 & 3 AND...INTERPRETATION OF THE SHIPYARD MAXIMUM ACHIEVABLE CONTROL TECHNOLOGY EPA’s Maximum Achievable Control Technology Rule for Shipyards: A Plain English

  9. Municipal resilience: A paradigm shift in emergency and continuity management.

    PubMed

    Solecki, Greg; Luchia, Mike

    More than a decade of emergency and continuity management vision was instrumental in providing the unprecedented level of response and recovery from the great flood of 2013. Earlier assessments, planning and validation promulgated development of corporate continuity, emergency and contingency plans along with tactical, strategic and recovery operations centres that all led to a reliable emergency management model that will continue to provide the backbone for municipal resilience.

  10. Why Do College Students Cheat? A Structural Equation Modeling Validation of the Theory of Planned Behavior

    ERIC Educational Resources Information Center

    AL-Dossary, Saeed Abdullah

    2017-01-01

    Cheating on tests is a serious problem in education. The purpose of this study was to test the efficacy of a modified form of the theory of planned behavior (TPB) to predict cheating behavior among a sample of Saudi university students. This study also sought to test the influence of cheating in high school on cheating in college within the…

  11. Integration of second cancer risk calculations in a radiotherapy treatment planning system

    NASA Astrophysics Data System (ADS)

    Hartmann, M.; Schneider, U.

    2014-03-01

    Second cancer risk in patients, in particular in children, who were treated with radiotherapy is an important side effect. It should be minimized by selecting an appropriate treatment plan for the patient. The objectives of this study were to integrate a risk model for radiation induced cancer into a treatment planning system which allows to judge different treatment plans with regard to second cancer induction and to quantify the potential reduction in predicted risk. A model for radiation induced cancer including fractionation effects which is valid for doses in the radiotherapy range was integrated into a treatment planning system. From the three-dimensional (3D) dose distribution the 3D-risk equivalent dose (RED) was calculated on an organ specific basis. In addition to RED further risk coefficients like OED (organ equivalent dose), EAR (excess absolute risk) and LAR (lifetime attributable risk) are computed. A risk model for radiation induced cancer was successfully integrated in a treatment planning system. Several risk coefficients can be viewed and used to obtain critical situations were a plan can be optimised. Risk-volume-histograms and organ specific risks were calculated for different treatment plans and were used in combination with NTCP estimates for plan evaluation. It is concluded that the integration of second cancer risk estimates in a commercial treatment planning system is feasible. It can be used in addition to NTCP modelling for optimising treatment plans which result in the lowest possible second cancer risk for a patient.

  12. Analytical Plan for Roman Glasses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strachan, Denis M.; Buck, Edgar C.; Mueller, Karl T.

    Roman glasses that have been in the sea or underground for about 1800 years can serve as the independent “experiment” that is needed for validation of codes and models that are used in performance assessment. Two sets of Roman-era glasses have been obtained for this purpose. One set comes from the sunken vessel the Iulia Felix; the second from recently excavated glasses from a Roman villa in Aquileia, Italy. The specimens contain glass artifacts and attached sediment or soil. In the case of the Iulia Felix glasses quite a lot of analytical work has been completed at the University ofmore » Padova, but from an archaeological perspective. The glasses from Aquileia have not been so carefully analyzed, but they are similar to other Roman glasses. Both glass and sediment or soil need to be analyzed and are the subject of this analytical plan. The glasses need to be analyzed with the goal of validating the model used to describe glass dissolution. The sediment and soil need to be analyzed to determine the profile of elements released from the glass. This latter need represents a significant analytical challenge because of the trace quantities that need to be analyzed. Both pieces of information will yield important information useful in the validation of the glass dissolution model and the chemical transport code(s) used to determine the migration of elements once released from the glass. In this plan, we outline the analytical techniques that should be useful in obtaining the needed information and suggest a useful starting point for this analytical effort.« less

  13. Standards and Guidelines for Numerical Models for Tsunami Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Titov, V.; Gonzalez, F.; Kanoglu, U.; Yalciner, A.; Synolakis, C. E.

    2006-12-01

    An increased number of nations around the workd need to develop tsunami mitigation plans which invariably involve inundation maps for warning guidance and evacuation planning. There is the risk that inundation maps may be produced with older or untested methodology, as there are currently no standards for modeling tools. In the aftermath of the 2004 megatsunami, some models were used to model inundation for Cascadia events with results much larger than sediment records and existing state-of-the-art studies suggest leading to confusion among emergency management. Incorrectly assessing tsunami impact is hazardous, as recent events in 2006 in Tonga, Kythira, Greece and Central Java have suggested (Synolakis and Bernard, 2006). To calculate tsunami currents, forces and runup on coastal structures, and inundation of coastlines one must calculate the evolution of the tsunami wave from the deep ocean to its target site, numerically. No matter what the numerical model, validation (the process of ensuring that the model solves the parent equations of motion accurately) and verification (the process of ensuring that the model used represents geophysical reality appropriately) both are an essential. Validation ensures that the model performs well in a wide range of circumstances and is accomplished through comparison with analytical solutions. Verification ensures that the computational code performs well over a range of geophysical problems. A few analytic solutions have been validated themselves with laboratory data. Even fewer existing numerical models have been both validated with the analytical solutions and verified with both laboratory measurements and field measurements, thus establishing a gold standard for numerical codes for inundation mapping. While there is in principle no absolute certainty that a numerical code that has performed well in all the benchmark tests will also produce correct inundation predictions with any given source motions, validated codes reduce the level of uncertainty in their results to the uncertainty in the geophysical initial conditions. Further, when coupled with real--time free--field tsunami measurements from tsunameters, validated codes are the only choice for realistic forecasting of inundation; the consequences of failure are too ghastly to take chances with numerical procedures that have not been validated. We discuss a ten step process of benchmark tests for models used for inundation mapping. The associated methodology and algorithmes have to first be validated with analytical solutions, then verified with laboratory measurements and field data. The models need to be published in the scientific literature in peer-review journals indexed by ISI. While this process may appear onerous, it reflects our state of knowledge, and is the only defensible methodology when human lives are at stake. Synolakis, C.E., and Bernard, E.N, Tsunami science before and beyond Boxing Day 2004, Phil. Trans. R. Soc. A 364 1845, 2231--2263, 2005.

  14. Royal London space analysis: plaster versus digital model assessment.

    PubMed

    Grewal, Balpreet; Lee, Robert T; Zou, Lifong; Johal, Ama

    2017-06-01

    With the advent of digital study models, the importance of being able to evaluate space requirements becomes valuable to treatment planning and the justification for any required extraction pattern. This study was undertaken to compare the validity and reliability of the Royal London space analysis (RLSA) undertaken on plaster as compared with digital models. A pilot study (n = 5) was undertaken on plaster and digital models to evaluate the feasibility of digital space planning. This also helped to determine the sample size calculation and as a result, 30 sets of study models with specified inclusion criteria were selected. All five components of the RLSA, namely: crowding; depth of occlusal curve; arch expansion/contraction; incisor antero-posterior advancement and inclination (assessed from the pre-treatment lateral cephalogram) were accounted for in relation to both model types. The plaster models served as the gold standard. Intra-operator measurement error (reliability) was evaluated along with a direct comparison of the measured digital values (validity) with the plaster models. The measurement error or coefficient of repeatability was comparable for plaster and digital space analyses and ranged from 0.66 to 0.95mm. No difference was found between the space analysis performed in either the upper or lower dental arch. Hence, the null hypothesis was accepted. The digital model measurements were consistently larger, albeit by a relatively small amount, than the plaster models (0.35mm upper arch and 0.32mm lower arch). No difference was detected in the RLSA when performed using either plaster or digital models. Thus, digital space analysis provides a valid and reproducible alternative method in the new era of digital records. © The Author 2016. Published by Oxford University Press on behalf of the European Orthodontic Society. All rights reserved. For permissions, please email: journals.permissions@oup.com

  15. Implementation of a multi-variable regression analysis in the assessment of the generation rate and composition of hospital solid waste for the design of a sustainable management system in developing countries.

    PubMed

    Al-Khatib, Issam A; Abu Fkhidah, Ismail; Khatib, Jumana I; Kontogianni, Stamatia

    2016-03-01

    Forecasting of hospital solid waste generation is a critical challenge for future planning. The composition and generation rate of hospital solid waste in hospital units was the field where the proposed methodology of the present article was applied in order to validate the results and secure the outcomes of the management plan in national hospitals. A set of three multiple-variable regression models has been derived for estimating the daily total hospital waste, general hospital waste, and total hazardous waste as a function of number of inpatients, number of total patients, and number of beds. The application of several key indicators and validation procedures indicates the high significance and reliability of the developed models in predicting the hospital solid waste of any hospital. Methodology data were drawn from existent scientific literature. Also, useful raw data were retrieved from international organisations and the investigated hospitals' personnel. The primal generation outcomes are compared with other local hospitals and also with hospitals from other countries. The main outcome, which is the developed model results, are presented and analysed thoroughly. The goal is this model to act as leverage in the discussions among governmental authorities on the implementation of a national plan for safe hospital waste management in Palestine. © The Author(s) 2016.

  16. A novel heuristic for optimization aggregate production problem: Evidence from flat panel display in Malaysia

    NASA Astrophysics Data System (ADS)

    Al-Kuhali, K.; Hussain M., I.; Zain Z., M.; Mullenix, P.

    2015-05-01

    Aim: This paper contribute to the flat panel display industry it terms of aggregate production planning. Methodology: For the minimization cost of total production of LCD manufacturing, a linear programming was applied. The decision variables are general production costs, additional cost incurred for overtime production, additional cost incurred for subcontracting, inventory carrying cost, backorder costs and adjustments for changes incurred within labour levels. Model has been developed considering a manufacturer having several product types, which the maximum types are N, along a total time period of T. Results: Industrial case study based on Malaysia is presented to test and to validate the developed linear programming model for aggregate production planning. Conclusion: The model development is fit under stable environment conditions. Overall it can be recommended to adapt the proven linear programming model to production planning of Malaysian flat panel display industry.

  17. Building a national model of public mental health preparedness and community resilience: validation of a dual-intervention, systems-based approach.

    PubMed

    McCabe, O Lee; Semon, Natalie L; Thompson, Carol B; Lating, Jeffrey M; Everly, George S; Perry, Charlene J; Moore, Suzanne Straub; Mosley, Adrian M; Links, Jonathan M

    2014-12-01

    Working within a series of partnerships among an academic health center, local health departments (LHDs), and faith-based organizations (FBOs), we validated companion interventions to address community mental health planning and response challenges in public health emergency preparedness. We implemented the project within the framework of an enhanced logic model and employed a multi-cohort, pre-test/post-test design to assess the outcomes of 1-day workshops in psychological first aid (PFA) and guided preparedness planning (GPP). The workshops were delivered to urban and rural communities in eastern and midwestern regions of the United States. Intervention effectiveness was based on changes in relevant knowledge, skills, and attitudes (KSAs) and on several behavioral indexes. Significant improvements were observed in self-reported and objectively measured KSAs across all cohorts. Additionally, GPP teams proved capable of producing quality drafts of basic community disaster plans in 1 day, and PFA trainees confirmed upon follow-up that their training proved useful in real-world trauma contexts. We documented examples of policy and practice changes at the levels of local and state health departments. Given appropriate guidance, LHDs and FBOs can implement an effective and potentially scalable model for promoting disaster mental health preparedness and community resilience, with implications for positive translational impact.

  18. Development of an audit instrument for nursing care plans in the patient record

    PubMed Central

    Bjorvell, C; Thorell-Ekstrand, I; Wredling, R

    2000-01-01

    Objectives—To develop, validate, and test the reliability of an audit instrument that measures the extent to which patient records describe important aspects of nursing care. Material—Twenty records from each of three hospital wards were collected and audited. The auditors were registered nurses with a knowledge of nursing documentation in accordance with the VIPS model—a model designed to structure nursing documentation. (VIPS is an acronym formed from the Swedish words for wellbeing, integrity, prevention, and security.) Methods—An audit instrument was developed by determining specific criteria to be met. The audit questions were aimed at revealing the content of the patient for nursing assessment, nursing diagnosis, planned interventions, and outcome. Each of the 60 records was reviewed by the three auditors independently and the reliability of the instrument was tested by calculating the inter-rater reliability coefficient. Content validity was tested by using an expert panel and calculating the content validity ratio. The criterion related validity was estimated by the correlation between the score of the Cat-ch-Ing instrument and the score of an earlier developed and used audit instrument. The results were then tested by using Pearson's correlation coefficient. Results—The new audit instrument, named Cat-ch-Ing, consists of 17 questions designed to judge the nursing documentation. Both quantity and quality variables are judged on a rating scale from zero to three, with a maximum score of 80. The inter-rater reliability coefficients were 0.98, 0.98, and 0.92, respectively for each group of 20 records, the content validity ratio ranged between 0.20 and 1.0 and the criterion related validity showed a significant correlation of r = 0.68 (p< 0.0001, 95% CI 0.57 to 0.76) between the two audit instruments. Conclusion—The Cat-ch-Ing instrument has proved to be a valid and reliable audit instrument for nursing records when the VIPS model is used as the basis of the documentation. (Quality in Health Care 2000;9:6–13) Key Words: audit instrument; nursing care plans; quality assurance PMID:10848373

  19. The PLAN score: a bedside prediction rule for death and severe disability following acute ischemic stroke.

    PubMed

    O'Donnell, Martin J; Fang, Jiming; D'Uva, Cami; Saposnik, Gustavo; Gould, Linda; McGrath, Emer; Kapral, Moira K

    2012-11-12

    We sought to develop and validate a simple clinical prediction rule for death and severe disability after acute ischemic stroke that can be used by general clinicians at the time of hospital admission. We analyzed data from a registry of 9847 patients (4943 in the derivation cohort and 4904 in the validation cohort) hospitalized with acute ischemic stroke and included in the Registry of the Canadian Stroke Network (July 1, 2003, to March 31, 2008; 11 regional stroke centers in Ontario, Canada). Outcome measures were 30-day and 1-year mortality and a modified Rankin score of 5 to 6 at discharge. Overall 30-day mortality was 11.5% (derivation cohort) and 13.5% (validation cohort). In the final multivariate model, we included 9 clinical variables that could be categorized as preadmission comorbidities (5 points for preadmission dependence [1.5], cancer [1.5], congestive heart failure [1.0], and atrial fibrillation [1.0]), level of consciousness (5 points for reduced level of consciousness), age (10 points, 1 point/decade), and neurologic focal deficit (5 points for significant/total weakness of the leg [2], weakness of the arm [2], and aphasia or neglect [1]). Maximum score is 25. In the validation cohort, the PLAN score (derived from preadmission comorbidities, level of consciousness, age, and neurologic deficit) predicted 30-day mortality (C statistic, 0.87), death or severe dependence at discharge (0.88), and 1-year mortality (0.84). The PLAN score also predicted favorable outcome (modified Rankin score, 0-2) at discharge (C statistic, 0.80). The PLAN clinical prediction rule identifies patients who will have a poor outcome after hospitalization for acute ischemic stroke. The score comprises clinical data available at the time of admission and may be determined by nonspecialist clinicians. Additional studies to independently validate the PLAN rule in different populations and settings are required.

  20. The Role of Integrated Modeling in the Design and Verification of the James Webb Space Telescope

    NASA Technical Reports Server (NTRS)

    Mosier, Gary E.; Howard, Joseph M.; Johnston, John D.; Parrish, Keith A.; Hyde, T. Tupper; McGinnis, Mark A.; Bluth, Marcel; Kim, Kevin; Ha, Kong Q.

    2004-01-01

    The James Web Space Telescope (JWST) is a large, infrared-optimized space telescope scheduled for launch in 2011. System-level verification of critical optical performance requirements will rely on integrated modeling to a considerable degree. In turn, requirements for accuracy of the models are significant. The size of the lightweight observatory structure, coupled with the need to test at cryogenic temperatures, effectively precludes validation of the models and verification of optical performance with a single test in 1-g. Rather, a complex series of steps are planned by which the components of the end-to-end models are validated at various levels of subassembly, and the ultimate verification of optical performance is by analysis using the assembled models. This paper describes the critical optical performance requirements driving the integrated modeling activity, shows how the error budget is used to allocate and track contributions to total performance, and presents examples of integrated modeling methods and results that support the preliminary observatory design. Finally, the concepts for model validation and the role of integrated modeling in the ultimate verification of observatory are described.

  1. Active distribution network planning considering linearized system loss

    NASA Astrophysics Data System (ADS)

    Li, Xiao; Wang, Mingqiang; Xu, Hao

    2018-02-01

    In this paper, various distribution network planning techniques with DGs are reviewed, and a new distribution network planning method is proposed. It assumes that the location of DGs and the topology of the network are fixed. The proposed model optimizes the capacities of DG and the optimal distribution line capacity simultaneously by a cost/benefit analysis and the benefit is quantified by the reduction of the expected interruption cost. Besides, the network loss is explicitly analyzed in the paper. For simplicity, the network loss is appropriately simplified as a quadratic function of difference of voltage phase angle. Then it is further piecewise linearized. In this paper, a piecewise linearization technique with different segment lengths is proposed. To validate its effectiveness and superiority, the proposed distribution network planning model with elaborate linearization technique is tested on the IEEE 33-bus distribution network system.

  2. An in vivo dose verification method for SBRT–VMAT delivery using the EPID

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCowan, P. M., E-mail: peter.mccowan@cancercare.mb.ca; Medical Physics Department, CancerCare Manitoba, 675 McDermot Avenue, Winnipeg, Manitoba R3E 0V9; Van Uytven, E.

    2015-12-15

    Purpose: Radiation treatments have become increasingly more complex with the development of volumetric modulated arc therapy (VMAT) and the use of stereotactic body radiation therapy (SBRT). SBRT involves the delivery of substantially larger doses over fewer fractions than conventional therapy. SBRT–VMAT treatments will strongly benefit from in vivo patient dose verification, as any errors in delivery can be more detrimental to the radiobiology of the patient as compared to conventional therapy. Electronic portal imaging devices (EPIDs) are available on most commercial linear accelerators (Linacs) and their documented use for dosimetry makes them valuable tools for patient dose verification. In thismore » work, the authors customize and validate a physics-based model which utilizes on-treatment EPID images to reconstruct the 3D dose delivered to the patient during SBRT–VMAT delivery. Methods: The SBRT Linac head, including jaws, multileaf collimators, and flattening filter, were modeled using Monte Carlo methods and verified with measured data. The simulation provides energy spectrum data that are used by their “forward” model to then accurately predict fluence generated by a SBRT beam at a plane above the patient. This fluence is then transported through the patient and then the dose to the phosphor layer in the EPID is calculated. Their “inverse” model back-projects the EPID measured focal fluence to a plane upstream of the patient and recombines it with the extra-focal fluence predicted by the forward model. This estimate of total delivered fluence is then forward projected onto the patient’s density matrix and a collapsed cone convolution algorithm calculates the dose delivered to the patient. The model was tested by reconstructing the dose for two prostate, three lung, and two spine SBRT–VMAT treatment fractions delivered to an anthropomorphic phantom. It was further validated against actual patient data for a lung and spine SBRT–VMAT plan. The results were verified with the treatment planning system (TPS) (ECLIPSE AAA) dose calculation. Results: The SBRT–VMAT reconstruction model performed very well when compared to the TPS. A stringent 2%/2 mm χ-comparison calculation gave pass rates better than 91% for the prostate plans, 88% for the lung plans, and 86% for the spine plans for voxels containing 80% or more of the prescribed dose. Patient data were 86% for the lung and 95% for the spine. A 3%/3 mm χ-comparison was also performed and gave pass rates better than 93% for all plan types. Conclusions: The authors have customized and validated a robust, physics-based model that calculates the delivered dose to a patient for SBRT–VMAT delivery using on-treatment EPID images. The accuracy of the results indicates that this approach is suitable for clinical implementation. Future work will incorporate this model into both offline and real-time clinical adaptive radiotherapy.« less

  3. Quantifying Unnecessary Normal Tissue Complication Risks due to Suboptimal Planning: A Secondary Study of RTOG 0126

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moore, Kevin L., E-mail: kevinmoore@ucsd.edu; Schmidt, Rachel; Moiseenko, Vitali

    Purpose: The purpose of this study was to quantify the frequency and clinical severity of quality deficiencies in intensity modulated radiation therapy (IMRT) planning in the Radiation Therapy Oncology Group 0126 protocol. Methods and Materials: A total of 219 IMRT patients from the high-dose arm (79.2 Gy) of RTOG 0126 were analyzed. To quantify plan quality, we used established knowledge-based methods for patient-specific dose-volume histogram (DVH) prediction of organs at risk and a Lyman-Kutcher-Burman (LKB) model for grade ≥2 rectal complications to convert DVHs into normal tissue complication probabilities (NTCPs). The LKB model was validated by fitting dose-response parameters relative tomore » observed toxicities. The 90th percentile (22 of 219) of plans with the lowest excess risk (difference between clinical and model-predicted NTCP) were used to create a model for the presumed best practices in the protocol (pDVH{sub 0126,top10%}). Applying the resultant model to the entire sample enabled comparisons between DVHs that patients could have received to DVHs they actually received. Excess risk quantified the clinical impact of suboptimal planning. Accuracy of pDVH predictions was validated by replanning 30 of 219 patients (13.7%), including equal numbers of presumed “high-quality,” “low-quality,” and randomly sampled plans. NTCP-predicted toxicities were compared to adverse events on protocol. Results: Existing models showed that bladder-sparing variations were less prevalent than rectum quality variations and that increased rectal sparing was not correlated with target metrics (dose received by 98% and 2% of the PTV, respectively). Observed toxicities were consistent with current LKB parameters. Converting DVH and pDVH{sub 0126,top10%} to rectal NTCPs, we observed 94 of 219 patients (42.9%) with ≥5% excess risk, 20 of 219 patients (9.1%) with ≥10% excess risk, and 2 of 219 patients (0.9%) with ≥15% excess risk. Replanning demonstrated the predicted NTCP reductions while maintaining the volume of the PTV receiving prescription dose. An equivalent sample of high-quality plans showed fewer toxicities than low-quality plans, 6 of 73 versus 10 of 73 respectively, although these differences were not significant (P=.21) due to insufficient statistical power in this retrospective study. Conclusions: Plan quality deficiencies in RTOG 0126 exposed patients to substantial excess risk for rectal complications.« less

  4. Short-term forecasting of emergency inpatient flow.

    PubMed

    Abraham, Gad; Byrnes, Graham B; Bain, Christopher A

    2009-05-01

    Hospital managers have to manage resources effectively, while maintaining a high quality of care. For hospitals where admissions from the emergency department to the wards represent a large proportion of admissions, the ability to forecast these admissions and the resultant ward occupancy is especially useful for resource planning purposes. Since emergency admissions often compete with planned elective admissions, modeling emergency demand may result in improved elective planning as well. We compare several models for forecasting daily emergency inpatient admissions and occupancy. The models are applied to three years of daily data. By measuring their mean square error in a cross-validation framework, we find that emergency admissions are largely random, and hence, unpredictable, whereas emergency occupancy can be forecasted using a model combining regression and autoregressive integrated moving average (ARIMA) model, or a seasonal ARIMA model, for up to one week ahead. Faced with variable admissions and occupancy, hospitals must prepare a reserve capacity of beds and staff. Our approach allows estimation of the required reserve capacity.

  5. Planning, Enactment, and Reflection in Inquiry-Based Learning: Validating the McGill Strategic Demands of Inquiry Questionnaire

    ERIC Educational Resources Information Center

    Shore, Bruce M.; Chichekian, Tanya; Syer, Cassidy A.; Aulls, Mark W.; Frederiksen, Carl H.

    2012-01-01

    Tools are needed to track the elements of students' successful engagement in inquiry. The "McGill Strategic Demands of Inquiry Questionnaire" (MSDIQ) is a 79-item, criterion-referenced, learner-focused questionnaire anchored in Schon's model and related models of self-regulated learning. The MSDIQ addresses three phases of inquiry…

  6. Validation and Calibration of Nuclear Thermal Hydraulics Multiscale Multiphysics Models - Subcooled Flow Boiling Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anh Bui; Nam Dinh; Brian Williams

    In addition to validation data plan, development of advanced techniques for calibration and validation of complex multiscale, multiphysics nuclear reactor simulation codes are a main objective of the CASL VUQ plan. Advanced modeling of LWR systems normally involves a range of physico-chemical models describing multiple interacting phenomena, such as thermal hydraulics, reactor physics, coolant chemistry, etc., which occur over a wide range of spatial and temporal scales. To a large extent, the accuracy of (and uncertainty in) overall model predictions is determined by the correctness of various sub-models, which are not conservation-laws based, but empirically derived from measurement data. Suchmore » sub-models normally require extensive calibration before the models can be applied to analysis of real reactor problems. This work demonstrates a case study of calibration of a common model of subcooled flow boiling, which is an important multiscale, multiphysics phenomenon in LWR thermal hydraulics. The calibration process is based on a new strategy of model-data integration, in which, all sub-models are simultaneously analyzed and calibrated using multiple sets of data of different types. Specifically, both data on large-scale distributions of void fraction and fluid temperature and data on small-scale physics of wall evaporation were simultaneously used in this work’s calibration. In a departure from traditional (or common-sense) practice of tuning/calibrating complex models, a modern calibration technique based on statistical modeling and Bayesian inference was employed, which allowed simultaneous calibration of multiple sub-models (and related parameters) using different datasets. Quality of data (relevancy, scalability, and uncertainty) could be taken into consideration in the calibration process. This work presents a step forward in the development and realization of the “CIPS Validation Data Plan” at the Consortium for Advanced Simulation of LWRs to enable quantitative assessment of the CASL modeling of Crud-Induced Power Shift (CIPS) phenomenon, in particular, and the CASL advanced predictive capabilities, in general. This report is prepared for the Department of Energy’s Consortium for Advanced Simulation of LWRs program’s VUQ Focus Area.« less

  7. Development and Validation of Linear Alternator Models for the Advanced Stirling Convertor

    NASA Technical Reports Server (NTRS)

    Metscher, Jonathan F.; Lewandowski, Edward

    2014-01-01

    Two models of the linear alternator of the Advanced Stirling Convertor (ASC) have been developed using the Sage 1-D modeling software package. The first model relates the piston motion to electric current by means of a motor constant. The second uses electromagnetic model components to model the magnetic circuit of the alternator. The models are tuned and validated using test data and compared against each other. Results show both models can be tuned to achieve results within 7% of ASC test data under normal operating conditions. Using Sage enables the creation of a complete ASC model to be developed and simulations completed quickly compared to more complex multi-dimensional models. These models allow for better insight into overall Stirling convertor performance, aid with Stirling power system modeling, and in the future support NASA mission planning for Stirling-based power systems.

  8. Development and Validation of Linear Alternator Models for the Advanced Stirling Convertor

    NASA Technical Reports Server (NTRS)

    Metscher, Jonathan F.; Lewandowski, Edward J.

    2015-01-01

    Two models of the linear alternator of the Advanced Stirling Convertor (ASC) have been developed using the Sage 1-D modeling software package. The first model relates the piston motion to electric current by means of a motor constant. The second uses electromagnetic model components to model the magnetic circuit of the alternator. The models are tuned and validated using test data and also compared against each other. Results show both models can be tuned to achieve results within 7 of ASC test data under normal operating conditions. Using Sage enables the creation of a complete ASC model to be developed and simulations completed quickly compared to more complex multi-dimensional models. These models allow for better insight into overall Stirling convertor performance, aid with Stirling power system modeling, and in the future support NASA mission planning for Stirling-based power systems.

  9. Medical talent management: a model for physician deployment.

    PubMed

    Brightman, Baird

    2007-01-01

    This article aims to provide a focused cost-effective method for triaging physicians into appropriate non-clinical roles to benefit both doctors and healthcare organizations. Reviews a validated career-planning process and customize it for medical talent management. A structured career assessment can differentiate between different physician work styles and direct medical talent into best-fit positions. This allows healthcare organizations to create a more finely tuned career ladder than the familiar "in or out" binary choice. PRACTICAL IMPLICATIONS--Healthcare organizations can invest in cost-effective processes for the optimal utilization of their medical talent. Provides a new use for a well-validated career assessment and planning system. The actual value of this approach should be studied using best-practices in ROI research.

  10. Landslide susceptibility modeling in a landslide prone area in Mazandarn Province, north of Iran: a comparison between GLM, GAM, MARS, and M-AHP methods

    NASA Astrophysics Data System (ADS)

    Pourghasemi, Hamid Reza; Rossi, Mauro

    2017-10-01

    Landslides are identified as one of the most important natural hazards in many areas throughout the world. The essential purpose of this study is to compare general linear model (GLM), general additive model (GAM), multivariate adaptive regression spline (MARS), and modified analytical hierarchy process (M-AHP) models and assessment of their performances for landslide susceptibility modeling in the west of Mazandaran Province, Iran. First, landslides were identified by interpreting aerial photographs, and extensive field works. In total, 153 landslides were identified in the study area. Among these, 105 landslides were randomly selected as training data (i.e. used in the models training) and the remaining 48 (30 %) cases were used for the validation (i.e. used in the models validation). Afterward, based on a deep literature review on 220 scientific papers (period between 2005 and 2012), eleven conditioning factors including lithology, land use, distance from rivers, distance from roads, distance from faults, slope angle, slope aspect, altitude, topographic wetness index (TWI), plan curvature, and profile curvature were selected. The Certainty Factor (CF) model was used for managing uncertainty in rule-based systems and evaluation of the correlation between the dependent (landslides) and independent variables. Finally, the landslide susceptibility zonation was produced using GLM, GAM, MARS, and M-AHP models. For evaluation of the models, the area under the curve (AUC) method was used and both success and prediction rate curves were calculated. The evaluation of models for GLM, GAM, and MARS showed 90.50, 88.90, and 82.10 % for training data and 77.52, 70.49, and 78.17 % for validation data, respectively. Furthermore, The AUC value of the produced landslide susceptibility map using M-AHP showed a training value of 77.82 % and validation value of 82.77 % accuracy. Based on the overall assessments, the proposed approaches showed reasonable results for landslide susceptibility mapping in the study area. Moreover, results obtained showed that the M-AHP model performed slightly better than the MARS, GLM, and GAM models in prediction. These algorithms can be very useful for landslide susceptibility and hazard mapping and land use planning in regional scale.

  11. Applying the theory of planned behavior to self-report dental attendance in Norwegian adults through structural equation modelling approach.

    PubMed

    Åstrøm, Anne N; Lie, Stein Atle; Gülcan, Ferda

    2018-05-31

    Understanding factors that affect dental attendance behavior helps in constructing effective oral health campaigns. A socio-cognitive model that adequately explains variance in regular dental attendance has yet to be validated among younger adults in Norway. Focusing a representative sample of younger Norwegian adults, this cross-sectional study provided an empirical test of the Theory of Planned Behavior (TPB) augmented with descriptive norm and action planning and estimated direct and indirect effects of attitudes, subjective norms, descriptive norms, perceived behavioral control and action planning on intended and self-reported regular dental attendance. Self-administered questionnaires provided by 2551, 25-35 year olds, randomly selected from the Norwegian national population registry were used to assess socio-demographic factors, dental attendance as well as the constructs of the augmented TPB model (attitudes, subjective norms, descriptive norms, intention, action planning). A two-stage process of structural equation modelling (SEM) was used to test the augmented TPB model. Confirmatory factor analysis, CFA, confirmed the proposed correlated 6-factor measurement model after re-specification. SEM revealed that attitudes, perceived behavioral control, subjective norms and descriptive norms explained intention. The corresponding standardized regression coefficients were respectively (β = 0.70), (β =0.18), (β = - 0.17) and (β =0.11) (p < 0.001). Intention (β =0.46) predicted action planning and action planning (β =0.19) predicted dental attendance behavior (p < 0.001). The model revealed indirect effects of intention and perceived behavioral control on behavior through action planning and through intention and action planning, respectively. The final model explained 64 and 41% of the total variance in intention and dental attendance behavior. The findings support the utility of the TPB, the expanded normative component and action planning in predicting younger adults' intended- and self-reported dental attendance. Interventions targeting young adults' dental attendance might usefully focus on positive consequences following this behavior accompanied with modeling and group performance.

  12. Patient-specific dosimetric endpoints based treatment plan quality control in radiotherapy.

    PubMed

    Song, Ting; Staub, David; Chen, Mingli; Lu, Weiguo; Tian, Zhen; Jia, Xun; Li, Yongbao; Zhou, Linghong; Jiang, Steve B; Gu, Xuejun

    2015-11-07

    In intensity modulated radiotherapy (IMRT), the optimal plan for each patient is specific due to unique patient anatomy. To achieve such a plan, patient-specific dosimetric goals reflecting each patient's unique anatomy should be defined and adopted in the treatment planning procedure for plan quality control. This study is to develop such a personalized treatment plan quality control tool by predicting patient-specific dosimetric endpoints (DEs). The incorporation of patient specific DEs is realized by a multi-OAR geometry-dosimetry model, capable of predicting optimal DEs based on the individual patient's geometry. The overall quality of a treatment plan is then judged with a numerical treatment plan quality indicator and characterized as optimal or suboptimal. Taking advantage of clinically available prostate volumetric modulated arc therapy (VMAT) treatment plans, we built and evaluated our proposed plan quality control tool. Using our developed tool, six of twenty evaluated plans were identified as sub-optimal plans. After plan re-optimization, these suboptimal plans achieved better OAR dose sparing without sacrificing the PTV coverage, and the dosimetric endpoints of the re-optimized plans agreed well with the model predicted values, which validate the predictability of the proposed tool. In conclusion, the developed tool is able to accurately predict optimally achievable DEs of multiple OARs, identify suboptimal plans, and guide plan optimization. It is a useful tool for achieving patient-specific treatment plan quality control.

  13. Development and Initial Validation of an Instrument for Human Capital Planning

    ERIC Educational Resources Information Center

    Zula, Kenneth J.; Chermack, Thomas J.

    2008-01-01

    This article reports on development and validation of an instrument for use in human capital approaches for organizational planning. The article describes use of a team of subject matter experts in developing a measure of human capital planning, and use of exploratory factor analysis techniques to validate the resulting instrument. These data were…

  14. Radioactive waste isolation in salt: special advisory report on the status of the Office of Nuclear Waste Isolation's plans for repository performance assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ditmars, J.D.; Walbridge, E.W.; Rote, D.M.

    1983-10-01

    Repository performance assessment is analysis that identifies events and processes that might affect a repository system for isolation of radioactive waste, examines their effects on barriers to waste migration, and estimates the probabilities of their occurrence and their consequences. In 1983 Battelle Memorial Institute's Office of Nuclear Waste Isolation (ONWI) prepared two plans - one for performance assessment for a waste repository in salt and one for verification and validation of performance assessment technology. At the request of the US Department of Energy's Salt Repository Project Office (SRPO), Argonne National Laboratory reviewed those plans and prepared this report to advisemore » SRPO of specific areas where ONWI's plans for performance assessment might be improved. This report presents a framework for repository performance assessment that clearly identifies the relationships among the disposal problems, the processes underlying the problems, the tools for assessment (computer codes), and the data. In particular, the relationships among important processes and 26 model codes available to ONWI are indicated. A common suggestion for computer code verification and validation is the need for specific and unambiguous documentation of the results of performance assessment activities. A major portion of this report consists of status summaries of 27 model codes indicated as potentially useful by ONWI. The code summaries focus on three main areas: (1) the code's purpose, capabilities, and limitations; (2) status of the elements of documentation and review essential for code verification and validation; and (3) proposed application of the code for performance assessment of salt repository systems. 15 references, 6 figures, 4 tables.« less

  15. Measuring infrastructure: A key step in program evaluation and planning.

    PubMed

    Schmitt, Carol L; Glasgow, LaShawn; Lavinghouze, S Rene; Rieker, Patricia P; Fulmer, Erika; McAleer, Kelly; Rogers, Todd

    2016-06-01

    State tobacco prevention and control programs (TCPs) require a fully functioning infrastructure to respond effectively to the Surgeon General's call for accelerating the national reduction in tobacco use. The literature describes common elements of infrastructure; however, a lack of valid and reliable measures has made it difficult for program planners to monitor relevant infrastructure indicators and address observed deficiencies, or for evaluators to determine the association among infrastructure, program efforts, and program outcomes. The Component Model of Infrastructure (CMI) is a comprehensive, evidence-based framework that facilitates TCP program planning efforts to develop and maintain their infrastructure. Measures of CMI components were needed to evaluate the model's utility and predictive capability for assessing infrastructure. This paper describes the development of CMI measures and results of a pilot test with nine state TCP managers. Pilot test findings indicate that the tool has good face validity and is clear and easy to follow. The CMI tool yields data that can enhance public health efforts in a funding-constrained environment and provides insight into program sustainability. Ultimately, the CMI measurement tool could facilitate better evaluation and program planning across public health programs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Towards Personalized Cardiology: Multi-Scale Modeling of the Failing Heart

    PubMed Central

    Amr, Ali; Neumann, Dominik; Georgescu, Bogdan; Seegerer, Philipp; Kamen, Ali; Haas, Jan; Frese, Karen S.; Irawati, Maria; Wirsz, Emil; King, Vanessa; Buss, Sebastian; Mereles, Derliz; Zitron, Edgar; Keller, Andreas; Katus, Hugo A.; Comaniciu, Dorin; Meder, Benjamin

    2015-01-01

    Background Despite modern pharmacotherapy and advanced implantable cardiac devices, overall prognosis and quality of life of HF patients remain poor. This is in part due to insufficient patient stratification and lack of individualized therapy planning, resulting in less effective treatments and a significant number of non-responders. Methods and Results State-of-the-art clinical phenotyping was acquired, including magnetic resonance imaging (MRI) and biomarker assessment. An individualized, multi-scale model of heart function covering cardiac anatomy, electrophysiology, biomechanics and hemodynamics was estimated using a robust framework. The model was computed on n=46 HF patients, showing for the first time that advanced multi-scale models can be fitted consistently on large cohorts. Novel multi-scale parameters derived from the model of all cases were analyzed and compared against clinical parameters, cardiac imaging, lab tests and survival scores to evaluate the explicative power of the model and its potential for better patient stratification. Model validation was pursued by comparing clinical parameters that were not used in the fitting process against model parameters. Conclusion This paper illustrates how advanced multi-scale models can complement cardiovascular imaging and how they could be applied in patient care. Based on obtained results, it becomes conceivable that, after thorough validation, such heart failure models could be applied for patient management and therapy planning in the future, as we illustrate in one patient of our cohort who received CRT-D implantation. PMID:26230546

  17. Optical dosimetry probes to validate Monte Carlo and empirical-method-based NIR dose planning in the brain.

    PubMed

    Verleker, Akshay Prabhu; Shaffer, Michael; Fang, Qianqian; Choi, Mi-Ran; Clare, Susan; Stantz, Keith M

    2016-12-01

    A three-dimensional photon dosimetry in tissues is critical in designing optical therapeutic protocols to trigger light-activated drug release. The objective of this study is to investigate the feasibility of a Monte Carlo-based optical therapy planning software by developing dosimetry tools to characterize and cross-validate the local photon fluence in brain tissue, as part of a long-term strategy to quantify the effects of photoactivated drug release in brain tumors. An existing GPU-based 3D Monte Carlo (MC) code was modified to simulate near-infrared photon transport with differing laser beam profiles within phantoms of skull bone (B), white matter (WM), and gray matter (GM). A novel titanium-based optical dosimetry probe with isotropic acceptance was used to validate the local photon fluence, and an empirical model of photon transport was developed to significantly decrease execution time for clinical application. Comparisons between the MC and the dosimetry probe measurements were on an average 11.27%, 13.25%, and 11.81% along the illumination beam axis, and 9.4%, 12.06%, 8.91% perpendicular to the beam axis for WM, GM, and B phantoms, respectively. For a heterogeneous head phantom, the measured % errors were 17.71% and 18.04% along and perpendicular to beam axis. The empirical algorithm was validated by probe measurements and matched the MC results (R20.99), with average % error of 10.1%, 45.2%, and 22.1% relative to probe measurements, and 22.6%, 35.8%, and 21.9% relative to the MC, for WM, GM, and B phantoms, respectively. The simulation time for the empirical model was 6 s versus 8 h for the GPU-based Monte Carlo for a head phantom simulation. These tools provide the capability to develop and optimize treatment plans for optimal release of pharmaceuticals in the treatment of cancer. Future work will test and validate these novel delivery and release mechanisms in vivo.

  18. 78 FR 17466 - Shipping Coordinating Committee; Notice of Committee Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-21

    ... of other IMO bodies --Validation of model training courses --Unlawful practices associated with certificates of competency --Casualty analysis --Development of an e-navigation strategy implementation plan... personnel involved with tug-barge operations --Revision of the Recommendations on training of personnel on...

  19. Fontan Surgical Planning: Previous Accomplishments, Current Challenges, and Future Directions.

    PubMed

    Trusty, Phillip M; Slesnick, Timothy C; Wei, Zhenglun Alan; Rossignac, Jarek; Kanter, Kirk R; Fogel, Mark A; Yoganathan, Ajit P

    2018-04-01

    The ultimate goal of Fontan surgical planning is to provide additional insights into the clinical decision-making process. In its current state, surgical planning offers an accurate hemodynamic assessment of the pre-operative condition, provides anatomical constraints for potential surgical options, and produces decent post-operative predictions if boundary conditions are similar enough between the pre-operative and post-operative states. Moving forward, validation with post-operative data is a necessary step in order to assess the accuracy of surgical planning and determine which methodological improvements are needed. Future efforts to automate the surgical planning process will reduce the individual expertise needed and encourage use in the clinic by clinicians. As post-operative physiologic predictions improve, Fontan surgical planning will become an more effective tool to accurately model patient-specific hemodynamics.

  20. Treatment Planning and Image Guidance for Radiofrequency Ablations of Large Tumors

    PubMed Central

    Ren, Hongliang; Campos-Nanez, Enrique; Yaniv, Ziv; Banovac, Filip; Abeledo, Hernan; Hata, Nobuhiko; Cleary, Kevin

    2014-01-01

    This article addresses the two key challenges in computer-assisted percutaneous tumor ablation: planning multiple overlapping ablations for large tumors while avoiding critical structures, and executing the prescribed plan. Towards semi-automatic treatment planning for image-guided surgical interventions, we develop a systematic approach to the needle-based ablation placement task, ranging from pre-operative planning algorithms to an intra-operative execution platform. The planning system incorporates clinical constraints on ablations and trajectories using a multiple objective optimization formulation, which consists of optimal path selection and ablation coverage optimization based on integer programming. The system implementation is presented and validated in phantom studies and on an animal model. The presented system can potentially be further extended for other ablation techniques such as cryotherapy. PMID:24235279

  1. Key Questions in Building Defect Prediction Models in Practice

    NASA Astrophysics Data System (ADS)

    Ramler, Rudolf; Wolfmaier, Klaus; Stauder, Erwin; Kossak, Felix; Natschläger, Thomas

    The information about which modules of a future version of a software system are defect-prone is a valuable planning aid for quality managers and testers. Defect prediction promises to indicate these defect-prone modules. However, constructing effective defect prediction models in an industrial setting involves a number of key questions. In this paper we discuss ten key questions identified in context of establishing defect prediction in a large software development project. Seven consecutive versions of the software system have been used to construct and validate defect prediction models for system test planning. Furthermore, the paper presents initial empirical results from the studied project and, by this means, contributes answers to the identified questions.

  2. Performance validation of the ANSER control laws for the F-18 HARV

    NASA Technical Reports Server (NTRS)

    Messina, Michael D.

    1995-01-01

    The ANSER control laws were implemented in Ada by NASA Dryden for flight test on the High Alpha Research Vehicle (HARV). The Ada implementation was tested in the hardware-in-the-loop (HIL) simulation, and results were compared to those obtained with the NASA Langley batch Fortran implementation of the control laws which are considered the 'truth model.' This report documents the performance validation test results between these implementations. This report contains the ANSER performance validation test plan, HIL versus batch time-history comparisons, simulation scripts used to generate checkcases, and detailed analysis of discrepancies discovered during testing.

  3. Performance validation of the ANSER Control Laws for the F-18 HARV

    NASA Technical Reports Server (NTRS)

    Messina, Michael D.

    1995-01-01

    The ANSER control laws were implemented in Ada by NASA Dryden for flight test on the High Alpha Research Vehicle (HARV). The Ada implementation was tested in the hardware-in-the-loop (HIL) simulation, and results were compared to those obtained with the NASA Langley batch Fortran implementation of the control laws which are considered the 'truth model'. This report documents the performance validation test results between these implementations. This report contains the ANSER performance validation test plan, HIL versus batch time-history comparisons, simulation scripts used to generate checkcases, and detailed analysis of discrepancies discovered during testing.

  4. Robot body self-modeling algorithm: a collision-free motion planning approach for humanoids.

    PubMed

    Leylavi Shoushtari, Ali

    2016-01-01

    Motion planning for humanoid robots is one of the critical issues due to the high redundancy and theoretical and technical considerations e.g. stability, motion feasibility and collision avoidance. The strategies which central nervous system employs to plan, signal and control the human movements are a source of inspiration to deal with the mentioned problems. Self-modeling is a concept inspired by body self-awareness in human. In this research it is integrated in an optimal motion planning framework in order to detect and avoid collision of the manipulated object with the humanoid body during performing a dynamic task. Twelve parametric functions are designed as self-models to determine the boundary of humanoid's body. Later, the boundaries which mathematically defined by the self-models are employed to calculate the safe region for box to avoid the collision with the robot. Four different objective functions are employed in motion simulation to validate the robustness of algorithm under different dynamics. The results also confirm the collision avoidance, reality and stability of the predicted motion.

  5. SU-D-BRB-01: A Predictive Planning Tool for Stereotactic Radiosurgery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palefsky, S; Roper, J; Elder, E

    Purpose: To demonstrate the feasibility of a predictive planning tool which provides SRS planning guidance based on simple patient anatomical properties: PTV size, PTV shape and distance from critical structures. Methods: Ten framed SRS cases treated at Winship Cancer Institute of Emory University were analyzed to extract data on PTV size, sphericity (shape), and distance from critical structures such as the brainstem and optic chiasm. The cases consisted of five pairs. Each pair consisted of two cases with a similar diagnosis (such as pituitary adenoma or arteriovenous malformation) that were treated with different techniques: DCA, or IMRS. A Naive Bayesmore » Classifier was trained on this data to establish the conditions under which each treatment modality was used. This model was validated by classifying ten other randomly-selected cases into DCA or IMRS classes, calculating the probability of each technique, and comparing results to the treated technique. Results: Of the ten cases used to validate the model, nine had their technique predicted correctly. The three cases treated with IMRS were all identified as such. Their probabilities of being treated with IMRS ranged between 59% and 100%. Six of the seven cases treated with DCA were correctly classified. These probabilities ranged between 51% and 95%. One case treated with DCA was incorrectly predicted to be an IMRS plan. The model’s confidence in this case was 91%. Conclusion: These findings indicate that a predictive planning tool based on simple patient anatomical properties can predict the SRS technique used for treatment. The algorithm operated with 90% accuracy. With further validation on larger patient populations, this tool may be used clinically to guide planners in choosing an appropriate treatment technique. The prediction algorithm could also be adapted to guide selection of treatment parameters such as treatment modality and number of fields for radiotherapy across anatomical sites.« less

  6. SU-G-TeP1-06: Fast GPU Framework for Four-Dimensional Monte Carlo in Adaptive Intensity Modulated Proton Therapy (IMPT) for Mobile Tumors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Botas, P; Heidelberg University, Heidelberg; Grassberger, C

    Purpose: To demonstrate the feasibility of fast Monte Carlo (MC) treatment planning and verification using four-dimensional CT (4DCT) for adaptive IMPT for lung cancer patients. Methods: A validated GPU MC code, gPMC, has been linked to the patient database at our institution and employed to compute the dose-influence matrices (Dij) on the planning CT (pCT). The pCT is an average of the respiratory motion of the patient. The Dijs and patient structures were fed to the optimizer to calculate a treatment plan. To validate the plan against motion, a 4D dose distribution averaged over the possible starting phases is calculatedmore » using the 4DCT and a model of the time structure of the delivered spot map. The dose is accumulated using vector maps created by a GPU-accelerated deformable image registration program (DIR) from each phase of the 4DCT to the reference phase using the B-spline method. Calculation of the Dij matrices and the DIR are performed on a cluster, with each field and vector map calculated in parallel. Results: The Dij production takes ∼3.5s per beamlet for 10e6 protons, depending on the energy and the CT size. Generating a plan with 4D simulation of 1000 spots in 4 fields takes approximately 1h. To test the framework, IMPT plans for 10 lung cancer patients were generated for validation. Differences between the planned and the delivered dose of 19% in dose to some organs at risk and 1.4/21.1% in target mean dose/homogeneity with respect to the plan were observed, suggesting potential for improvement if adaptation is considered. Conclusion: A fast MC treatment planning framework has been developed that allows reliable plan design and verification for mobile targets and adaptation of treatment plans. This will significantly impact treatments for lung tumors, as 4D-MC dose calculations can now become part of planning strategies.« less

  7. Modeling synchronous voltage source converters in transmission system planning studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kosterev, D.N.

    1997-04-01

    A Voltage Source Converter (VSC) can be beneficial to power utilities in many ways. To evaluate the VSC performance in potential applications, the device has to be represented appropriately in planning studies. This paper addresses VSC modeling for EMTP, powerflow, and transient stability studies. First, the VSC operating principles are overviewed, and the device model for EMTP studies is presented. The ratings of VSC components are discussed, and the device operating characteristics are derived based on these ratings. A powerflow model is presented and various control modes are proposed. A detailed stability model is developed, and its step-by-step initialization proceduremore » is described. A simplified stability model is also derived under stated assumptions. Finally, validation studies are performed to demonstrate performance of developed stability models and to compare it with EMTP simulations.« less

  8. A predictive score to identify hospitalized patients' risk of discharge to a post-acute care facility

    PubMed Central

    Louis Simonet, Martine; Kossovsky, Michel P; Chopard, Pierre; Sigaud, Philippe; Perneger, Thomas V; Gaspoz, Jean-Michel

    2008-01-01

    Background Early identification of patients who need post-acute care (PAC) may improve discharge planning. The purposes of the study were to develop and validate a score predicting discharge to a post-acute care (PAC) facility and to determine its best assessment time. Methods We conducted a prospective study including 349 (derivation cohort) and 161 (validation cohort) consecutive patients in a general internal medicine service of a teaching hospital. We developed logistic regression models predicting discharge to a PAC facility, based on patient variables measured on admission (day 1) and on day 3. The value of each model was assessed by its area under the receiver operating characteristics curve (AUC). A simple numerical score was derived from the best model, and was validated in a separate cohort. Results Prediction of discharge to a PAC facility was as accurate on day 1 (AUC: 0.81) as on day 3 (AUC: 0.82). The day-3 model was more parsimonious, with 5 variables: patient's partner inability to provide home help (4 pts); inability to self-manage drug regimen (4 pts); number of active medical problems on admission (1 pt per problem); dependency in bathing (4 pts) and in transfers from bed to chair (4 pts) on day 3. A score ≥ 8 points predicted discharge to a PAC facility with a sensitivity of 87% and a specificity of 63%, and was significantly associated with inappropriate hospital days due to discharge delays. Internal and external validations confirmed these results. Conclusion A simple score computed on the 3rd hospital day predicted discharge to a PAC facility with good accuracy. A score > 8 points should prompt early discharge planning. PMID:18647410

  9. Cross Validation of Selection of Variables in Multiple Regression.

    DTIC Science & Technology

    1979-12-01

    55 vii CROSS VALIDATION OF SELECTION OF VARIABLES IN MULTIPLE REGRESSION I Introduction Background Long term DoD planning gcals...028545024 .31109000 BF * SS - .008700618 .0471961 Constant - .70977903 85.146786 55 had adequate predictive capabilities; the other two models (the...71ZCO F111D Control 54 73EGO FlIID Computer, General Purpose 55 73EPO FII1D Converter-Multiplexer 56 73HAO flllD Stabilizer Platform 57 73HCO F1ID

  10. Dynamic Modeling and Soil Mechanics for Path Planning of the Mars Exploration Rovers

    NASA Technical Reports Server (NTRS)

    Trease, Brian

    2011-01-01

    To help minimize risk of high sinkage and slippage during drives and to better understand soil properties and rover terramechanics from drive data, a multidisciplinary team was formed under the Mars Exploration Rover project to develop and utilize dynamic computer-based models for rover drives over realistic terrains. The resulting system, named ARTEMIS (Adams-based Rover Terramechanics and Mobility Interaction System), consists of the dynamic model, a library of terramechanics subroutines, and the high-resolution digital elevation maps of the Mars surface. A 200-element model of the rovers was developed and validated for drop tests before launch, using Adams dynamic modeling software. The external library was built in Fortran and called by Adams to model the wheel-soil interactions include the rut-formation effect of deformable soils, lateral and longitudinal forces, bull-dozing effects, and applied wheel torque. The paper presents the details and implementation of the system. To validate the developed system, one study case is presented from a realistic drive on Mars of the Opportunity rover. The simulation results match well from the measurement of on-board telemetry data. In its final form, ARTEMIS will be used in a predictive manner to assess terrain navigability and will become part of the overall effort in path planning and navigation for both Martian and lunar rovers.

  11. Automatic Segmentation of the Eye in 3D Magnetic Resonance Imaging: A Novel Statistical Shape Model for Treatment Planning of Retinoblastoma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ciller, Carlos, E-mail: carlos.cillerruiz@unil.ch; Ophthalmic Technology Group, ARTORG Center of the University of Bern, Bern; Centre d’Imagerie BioMédicale, University of Lausanne, Lausanne

    Purpose: Proper delineation of ocular anatomy in 3-dimensional (3D) imaging is a big challenge, particularly when developing treatment plans for ocular diseases. Magnetic resonance imaging (MRI) is presently used in clinical practice for diagnosis confirmation and treatment planning for treatment of retinoblastoma in infants, where it serves as a source of information, complementary to the fundus or ultrasonographic imaging. Here we present a framework to fully automatically segment the eye anatomy for MRI based on 3D active shape models (ASM), and we validate the results and present a proof of concept to automatically segment pathological eyes. Methods and Materials: Manualmore » and automatic segmentation were performed in 24 images of healthy children's eyes (3.29 ± 2.15 years of age). Imaging was performed using a 3-T MRI scanner. The ASM consists of the lens, the vitreous humor, the sclera, and the cornea. The model was fitted by first automatically detecting the position of the eye center, the lens, and the optic nerve, and then aligning the model and fitting it to the patient. We validated our segmentation method by using a leave-one-out cross-validation. The segmentation results were evaluated by measuring the overlap, using the Dice similarity coefficient (DSC) and the mean distance error. Results: We obtained a DSC of 94.90 ± 2.12% for the sclera and the cornea, 94.72 ± 1.89% for the vitreous humor, and 85.16 ± 4.91% for the lens. The mean distance error was 0.26 ± 0.09 mm. The entire process took 14 seconds on average per eye. Conclusion: We provide a reliable and accurate tool that enables clinicians to automatically segment the sclera, the cornea, the vitreous humor, and the lens, using MRI. We additionally present a proof of concept for fully automatically segmenting eye pathology. This tool reduces the time needed for eye shape delineation and thus can help clinicians when planning eye treatment and confirming the extent of the tumor.« less

  12. Automatic Segmentation of the Eye in 3D Magnetic Resonance Imaging: A Novel Statistical Shape Model for Treatment Planning of Retinoblastoma.

    PubMed

    Ciller, Carlos; De Zanet, Sandro I; Rüegsegger, Michael B; Pica, Alessia; Sznitman, Raphael; Thiran, Jean-Philippe; Maeder, Philippe; Munier, Francis L; Kowal, Jens H; Cuadra, Meritxell Bach

    2015-07-15

    Proper delineation of ocular anatomy in 3-dimensional (3D) imaging is a big challenge, particularly when developing treatment plans for ocular diseases. Magnetic resonance imaging (MRI) is presently used in clinical practice for diagnosis confirmation and treatment planning for treatment of retinoblastoma in infants, where it serves as a source of information, complementary to the fundus or ultrasonographic imaging. Here we present a framework to fully automatically segment the eye anatomy for MRI based on 3D active shape models (ASM), and we validate the results and present a proof of concept to automatically segment pathological eyes. Manual and automatic segmentation were performed in 24 images of healthy children's eyes (3.29 ± 2.15 years of age). Imaging was performed using a 3-T MRI scanner. The ASM consists of the lens, the vitreous humor, the sclera, and the cornea. The model was fitted by first automatically detecting the position of the eye center, the lens, and the optic nerve, and then aligning the model and fitting it to the patient. We validated our segmentation method by using a leave-one-out cross-validation. The segmentation results were evaluated by measuring the overlap, using the Dice similarity coefficient (DSC) and the mean distance error. We obtained a DSC of 94.90 ± 2.12% for the sclera and the cornea, 94.72 ± 1.89% for the vitreous humor, and 85.16 ± 4.91% for the lens. The mean distance error was 0.26 ± 0.09 mm. The entire process took 14 seconds on average per eye. We provide a reliable and accurate tool that enables clinicians to automatically segment the sclera, the cornea, the vitreous humor, and the lens, using MRI. We additionally present a proof of concept for fully automatically segmenting eye pathology. This tool reduces the time needed for eye shape delineation and thus can help clinicians when planning eye treatment and confirming the extent of the tumor. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Joint Planning Of Energy Storage and Transmission Considering Wind-Storage Combined System and Demand Side Response

    NASA Astrophysics Data System (ADS)

    Huang, Y.; Liu, B. Z.; Wang, K. Y.; Ai, X.

    2017-12-01

    In response to the new requirements of the operation mode of wind-storage combined system and demand side response for transmission network planning, this paper presents a joint planning of energy storage and transmission considering wind-storage combined system and demand side response. Firstly, the charge-discharge strategy of energy storage system equipped at the outlet of wind farm and demand side response strategy are analysed to achieve the best comprehensive benefits through the coordination of the two. Secondly, in the general transmission network planning model with wind power, both energy storage cost and demand side response cost are added to the objective function. Not only energy storage operation constraints and but also demand side response constraints are introduced into the constraint condition. Based on the classical formulation of TEP, a new formulation is developed considering the simultaneous addition of the charge-discharge strategy of energy storage system equipped at the outlet of the wind farm and demand side response strategy, which belongs to a typical mixed integer linear programming model that can be solved by mature optimization software. The case study based on the Garver-6 bus system shows that the validity of the proposed model is verified by comparison with general transmission network planning model. Furthermore, the results demonstrate that the joint planning model can gain more economic benefits through setting up different cases.

  14. NEAMS SOFTWARE V&V PLAN FOR THE MARMOT SOFTWARE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michael R Tonks

    2014-03-01

    In order to ensure the accuracy and quality of the microstructure based materials models being developed in conjunction with MARMOT simulations, MARMOT must undergo exhaustive verification and validation. Only after this process can we confidently rely on the MARMOT code to predict the microstructure evolution within the fuel. Therefore, in this report we lay out a V&V plan for the MARMOT code, highlighting where existing data could be used and where new data is required.

  15. CMC Research at NASA Glenn in 2015: Recent Progress and Plans

    NASA Technical Reports Server (NTRS)

    Grady, Joseph E.

    2015-01-01

    As part of NASAs Aeronautical Sciences project, Glenn Research Center has developed advanced fiber and matrix constituents for a 2700F CMC for turbine engine applications. Fiber and matrix development and characterization will be reviewed. Resulting improvements in CMC mechanical properties and durability will be summarized. Plans for 2015 will be described, including development and validation of models predicting effects of the engine environment on durability of SiC/SiC composites with Environmental Barrier Coatings

  16. Active imaging system performance model for target acquisition

    NASA Astrophysics Data System (ADS)

    Espinola, Richard L.; Teaney, Brian; Nguyen, Quang; Jacobs, Eddie L.; Halford, Carl E.; Tofsted, David H.

    2007-04-01

    The U.S. Army RDECOM CERDEC Night Vision & Electronic Sensors Directorate has developed a laser-range-gated imaging system performance model for the detection, recognition, and identification of vehicle targets. The model is based on the established US Army RDECOM CERDEC NVESD sensor performance models of the human system response through an imaging system. The Java-based model, called NVLRG, accounts for the effect of active illumination, atmospheric attenuation, and turbulence effects relevant to LRG imagers, such as speckle and scintillation, and for the critical sensor and display components. This model can be used to assess the performance of recently proposed active SWIR systems through various trade studies. This paper will describe the NVLRG model in detail, discuss the validation of recent model components, present initial trade study results, and outline plans to validate and calibrate the end-to-end model with field data through human perception testing.

  17. A text-based data mining and toxicity prediction modeling system for a clinical decision support in radiation oncology: A preliminary study

    NASA Astrophysics Data System (ADS)

    Kim, Kwang Hyeon; Lee, Suk; Shim, Jang Bo; Chang, Kyung Hwan; Yang, Dae Sik; Yoon, Won Sup; Park, Young Je; Kim, Chul Yong; Cao, Yuan Jie

    2017-08-01

    The aim of this study is an integrated research for text-based data mining and toxicity prediction modeling system for clinical decision support system based on big data in radiation oncology as a preliminary research. The structured and unstructured data were prepared by treatment plans and the unstructured data were extracted by dose-volume data image pattern recognition of prostate cancer for research articles crawling through the internet. We modeled an artificial neural network to build a predictor model system for toxicity prediction of organs at risk. We used a text-based data mining approach to build the artificial neural network model for bladder and rectum complication predictions. The pattern recognition method was used to mine the unstructured toxicity data for dose-volume at the detection accuracy of 97.9%. The confusion matrix and training model of the neural network were achieved with 50 modeled plans (n = 50) for validation. The toxicity level was analyzed and the risk factors for 25% bladder, 50% bladder, 20% rectum, and 50% rectum were calculated by the artificial neural network algorithm. As a result, 32 plans could cause complication but 18 plans were designed as non-complication among 50 modeled plans. We integrated data mining and a toxicity modeling method for toxicity prediction using prostate cancer cases. It is shown that a preprocessing analysis using text-based data mining and prediction modeling can be expanded to personalized patient treatment decision support based on big data.

  18. Texas flexible pavements and overlays : calibration plans for M-E models and related software.

    DOT National Transportation Integrated Search

    2013-06-01

    This five-year project was initiated to collect materials and pavement performance data on a minimum of 100 highway test sections around the State of Texas, incorporating flexible pavements and overlays. Besides being used to calibrate and validate m...

  19. Rethinking Affirmative Action on Campus.

    ERIC Educational Resources Information Center

    La Noue, George R.

    1995-01-01

    The legal validity and public support of racial, ethnic, and gender preferences are eroding, and all affirmative action programs must be reconsidered. All American colleges and universities must develop new plans for affirmative action programs. Policies should cover admission, financial aid, employment, and contracting. Three primary models focus…

  20. MPST Software: grl_pef_check

    NASA Technical Reports Server (NTRS)

    Call, Jared A.; Kwok, John H.; Fisher, Forest W.

    2013-01-01

    This innovation is a tool used to verify and validate spacecraft sequences at the predicted events file (PEF) level for the GRAIL (Gravity Recovery and Interior Laboratory, see http://www.nasa. gov/mission_pages/grail/main/index. html) mission as part of the Multi-Mission Planning and Sequencing Team (MPST) operations process to reduce the possibility for errors. This tool is used to catch any sequence related errors or issues immediately after the seqgen modeling to streamline downstream processes. This script verifies and validates the seqgen modeling for the GRAIL MPST process. A PEF is provided as input, and dozens of checks are performed on it to verify and validate the command products including command content, command ordering, flight-rule violations, modeling boundary consistency, resource limits, and ground commanding consistency. By performing as many checks as early in the process as possible, grl_pef_check streamlines the MPST task of generating GRAIL command and modeled products on an aggressive schedule. By enumerating each check being performed, and clearly stating the criteria and assumptions made at each step, grl_pef_check can be used as a manual checklist as well as an automated tool. This helper script was written with a focus on enabling the user with the information they need in order to evaluate a sequence quickly and efficiently, while still keeping them informed and active in the overall sequencing process. grl_pef_check verifies and validates the modeling and sequence content prior to investing any more effort into the build. There are dozens of various items in the modeling run that need to be checked, which is a time-consuming and errorprone task. Currently, no software exists that provides this functionality. Compared to a manual process, this script reduces human error and saves considerable man-hours by automating and streamlining the mission planning and sequencing task for the GRAIL mission.

  1. An Innovative Software Tool Suite for Power Plant Model Validation and Parameter Calibration using PMU Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Yuanyuan; Diao, Ruisheng; Huang, Renke

    Maintaining good quality of power plant stability models is of critical importance to ensure the secure and economic operation and planning of today’s power grid with its increasing stochastic and dynamic behavior. According to North American Electric Reliability (NERC) standards, all generators in North America with capacities larger than 10 MVA are required to validate their models every five years. Validation is quite costly and can significantly affect the revenue of generator owners, because the traditional staged testing requires generators to be taken offline. Over the past few years, validating and calibrating parameters using online measurements including phasor measurement unitsmore » (PMUs) and digital fault recorders (DFRs) has been proven to be a cost-effective approach. In this paper, an innovative open-source tool suite is presented for validating power plant models using PPMV tool, identifying bad parameters with trajectory sensitivity analysis, and finally calibrating parameters using an ensemble Kalman filter (EnKF) based algorithm. The architectural design and the detailed procedures to run the tool suite are presented, with results of test on a realistic hydro power plant using PMU measurements for 12 different events. The calibrated parameters of machine, exciter, governor and PSS models demonstrate much better performance than the original models for all the events and show the robustness of the proposed calibration algorithm.« less

  2. Modeling and Validation of Microwave Ablations with Internal Vaporization

    PubMed Central

    Chiang, Jason; Birla, Sohan; Bedoya, Mariajose; Jones, David; Subbiah, Jeyam; Brace, Christopher L.

    2014-01-01

    Numerical simulation is increasingly being utilized for computer-aided design of treatment devices, analysis of ablation growth, and clinical treatment planning. Simulation models to date have incorporated electromagnetic wave propagation and heat conduction, but not other relevant physics such as water vaporization and mass transfer. Such physical changes are particularly noteworthy during the intense heat generation associated with microwave heating. In this work, a numerical model was created that integrates microwave heating with water vapor generation and transport by using porous media assumptions in the tissue domain. The heating physics of the water vapor model was validated through temperature measurements taken at locations 5, 10 and 20 mm away from the heating zone of the microwave antenna in homogenized ex vivo bovine liver setup. Cross-sectional area of water vapor transport was validated through intra-procedural computed tomography (CT) during microwave ablations in homogenized ex vivo bovine liver. Iso-density contours from CT images were compared to vapor concentration contours from the numerical model at intermittent time points using the Jaccard Index. In general, there was an improving correlation in ablation size dimensions as the ablation procedure proceeded, with a Jaccard Index of 0.27, 0.49, 0.61, 0.67 and 0.69 at 1, 2, 3, 4, and 5 minutes. This study demonstrates the feasibility and validity of incorporating water vapor concentration into thermal ablation simulations and validating such models experimentally. PMID:25330481

  3. Longitudinal train dynamics model for a rail transit simulation system

    DOE PAGES

    Wang, Jinghui; Rakha, Hesham A.

    2018-01-01

    The paper develops a longitudinal train dynamics model in support of microscopic railway transportation simulation. The model can be calibrated without any mechanical data making it ideal for implementation in transportation simulators. The calibration and validation work is based on data collected from the Portland light rail train fleet. The calibration procedure is mathematically formulated as a constrained non-linear optimization problem. The validity of the model is assessed by comparing instantaneous model predictions against field observations, and also evaluated in the domains of acceleration/deceleration versus speed and acceleration/deceleration versus distance. A test is conducted to investigate the adequacy of themore » model in simulation implementation. The results demonstrate that the proposed model can adequately capture instantaneous train dynamics, and provides good performance in the simulation test. Thus, the model provides a simple theoretical foundation for microscopic simulators and will significantly support the planning, management and control of railway transportation systems.« less

  4. Longitudinal train dynamics model for a rail transit simulation system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jinghui; Rakha, Hesham A.

    The paper develops a longitudinal train dynamics model in support of microscopic railway transportation simulation. The model can be calibrated without any mechanical data making it ideal for implementation in transportation simulators. The calibration and validation work is based on data collected from the Portland light rail train fleet. The calibration procedure is mathematically formulated as a constrained non-linear optimization problem. The validity of the model is assessed by comparing instantaneous model predictions against field observations, and also evaluated in the domains of acceleration/deceleration versus speed and acceleration/deceleration versus distance. A test is conducted to investigate the adequacy of themore » model in simulation implementation. The results demonstrate that the proposed model can adequately capture instantaneous train dynamics, and provides good performance in the simulation test. Thus, the model provides a simple theoretical foundation for microscopic simulators and will significantly support the planning, management and control of railway transportation systems.« less

  5. Lessons Learned on Operating and Preparing Operations for a Technology Mission from the Perspective of the Earth Observing-1 Mission

    NASA Technical Reports Server (NTRS)

    Mandl, Dan; Howard, Joseph

    2000-01-01

    The New Millennium Program's first Earth-observing mission (EO-1) is a technology validation mission. It is managed by the NASA Goddard Space Flight Center in Greenbelt, Maryland and is scheduled for launch in the summer of 2000. The purpose of this mission is to flight-validate revolutionary technologies that will contribute to the reduction of cost and increase of capabilities for future land imaging missions. In the EO-1 mission, there are five instrument, five spacecraft, and three supporting technologies to flight-validate during a year of operations. EO-1 operations and the accompanying ground system were intended to be simple in order to maintain low operational costs. For purposes of formulating operations, it was initially modeled as a small science mission. However, it quickly evolved into a more complex mission due to the difficulties in effectively integrating all of the validation plans of the individual technologies. As a consequence, more operational support was required to confidently complete the on-orbit validation of the new technologies. This paper will outline the issues and lessons learned applicable to future technology validation missions. Examples of some of these include the following: (1) operational complexity encountered in integrating all of the validation plans into a coherent operational plan, (2) initial desire to run single shift operations subsequently growing to 6 "around-the-clock" operations, (3) managing changes in the technologies that ultimately affected operations, (4) necessity for better team communications within the project to offset the effects of change on the Ground System Developers, Operations Engineers, Integration and Test Engineers, S/C Subsystem Engineers, and Scientists, and (5) the need for a more experienced Flight Operations Team to achieve the necessary operational flexibility. The discussion will conclude by providing several cost comparisons for developing operations from previous missions to EO-1 and discuss some details that might be done differently for future technology validation missions.

  6. Study of the Performance of Aids to Navigation Systems - Phase 1, An Empirical Model Approach

    DTIC Science & Technology

    1978-07-19

    Pesch, .. L. /Masakasy, J. G. /Clark Di . A. /Atkins .-. S.... -------- 00o Document I available to the U. S. public through the National Technical...Document is available to the public through PILOTING, FIX, NAVIGATOR, PILOT, the National Technical Information Service, MONTE CARLO MODEL, SHIP SIMULATO...Validation of Entire Navigating and Steering 5-33 Model 5.5 Overview of Model Capabilities and Achieved Goals 5-33 vi SECTION TITLE PAGE 6 PLAN FOR

  7. Implementation of diffraction in a ray-tracing model for the prediction of noise in open-plan offices.

    PubMed

    Chevret, P; Chatillon, J

    2012-11-01

    Sound prediction in open-plan offices is a real challenge because of the complexity of the layout of such offices, and therefore because of the multitude of acoustic phenomena involved. One such phenomenon, of primary importance, and not the least challenging of them, is the diffraction by screens and low dividers that usually partition the workspace. This paper describes implementing the equations of the Uniform Theory of Diffraction [McNamara et al. (1990). Introduction to the Uniform Theory of Diffraction (Artech House, Boston)] in an existing ray-tracing model initially dedicated to sound prediction in industrial premises. For the purposes of validation, a series of measurements was conducted in a semi-anechoic chamber in the same manner as Wang and Bradley [(2002). Appl. Acoust. 63, 849-866] but including real desktops instead of single screens. A first phase was dedicated to controlling the quality of the installation by making comparisons with McNamara's solution for a single screen on a rigid floor. Then, the validation itself was conducted with measurements on real desktops, first without a ceiling, and then with a rigid ceiling suspended above the double desk. The results of the comparisons between calculations and measurements in this configuration have demonstrated that the model is an effective tool for predicting sound levels in an open-plan office.

  8. A radiobiology-based inverse treatment planning method for optimisation of permanent l-125 prostate implants in focal brachytherapy.

    PubMed

    Haworth, Annette; Mears, Christopher; Betts, John M; Reynolds, Hayley M; Tack, Guido; Leo, Kevin; Williams, Scott; Ebert, Martin A

    2016-01-07

    Treatment plans for ten patients, initially treated with a conventional approach to low dose-rate brachytherapy (LDR, 145 Gy to entire prostate), were compared with plans for the same patients created with an inverse-optimisation planning process utilising a biologically-based objective. The 'biological optimisation' considered a non-uniform distribution of tumour cell density through the prostate based on known and expected locations of the tumour. Using dose planning-objectives derived from our previous biological-model validation study, the volume of the urethra receiving 125% of the conventional prescription (145 Gy) was reduced from a median value of 64% to less than 8% whilst maintaining high values of TCP. On average, the number of planned seeds was reduced from 85 to less than 75. The robustness of plans to random seed displacements needs to be carefully considered when using contemporary seed placement techniques. We conclude that an inverse planning approach to LDR treatments, based on a biological objective, has the potential to maintain high rates of tumour control whilst minimising dose to healthy tissue. In future, the radiobiological model will be informed using multi-parametric MRI to provide a personalised medicine approach.

  9. A radiobiology-based inverse treatment planning method for optimisation of permanent l-125 prostate implants in focal brachytherapy

    NASA Astrophysics Data System (ADS)

    Haworth, Annette; Mears, Christopher; Betts, John M.; Reynolds, Hayley M.; Tack, Guido; Leo, Kevin; Williams, Scott; Ebert, Martin A.

    2016-01-01

    Treatment plans for ten patients, initially treated with a conventional approach to low dose-rate brachytherapy (LDR, 145 Gy to entire prostate), were compared with plans for the same patients created with an inverse-optimisation planning process utilising a biologically-based objective. The ‘biological optimisation’ considered a non-uniform distribution of tumour cell density through the prostate based on known and expected locations of the tumour. Using dose planning-objectives derived from our previous biological-model validation study, the volume of the urethra receiving 125% of the conventional prescription (145 Gy) was reduced from a median value of 64% to less than 8% whilst maintaining high values of TCP. On average, the number of planned seeds was reduced from 85 to less than 75. The robustness of plans to random seed displacements needs to be carefully considered when using contemporary seed placement techniques. We conclude that an inverse planning approach to LDR treatments, based on a biological objective, has the potential to maintain high rates of tumour control whilst minimising dose to healthy tissue. In future, the radiobiological model will be informed using multi-parametric MRI to provide a personalised medicine approach.

  10. Interval linear programming model for long-term planning of vehicle recycling in the Republic of Serbia under uncertainty.

    PubMed

    Simic, Vladimir; Dimitrijevic, Branka

    2015-02-01

    An interval linear programming approach is used to formulate and comprehensively test a model for optimal long-term planning of vehicle recycling in the Republic of Serbia. The proposed model is applied to a numerical case study: a 4-year planning horizon (2013-2016) is considered, three legislative cases and three scrap metal price trends are analysed, availability of final destinations for sorted waste flows is explored. Potential and applicability of the developed model are fully illustrated. Detailed insights on profitability and eco-efficiency of the projected contemporary equipped vehicle recycling factory are presented. The influences of the ordinance on the management of end-of-life vehicles in the Republic of Serbia on the vehicle hulks procuring, sorting generated material fractions, sorted waste allocation and sorted metals allocation decisions are thoroughly examined. The validity of the waste management strategy for the period 2010-2019 is tested. The formulated model can create optimal plans for procuring vehicle hulks, sorting generated material fractions, allocating sorted waste flows and allocating sorted metals. Obtained results are valuable for supporting the construction and/or modernisation process of a vehicle recycling system in the Republic of Serbia. © The Author(s) 2015.

  11. Development, test-retest reliability and validity of the Pharmacy Value-Added Services Questionnaire (PVASQ).

    PubMed

    Tan, Christine L; Hassali, Mohamed A; Saleem, Fahad; Shafie, Asrul A; Aljadhey, Hisham; Gan, Vincent B

    2015-01-01

    (i) To develop the Pharmacy Value-Added Services Questionnaire (PVASQ) using emerging themes generated from interviews. (ii) To establish reliability and validity of questionnaire instrument. Using an extended Theory of Planned Behavior as the theoretical model, face-to-face interviews generated salient beliefs of pharmacy value-added services. The PVASQ was constructed initially in English incorporating important themes and later translated into the Malay language with forward and backward translation. Intention (INT) to adopt pharmacy value-added services is predicted by attitudes (ATT), subjective norms (SN), perceived behavioral control (PBC), knowledge and expectations. Using a 7-point Likert-type scale and a dichotomous scale, test-retest reliability (N=25) was assessed by administrating the questionnaire instrument twice at an interval of one week apart. Internal consistency was measured by Cronbach's alpha and construct validity between two administrations was assessed using the kappa statistic and the intraclass correlation coefficient (ICC). Confirmatory Factor Analysis, CFA (N=410) was conducted to assess construct validity of the PVASQ. The kappa coefficients indicate a moderate to almost perfect strength of agreement between test and retest. The ICC for all scales tested for intra-rater (test-retest) reliability was good. The overall Cronbach' s alpha (N=25) is 0.912 and 0.908 for the two time points. The result of CFA (N=410) showed most items loaded strongly and correctly into corresponding factors. Only one item was eliminated. This study is the first to develop and establish the reliability and validity of the Pharmacy Value-Added Services Questionnaire instrument using the Theory of Planned Behavior as the theoretical model. The translated Malay language version of PVASQ is reliable and valid to predict Malaysian patients' intention to adopt pharmacy value-added services to collect partial medicine supply.

  12. Development of the monitoring system to detect the piping thickness reduction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, N. Y.; Ryu, K. H.; Oh, Y. J.

    2006-07-01

    As nuclear piping becomes aging, secondary piping which was considered safe, undergo thickness reduction problem these days. After some accidents caused by Flow Accelerated Corrosion (FAC), guidelines and recommendations for the thinned pipe management were issued, and thus need for monitoring increases. Through thinned pipe management program, monitoring activities based on the various analyses and the case study of other plants also increases. As the monitoring points increase, time needs to cover the recommended inspection area becomes increasing, while the time given to inspect the piping during overhaul becomes shortened. Existing Ultrasonic Technique (UT) can cover small area in amore » given time. Moreover, it cannot be applied to a complex geometry piping or a certain location like welded part. In this paper, we suggested Switching Direct Current Potential Drop (S-DCPD) method by which we can narrow down the FAC-susceptible area. To apply DCPD, we developed both resistance model and Finite Element Method (FEM) model to predict the DCPD feasibility. We tested elbow specimen to compare DCPD monitoring results with UT results to identify consistency. For the validation test, we designed simulation loop. To determine the text condition, we analyzed environmental parameters and introduced applicable wearing rate model. To obtain the model parameters, we developed electrodes and analyzed velocity profile in the test loop using CFX code. Based on the prediction model and prototype testing results, we are planning to perform validation test to identify applicability of S-DCPD in the NPP environment. Validation text plan will be described as a future work. (authors)« less

  13. A model-based 3D patient-specific pre-treatment QA method for VMAT using the EPID

    NASA Astrophysics Data System (ADS)

    McCowan, P. M.; Asuni, G.; van Beek, T.; van Uytven, E.; Kujanpaa, K.; McCurdy, B. M. C.

    2017-02-01

    This study reports the development and validation of a model-based, 3D patient dose reconstruction method for pre-treatment quality assurance using EPID images. The method is also investigated for sensitivity to potential MLC delivery errors. Each cine-mode EPID image acquired during plan delivery was processed using a previously developed back-projection dose reconstruction model providing a 3D dose estimate on the CT simulation data. Validation was carried out using 24 SBRT-VMAT patient plans by comparing: (1) ion chamber point dose measurements in a solid water phantom, (2) the treatment planning system (TPS) predicted 3D dose to the EPID reconstructed 3D dose in a solid water phantom, and (3) the TPS predicted 3D dose to the EPID and our forward predicted reconstructed 3D dose in the patient (CT data). AAA and AcurosXB were used for TPS predictions. Dose distributions were compared using 3%/3 mm (95% tolerance) and 2%/2 mm (90% tolerance) γ-tests in the planning target volume (PTV) and 20% dose volumes. The average percentage point dose differences between the ion chamber and the EPID, AcurosXB, and AAA were 0.73  ±  1.25%, 0.38  ±  0.96% and 1.06  ±  1.34% respectively. For the patient (CT) dose comparisons, seven (3%/3 mm) and nine (2%/2 mm) plans failed the EPID versus AAA. All plans passed the EPID versus Acuros XB and the EPID versus forward model γ-comparisons. Four types of MLC sensitive errors (opening, shifting, stuck, and retracting), of varying magnitude (0.2, 0.5, 1.0, 2.0 mm), were introduced into six different SBRT-VMAT plans. γ-comparisons of the erroneous EPID dose and original predicted dose were carried out using the same criteria as above. For all plans, the sensitivity testing using a 3%/3 mm γ-test in the PTV successfully determined MLC errors on the order of 1.0 mm, except for the single leaf retraction-type error. A 2%/2 mm criteria produced similar results with two more additional detected errors.

  14. Development and validation of a computational model of the knee joint for the evaluation of surgical treatments for osteoarthritis

    PubMed Central

    Mootanah, R.; Imhauser, C.W.; Reisse, F.; Carpanen, D.; Walker, R.W.; Koff, M.F.; Lenhoff, M.W.; Rozbruch, S.R.; Fragomen, A.T.; Dewan, Z.; Kirane, Y.M.; Cheah, Pamela A.; Dowell, J.K.; Hillstrom, H.J.

    2014-01-01

    A three-dimensional (3D) knee joint computational model was developed and validated to predict knee joint contact forces and pressures for different degrees of malalignment. A 3D computational knee model was created from high-resolution radiological images to emulate passive sagittal rotation (full-extension to 65°-flexion) and weight acceptance. A cadaveric knee mounted on a six-degree-of-freedom robot was subjected to matching boundary and loading conditions. A ligament-tuning process minimised kinematic differences between the robotically loaded cadaver specimen and the finite element (FE) model. The model was validated by measured intra-articular force and pressure measurements. Percent full scale error between EE-predicted and in vitro-measured values in the medial and lateral compartments were 6.67% and 5.94%, respectively, for normalised peak pressure values, and 7.56% and 4.48%, respectively, for normalised force values. The knee model can accurately predict normalised intra-articular pressure and forces for different loading conditions and could be further developed for subject-specific surgical planning. PMID:24786914

  15. Development and validation of a computational model of the knee joint for the evaluation of surgical treatments for osteoarthritis.

    PubMed

    Mootanah, R; Imhauser, C W; Reisse, F; Carpanen, D; Walker, R W; Koff, M F; Lenhoff, M W; Rozbruch, S R; Fragomen, A T; Dewan, Z; Kirane, Y M; Cheah, K; Dowell, J K; Hillstrom, H J

    2014-01-01

    A three-dimensional (3D) knee joint computational model was developed and validated to predict knee joint contact forces and pressures for different degrees of malalignment. A 3D computational knee model was created from high-resolution radiological images to emulate passive sagittal rotation (full-extension to 65°-flexion) and weight acceptance. A cadaveric knee mounted on a six-degree-of-freedom robot was subjected to matching boundary and loading conditions. A ligament-tuning process minimised kinematic differences between the robotically loaded cadaver specimen and the finite element (FE) model. The model was validated by measured intra-articular force and pressure measurements. Percent full scale error between FE-predicted and in vitro-measured values in the medial and lateral compartments were 6.67% and 5.94%, respectively, for normalised peak pressure values, and 7.56% and 4.48%, respectively, for normalised force values. The knee model can accurately predict normalised intra-articular pressure and forces for different loading conditions and could be further developed for subject-specific surgical planning.

  16. Verification and Validation Plan for Flight Performance Requirements on the CEV Parachute Assembly System

    NASA Technical Reports Server (NTRS)

    Morris, Aaron L.; Olson, Leah M.

    2011-01-01

    The Crew Exploration Vehicle Parachute Assembly System (CPAS) is engaged in a multi-year design and test campaign aimed at qualifying a parachute recovery system for human use on the Orion Spacecraft. Orion has parachute flight performance requirements that will ultimately be verified through the use of Monte Carlo multi-degree of freedom flight simulations. These simulations will be anchored by real world flight test data and iteratively improved to provide a closer approximation to the real physics observed in the inherently chaotic inflation and steady state flight of the CPAS parachutes. This paper will examine the processes necessary to verify the flight performance requirements of the human rated spacecraft. The focus will be on the requirements verification and model validation planned on CPAS.

  17. Planning Training Loads for the 400 M Hurdles in Three-Month Mesocycles using Artificial Neural Networks.

    PubMed

    Przednowek, Krzysztof; Iskra, Janusz; Wiktorowicz, Krzysztof; Krzeszowski, Tomasz; Maszczyk, Adam

    2017-12-01

    This paper presents a novel approach to planning training loads in hurdling using artificial neural networks. The neural models performed the task of generating loads for athletes' training for the 400 meters hurdles. All the models were calculated based on the training data of 21 Polish National Team hurdlers, aged 22.25 ± 1.96, competing between 1989 and 2012. The analysis included 144 training plans that represented different stages in the annual training cycle. The main contribution of this paper is to develop neural models for planning training loads for the entire career of a typical hurdler. In the models, 29 variables were used, where four characterized the runner and 25 described the training process. Two artificial neural networks were used: a multi-layer perceptron and a network with radial basis functions. To assess the quality of the models, the leave-one-out cross-validation method was used in which the Normalized Root Mean Squared Error was calculated. The analysis shows that the method generating the smallest error was the radial basis function network with nine neurons in the hidden layer. Most of the calculated training loads demonstrated a non-linear relationship across the entire competitive period. The resulting model can be used as a tool to assist a coach in planning training loads during a selected training period.

  18. Planning Training Loads for the 400 M Hurdles in Three-Month Mesocycles using Artificial Neural Networks

    PubMed Central

    Iskra, Janusz; Wiktorowicz, Krzysztof; Krzeszowski, Tomasz; Maszczyk, Adam

    2017-01-01

    Abstract This paper presents a novel approach to planning training loads in hurdling using artificial neural networks. The neural models performed the task of generating loads for athletes’ training for the 400 meters hurdles. All the models were calculated based on the training data of 21 Polish National Team hurdlers, aged 22.25 ± 1.96, competing between 1989 and 2012. The analysis included 144 training plans that represented different stages in the annual training cycle. The main contribution of this paper is to develop neural models for planning training loads for the entire career of a typical hurdler. In the models, 29 variables were used, where four characterized the runner and 25 described the training process. Two artificial neural networks were used: a multi-layer perceptron and a network with radial basis functions. To assess the quality of the models, the leave-one-out cross-validation method was used in which the Normalized Root Mean Squared Error was calculated. The analysis shows that the method generating the smallest error was the radial basis function network with nine neurons in the hidden layer. Most of the calculated training loads demonstrated a non-linear relationship across the entire competitive period. The resulting model can be used as a tool to assist a coach in planning training loads during a selected training period. PMID:29339998

  19. Release Fixed Heel Point (FHP) Accommodation Model Verification and Validation (V and V) Plan - Rev A

    DTIC Science & Technology

    2017-01-23

    5e. TASK NUMBER N/A 5f. WORK UNIT NUMBER N/A 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) AND ADDRESS(ES) RDECOM-TARDEC-ACT Attn...occupant work space, central 90% of the Soldier population, encumbrance, posture and position, verification and validation, computer aided design...factors engineers could benefit by working with vehicle designers to perform virtual assessments in CAD when there is not enough time and/or funding to

  20. Development of Level 2 Calibration and Validation Plans for GOES-R; What is a RIMP?

    NASA Technical Reports Server (NTRS)

    Kopp, Thomas J.; Belsma, Leslie O.; Mollner, Andrew K.; Sun, Ziping; Deluccia, Frank

    2017-01-01

    Calibration and Validation (CalVal) plans for Geostationary Operational Environmental Satellite version R (GOES-R) Level 2 (L2) products were documented via Resource, Implementation, and Management Plans (RIMPs) for all of the official L2 products required from the GOES-R Advanced Baseline Imager (ABI). In 2015 the GOES-R program decided to replace the typical CalVal plans with RIMPs that covered, for a given L2 product, what was required from that product, how it would be validated, and what tools would be used to do so. Similar to Level 1b products, the intent was to cover the full spectrum of planning required for the CalVal of L2 ABI products. Instead of focusing on step-by-step procedures, the RIMPs concentrated on the criteria for each stage of the validation process (Beta, Provisional, and Full Validation) and the many elements required to prove when each stage was reached.

  1. NASA sea ice and snow validation plan for the Defense Meteorological Satellite Program special sensor microwave/imager

    NASA Technical Reports Server (NTRS)

    Cavalieri, Donald J. (Editor); Swift, Calvin T. (Editor)

    1987-01-01

    This document addresses the task of developing and executing a plan for validating the algorithm used for initial processing of sea ice data from the Special Sensor Microwave/Imager (SSMI). The document outlines a plan for monitoring the performance of the SSMI, for validating the derived sea ice parameters, and for providing quality data products before distribution to the research community. Because of recent advances in the application of passive microwave remote sensing to snow cover on land, the validation of snow algorithms is also addressed.

  2. Comparison and validation of statistical methods for predicting power outage durations in the event of hurricanes.

    PubMed

    Nateghi, Roshanak; Guikema, Seth D; Quiring, Steven M

    2011-12-01

    This article compares statistical methods for modeling power outage durations during hurricanes and examines the predictive accuracy of these methods. Being able to make accurate predictions of power outage durations is valuable because the information can be used by utility companies to plan their restoration efforts more efficiently. This information can also help inform customers and public agencies of the expected outage times, enabling better collective response planning, and coordination of restoration efforts for other critical infrastructures that depend on electricity. In the long run, outage duration estimates for future storm scenarios may help utilities and public agencies better allocate risk management resources to balance the disruption from hurricanes with the cost of hardening power systems. We compare the out-of-sample predictive accuracy of five distinct statistical models for estimating power outage duration times caused by Hurricane Ivan in 2004. The methods compared include both regression models (accelerated failure time (AFT) and Cox proportional hazard models (Cox PH)) and data mining techniques (regression trees, Bayesian additive regression trees (BART), and multivariate additive regression splines). We then validate our models against two other hurricanes. Our results indicate that BART yields the best prediction accuracy and that it is possible to predict outage durations with reasonable accuracy. © 2011 Society for Risk Analysis.

  3. Development and validation of Prediction models for Risks of complications in Early-onset Pre-eclampsia (PREP): a prospective cohort study.

    PubMed

    Thangaratinam, Shakila; Allotey, John; Marlin, Nadine; Mol, Ben W; Von Dadelszen, Peter; Ganzevoort, Wessel; Akkermans, Joost; Ahmed, Asif; Daniels, Jane; Deeks, Jon; Ismail, Khaled; Barnard, Ann Marie; Dodds, Julie; Kerry, Sally; Moons, Carl; Riley, Richard D; Khan, Khalid S

    2017-04-01

    The prognosis of early-onset pre-eclampsia (before 34 weeks' gestation) is variable. Accurate prediction of complications is required to plan appropriate management in high-risk women. To develop and validate prediction models for outcomes in early-onset pre-eclampsia. Prospective cohort for model development, with validation in two external data sets. Model development: 53 obstetric units in the UK. Model transportability: PIERS (Pre-eclampsia Integrated Estimate of RiSk for mothers) and PETRA (Pre-Eclampsia TRial Amsterdam) studies. Pregnant women with early-onset pre-eclampsia. Nine hundred and forty-six women in the model development data set and 850 women (634 in PIERS, 216 in PETRA) in the transportability (external validation) data sets. The predictors were identified from systematic reviews of tests to predict complications in pre-eclampsia and were prioritised by Delphi survey. The primary outcome was the composite of adverse maternal outcomes established using Delphi surveys. The secondary outcome was the composite of fetal and neonatal complications. We developed two prediction models: a logistic regression model (PREP-L) to assess the overall risk of any maternal outcome until postnatal discharge and a survival analysis model (PREP-S) to obtain individual risk estimates at daily intervals from diagnosis until 34 weeks. Shrinkage was used to adjust for overoptimism of predictor effects. For internal validation (of the full models in the development data) and external validation (of the reduced models in the transportability data), we computed the ability of the models to discriminate between those with and without poor outcomes ( c -statistic), and the agreement between predicted and observed risk (calibration slope). The PREP-L model included maternal age, gestational age at diagnosis, medical history, systolic blood pressure, urine protein-to-creatinine ratio, platelet count, serum urea concentration, oxygen saturation, baseline treatment with antihypertensive drugs and administration of magnesium sulphate. The PREP-S model additionally included exaggerated tendon reflexes and serum alanine aminotransaminase and creatinine concentration. Both models showed good discrimination for maternal complications, with anoptimism-adjusted c -statistic of 0.82 [95% confidence interval (CI) 0.80 to 0.84] for PREP-L and 0.75 (95% CI 0.73 to 0.78) for the PREP-S model in the internal validation. External validation of the reduced PREP-L model showed good performance with a c -statistic of 0.81 (95% CI 0.77 to 0.85) in PIERS and 0.75 (95% CI 0.64 to 0.86) in PETRA cohorts for maternal complications, and calibrated well with slopes of 0.93 (95% CI 0.72 to 1.10) and 0.90 (95% CI 0.48 to 1.32), respectively. In the PIERS data set, the reduced PREP-S model had a c -statistic of 0.71 (95% CI 0.67 to 0.75) and a calibration slope of 0.67 (95% CI 0.56 to 0.79). Low gestational age at diagnosis, high urine protein-to-creatinine ratio, increased serum urea concentration, treatment with antihypertensive drugs, magnesium sulphate, abnormal uterine artery Doppler scan findings and estimated fetal weight below the 10th centile were associated with fetal complications. The PREP-L model provided individualised risk estimates in early-onset pre-eclampsia to plan management of high- or low-risk individuals. The PREP-S model has the potential to be used as a triage tool for risk assessment. The impacts of the model use on outcomes need further evaluation. Current Controlled Trials ISRCTN40384046. The National Institute for Health Research Health Technology Assessment programme.

  4. Modeling the Dynamic Interrelations between Mobility, Utility, and Land Asking Price

    NASA Astrophysics Data System (ADS)

    Hidayat, E.; Rudiarto, I.; Siegert, F.; Vries, W. D.

    2018-02-01

    Limited and insufficient information about the dynamic interrelation among mobility, utility, and land price is the main reason to conduct this research. Several studies, with several approaches, and several variables have been conducted so far in order to model the land price. However, most of these models appear to generate primarily static land prices. Thus, a research is required to compare, design, and validate different models which calculate and/or compare the inter-relational changes of mobility, utility, and land price. The applied method is a combination of analysis of literature review, expert interview, and statistical analysis. The result is newly improved mathematical model which have been validated and is suitable for the case study location. This improved model consists of 12 appropriate variables. This model can be implemented in the Salatiga city as the case study location in order to arrange better land use planning to mitigate the uncontrolled urban growth.

  5. Three-dimensional surgical simulation improves the planning for correction of facial prognathism and asymmetry: A qualitative and quantitative study

    PubMed Central

    Ho, Cheng-Ting; Lin, Hsiu-Hsia; Liou, Eric J. W.; Lo, Lun-Jou

    2017-01-01

    Traditional planning method for orthognathic surgery has limitations of cephalometric analysis, especially for patients with asymmetry. The aim of this study was to assess surgical plan modification after 3-demensional (3D) simulation. The procedures were to perform traditional surgical planning, construction of 3D model for the initial surgical plan (P1), 3D model of altered surgical plan after simulation (P2), comparison between P1 and P2 models, surgical execution, and postoperative validation using superimposition and root-mean-square difference (RMSD) between postoperative 3D image and P2 simulation model. Surgical plan was modified after 3D simulation in 93% of the cases. Absolute linear changes of landmarks in mediolateral direction (x-axis) were significant and between 1.11 to 1.62 mm. The pitch, yaw, and roll rotation as well as ramus inclination correction also showed significant changes after the 3D planning. Yaw rotation of the maxillomandibular complex (1.88 ± 0.32°) and change of ramus inclination (3.37 ± 3.21°) were most frequently performed for correction of the facial asymmetry. Errors between the postsurgical image and 3D simulation were acceptable, with RMSD 0.63 ± 0.25 mm for the maxilla and 0.85 ± 0.41 mm for the mandible. The information from this study could be used to augment the clinical planning and surgical execution when a conventional approach is applied. PMID:28071714

  6. CMC Research at NASA Glenn in 2016: Recent Progress and Plans

    NASA Technical Reports Server (NTRS)

    Grady, Joseph E.

    2016-01-01

    As part of NASA's Aeronautical Sciences project, Glenn Research Center has developed advanced fiber and matrix constituents for a 2700 degrees Fahrenheit CMC (Ceramic Matrix Composite) for turbine engine applications. Fiber and matrix development and characterization will be reviewed. Resulting improvements in CMC mechanical properties and durability will be summarized. Plans for 2015 will be described, including development and validation of models predicting effects of the engine environment on durability of SiCSiC composites with Environmental Barrier Coatings (EBCs).

  7. AN OPTIMAL MAINTENANCE MANAGEMENT MODEL FOR AIRPORT CONCRETE PAVEMENT

    NASA Astrophysics Data System (ADS)

    Shimomura, Taizo; Fujimori, Yuji; Kaito, Kiyoyuki; Obama, Kengo; Kobayashi, Kiyoshi

    In this paper, an optimal management model is formulated for the performance-based rehabilitation/maintenance contract for airport concrete pavement, whereby two types of life cycle cost risks, i.e., ground consolidation risk and concrete depreciation risk, are explicitly considered. The non-homogenous Markov chain model is formulated to represent the deterioration processes of concrete pavement which are conditional upon the ground consolidation processes. The optimal non-homogenous Markov decision model with multiple types of risk is presented to design the optimal rehabilitation/maintenance plans. And the methodology to revise the optimal rehabilitation/maintenance plans based upon the monitoring data by the Bayesian up-to-dating rules. The validity of the methodology presented in this paper is examined based upon the case studies carried out for the H airport.

  8. Using Micro-Synchrophasor Data for Advanced Distribution Grid Planning and Operations Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, Emma; Kiliccote, Sila; McParland, Charles

    2014-07-01

    This report reviews the potential for distribution-grid phase-angle data that will be available from new micro-synchrophasors (µPMUs) to be utilized in existing distribution-grid planning and operations analysis. This data could augment the current diagnostic capabilities of grid analysis software, used in both planning and operations for applications such as fault location, and provide data for more accurate modeling of the distribution system. µPMUs are new distribution-grid sensors that will advance measurement and diagnostic capabilities and provide improved visibility of the distribution grid, enabling analysis of the grid’s increasingly complex loads that include features such as large volumes of distributed generation.more » Large volumes of DG leads to concerns on continued reliable operation of the grid, due to changing power flow characteristics and active generation, with its own protection and control capabilities. Using µPMU data on change in voltage phase angle between two points in conjunction with new and existing distribution-grid planning and operational tools is expected to enable model validation, state estimation, fault location, and renewable resource/load characterization. Our findings include: data measurement is outstripping the processing capabilities of planning and operational tools; not every tool can visualize a voltage phase-angle measurement to the degree of accuracy measured by advanced sensors, and the degree of accuracy in measurement required for the distribution grid is not defined; solving methods cannot handle the high volumes of data generated by modern sensors, so new models and solving methods (such as graph trace analysis) are needed; standardization of sensor-data communications platforms in planning and applications tools would allow integration of different vendors’ sensors and advanced measurement devices. In addition, data from advanced sources such as µPMUs could be used to validate models to improve/ensure accuracy, providing information on normally estimated values such as underground conductor impedance, and characterization of complex loads. Although the input of high-fidelity data to existing tools will be challenging, µPMU data on phase angle (as well as other data from advanced sensors) will be useful for basic operational decisions that are based on a trend of changing data.« less

  9. Quality Control Analysis of Selected Aspects of Programs Administered by the Bureau of Student Financial Assistance. Task 1 and Quality Control Sample; Error-Prone Modeling Analysis Plan.

    ERIC Educational Resources Information Center

    Saavedra, Pedro; And Others

    Parameters and procedures for developing an error-prone model (EPM) to predict financial aid applicants who are likely to misreport on Basic Educational Opportunity Grant (BEOG) applications are introduced. Specifications to adapt these general parameters to secondary data analysis of the Validation, Edits, and Applications Processing Systems…

  10. FY2017 Pilot Project Plan for the Nuclear Energy Knowledge and Validation Center Initiative

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ren, Weiju

    To prepare for technical development of computational code validation under the Nuclear Energy Knowledge and Validation Center (NEKVAC) initiative, several meetings were held by a group of experts of the Idaho National Laboratory (INL) and the Oak Ridge National Laboratory (ORNL) to develop requirements of, and formulate a structure for, a transient fuel database through leveraging existing resources. It was concluded in discussions of these meetings that a pilot project is needed to address the most fundamental issues that can generate immediate stimulus to near-future validation developments as well as long-lasting benefits to NEKVAC operation. The present project is proposedmore » based on the consensus of these discussions. Analysis of common scenarios in code validation indicates that the incapability of acquiring satisfactory validation data is often a showstopper that must first be tackled before any confident validation developments can be carried out. Validation data are usually found scattered in different places most likely with interrelationships among the data not well documented, incomplete with information for some parameters missing, nonexistent, or unrealistic to experimentally generate. Furthermore, with very different technical backgrounds, the modeler, the experimentalist, and the knowledgebase developer that must be involved in validation data development often cannot communicate effectively without a data package template that is representative of the data structure for the information domain of interest to the desired code validation. This pilot project is proposed to use the legendary TREAT Experiments Database to provide core elements for creating an ideal validation data package. Data gaps and missing data interrelationships will be identified from these core elements. All the identified missing elements will then be filled in with experimental data if available from other existing sources or with dummy data if nonexistent. The resulting hybrid validation data package (composed of experimental and dummy data) will provide a clear and complete instance delineating the structure of the desired validation data and enabling effective communication among the modeler, the experimentalist, and the knowledgebase developer. With a good common understanding of the desired data structure by the three parties of subject matter experts, further existing data hunting will be effectively conducted, new experimental data generation will be realistically pursued, knowledgebase schema will be practically designed; and code validation will be confidently planned.« less

  11. Adaptation of clinical prediction models for application in local settings.

    PubMed

    Kappen, Teus H; Vergouwe, Yvonne; van Klei, Wilton A; van Wolfswinkel, Leo; Kalkman, Cor J; Moons, Karel G M

    2012-01-01

    When planning to use a validated prediction model in new patients, adequate performance is not guaranteed. For example, changes in clinical practice over time or a different case mix than the original validation population may result in inaccurate risk predictions. To demonstrate how clinical information can direct updating a prediction model and development of a strategy for handling missing predictor values in clinical practice. A previously derived and validated prediction model for postoperative nausea and vomiting was updated using a data set of 1847 patients. The update consisted of 1) changing the definition of an existing predictor, 2) reestimating the regression coefficient of a predictor, and 3) adding a new predictor to the model. The updated model was then validated in a new series of 3822 patients. Furthermore, several imputation models were considered to handle real-time missing values, so that possible missing predictor values could be anticipated during actual model use. Differences in clinical practice between our local population and the original derivation population guided the update strategy of the prediction model. The predictive accuracy of the updated model was better (c statistic, 0.68; calibration slope, 1.0) than the original model (c statistic, 0.62; calibration slope, 0.57). Inclusion of logistical variables in the imputation models, besides observed patient characteristics, contributed to a strategy to deal with missing predictor values at the time of risk calculation. Extensive knowledge of local, clinical processes provides crucial information to guide the process of adapting a prediction model to new clinical practices.

  12. Nursing diagnosis of grieving: content validity in perinatal loss situations.

    PubMed

    Paloma-Castro, Olga; Romero-Sánchez, José Manuel; Paramio-Cuevas, Juan Carlos; Pastor-Montero, Sonia María; Castro-Yuste, Cristina; Frandsen, Anna J; Albar-Marín, María Jesús; Bas-Sarmiento, Pilar; Moreno-Corral, Luis Javier

    2014-06-01

    To validate the content of the NANDA-I nursing diagnosis of grieving in situations of perinatal loss. Using the Fehring's model, 208 Spanish experts were asked to assess the adequacy of the defining characteristics and other manifestations identified in the literature for cases of perinatal loss. The content validity index was 0.867. Twelve of the 18 defining characteristics were validated, seven as major and five as minor. From the manifestations proposed, "empty inside" was considered as major. The nursing diagnosis of grieving fits in content to the cases of perinatal loss according to experts. The results have provided evidence to support the use of the diagnosis in care plans for said clinical situation. © 2013 NANDA International.

  13. Integrated Medical Model (IMM) Project Verification, Validation, and Credibility (VVandC)

    NASA Technical Reports Server (NTRS)

    Walton, M.; Boley, L.; Keenan, L.; Kerstman, E.; Shah, R.; Young, M.; Saile, L.; Garcia, Y.; Meyers, J.; Reyes, D.

    2015-01-01

    The Integrated Medical Model (IMM) Project supports end user requests by employing the Integrated Medical Evidence Database (iMED) and IMM tools as well as subject matter expertise within the Project. The iMED houses data used by the IMM. The IMM is designed to forecast relative changes for a specified set of crew health and mission success risk metrics by using a probabilistic model based on historical data, cohort data, and subject matter expert opinion. A stochastic approach is taken because deterministic results would not appropriately reflect the uncertainty in the IMM inputs. Once the IMM was conceptualized, a plan was needed to rigorously assess input information, framework and code, and output results of the IMM, and ensure that end user requests and requirements were considered during all stages of model development and implementation, as well as lay the foundation for external review and application. METHODS: In 2008, the Project team developed a comprehensive verification and validation (VV) plan, which specified internal and external review criteria encompassing 1) verification of data and IMM structure to ensure proper implementation of the IMM, 2) several validation techniques to confirm that the simulation capability of the IMM appropriately represents occurrences and consequences of medical conditions during space missions, and 3) credibility processes to develop user confidence in the information derived from the IMM. When the NASA-STD-7009 (7009) [1] was published, the Project team updated their verification, validation, and credibility (VVC) project plan to meet 7009 requirements and include 7009 tools in reporting VVC status of the IMM. Construction of these tools included meeting documentation and evidence requirements sufficient to meet external review success criteria. RESULTS: IMM Project VVC updates are compiled recurrently and include updates to the 7009 Compliance and Credibility matrices. Reporting tools have evolved over the lifetime of the IMM Project to better communicate VVC status. This has included refining original 7009 methodology with augmentation from the HRP NASA-STD-7009 Guidance Document working group and the NASA-HDBK-7009 [2]. End user requests and requirements are being satisfied as evidenced by ISS Program acceptance of IMM risk forecasts, transition to an operational model and simulation tool, and completion of service requests from a broad end user consortium including operations, science and technology planning, and exploration planning. IMM v4.0 is slated for operational release in the FY015 and current VVC assessments illustrate the expected VVC status prior to the completion of customer lead external review efforts. CONCLUSIONS: The VVC approach established by the IMM Project of incorporating Project-specific recommended practices and guidelines for implementing the 7009 requirements is comprehensive and includes the involvement of end users at every stage in IMM evolution. Methods and techniques used to quantify the VVC status of the IMM Project represented a critical communication tool in providing clear and concise suitability assessments to IMM customers. These processes have not only received approval from the local NASA community but have also garnered recognition by other federal agencies seeking to develop similar guidelines in the medical modeling community.

  14. A discrete event simulation tool to support and predict hospital and clinic staffing.

    PubMed

    DeRienzo, Christopher M; Shaw, Ryan J; Meanor, Phillip; Lada, Emily; Ferranti, Jeffrey; Tanaka, David

    2017-06-01

    We demonstrate how to develop a simulation tool to help healthcare managers and administrators predict and plan for staffing needs in a hospital neonatal intensive care unit using administrative data. We developed a discrete event simulation model of nursing staff needed in a neonatal intensive care unit and then validated the model against historical data. The process flow was translated into a discrete event simulation model. Results demonstrated that the model can be used to give a respectable estimate of annual admissions, transfers, and deaths based upon two different staffing levels. The discrete event simulation tool model can provide healthcare managers and administrators with (1) a valid method of modeling patient mix, patient acuity, staffing needs, and costs in the present state and (2) a forecast of how changes in a unit's staffing, referral patterns, or patient mix would affect a unit in a future state.

  15. The Career Futures Inventory-Revised: Measuring Dimensions of Career Adaptability

    ERIC Educational Resources Information Center

    Rottinghaus, Patrick J.; Buelow, Kristine L.; Matyja, Anna; Schneider, Madalyn R.

    2012-01-01

    This study reports the development and initial validation of the "Career Futures Inventory-Revised" (CFI-R) in two large samples of university students. The 28-item CFI-R assesses aspects of career adaptability, including positive career planning attitudes, general outcome expectations, and components of Parsons' tripartite model and…

  16. Influences of personality traits and continuation intentions on physical activity participation within the theory of planned behaviour.

    PubMed

    Chatzisarantis, Nikos L D; Hagger, Martin S

    2008-01-01

    Previous research has suggested that the theory of planned behaviour is insufficient in capturing all the antecedents of physical activity participation and that continuation intentions or personality traits may improve the predictive validity of the model. The present study examined the combined effects of continuation intentions and personality traits on health behaviour within the theory of planned behaviour. To examine these effects, 180 university students (N = 180, Male = 87, Female = 93, Age = 19.14 years, SD = 0.94) completed self-report measures of the theory of planned behaviour, personality traits and continuation intentions. After 5 weeks, perceived achievement of behavioural outcomes and actual participation in physical activities were assessed. Results supported discriminant validity between continuation intentions, conscientiousness and extroversion and indicated that perceived achievement of behavioural outcomes and continuation intentions of failure predicted physical activity participation after controlling for personality effects, past behaviour and other variables in the theory of planned behaviour. In addition, results indicated that conscientiousness moderated the effects of continuation intentions of failure on physical activity such that continuation intentions of failure predicted physical activity participation among conscientious and not among less conscientious individuals. These findings suggest that the effects of continuation intentions on health behaviour are contingent on personality characteristics.

  17. Issues in knowledge representation to support maintainability: A case study in scientific data preparation

    NASA Technical Reports Server (NTRS)

    Chien, Steve; Kandt, R. Kirk; Roden, Joseph; Burleigh, Scott; King, Todd; Joy, Steve

    1992-01-01

    Scientific data preparation is the process of extracting usable scientific data from raw instrument data. This task involves noise detection (and subsequent noise classification and flagging or removal), extracting data from compressed forms, and construction of derivative or aggregate data (e.g. spectral densities or running averages). A software system called PIPE provides intelligent assistance to users developing scientific data preparation plans using a programming language called Master Plumber. PIPE provides this assistance capability by using a process description to create a dependency model of the scientific data preparation plan. This dependency model can then be used to verify syntactic and semantic constraints on processing steps to perform limited plan validation. PIPE also provides capabilities for using this model to assist in debugging faulty data preparation plans. In this case, the process model is used to focus the developer's attention upon those processing steps and data elements that were used in computing the faulty output values. Finally, the dependency model of a plan can be used to perform plan optimization and runtime estimation. These capabilities allow scientists to spend less time developing data preparation procedures and more time on scientific analysis tasks. Because the scientific data processing modules (called fittings) evolve to match scientists' needs, issues regarding maintainability are of prime importance in PIPE. This paper describes the PIPE system and describes how issues in maintainability affected the knowledge representation used in PIPE to capture knowledge about the behavior of fittings.

  18. Modeling and Simulation Plans in Support of Low Cost, Size, Weight, and Power Surveillance Systems for Detecting and Tracking Non-Cooperative Aircraft

    NASA Technical Reports Server (NTRS)

    Wu, Gilbert; Santiago, Confesor

    2017-01-01

    RTCA Special Committee (SC) 228 has initiated a second phase for the development of minimum operational performance standards (MOPS) for UAS detect and avoid (DAA) systems. Technologies to enable UAS with less available Size, Weight, and Power (SWaP) will be considered. RTCA SC-228 has established sub-working groups and one of the sub-working groups is focused on aligning modeling and simulations activities across all participating committee members. This briefing will describe NASAs modeling and simulation plans for the development of performance standards for low cost, size, weight, and power (C-SWaP) surveillance systems that detect and track non-cooperative aircraft. The briefing will also describe the simulation platform NASA intends to use to support end-to-end verification and validation for these DAA systems. Lastly, the briefing will highlight the experiment plan for our first simulation study, and provide a high-level description of our future flight test plans. This briefing does not contain any results or data.

  19. Capacity planning for maternal-fetal medicine using discrete event simulation.

    PubMed

    Ferraro, Nicole M; Reamer, Courtney B; Reynolds, Thomas A; Howell, Lori J; Moldenhauer, Julie S; Day, Theodore Eugene

    2015-07-01

    Maternal-fetal medicine is a rapidly growing field requiring collaboration from many subspecialties. We provide an evidence-based estimate of capacity needs for our clinic, as well as demonstrate how simulation can aid in capacity planning in similar environments. A Discrete Event Simulation of the Center for Fetal Diagnosis and Treatment and Special Delivery Unit at The Children's Hospital of Philadelphia was designed and validated. This model was then used to determine the time until demand overwhelms inpatient bed availability under increasing capacity. No significant deviation was found between historical inpatient censuses and simulated censuses for the validation phase (p = 0.889). Prospectively increasing capacity was found to delay time to balk (the inability of the center to provide bed space for a patient in need of admission). With current capacity, the model predicts mean time to balk of 276 days. Adding three beds delays mean time to first balk to 762 days; an additional six beds to 1,335 days. Providing sufficient access is a patient safety issue, and good planning is crucial for targeting infrastructure investments appropriately. Computer-simulated analysis can provide an evidence base for both medical and administrative decision making in a complex clinical environment. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  20. Predicted Risk of Radiation-Induced Cancers After Involved Field and Involved Node Radiotherapy With or Without Intensity Modulation for Early-Stage Hodgkin Lymphoma in Female Patients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weber, Damien C., E-mail: damien.weber@unige.ch; Johanson, Safora; Peguret, Nicolas

    2011-10-01

    Purpose: To assess the excess relative risk (ERR) of radiation-induced cancers (RIC) in female patients with Hodgkin lymphoma (HL) female patients treated with conformal (3DCRT), intensity modulated (IMRT), or volumetric modulated arc (RA) radiation therapy. Methods and Materials: Plans for 10 early-stage HL female patients were computed for 3DCRT, IMRT, and RA with involved field RT (IFRT) and involvednode RT (INRT) radiation fields. Organs at risk dose--volume histograms were computed and inter-compared for IFRT vs. INRT and 3DCRT vs. IMRT/RA, respectively. The ERR for cancer induction in breasts, lungs, and thyroid was estimated using both linear and nonlinear models. Results:more » The mean estimated ERR for breast, lung, and thyroid were significantly lower (p < 0.01) with INRT than with IFRT planning, regardless of the radiation delivery technique used, assuming a linear dose-risk relationship. We found that using the nonlinear model, the mean ERR values were significantly (p < 0.01) increased with IMRT or RA compared to those with 3DCRT planning for the breast, lung, and thyroid, using an IFRT paradigm. After INRT planning, IMRT or RA increased the risk of RIC for lung and thyroid only. Conclusions: In this comparative planning study, using a nonlinear dose--risk model, IMRT or RA increased the estimated risk of RIC for breast, lung, and thyroid for HL female patients. This study also suggests that INRT planning, compared to IFRT planning, may reduce the ERR of RIC when risk is predicted using a linear model. Observing the opposite effect, with a nonlinear model, however, questions the validity of these biologically parameterized models.« less

  1. Developing and validating risk prediction models in an individual participant data meta-analysis

    PubMed Central

    2014-01-01

    Background Risk prediction models estimate the risk of developing future outcomes for individuals based on one or more underlying characteristics (predictors). We review how researchers develop and validate risk prediction models within an individual participant data (IPD) meta-analysis, in order to assess the feasibility and conduct of the approach. Methods A qualitative review of the aims, methodology, and reporting in 15 articles that developed a risk prediction model using IPD from multiple studies. Results The IPD approach offers many opportunities but methodological challenges exist, including: unavailability of requested IPD, missing patient data and predictors, and between-study heterogeneity in methods of measurement, outcome definitions and predictor effects. Most articles develop their model using IPD from all available studies and perform only an internal validation (on the same set of data). Ten of the 15 articles did not allow for any study differences in baseline risk (intercepts), potentially limiting their model’s applicability and performance in some populations. Only two articles used external validation (on different data), including a novel method which develops the model on all but one of the IPD studies, tests performance in the excluded study, and repeats by rotating the omitted study. Conclusions An IPD meta-analysis offers unique opportunities for risk prediction research. Researchers can make more of this by allowing separate model intercept terms for each study (population) to improve generalisability, and by using ‘internal-external cross-validation’ to simultaneously develop and validate their model. Methodological challenges can be reduced by prospectively planned collaborations that share IPD for risk prediction. PMID:24397587

  2. Validation of Mars-GRAM and Planned New Features

    NASA Technical Reports Server (NTRS)

    Justus, C. G.; Duvall, Aleta; Keller, Vernon W.

    2004-01-01

    For altitudes below 80 km, Mars Global Reference Atmospheric Model (Mars-GRAM 2001) is based on output climatology from NASA Ames Mars General Circulation Model (MGCM). At COSPAR 2002, results were presented of validation tests of Mars-GRAM versus data from Mars Global Surveyor Thermal Emission Spectrometer (TES) and Radio Science (RS) experiment. Further validation tests are presented comparing Mars- GRAM densities with those from the European Mars Climate Database (MCD), and comparing densities from both Mars-GRAM and MCD against TES observations. Throughout most of the height and latitude range of TES data (040 km and 70s to 70N), good agreement is found between atmospheric densities from Mars-GRAM and MCD. However, at the season and latitude zone for Mars Phoenix arrival and landing (Ls = 65 to 80 degrees and latitude 65 to 75N), Mars-GRAM densities are about 30 to 45 percent higher than MCD densities near 40 km altitude. Further evaluation is warranted concerning potential impact of these model differences on planning for Phoenix entry and descent. Three planned features for Mars-GRAM update are also discussed: (1) new MGCM and Thermospheric General Circulation Model data sets to be used as a revised basis for Mars-GRAM mean atmosphere, (2) a new feature to represent planetary-scale traveling waves for upper altitude density variations (such as found during Mars Odyssey aerobraking), and (3) a new model for effects of high resolution topographic slope on winds near the surface (0 to 4.5 km above MOLA topography level). Mars-GRAM slope winds will be computed from a diagnostic (algebraic) relationship based on Ye, Segal, and Pielke (1990). This approach differs from mesoscale models (such as MRAMS and Mars MM5), which use prognostic, full-physics solutions of the time- and space-dependent differential equations of motion. As such, slope winds in Mars-GRAM will be consistent with its "engineering-level" approach, and will be extremely fast and easy to evaluate, compared with mesoscale model solutions. Mars-GRAM slope winds are not being suggested as a replacement for sophisticated, full-physics Mars mesoscale models, but may have value, particularly for preliminary screening of large numbers of candidate landing sites for future Mars missions, such as Phoenix and Mars Science Laboratory. Test output is presented from Mars-GRAM slope winds in the area of Gusev Crater and Valles Marineris.

  3. Interactive Care Model: A Framework for More Fully Engaging People in Their Healthcare.

    PubMed

    Drenkard, Karen; Swartwout, Ellen; Deyo, Patricia; O'Neil, Michael B

    2015-10-01

    Transformation of care delivery requires rethinking the relationship between the person and clinician. The model described provides a process to more fully engage patients in their care. Five encounters include assessing capacity for engagement, exchanging information and choices, planning, determining interventions, and evaluating the effectiveness of engagement interventions. Created by researchers and validated by experts, implications for practice, education, and policy are explored.

  4. Empirical and Face Validity of Software Maintenance Defect Models Used at the Jet Propulsion Laboratory

    NASA Technical Reports Server (NTRS)

    Taber, William; Port, Dan

    2014-01-01

    At the Mission Design and Navigation Software Group at the Jet Propulsion Laboratory we make use of finite exponential based defect models to aid in maintenance planning and management for our widely used critical systems. However a number of pragmatic issues arise when applying defect models for a post-release system in continuous use. These include: how to utilize information from problem reports rather than testing to drive defect discovery and removal effort, practical model calibration, and alignment of model assumptions with our environment.

  5. Computational Fluid Dynamics and Additive Manufacturing to Diagnose and Treat Cardiovascular Disease.

    PubMed

    Randles, Amanda; Frakes, David H; Leopold, Jane A

    2017-11-01

    Noninvasive engineering models are now being used for diagnosing and planning the treatment of cardiovascular disease. Techniques in computational modeling and additive manufacturing have matured concurrently, and results from simulations can inform and enable the design and optimization of therapeutic devices and treatment strategies. The emerging synergy between large-scale simulations and 3D printing is having a two-fold benefit: first, 3D printing can be used to validate the complex simulations, and second, the flow models can be used to improve treatment planning for cardiovascular disease. In this review, we summarize and discuss recent methods and findings for leveraging advances in both additive manufacturing and patient-specific computational modeling, with an emphasis on new directions in these fields and remaining open questions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Underwood, Tracy, E-mail: tunderwood@mgh.harvard.edu; Department of Medical Physics and Bioengineering, University College London, London; Giantsoudi, Drosoula

    Purpose: For prostate treatments, robust evidence regarding the superiority of either intensity modulated radiation therapy (IMRT) or proton therapy is currently lacking. In this study we investigated the circumstances under which proton therapy should be expected to outperform IMRT, particularly the proton beam orientations and relative biological effectiveness (RBE) assumptions. Methods and Materials: For 8 patients, 4 treatment planning strategies were considered: (A) IMRT; (B) passively scattered standard bilateral (SB) proton beams; (C) passively scattered anterior oblique (AO) proton beams, and (D) AO intensity modulated proton therapy (IMPT). For modalities (B)-(D) the dose and linear energy transfer (LET) distributions weremore » simulated using the TOPAS Monte Carlo platform and RBE was calculated according to 3 different models. Results: Assuming a fixed RBE of 1.1, our implementation of IMRT outperformed SB proton therapy across most normal tissue metrics. For the scattered AO proton plans, application of the variable RBE models resulted in substantial hotspots in rectal RBE weighted dose. For AO IMPT, it was typically not possible to find a plan that simultaneously met the tumor and rectal constraints for both fixed and variable RBE models. Conclusion: If either a fixed RBE of 1.1 or a variable RBE model could be validated in vivo, then it would always be possible to use AO IMPT to dose-boost the prostate and improve normal tissue sparing relative to IMRT. For a cohort without rectum spacer gels, this study (1) underlines the importance of resolving the question of proton RBE within the framework of an IMRT versus proton debate for the prostate and (2) highlights that without further LET/RBE model validation, great care must be taken if AO proton fields are to be considered for prostate treatments.« less

  7. A generative model for segmentation of tumor and organs-at-risk for radiation therapy planning of glioblastoma patients

    NASA Astrophysics Data System (ADS)

    Agn, Mikael; Law, Ian; Munck af Rosenschöld, Per; Van Leemput, Koen

    2016-03-01

    We present a fully automated generative method for simultaneous brain tumor and organs-at-risk segmentation in multi-modal magnetic resonance images. The method combines an existing whole-brain segmentation technique with a spatial tumor prior, which uses convolutional restricted Boltzmann machines to model tumor shape. The method is not tuned to any specific imaging protocol and can simultaneously segment the gross tumor volume, peritumoral edema and healthy tissue structures relevant for radiotherapy planning. We validate the method on a manually delineated clinical data set of glioblastoma patients by comparing segmentations of gross tumor volume, brainstem and hippocampus. The preliminary results demonstrate the feasibility of the method.

  8. A Method for Suppressing Line Overload Phenomena Using NAS Battery Systems

    NASA Astrophysics Data System (ADS)

    Ohtaka, Toshiya; Iwamoto, Shinichi

    In this paper, we pay attention to the superior operating control function and instantaneous discharging characteristics of NAS battery systems, and propose a method for determining installation planning and operating control schemes of NAS battery systems for suppressing line overload phenomena. In the stage of planning, a target contingency is identified, and an optimal allocation and capacity of NAS battery systems and an amount of generation changes are determined for the contingency. In the stage of operation, the control strategy of NAS battery system is determined. Simulations are carried out for verifying the validity of the proposed method using the IEEJ 1 machine V system model and an example 2 machine 16 bus system model.

  9. Knowledge, attitudes, and practice behaviors of oncology advanced practice nurses regarding advanced care planning for patients with cancer.

    PubMed

    Zhou, Guiyun; Stoltzfus, Jill C; Houldin, Arlene D; Parks, Susan M; Swan, Beth Ann

    2010-11-01

    To establish initial reliability and validity of a Web-based survey focused on oncology advanced practice nurses' (APNs') knowledge, attitudes, and practice behaviors regarding advanced care planning, and to obtain preliminary understanding of APNs' knowledge, attitudes, and practice behaviors and perceived barriers to advanced care planning. Descriptive, cross-sectional, pilot survey study. The eastern United States. 300 oncology APNs. Guided by the Theory of Planned Behavior, a knowledge, attitudes, and practice behaviors survey was developed and reviewed for content validity. The survey was distributed to 300 APNs via e-mail and sent again to the 89 APNs who responded to the initial survey. Exploratory factor analysis was used to examine the construct validity and test-retest reliability of the survey's attitudinal and practice behavior portions. Respondents' demographics, knowledge, attitudes, practice behaviors, and perceived barriers to advanced care planning practice. Exploratory factor analysis yielded a five-factor solution from the survey's attitudes and practice behavior portions with internal consistency using Cronbach alpha. Respondents achieved an average of 67% correct answers in the 12-item knowledge section and scored positively in attitudes toward advanced care planning. Their practice behavior scores were marginally positive. The most common reported barriers were from patients' and families' as well as physicians' reluctance to discuss advanced care planning. The attitudinal and practice behaviors portions of the survey demonstrated preliminary construct validity and test-retest reliability. Regarding advanced care planning, respondents were moderately knowledgeable, but their advanced care planning practice was not routine. Validly assessing oncology APNs' knowledge, attitudes, and practice behaviors regarding advanced care planning will enable more tailored approaches to improve end-of-life care outcomes.

  10. SU-D-207-07: Implementation of Full/half Bowtie Filter Model in a Commercial Treatment Planning System for Kilovoltage X-Ray Imaging Dose Estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, S; Alaei, P

    2015-06-15

    Purpose: To implement full/half bowtie filter models in a commercial treatment planning system (TPS) to calculate kilovoltage (kV) x-ray imaging dose of Varian On-Board Imager (OBI) cone beam CT (CBCT) system. Methods: Full/half bowtie filters of Varian OBI were created as compensator models in Pinnacle TPS (version 9.6) using Matlab software (version 2011a). The profiles of both bowtie filters were acquired from the manufacturer, imported into the Matlab system and hard coded in binary file format. A Pinnacle script was written to import each bowtie filter data into a Pinnacle treatment plan as a compensator. A kV x-ray beam modelmore » without including the compensator model was commissioned per each bowtie filter setting based on percent depth dose and lateral profile data acquired from Monte Carlo simulations. To validate the bowtie filter models, a rectangular water phantom was generated in the planning system and an anterior/posterior beam with each bowtie filter was created. Using the Pinnacle script, each bowtie filter compensator was added to the treatment plan. Lateral profile at the depth of 3cm and percent depth dose were measured using an ion chamber and compared with the data extracted from the treatment plans. Results: The kV x-ray beams for both full and half bowtie filter have been modeled in a commercial TPS. The difference of lateral and depth dose profiles between dose calculations and ion chamber measurements were within 6%. Conclusion: Both full/half bowtie filter models provide reasonable results in kV x-ray dose calculations in the water phantom. This study demonstrates the possibility of using a model-based treatment planning system to calculate the kV imaging dose for both full and half bowtie filter modes. Further study is to be performed to evaluate the models in clinical situations.« less

  11. Helium ions at the heidelberg ion beam therapy center: comparisons between FLUKA Monte Carlo code predictions and dosimetric measurements

    NASA Astrophysics Data System (ADS)

    Tessonnier, T.; Mairani, A.; Brons, S.; Sala, P.; Cerutti, F.; Ferrari, A.; Haberer, T.; Debus, J.; Parodi, K.

    2017-08-01

    In the field of particle therapy helium ion beams could offer an alternative for radiotherapy treatments, owing to their interesting physical and biological properties intermediate between protons and carbon ions. We present in this work the comparisons and validations of the Monte Carlo FLUKA code against in-depth dosimetric measurements acquired at the Heidelberg Ion Beam Therapy Center (HIT). Depth dose distributions in water with and without ripple filter, lateral profiles at different depths in water and a spread-out Bragg peak were investigated. After experimentally-driven tuning of the less known initial beam characteristics in vacuum (beam lateral size and momentum spread) and simulation parameters (water ionization potential), comparisons of depth dose distributions were performed between simulations and measurements, which showed overall good agreement with range differences below 0.1 mm and dose-weighted average dose-differences below 2.3% throughout the entire energy range. Comparisons of lateral dose profiles showed differences in full-width-half-maximum lower than 0.7 mm. Measurements of the spread-out Bragg peak indicated differences with simulations below 1% in the high dose regions and 3% in all other regions, with a range difference less than 0.5 mm. Despite the promising results, some discrepancies between simulations and measurements were observed, particularly at high energies. These differences were attributed to an underestimation of dose contributions from secondary particles at large angles, as seen in a triple Gaussian parametrization of the lateral profiles along the depth. However, the results allowed us to validate FLUKA simulations against measurements, confirming its suitability for 4He ion beam modeling in preparation of clinical establishment at HIT. Future activities building on this work will include treatment plan comparisons using validated biological models between proton and helium ions, either within a Monte Carlo treatment planning engine based on the same FLUKA code, or an independent analytical planning system fed with a validated database of inputs calculated with FLUKA.

  12. Helium ions at the heidelberg ion beam therapy center: comparisons between FLUKA Monte Carlo code predictions and dosimetric measurements.

    PubMed

    Tessonnier, T; Mairani, A; Brons, S; Sala, P; Cerutti, F; Ferrari, A; Haberer, T; Debus, J; Parodi, K

    2017-08-01

    In the field of particle therapy helium ion beams could offer an alternative for radiotherapy treatments, owing to their interesting physical and biological properties intermediate between protons and carbon ions. We present in this work the comparisons and validations of the Monte Carlo FLUKA code against in-depth dosimetric measurements acquired at the Heidelberg Ion Beam Therapy Center (HIT). Depth dose distributions in water with and without ripple filter, lateral profiles at different depths in water and a spread-out Bragg peak were investigated. After experimentally-driven tuning of the less known initial beam characteristics in vacuum (beam lateral size and momentum spread) and simulation parameters (water ionization potential), comparisons of depth dose distributions were performed between simulations and measurements, which showed overall good agreement with range differences below 0.1 mm and dose-weighted average dose-differences below 2.3% throughout the entire energy range. Comparisons of lateral dose profiles showed differences in full-width-half-maximum lower than 0.7 mm. Measurements of the spread-out Bragg peak indicated differences with simulations below 1% in the high dose regions and 3% in all other regions, with a range difference less than 0.5 mm. Despite the promising results, some discrepancies between simulations and measurements were observed, particularly at high energies. These differences were attributed to an underestimation of dose contributions from secondary particles at large angles, as seen in a triple Gaussian parametrization of the lateral profiles along the depth. However, the results allowed us to validate FLUKA simulations against measurements, confirming its suitability for 4 He ion beam modeling in preparation of clinical establishment at HIT. Future activities building on this work will include treatment plan comparisons using validated biological models between proton and helium ions, either within a Monte Carlo treatment planning engine based on the same FLUKA code, or an independent analytical planning system fed with a validated database of inputs calculated with FLUKA.

  13. Maximally Expressive Modeling

    NASA Technical Reports Server (NTRS)

    Jaap, John; Davis, Elizabeth; Richardson, Lea

    2004-01-01

    Planning and scheduling systems organize tasks into a timeline or schedule. Tasks are logically grouped into containers called models. Models are a collection of related tasks, along with their dependencies and requirements, that when met will produce the desired result. One challenging domain for a planning and scheduling system is the operation of on-board experiments for the International Space Station. In these experiments, the equipment used is among the most complex hardware ever developed; the information sought is at the cutting edge of scientific endeavor; and the procedures are intricate and exacting. Scheduling is made more difficult by a scarcity of station resources. The models to be fed into the scheduler must describe both the complexity of the experiments and procedures (to ensure a valid schedule) and the flexibilities of the procedures and the equipment (to effectively utilize available resources). Clearly, scheduling International Space Station experiment operations calls for a maximally expressive modeling schema.

  14. Duchenne Regulatory Science Consortium Meeting on Disease Progression Modeling for Duchenne Muscular Dystrophy.

    PubMed

    Larkindale, Jane; Abresch, Richard; Aviles, Enrique; Bronson, Abby; Chin, Janice; Furlong, Pat; Gordish-Dressman, Heather; Habeeb-Louks, Elizabeth; Henricson, Erik; Kroger, Hans; Lynn, Charles; Lynn, Stephen; Martin, Dana; Nuckolls, Glen; Rooney, William; Romero, Klaus; Sweeney, Lee; Vandenborne, Krista; Walter, Glenn; Wolff, Jodi; Wong, Brenda; McDonald, Craig M; Duchenne Regulatory Science Consortium Imaging-Dmd Consortium And The Cinrg Investigators, Members Of The

    2017-01-12

    The Duchenne Regulatory Science Consortium (D-RSC) was established to develop tools to accelerate drug development for DMD.  The resulting tools are anticipated to meet validity requirements outlined by qualification/endorsement pathways at both the U.S. Food and Drug Administration (FDA) and European Medicines Administration (EMA), and will be made available to the drug development community. The initial goals of the consortium include the development of a disease progression model, with the goal of creating a model that would be used to forecast changes in clinically meaningful endpoints, which would inform clinical trial protocol development and data analysis.  Methods: In April of 2016 the consortium and other experts met to formulate plans for the development of the model.  Conclusions: Here we report the results of the meeting, and discussion as to the form of the model that we plan to move forward to develop, after input from the regulatory authorities.

  15. Validation of design procedure and performance modeling of a heat and fluid transport field experiment in the unsaturated zone

    NASA Astrophysics Data System (ADS)

    Nir, A.; Doughty, C.; Tsang, C. F.

    Validation methods which developed in the context of deterministic concepts of past generations often cannot be directly applied to environmental problems, which may be characterized by limited reproducibility of results and highly complex models. Instead, validation is interpreted here as a series of activities, including both theoretical and experimental tests, designed to enhance our confidence in the capability of a proposed model to describe some aspect of reality. We examine the validation process applied to a project concerned with heat and fluid transport in porous media, in which mathematical modeling, simulation, and results of field experiments are evaluated in order to determine the feasibility of a system for seasonal thermal energy storage in shallow unsaturated soils. Technical details of the field experiments are not included, but appear in previous publications. Validation activities are divided into three stages. The first stage, carried out prior to the field experiments, is concerned with modeling the relevant physical processes, optimization of the heat-exchanger configuration and the shape of the storage volume, and multi-year simulation. Subjects requiring further theoretical and experimental study are identified at this stage. The second stage encompasses the planning and evaluation of the initial field experiment. Simulations are made to determine the experimental time scale and optimal sensor locations. Soil thermal parameters and temperature boundary conditions are estimated using an inverse method. Then results of the experiment are compared with model predictions using different parameter values and modeling approximations. In the third stage, results of an experiment performed under different boundary conditions are compared to predictions made by the models developed in the second stage. Various aspects of this theoretical and experimental field study are described as examples of the verification and validation procedure. There is no attempt to validate a specific model, but several models of increasing complexity are compared with experimental results. The outcome is interpreted as a demonstration of the paradigm proposed by van der Heijde, 26 that different constituencies have different objectives for the validation process and therefore their acceptance criteria differ also.

  16. Robotic Billiards: Understanding Humans in Order to Counter Them.

    PubMed

    Nierhoff, Thomas; Leibrandt, Konrad; Lorenz, Tamara; Hirche, Sandra

    2016-08-01

    Ongoing technological advances in the areas of computation, sensing, and mechatronics enable robotic-based systems to interact with humans in the real world. To succeed against a human in a competitive scenario, a robot must anticipate the human behavior and include it in its own planning framework. Then it can predict the next human move and counter it accordingly, thus not only achieving overall better performance but also systematically exploiting the opponent's weak spots. Pool is used as a representative scenario to derive a model-based planning and control framework where not only the physics of the environment but also a model of the opponent is considered. By representing the game of pool as a Markov decision process and incorporating a model of the human decision-making based on studies, an optimized policy is derived. This enables the robot to include the opponent's typical game style into its tactical considerations when planning a stroke. The results are validated in simulations and real-life experiments with an anthropomorphic robot playing pool against a human.

  17. Review of TRMM/GPM Rainfall Algorithm Validation

    NASA Technical Reports Server (NTRS)

    Smith, Eric A.

    2004-01-01

    A review is presented concerning current progress on evaluation and validation of standard Tropical Rainfall Measuring Mission (TRMM) precipitation retrieval algorithms and the prospects for implementing an improved validation research program for the next generation Global Precipitation Measurement (GPM) Mission. All standard TRMM algorithms are physical in design, and are thus based on fundamental principles of microwave radiative transfer and its interaction with semi-detailed cloud microphysical constituents. They are evaluated for consistency and degree of equivalence with one another, as well as intercompared to radar-retrieved rainfall at TRMM's four main ground validation sites. Similarities and differences are interpreted in the context of the radiative and microphysical assumptions underpinning the algorithms. Results indicate that the current accuracies of the TRMM Version 6 algorithms are approximately 15% at zonal-averaged / monthly scales with precisions of approximately 25% for full resolution / instantaneous rain rate estimates (i.e., level 2 retrievals). Strengths and weaknesses of the TRMM validation approach are summarized. Because the dew of convergence of level 2 TRMM algorithms is being used as a guide for setting validation requirements for the GPM mission, it is important that the GPM algorithm validation program be improved to ensure concomitant improvement in the standard GPM retrieval algorithms. An overview of the GPM Mission's validation plan is provided including a description of a new type of physical validation model using an analytic 3-dimensional radiative transfer model.

  18. 9 CFR 381.22 - Conditions for receiving inspection.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... have conducted a hazard analysis and developed and validated a HACCP plan, in accordance with §§ 417.2... exceed 90 days, during which period the establishment must validate its HACCP plan. (c) Before producing... developed a HACCP plan applicable to that product in accordance with § 417.2 of this chapter. During a...

  19. 9 CFR 381.22 - Conditions for receiving inspection.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... have conducted a hazard analysis and developed and validated a HACCP plan, in accordance with §§ 417.2... exceed 90 days, during which period the establishment must validate its HACCP plan. (c) Before producing... developed a HACCP plan applicable to that product in accordance with § 417.2 of this chapter. During a...

  20. Documenting historical data and accessing it on the World Wide Web

    Treesearch

    Malchus B. Baker; Daniel P. Huebner; Peter F. Ffolliott

    2000-01-01

    New computer technologies facilitate the storage, retrieval, and summarization of watershed-based data sets on the World Wide Web. These data sets are used by researchers when testing and validating predictive models, managers when planning and implementing watershed management practices, educators when learning about hydrologic processes, and decisionmakers when...

  1. Constraint Modeling for Curriculum Planning and Validation

    ERIC Educational Resources Information Center

    Baldoni, Matteo; Baroglio, Cristina; Brunkhorst, Ingo; Henze, Nicola; Marengo, Elisa; Patti, Viviana

    2011-01-01

    Curricula authoring is a complex process, involving different actors and different kinds of knowledge. Learners aim at acquiring expertise about some topic of their own interest, and need to perceive that the curriculum they attend will lead them toward their goal; when this does not happen, they become demotivated. Learners are all different, not…

  2. Effects of Planning Instruction on a Young Writer with Asperger Syndrome

    ERIC Educational Resources Information Center

    Asaro, Kristie; Saddler, Bruce

    2009-01-01

    One validated model for teaching strategies to less skilled writers is the self-regulated strategy development (SRSD) approach. This method has been used to successfully improve the writing of children with learning disabilities and has recently been extended to students with emotional and behavioral disorders and attention-deficit/hyperactivity…

  3. Evaluation of spray drift using low speed wind tunnel measurements and dispersion modeling

    USDA-ARS?s Scientific Manuscript database

    The objective of this work was to evaluate the EPA’s proposed Test Plan for the validation testing of pesticide spray drift reduction technologies (DRTs) for row and field crops, focusing on the evaluation of ground application systems using the low-speed wind tunnel protocols and processing the dat...

  4. School-to-Work Apprenticeship. Project Manual 1993-1995.

    ERIC Educational Resources Information Center

    Lee Coll., Baytown, TX.

    With 1993-94 and 1994-95 Perkins tech prep funds, Lee College, in cooperation with a consortium and local schools, planned, developed, and validated a school-to-work apprenticeship model for tech prep programs. The other educational partners were the Gulf Coast Tech Prep Consortium and nine high schools in eight area school districts. The…

  5. A Web Application for Validating and Disseminating Surface Energy Balance Evapotranspiration Estimates for Hydrologic Modeling Applications

    NASA Astrophysics Data System (ADS)

    Schneider, C. A.; Aggett, G. R.; Nevo, A.; Babel, N. C.; Hattendorf, M. J.

    2008-12-01

    The western United States face an increasing threat from drought - and the social, economic, and environmental impacts that come with it. The combination of diminished water supplies along with increasing demand for urban and other uses is rapidly depleting surface and ground water reserves traditionally allocated for agricultural use. Quantification of water consumptive use is increasingly important as water resources are placed under growing tension by increased users and interests. Scarce water supplies can be managed more efficiently through use of information and prediction tools accessible via the internet. METRIC (Mapping ET at high Resolution with Internalized Calibration) represents a maturing technology for deriving a remote sensing-based surface energy balance for estimating ET from the earth's surface. This technology has the potential to become widely adopted and used by water resources communities providing critical support to a host of water decision support tools. ET images created using METRIC or similar remote- sensing based processing systems could be routinely used as input to operational and planning models for water demand forecasting, reservoir operations, ground-water management, irrigation water supply planning, water rights regulation, and for the improvement, validation, and use of hydrological models. The ET modeling and subsequent validation and distribution of results via the web presented here provides a vehicle through which METRIC ET parameters can be made more accessible to hydrologic modelers. It will enable users of the data to assess the results of the spatially distributed ET modeling and compare with results from conventional ET estimation methods prior to assimilation in surface and ground water models. In addition, this ET-Server application will provide rapid and transparent access to the data enabling quantification of uncertainties due to errors in temporal sampling and METRIC modeling, while the GIS-based analytical tools will facilitate quality assessments associated with the selected spatio-temporal scale of interest.

  6. CMC Research at NASA Glenn in 2017: Recent Progress and Plans

    NASA Technical Reports Server (NTRS)

    Grady, Joseph E.

    2017-01-01

    As part of NASA's Aeronautics research mission, Glenn Research Center has developed advanced constituents for 2700F CMC turbine engine applications. In this presentation, fiber and matrix development and characterization for SiCSiC composites will be reviewed and resulting improvements in CMC durability and mechanical properties will be summarized. Progress toward the development and validation of models predicting the effects of the engine environment on durability of CMC and Environmental Barrier Coatings will be summarized and plans for research and collaborations in 2017 will be summarized.

  7. Testing an explanatory model of nurses' intention to report adverse drug reactions in hospital settings.

    PubMed

    Angelis, Alessia De; Pancani, Luca; Steca, Patrizia; Colaceci, Sofia; Giusti, Angela; Tibaldi, Laura; Alvaro, Rosaria; Ausili, Davide; Vellone, Ercole

    2017-05-01

    To test an explanatory model of nurses' intention to report adverse drug reactions in hospital settings, based on the theory of planned behaviour. Under-reporting of adverse drug reactions is an important problem among nurses. A cross-sectional design was used. Data were collected with the adverse drug reporting nurses' questionnaire. Confirmatory factor analysis was performed to test the factor validity of the adverse drug reporting nurses' questionnaire, and structural equation modelling was used to test the explanatory model. The convenience sample comprised 500 Italian hospital nurses (mean age = 43.52). Confirmatory factor analysis supported the factor validity of the adverse drug reporting nurses' questionnaire. The structural equation modelling showed a good fit with the data. Nurses' intention to report adverse drug reactions was significantly predicted by attitudes, subjective norms and perceived behavioural control (R² = 0.16). The theory of planned behaviour effectively explained the mechanisms behind nurses' intention to report adverse drug reactions, showing how several factors come into play. In a scenario of organisational empowerment towards adverse drug reaction reporting, the major predictors of the intention to report are support for the decision to report adverse drug reactions from other health care practitioners, perceptions about the value of adverse drug reaction reporting and nurses' favourable self-assessment of their adverse drug reaction reporting skills. © 2017 John Wiley & Sons Ltd.

  8. The Fifth Calibration/Data Product Validation Panel Meeting

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The minutes and associated documents prepared from presentations and meetings at the Fifth Calibration/Data Product Validation Panel meeting in Boulder, Colorado, April 8 - 10, 1992, are presented. Key issues include (1) statistical characterization of data sets: finding statistics that characterize key attributes of the data sets, and defining ways to characterize the comparisons among data sets; (2) selection of specific intercomparison exercises: selecting characteristic spatial and temporal regions for intercomparisons, and impact of validation exercises on the logistics of current and planned field campaigns and model runs; and (3) preparation of data sets for intercomparisons: characterization of assumptions, transportable data formats, labeling data files, content of data sets, and data storage and distribution (EOSDIS interface).

  9. Parameter Identification and Uncertainty Analysis for Visual MODFLOW based Groundwater Flow Model in a Small River Basin, Eastern India

    NASA Astrophysics Data System (ADS)

    Jena, S.

    2015-12-01

    The overexploitation of groundwater resulted in abandoning many shallow tube wells in the river Basin in Eastern India. For the sustainability of groundwater resources, basin-scale modelling of groundwater flow is essential for the efficient planning and management of the water resources. The main intent of this study is to develope a 3-D groundwater flow model of the study basin using the Visual MODFLOW package and successfully calibrate and validate it using 17 years of observed data. The sensitivity analysis was carried out to quantify the susceptibility of aquifer system to the river bank seepage, recharge from rainfall and agriculture practices, horizontal and vertical hydraulic conductivities, and specific yield. To quantify the impact of parameter uncertainties, Sequential Uncertainty Fitting Algorithm (SUFI-2) and Markov chain Monte Carlo (MCMC) techniques were implemented. Results from the two techniques were compared and the advantages and disadvantages were analysed. Nash-Sutcliffe coefficient (NSE) and coefficient of determination (R2) were adopted as two criteria during calibration and validation of the developed model. NSE and R2 values of groundwater flow model for calibration and validation periods were in acceptable range. Also, the MCMC technique was able to provide more reasonable results than SUFI-2. The calibrated and validated model will be useful to identify the aquifer properties, analyse the groundwater flow dynamics and the change in groundwater levels in future forecasts.

  10. Development and internal validation of a side-specific, multiparametric magnetic resonance imaging-based nomogram for the prediction of extracapsular extension of prostate cancer.

    PubMed

    Martini, Alberto; Gupta, Akriti; Lewis, Sara C; Cumarasamy, Shivaram; Haines, Kenneth G; Briganti, Alberto; Montorsi, Francesco; Tewari, Ashutosh K

    2018-04-19

    To develop a nomogram for predicting side-specific extracapsular extension (ECE) for planning nerve-sparing radical prostatectomy. We retrospectively analysed data from 561 patients who underwent robot-assisted radical prostatectomy between February 2014 and October 2015. To develop a side-specific predictive model, we considered the prostatic lobes separately. Four variables were included: prostate-specific antigen; highest ipsilateral biopsy Gleason grade; highest ipsilateral percentage core involvement; and ECE on multiparametric magnetic resonance imaging (mpMRI). A multivariable logistic regression analysis was fitted to predict side-specific ECE. A nomogram was built based on the coefficients of the logit function. Internal validation was performed using 'leave-one-out' cross-validation. Calibration was graphically investigated. The decision curve analysis was used to evaluate the net clinical benefit. The study population consisted of 829 side-specific cases, after excluding negative biopsy observations (n = 293). ECE was reported on mpMRI and final pathology in 115 (14%) and 142 (17.1%) cases, respectively. Among these, mpMRI was able to predict ECE correctly in 57 (40.1%) cases. All variables in the model except highest percentage core involvement were predictors of ECE (all P ≤ 0.006). All variables were considered for inclusion in the nomogram. After internal validation, the area under the curve was 82.11%. The model demonstrated excellent calibration and improved clinical risk prediction, especially when compared with relying on mpMRI prediction of ECE alone. When retrospectively applying the nomogram-derived probability, using a 20% threshold for performing nerve-sparing, nine out of 14 positive surgical margins (PSMs) at the site of ECE resulted above the threshold. We developed an easy-to-use model for the prediction of side-specific ECE, and hope it serves as a tool for planning nerve-sparing radical prostatectomy and in the reduction of PSM in future series. © 2018 The Authors BJU International © 2018 BJU International Published by John Wiley & Sons Ltd.

  11. PSOLA: A Heuristic Land-Use Allocation Model Using Patch-Level Operations and Knowledge-Informed Rules.

    PubMed

    Liu, Yaolin; Peng, Jinjin; Jiao, Limin; Liu, Yanfang

    2016-01-01

    Optimizing land-use allocation is important to regional sustainable development, as it promotes the social equality of public services, increases the economic benefits of land-use activities, and reduces the ecological risk of land-use planning. Most land-use optimization models allocate land-use using cell-level operations that fragment land-use patches. These models do not cooperate well with land-use planning knowledge, leading to irrational land-use patterns. This study focuses on building a heuristic land-use allocation model (PSOLA) using particle swarm optimization. The model allocates land-use with patch-level operations to avoid fragmentation. The patch-level operations include a patch-edge operator, a patch-size operator, and a patch-compactness operator that constrain the size and shape of land-use patches. The model is also integrated with knowledge-informed rules to provide auxiliary knowledge of land-use planning during optimization. The knowledge-informed rules consist of suitability, accessibility, land use policy, and stakeholders' preference. To validate the PSOLA model, a case study was performed in Gaoqiao Town in Zhejiang Province, China. The results demonstrate that the PSOLA model outperforms a basic PSO (Particle Swarm Optimization) in the terms of the social, economic, ecological, and overall benefits by 3.60%, 7.10%, 1.53% and 4.06%, respectively, which confirms the effectiveness of our improvements. Furthermore, the model has an open architecture, enabling its extension as a generic tool to support decision making in land-use planning.

  12. PSOLA: A Heuristic Land-Use Allocation Model Using Patch-Level Operations and Knowledge-Informed Rules

    PubMed Central

    Liu, Yaolin; Peng, Jinjin; Jiao, Limin; Liu, Yanfang

    2016-01-01

    Optimizing land-use allocation is important to regional sustainable development, as it promotes the social equality of public services, increases the economic benefits of land-use activities, and reduces the ecological risk of land-use planning. Most land-use optimization models allocate land-use using cell-level operations that fragment land-use patches. These models do not cooperate well with land-use planning knowledge, leading to irrational land-use patterns. This study focuses on building a heuristic land-use allocation model (PSOLA) using particle swarm optimization. The model allocates land-use with patch-level operations to avoid fragmentation. The patch-level operations include a patch-edge operator, a patch-size operator, and a patch-compactness operator that constrain the size and shape of land-use patches. The model is also integrated with knowledge-informed rules to provide auxiliary knowledge of land-use planning during optimization. The knowledge-informed rules consist of suitability, accessibility, land use policy, and stakeholders’ preference. To validate the PSOLA model, a case study was performed in Gaoqiao Town in Zhejiang Province, China. The results demonstrate that the PSOLA model outperforms a basic PSO (Particle Swarm Optimization) in the terms of the social, economic, ecological, and overall benefits by 3.60%, 7.10%, 1.53% and 4.06%, respectively, which confirms the effectiveness of our improvements. Furthermore, the model has an open architecture, enabling its extension as a generic tool to support decision making in land-use planning. PMID:27322619

  13. Biped Robot Gait Planning Based on 3D Linear Inverted Pendulum Model

    NASA Astrophysics Data System (ADS)

    Yu, Guochen; Zhang, Jiapeng; Bo, Wu

    2018-01-01

    In order to optimize the biped robot’s gait, the biped robot’s walking motion is simplify to the 3D linear inverted pendulum motion mode. The Center of Mass (CoM) locus is determined from the relationship between CoM and the Zero Moment Point (ZMP) locus. The ZMP locus is planned in advance. Then, the forward gait and lateral gait are simplified as connecting rod structure. Swing leg trajectory using B-spline interpolation. And the stability of the walking process is discussed in conjunction with the ZMP equation. Finally the system simulation is carried out under the given conditions to verify the validity of the proposed planning method.

  14. Rapid inverse planning for pressure-driven drug infusions in the brain.

    PubMed

    Rosenbluth, Kathryn H; Martin, Alastair J; Mittermeyer, Stephan; Eschermann, Jan; Dickinson, Peter J; Bankiewicz, Krystof S

    2013-01-01

    Infusing drugs directly into the brain is advantageous to oral or intravenous delivery for large molecules or drugs requiring high local concentrations with low off-target exposure. However, surgeons manually planning the cannula position for drug delivery in the brain face a challenging three-dimensional visualization task. This study presents an intuitive inverse-planning technique to identify the optimal placement that maximizes coverage of the target structure while minimizing the potential for leakage outside the target. The technique was retrospectively validated using intraoperative magnetic resonance imaging of infusions into the striatum of non-human primates and into a tumor in a canine model and applied prospectively to upcoming human clinical trials.

  15. Urban water infrastructure optimization to reduce environmental impacts and costs.

    PubMed

    Lim, Seong-Rin; Suh, Sangwon; Kim, Jung-Hoon; Park, Hung Suck

    2010-01-01

    Urban water planning and policy have been focusing on environmentally benign and economically viable water management. The objective of this study is to develop a mathematical model to integrate and optimize urban water infrastructures for supply-side planning and policy: freshwater resources and treated wastewater are allocated to various water demand categories in order to reduce contaminants in the influents supplied for drinking water, and to reduce consumption of the water resources imported from the regions beyond a city boundary. A case study is performed to validate the proposed model. An optimal urban water system of a metropolitan city is calculated on the basis of the model and compared to the existing water system. The integration and optimization decrease (i) average concentrations of the influents supplied for drinking water, which can improve human health and hygiene; (ii) total consumption of water resources, as well as electricity, reducing overall environmental impacts; (iii) life cycle cost; and (iv) water resource dependency on other regions, improving regional water security. This model contributes to sustainable urban water planning and policy. 2009 Elsevier Ltd. All rights reserved.

  16. Artificial neural network based gynaecological image-guided adaptive brachytherapy treatment planning correction of intra-fractional organs at risk dose variation.

    PubMed

    Jaberi, Ramin; Siavashpour, Zahra; Aghamiri, Mahmoud Reza; Kirisits, Christian; Ghaderi, Reza

    2017-12-01

    Intra-fractional organs at risk (OARs) deformations can lead to dose variation during image-guided adaptive brachytherapy (IGABT). The aim of this study was to modify the final accepted brachytherapy treatment plan to dosimetrically compensate for these intra-fractional organs-applicators position variations and, at the same time, fulfilling the dosimetric criteria. Thirty patients with locally advanced cervical cancer, after external beam radiotherapy (EBRT) of 45-50 Gy over five to six weeks with concomitant weekly chemotherapy, and qualified for intracavitary high-dose-rate (HDR) brachytherapy with tandem-ovoid applicators were selected for this study. Second computed tomography scan was done for each patient after finishing brachytherapy treatment with applicators in situ. Artificial neural networks (ANNs) based models were used to predict intra-fractional OARs dose-volume histogram parameters variations and propose a new final plan. A model was developed to estimate the intra-fractional organs dose variations during gynaecological intracavitary brachytherapy. Also, ANNs were used to modify the final brachytherapy treatment plan to compensate dosimetrically for changes in 'organs-applicators', while maintaining target dose at the original level. There are semi-automatic and fast responding models that can be used in the routine clinical workflow to reduce individually IGABT uncertainties. These models can be more validated by more patients' plans to be able to serve as a clinical tool.

  17. Automatic tissue segmentation of head and neck MR images for hyperthermia treatment planning

    NASA Astrophysics Data System (ADS)

    Fortunati, Valerio; Verhaart, René F.; Niessen, Wiro J.; Veenland, Jifke F.; Paulides, Margarethus M.; van Walsum, Theo

    2015-08-01

    A hyperthermia treatment requires accurate, patient-specific treatment planning. This planning is based on 3D anatomical models which are generally derived from computed tomography. Because of its superior soft tissue contrast, magnetic resonance imaging (MRI) information can be introduced to improve the quality of these 3D patient models and therefore the treatment planning itself. Thus, we present here an automatic atlas-based segmentation algorithm for MR images of the head and neck. Our method combines multiatlas local weighting fusion with intensity modelling. The accuracy of the method was evaluated using a leave-one-out cross validation experiment over a set of 11 patients for which manual delineation were available. The accuracy of the proposed method was high both in terms of the Dice similarity coefficient (DSC) and the 95th percentile Hausdorff surface distance (HSD) with median DSC higher than 0.8 for all tissues except sclera. For all tissues, except the spine tissues, the accuracy was approaching the interobserver agreement/variability both in terms of DSC and HSD. The positive effect of adding the intensity modelling to the multiatlas fusion decreased when a more accurate atlas fusion method was used. Using the proposed approach we improved the performance of the approach previously presented for H&N hyperthermia treatment planning, making the method suitable for clinical application.

  18. A virtual source model for Monte Carlo simulation of helical tomotherapy.

    PubMed

    Yuan, Jiankui; Rong, Yi; Chen, Quan

    2015-01-08

    The purpose of this study was to present a Monte Carlo (MC) simulation method based on a virtual source, jaw, and MLC model to calculate dose in patient for helical tomotherapy without the need of calculating phase-space files (PSFs). Current studies on the tomotherapy MC simulation adopt a full MC model, which includes extensive modeling of radiation source, primary and secondary jaws, and multileaf collimator (MLC). In the full MC model, PSFs need to be created at different scoring planes to facilitate the patient dose calculations. In the present work, the virtual source model (VSM) we established was based on the gold standard beam data of a tomotherapy unit, which can be exported from the treatment planning station (TPS). The TPS-generated sinograms were extracted from the archived patient XML (eXtensible Markup Language) files. The fluence map for the MC sampling was created by incorporating the percentage leaf open time (LOT) with leaf filter, jaw penumbra, and leaf latency contained from sinogram files. The VSM was validated for various geometry setups and clinical situations involving heterogeneous media and delivery quality assurance (DQA) cases. An agreement of < 1% was obtained between the measured and simulated results for percent depth doses (PDDs) and open beam profiles for all three jaw settings in the VSM commissioning. The accuracy of the VSM leaf filter model was verified in comparing the measured and simulated results for a Picket Fence pattern. An agreement of < 2% was achieved between the presented VSM and a published full MC model for heterogeneous phantoms. For complex clinical head and neck (HN) cases, the VSM-based MC simulation of DQA plans agreed with the film measurement with 98% of planar dose pixels passing on the 2%/2 mm gamma criteria. For patient treatment plans, results showed comparable dose-volume histograms (DVHs) for planning target volumes (PTVs) and organs at risk (OARs). Deviations observed in this study were consistent with literature. The VSM-based MC simulation approach can be feasibly built from the gold standard beam model of a tomotherapy unit. The accuracy of the VSM was validated against measurements in homogeneous media, as well as published full MC model in heterogeneous media.

  19. A virtual source model for Monte Carlo simulation of helical tomotherapy

    PubMed Central

    Yuan, Jiankui; Rong, Yi

    2015-01-01

    The purpose of this study was to present a Monte Carlo (MC) simulation method based on a virtual source, jaw, and MLC model to calculate dose in patient for helical tomotherapy without the need of calculating phase‐space files (PSFs). Current studies on the tomotherapy MC simulation adopt a full MC model, which includes extensive modeling of radiation source, primary and secondary jaws, and multileaf collimator (MLC). In the full MC model, PSFs need to be created at different scoring planes to facilitate the patient dose calculations. In the present work, the virtual source model (VSM) we established was based on the gold standard beam data of a tomotherapy unit, which can be exported from the treatment planning station (TPS). The TPS‐generated sinograms were extracted from the archived patient XML (eXtensible Markup Language) files. The fluence map for the MC sampling was created by incorporating the percentage leaf open time (LOT) with leaf filter, jaw penumbra, and leaf latency contained from sinogram files. The VSM was validated for various geometry setups and clinical situations involving heterogeneous media and delivery quality assurance (DQA) cases. An agreement of <1% was obtained between the measured and simulated results for percent depth doses (PDDs) and open beam profiles for all three jaw settings in the VSM commissioning. The accuracy of the VSM leaf filter model was verified in comparing the measured and simulated results for a Picket Fence pattern. An agreement of <2% was achieved between the presented VSM and a published full MC model for heterogeneous phantoms. For complex clinical head and neck (HN) cases, the VSM‐based MC simulation of DQA plans agreed with the film measurement with 98% of planar dose pixels passing on the 2%/2 mm gamma criteria. For patient treatment plans, results showed comparable dose‐volume histograms (DVHs) for planning target volumes (PTVs) and organs at risk (OARs). Deviations observed in this study were consistent with literature. The VSM‐based MC simulation approach can be feasibly built from the gold standard beam model of a tomotherapy unit. The accuracy of the VSM was validated against measurements in homogeneous media, as well as published full MC model in heterogeneous media. PACS numbers: 87.53.‐j, 87.55.K‐ PMID:25679157

  20. An image-guided planning system for endosseous oral implants.

    PubMed

    Verstreken, K; Van Cleynenbreugel, J; Martens, K; Marchal, G; van Steenberghe, D; Suetens, P

    1998-10-01

    A preoperative planning system for oral implant surgery was developed which takes as input computed tomographies (CT's) of the jaws. Two-dimensional (2-D) reslices of these axial CT slices orthogonal to a curve following the jaw arch are computed and shown together with three-dimensional (3-D) surface rendered models of the bone and computer-aided design (CAD)-like implant models. A technique is developed for scanning and visualizing an eventual existing removable prosthesis together with the bone structures. Evaluation of the planning done with the system shows a difference between 2-D and 3-D planning methods. Validation studies measure the benefits of the 3-D approach by comparing plans made in 2-D mode only with those further adjusted using the full 3-D visualization capabilities of the system. The benefits of a 3-D approach are then evident where a prosthesis is involved in the planning. For the majority of the patients, clinically important adjustments and optimizations to the 2-D plans are made once the 3-D visualization is enabled, effectively resulting in a better plan. The alterations are related to bone quality and quantity (p < 0.05), biomechanics (p < 0.005), and esthetics (p < 0.005), and are so obvious that the 3-D plan stands out clearly (p < 0.005). The improvements often avoid complications such as mandibular nerve damage, sinus perforations, fenestrations, or dehiscences.

  1. Damage modeling and statistical analysis of optics damage performance in MJ-class laser systems.

    PubMed

    Liao, Zhi M; Raymond, B; Gaylord, J; Fallejo, R; Bude, J; Wegner, P

    2014-11-17

    Modeling the lifetime of a fused silica optic is described for a multiple beam, MJ-class laser system. This entails combining optic processing data along with laser shot data to account for complete history of optic processing and shot exposure. Integrating with online inspection data allows for the construction of a performance metric to describe how an optic performs with respect to the model. This methodology helps to validate the damage model as well as allows strategic planning and identifying potential hidden parameters that are affecting the optic's performance.

  2. Predicting High Health Care Resource Utilization in a Single-payer Public Health Care System: Development and Validation of the High Resource User Population Risk Tool (HRUPoRT).

    PubMed

    Rosella, Laura C; Kornas, Kathy; Yao, Zhan; Manuel, Douglas G; Bornbaum, Catherine; Fransoo, Randall; Stukel, Therese

    2017-11-17

    A large proportion of health care spending is incurred by a small proportion of the population. Population-based health planning tools that consider both the clinical and upstream determinants of high resource users (HRU) of the health system are lacking. To develop and validate the High Resource User Population Risk Tool (HRUPoRT), a predictive model of adults that will become the top 5% of health care users over a 5-year period, based on self-reported clinical, sociodemographic, and health behavioral predictors in population survey data. The HRUPoRT model was developed in a prospective cohort design using the combined 2005 and 2007/2008 Canadian Community Health Surveys (CCHS) (N=58,617), and validated using the external 2009/2010 CCHS cohort (N=28,721). Health care utilization for each of the 5 years following CCHS interview date were determined by applying a person-centered costing algorithm to the linked health administrative databases. Discrimination and calibration of the model were assessed using c-statistic and Hosmer-Lemeshow (HL) χ statistic. The best prediction model for 5-year transition to HRU status included 12 predictors and had good discrimination (c-statistic=0.8213) and calibration (HL χ=18.71) in the development cohort. The model performed similarly in the validation cohort (c-statistic=0.8171; HL χ=19.95). The strongest predictors in the HRUPoRT model were age, perceived general health, and body mass index. HRUPoRT can accurately project the proportion of individuals in the population that will become a HRU over 5 years. HRUPoRT can be applied to inform health resource planning and prevention strategies at the community level.This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND), where it is permissible to download and share the work provided it is properly cited. The work cannot be changed in any way or used commercially without permission from the journal. http://creativecommons.org/licenses/by-nc-nd/4.0/.

  3. Benchmarking and validation of a Geant4-SHADOW Monte Carlo simulation for dose calculations in microbeam radiation therapy.

    PubMed

    Cornelius, Iwan; Guatelli, Susanna; Fournier, Pauline; Crosbie, Jeffrey C; Sanchez Del Rio, Manuel; Bräuer-Krisch, Elke; Rosenfeld, Anatoly; Lerch, Michael

    2014-05-01

    Microbeam radiation therapy (MRT) is a synchrotron-based radiotherapy modality that uses high-intensity beams of spatially fractionated radiation to treat tumours. The rapid evolution of MRT towards clinical trials demands accurate treatment planning systems (TPS), as well as independent tools for the verification of TPS calculated dose distributions in order to ensure patient safety and treatment efficacy. Monte Carlo computer simulation represents the most accurate method of dose calculation in patient geometries and is best suited for the purpose of TPS verification. A Monte Carlo model of the ID17 biomedical beamline at the European Synchrotron Radiation Facility has been developed, including recent modifications, using the Geant4 Monte Carlo toolkit interfaced with the SHADOW X-ray optics and ray-tracing libraries. The code was benchmarked by simulating dose profiles in water-equivalent phantoms subject to irradiation by broad-beam (without spatial fractionation) and microbeam (with spatial fractionation) fields, and comparing against those calculated with a previous model of the beamline developed using the PENELOPE code. Validation against additional experimental dose profiles in water-equivalent phantoms subject to broad-beam irradiation was also performed. Good agreement between codes was observed, with the exception of out-of-field doses and toward the field edge for larger field sizes. Microbeam results showed good agreement between both codes and experimental results within uncertainties. Results of the experimental validation showed agreement for different beamline configurations. The asymmetry in the out-of-field dose profiles due to polarization effects was also investigated, yielding important information for the treatment planning process in MRT. This work represents an important step in the development of a Monte Carlo-based independent verification tool for treatment planning in MRT.

  4. A computational method for estimating the dosimetric effect of intra-fraction motion on step-and-shoot IMRT and compensator plans

    NASA Astrophysics Data System (ADS)

    Waghorn, Ben J.; Shah, Amish P.; Ngwa, Wilfred; Meeks, Sanford L.; Moore, Joseph A.; Siebers, Jeffrey V.; Langen, Katja M.

    2010-07-01

    Intra-fraction organ motion during intensity-modulated radiation therapy (IMRT) treatment can cause differences between the planned and the delivered dose distribution. To investigate the extent of these dosimetric changes, a computational model was developed and validated. The computational method allows for calculation of the rigid motion perturbed three-dimensional dose distribution in the CT volume and therefore a dose volume histogram-based assessment of the dosimetric impact of intra-fraction motion on a rigidly moving body. The method was developed and validated for both step-and-shoot IMRT and solid compensator IMRT treatment plans. For each segment (or beam), fluence maps were exported from the treatment planning system. Fluence maps were shifted according to the target position deduced from a motion track. These shifted, motion-encoded fluence maps were then re-imported into the treatment planning system and were used to calculate the motion-encoded dose distribution. To validate the accuracy of the motion-encoded dose distribution the treatment plan was delivered to a moving cylindrical phantom using a programmed four-dimensional motion phantom. Extended dose response (EDR-2) film was used to measure a planar dose distribution for comparison with the calculated motion-encoded distribution using a gamma index analysis (3% dose difference, 3 mm distance-to-agreement). A series of motion tracks incorporating both inter-beam step-function shifts and continuous sinusoidal motion were tested. The method was shown to accurately predict the film's dose distribution for all of the tested motion tracks, both for the step-and-shoot IMRT and compensator plans. The average gamma analysis pass rate for the measured dose distribution with respect to the calculated motion-encoded distribution was 98.3 ± 0.7%. For static delivery the average film-to-calculation pass rate was 98.7 ± 0.2%. In summary, a computational technique has been developed to calculate the dosimetric effect of intra-fraction motion. This technique has the potential to evaluate a given plan's sensitivity to anticipated organ motion. With knowledge of the organ's motion it can also be used as a tool to assess the impact of measured intra-fraction motion after dose delivery.

  5. Models Required to Mitigate Impacts of Space Weather on Space Systems

    NASA Technical Reports Server (NTRS)

    Barth, Janet L.

    2003-01-01

    This viewgraph presentation attempts to develop a model of factors which need to be considered in the design and construction of spacecraft to lessen the effects of space weather on these vehicles. Topics considered include: space environments and effects, radiation environments and effects, space weather drivers, space weather models, climate models, solar proton activity and mission design for the GOES mission. The authors conclude that space environment models need to address issues from mission planning through operations and a program to develop and validate authoritative space environment models for application to spacecraft design does not exist at this time.

  6. Planning Under Continuous Time and Resource Uncertainty: A Challenge for AI

    NASA Technical Reports Server (NTRS)

    Bresina, John; Dearden, Richard; Meuleau, Nicolas; Smith, David; Washington, Rich; Clancy, Daniel (Technical Monitor)

    2002-01-01

    There has been considerable work in Al on decision-theoretic planning and planning under uncertainty. Unfortunately, all of this work suffers from one or more of the following limitations: 1) it relies on very simple models of actions and time, 2) it assumes that uncertainty is manifested in discrete action outcomes, and 3) it is only practical for very small problems. For many real world problems, these assumptions fail to hold. A case in point is planning the activities for a Mars rover. For this domain none of the above assumptions are valid: 1) actions can be concurrent and have differing durations, 2) there is uncertainty concerning action durations and consumption of continuous resources like power, and 3) typical daily plans involve on the order of a hundred actions. We describe the rover problem, discuss previous work on planning under uncertainty, and present a detailed. but very small, example illustrating some of the difficulties of finding good plans.

  7. Multiyear Plan for Validation of EnergyPlus Multi-Zone HVAC System Modeling using ORNL's Flexible Research Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Im, Piljae; Bhandari, Mahabir S.; New, Joshua Ryan

    This document describes the Oak Ridge National Laboratory (ORNL) multiyear experimental plan for validation and uncertainty characterization of whole-building energy simulation for a multi-zone research facility using a traditional rooftop unit (RTU) as a baseline heating, ventilating, and air conditioning (HVAC) system. The project’s overarching objective is to increase the accuracy of energy simulation tools by enabling empirical validation of key inputs and algorithms. Doing so is required to inform the design of increasingly integrated building systems and to enable accountability for performance gaps between design and operation of a building. The project will produce documented data sets that canmore » be used to validate key functionality in different energy simulation tools and to identify errors and inadequate assumptions in simulation engines so that developers can correct them. ASHRAE Standard 140, Method of Test for the Evaluation of Building Energy Analysis Computer Programs (ASHRAE 2004), currently consists primarily of tests to compare different simulation programs with one another. This project will generate sets of measured data to enable empirical validation, incorporate these test data sets in an extended version of Standard 140, and apply these tests to the Department of Energy’s (DOE) EnergyPlus software (EnergyPlus 2016) to initiate the correction of any significant deficiencies. The fitness-for-purpose of the key algorithms in EnergyPlus will be established and demonstrated, and vendors of other simulation programs will be able to demonstrate the validity of their products. The data set will be equally applicable to validation of other simulation engines as well.« less

  8. Use of a vision model to quantify the significance of factors effecting target conspicuity

    NASA Astrophysics Data System (ADS)

    Gilmore, M. A.; Jones, C. K.; Haynes, A. W.; Tolhurst, D. J.; To, M.; Troscianko, T.; Lovell, P. G.; Parraga, C. A.; Pickavance, K.

    2006-05-01

    When designing camouflage it is important to understand how the human visual system processes the information to discriminate the target from the background scene. A vision model has been developed to compare two images and detect differences in local contrast in each spatial frequency channel. Observer experiments are being undertaken to validate this vision model so that the model can be used to quantify the relative significance of different factors affecting target conspicuity. Synthetic imagery can be used to design improved camouflage systems. The vision model is being used to compare different synthetic images to understand what features in the image are important to reproduce accurately and to identify the optimum way to render synthetic imagery for camouflage effectiveness assessment. This paper will describe the vision model and summarise the results obtained from the initial validation tests. The paper will also show how the model is being used to compare different synthetic images and discuss future work plans.

  9. Calibration of a stochastic health evolution model using NHIS data

    NASA Astrophysics Data System (ADS)

    Gupta, Aparna; Li, Zhisheng

    2011-10-01

    This paper presents and calibrates an individual's stochastic health evolution model. In this health evolution model, the uncertainty of health incidents is described by a stochastic process with a finite number of possible outcomes. We construct a comprehensive health status index (HSI) to describe an individual's health status, as well as a health risk factor system (RFS) to classify individuals into different risk groups. Based on the maximum likelihood estimation (MLE) method and the method of nonlinear least squares fitting, model calibration is formulated in terms of two mixed-integer nonlinear optimization problems. Using the National Health Interview Survey (NHIS) data, the model is calibrated for specific risk groups. Longitudinal data from the Health and Retirement Study (HRS) is used to validate the calibrated model, which displays good validation properties. The end goal of this paper is to provide a model and methodology, whose output can serve as a crucial component of decision support for strategic planning of health related financing and risk management.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tanny, S; Bogue, J; Parsai, E

    Purpose: Potential collisions between the gantry head and the patient or table assembly are difficult to detect in most treatment planning systems. We have developed and implemented a novel software package for the representation of potential gantry collisions with the couch assembly at the time of treatment planning. Methods: Physical dimensions of the Varian Edge linear accelerator treatment head were measured and reproduced using the Visual Python display package. A script was developed for the Pinnacle treatment planning system to generate a file with the relevant couch, gantry, and isocenter positions for each beam in a planning trial. A pythonmore » program was developed to parse the information from the TPS and produce a representative model of the couch/gantry system. Using the model and the Visual Python libraries, a rendering window is generated for each beam that allows the planner to evaluate the possibility of a collision. Results: Comparison against heuristic methods and direct verification on the machine validated the collision model generated by the software. Encounters of <1 cm between the gantry treatment head and table were visualized as collisions in our virtual model. Visual windows were created depicting the angle of collision for each beam, including the anticipated table coordinates. Visual rendering of a 6 arc trial with multiple couch positions was completed in under 1 minute, with network bandwidth being the primary bottleneck. Conclusion: The developed software allows for quick examination of possible collisions during the treatment planning process and helps to prevent major collisions prior to plan approval. The software can easily be implemented on future planning systems due to the versatility and platform independence of the Python programming language. Further integration of the software with the treatment planning system will allow the possibility of patient-gantry collision detection for a range of treatment machines.« less

  11. The development and validation of the advance care planning questionnaire in Malaysia.

    PubMed

    Lai, Pauline Siew Mei; Mohd Mudri, Salinah; Chinna, Karuthan; Othman, Sajaratulnisah

    2016-10-18

    Advance care planning is a voluntary process whereby individual preferences, values and beliefs are used to aid a person in planning for end-of-life care. Currently, there is no local instrument to assess an individual's awareness and attitude towards advance care planning. This study aimed to develop an Advance Care Planning Questionnaire and to determine its validity and reliability among older people in Malaysia. The Advance Care Planning Questionnaire was developed based on literature review. Face and content validity was verified by an expert panel, and piloted among 15 participants. Our study was conducted from October 2013 to February 2014, at an urban primary care clinic in Malaysia. Included were those aged >50 years, who could understand English. A retest was conducted 2 weeks after the first administration. Participants from the pilot study did not encounter any problems in answering the Advance Care Planning Questionnaire. Hence, no further modifications were made. Flesch reading ease was 71. The final version of the Advance Care Planning Questionnaire consists of 66 items: 30 items were measured on a nominal scale, whilst 36 items were measured on a Likert-like scale; of which we were only able to validate 22 items, as the remaining 14 items were descriptive in nature. A total of 245 eligible participants were approached; of which 230 agreed to participate (response rate = 93.9 %). Factor analysis on the 22 items measured on a Likert-scale revealed four domains: "feelings regarding advance care planning", "justifications for advance care planning", "justifications for not having advance care planning: fate and religion", and "justifications for not having advance care planning: avoid thinking about death". The Cronbach's alpha values for items each domain ranged from 0.637-0.915. In test-retest, kappa values ranged from 0.738-0.947. The final Advance Care Planning Questionnaire consisted of 63 items and 4 domains. It was found to be a valid and reliable instrument to assess the awareness and attitude of older people in Malaysia towards advance care planning.

  12. Coupling of Bayesian Networks with GIS for wildfire risk assessment on natural and agricultural areas of the Mediterranean

    NASA Astrophysics Data System (ADS)

    Scherb, Anke; Papakosta, Panagiota; Straub, Daniel

    2014-05-01

    Wildfires cause severe damages to ecosystems, socio-economic assets, and human lives in the Mediterranean. To facilitate coping with wildfire risks, an understanding of the factors influencing wildfire occurrence and behavior (e.g. human activity, weather conditions, topography, fuel loads) and their interaction is of importance, as is the implementation of this knowledge in improved wildfire hazard and risk prediction systems. In this project, a probabilistic wildfire risk prediction model is developed, with integrated fire occurrence and fire propagation probability and potential impact prediction on natural and cultivated areas. Bayesian Networks (BNs) are used to facilitate the probabilistic modeling. The final BN model is a spatial-temporal prediction system at the meso scale (1 km2 spatial and 1 day temporal resolution). The modeled consequences account for potential restoration costs and production losses referred to forests, agriculture, and (semi-) natural areas. BNs and a geographic information system (GIS) are coupled within this project to support a semi-automated BN model parameter learning and the spatial-temporal risk prediction. The coupling also enables the visualization of prediction results by means of daily maps. The BN parameters are learnt for Cyprus with data from 2006-2009. Data from 2010 is used as validation data set. A special focus is put on the performance evaluation of the BN for fire occurrence, which is modeled as binary classifier and thus, could be validated by means of Receiver Operator Characteristic (ROC) curves. With the final best models, AUC values of more than 70% for validation could be achieved, which indicates potential for reliable prediction performance via BN. Maps of selected days in 2010 are shown to illustrate final prediction results. The resulting system can be easily expanded to predict additional expected damages in the mesoscale (e.g. building and infrastructure damages). The system can support planning of preventive measures (e.g. state resources allocation for wildfire prevention and preparedness) and assist recuperation plans of damaged areas.

  13. Independent Monte-Carlo dose calculation for MLC based CyberKnife radiotherapy

    NASA Astrophysics Data System (ADS)

    Mackeprang, P.-H.; Vuong, D.; Volken, W.; Henzen, D.; Schmidhalter, D.; Malthaner, M.; Mueller, S.; Frei, D.; Stampanoni, M. F. M.; Dal Pra, A.; Aebersold, D. M.; Fix, M. K.; Manser, P.

    2018-01-01

    This work aims to develop, implement and validate a Monte Carlo (MC)-based independent dose calculation (IDC) framework to perform patient-specific quality assurance (QA) for multi-leaf collimator (MLC)-based CyberKnife® (Accuray Inc., Sunnyvale, CA) treatment plans. The IDC framework uses an XML-format treatment plan as exported from the treatment planning system (TPS) and DICOM format patient CT data, an MC beam model using phase spaces, CyberKnife MLC beam modifier transport using the EGS++ class library, a beam sampling and coordinate transformation engine and dose scoring using DOSXYZnrc. The framework is validated against dose profiles and depth dose curves of single beams with varying field sizes in a water tank in units of cGy/Monitor Unit and against a 2D dose distribution of a full prostate treatment plan measured with Gafchromic EBT3 (Ashland Advanced Materials, Bridgewater, NJ) film in a homogeneous water-equivalent slab phantom. The film measurement is compared to IDC results by gamma analysis using 2% (global)/2 mm criteria. Further, the dose distribution of the clinical treatment plan in the patient CT is compared to TPS calculation by gamma analysis using the same criteria. Dose profiles from IDC calculation in a homogeneous water phantom agree within 2.3% of the global max dose or 1 mm distance to agreement to measurements for all except the smallest field size. Comparing the film measurement to calculated dose, 99.9% of all voxels pass gamma analysis, comparing dose calculated by the IDC framework to TPS calculated dose for the clinical prostate plan shows 99.0% passing rate. IDC calculated dose is found to be up to 5.6% lower than dose calculated by the TPS in this case near metal fiducial markers. An MC-based modular IDC framework was successfully developed, implemented and validated against measurements and is now available to perform patient-specific QA by IDC.

  14. Development, test-retest reliability and validity of the Pharmacy Value-Added Services Questionnaire (PVASQ)

    PubMed Central

    Tan, Christine L.; Hassali, Mohamed A.; Saleem, Fahad; Shafie, Asrul A.; Aljadhey, Hisham; Gan, Vincent B.

    2015-01-01

    Objective: (i) To develop the Pharmacy Value-Added Services Questionnaire (PVASQ) using emerging themes generated from interviews. (ii) To establish reliability and validity of questionnaire instrument. Methods: Using an extended Theory of Planned Behavior as the theoretical model, face-to-face interviews generated salient beliefs of pharmacy value-added services. The PVASQ was constructed initially in English incorporating important themes and later translated into the Malay language with forward and backward translation. Intention (INT) to adopt pharmacy value-added services is predicted by attitudes (ATT), subjective norms (SN), perceived behavioral control (PBC), knowledge and expectations. Using a 7-point Likert-type scale and a dichotomous scale, test-retest reliability (N=25) was assessed by administrating the questionnaire instrument twice at an interval of one week apart. Internal consistency was measured by Cronbach’s alpha and construct validity between two administrations was assessed using the kappa statistic and the intraclass correlation coefficient (ICC). Confirmatory Factor Analysis, CFA (N=410) was conducted to assess construct validity of the PVASQ. Results: The kappa coefficients indicate a moderate to almost perfect strength of agreement between test and retest. The ICC for all scales tested for intra-rater (test-retest) reliability was good. The overall Cronbach’ s alpha (N=25) is 0.912 and 0.908 for the two time points. The result of CFA (N=410) showed most items loaded strongly and correctly into corresponding factors. Only one item was eliminated. Conclusions: This study is the first to develop and establish the reliability and validity of the Pharmacy Value-Added Services Questionnaire instrument using the Theory of Planned Behavior as the theoretical model. The translated Malay language version of PVASQ is reliable and valid to predict Malaysian patients’ intention to adopt pharmacy value-added services to collect partial medicine supply. PMID:26445622

  15. Laser-Induced Thermal Acoustic Measurements in a Highly Back-Pressured Scramjet Isolator Model: A Research Plan

    NASA Technical Reports Server (NTRS)

    Middleton, Troy F.; Balla, Robert J.; Baurle, Robert A.; Wilson, Lloyd G.

    2008-01-01

    Under the Propulsion Discipline of NASA s Fundamental Aeronautics Program s Hypersonics Project, a test apparatus, for testing a scramjet isolator model, is being constructed at NASA's Langley Research Center. The test apparatus will incorporate a 1-inch by 2-inch by 15-inch-long scramjet isolator model supplied with 2.1 lbm/sec of unheated dry air through a Mach 2.5 converging-diverging nozzle. The planned research will incorporate progressively more challenging measurement techniques to characterize the flow field within the isolator, concluding with the application of the Laser-Induced Thermal Acoustic (LITA) measurement technique. The primary goal of this research is to use the data acquired to validate Computational Fluid Dynamics (CFD) models employed to characterize the complex flow field of a scramjet isolator. This paper describes the test apparatus being constructed, pre-test CFD simulations, and the LITA measurement technique.

  16. Joint Seasonal ARMA Approach for Modeling of Load Forecast Errors in Planning Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hafen, Ryan P.; Samaan, Nader A.; Makarov, Yuri V.

    2014-04-14

    To make informed and robust decisions in the probabilistic power system operation and planning process, it is critical to conduct multiple simulations of the generated combinations of wind and load parameters and their forecast errors to handle the variability and uncertainty of these time series. In order for the simulation results to be trustworthy, the simulated series must preserve the salient statistical characteristics of the real series. In this paper, we analyze day-ahead load forecast error data from multiple balancing authority locations and characterize statistical properties such as mean, standard deviation, autocorrelation, correlation between series, time-of-day bias, and time-of-day autocorrelation.more » We then construct and validate a seasonal autoregressive moving average (ARMA) model to model these characteristics, and use the model to jointly simulate day-ahead load forecast error series for all BAs.« less

  17. Methodology, status and plans for development and assessment of Cathare code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bestion, D.; Barre, F.; Faydide, B.

    1997-07-01

    This paper presents the methodology, status and plans for the development, assessment and uncertainty evaluation of the Cathare code. Cathare is a thermalhydraulic code developed by CEA (DRN), IPSN, EDF and FRAMATOME for PWR safety analysis. First, the status of the code development and assessment is presented. The general strategy used for the development and the assessment of the code is presented. Analytical experiments with separate effect tests, and component tests are used for the development and the validation of closure laws. Successive Revisions of constitutive laws are implemented in successive Versions of the code and assessed. System tests ormore » integral tests are used to validate the general consistency of the Revision. Each delivery of a code Version + Revision is fully assessed and documented. A methodology is being developed to determine the uncertainty on all constitutive laws of the code using calculations of many analytical tests and applying the Discrete Adjoint Sensitivity Method (DASM). At last, the plans for the future developments of the code are presented. They concern the optimization of the code performance through parallel computing - the code will be used for real time full scope plant simulators - the coupling with many other codes (neutronic codes, severe accident codes), the application of the code for containment thermalhydraulics. Also, physical improvements are required in the field of low pressure transients and in the modeling for the 3-D model.« less

  18. Nonlinear dynamic analysis and optimal trajectory planning of a high-speed macro-micro manipulator

    NASA Astrophysics Data System (ADS)

    Yang, Yi-ling; Wei, Yan-ding; Lou, Jun-qiang; Fu, Lei; Zhao, Xiao-wei

    2017-09-01

    This paper reports the nonlinear dynamic modeling and the optimal trajectory planning for a flexure-based macro-micro manipulator, which is dedicated to the large-scale and high-speed tasks. In particular, a macro- micro manipulator composed of a servo motor, a rigid arm and a compliant microgripper is focused. Moreover, both flexure hinges and flexible beams are considered. By combining the pseudorigid-body-model method, the assumed mode method and the Lagrange equation, the overall dynamic model is derived. Then, the rigid-flexible-coupling characteristics are analyzed by numerical simulations. After that, the microscopic scale vibration excited by the large-scale motion is reduced through the trajectory planning approach. Especially, a fitness function regards the comprehensive excitation torque of the compliant microgripper is proposed. The reference curve and the interpolation curve using the quintic polynomial trajectories are adopted. Afterwards, an improved genetic algorithm is used to identify the optimal trajectory by minimizing the fitness function. Finally, the numerical simulations and experiments validate the feasibility and the effectiveness of the established dynamic model and the trajectory planning approach. The amplitude of the residual vibration reduces approximately 54.9%, and the settling time decreases 57.1%. Therefore, the operation efficiency and manipulation stability are significantly improved.

  19. A novel soft tissue prediction methodology for orthognathic surgery based on probabilistic finite element modelling

    PubMed Central

    Borghi, Alessandro; Ruggiero, Federica; Badiali, Giovanni; Bianchi, Alberto; Marchetti, Claudio; Rodriguez-Florez, Naiara; Breakey, Richard W. F.; Jeelani, Owase; Dunaway, David J.; Schievano, Silvia

    2018-01-01

    Repositioning of the maxilla in orthognathic surgery is carried out for functional and aesthetic purposes. Pre-surgical planning tools can predict 3D facial appearance by computing the response of the soft tissue to the changes to the underlying skeleton. The clinical use of commercial prediction software remains controversial, likely due to the deterministic nature of these computational predictions. A novel probabilistic finite element model (FEM) for the prediction of postoperative facial soft tissues is proposed in this paper. A probabilistic FEM was developed and validated on a cohort of eight patients who underwent maxillary repositioning and had pre- and postoperative cone beam computed tomography (CBCT) scans taken. Firstly, a variables correlation assessed various modelling parameters. Secondly, a design of experiments (DOE) provided a range of potential outcomes based on uniformly distributed input parameters, followed by an optimisation. Lastly, the second DOE iteration provided optimised predictions with a probability range. A range of 3D predictions was obtained using the probabilistic FEM and validated using reconstructed soft tissue surfaces from the postoperative CBCT data. The predictions in the nose and upper lip areas accurately include the true postoperative position, whereas the prediction under-estimates the position of the cheeks and lower lip. A probabilistic FEM has been developed and validated for the prediction of the facial appearance following orthognathic surgery. This method shows how inaccuracies in the modelling and uncertainties in executing surgical planning influence the soft tissue prediction and it provides a range of predictions including a minimum and maximum, which may be helpful for patients in understanding the impact of surgery on the face. PMID:29742139

  20. A novel soft tissue prediction methodology for orthognathic surgery based on probabilistic finite element modelling.

    PubMed

    Knoops, Paul G M; Borghi, Alessandro; Ruggiero, Federica; Badiali, Giovanni; Bianchi, Alberto; Marchetti, Claudio; Rodriguez-Florez, Naiara; Breakey, Richard W F; Jeelani, Owase; Dunaway, David J; Schievano, Silvia

    2018-01-01

    Repositioning of the maxilla in orthognathic surgery is carried out for functional and aesthetic purposes. Pre-surgical planning tools can predict 3D facial appearance by computing the response of the soft tissue to the changes to the underlying skeleton. The clinical use of commercial prediction software remains controversial, likely due to the deterministic nature of these computational predictions. A novel probabilistic finite element model (FEM) for the prediction of postoperative facial soft tissues is proposed in this paper. A probabilistic FEM was developed and validated on a cohort of eight patients who underwent maxillary repositioning and had pre- and postoperative cone beam computed tomography (CBCT) scans taken. Firstly, a variables correlation assessed various modelling parameters. Secondly, a design of experiments (DOE) provided a range of potential outcomes based on uniformly distributed input parameters, followed by an optimisation. Lastly, the second DOE iteration provided optimised predictions with a probability range. A range of 3D predictions was obtained using the probabilistic FEM and validated using reconstructed soft tissue surfaces from the postoperative CBCT data. The predictions in the nose and upper lip areas accurately include the true postoperative position, whereas the prediction under-estimates the position of the cheeks and lower lip. A probabilistic FEM has been developed and validated for the prediction of the facial appearance following orthognathic surgery. This method shows how inaccuracies in the modelling and uncertainties in executing surgical planning influence the soft tissue prediction and it provides a range of predictions including a minimum and maximum, which may be helpful for patients in understanding the impact of surgery on the face.

  1. Nonlinear scaling of the Unit Hydrograph Peaking Factor for dam safety

    NASA Astrophysics Data System (ADS)

    Pradhan, N. R.; Loney, D.

    2017-12-01

    Existing U.S. Army Corps of Engineers (USACE) policy suggests unit hydrograph peaking factor (UHPF), the ratio of an observed and modeled event unit hydrograph peak, range between 1.25 and 1.50 to ensure dam safety. It is pertinent to investigate the impact of extreme flood events on the validity of this range through physically based rainfall-runoff models not available during the planning and design of most USACE dams. The UHPF range was analyzed by deploying the Gridded Surface Subsurface Hydrologic Analysis (GSSHA) model in the Goose Creek, VA, watershed to develop a UHPF relationship with excess rainfall across various return-period events. An effective rainfall factor (ERF) is introduced to validate existing UHPF guidance as well as provide a nonlinear UHPF scaling relation when effective rainfall does not match that of the UH design event.

  2. Sci—Fri PM: Dosimetry—06: Commissioning of a 3D patient specific QA system for hypofractionated prostate treatments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rivest, R; Venkataraman, S; McCurdy, B

    The objective of this work is to commission the 6MV-SRS beam model in COMPASS (v2.1, IBA-Dosimetry) and validate its use for patient specific QA of hypofractionated prostate treatments. The COMPASS system consists of a 2D ion chamber array (MatriXX{sup Evolution}), an independent gantry angle sensor and associated software. The system can either directly calculate or reconstruct (using measured detector responses) a 3D dose distribution on the patient CT dataset for plan verification. Beam models are developed and commissioned in the same manner as a beam model is commissioned in a standard treatment planning system. Model validation was initially performed bymore » comparing both COMPASS calculations and reconstructions to measured open field beam data. Next, 10 hypofractionated prostate RapidArc plans were delivered to both the COMPASS system and a phantom with ion chamber and film inserted. COMPASS dose distributions calculated and reconstructed on the phantom CT dataset were compared to the chamber and film measurements. The mean (± standard deviation) difference between COMPASS reconstructed dose and ion chamber measurement was 1.4 ± 1.0%. The maximum discrepancy was 2.6%. Corresponding values for COMPASS calculation were 0.9 ± 0.9% and 2.6%, respectively. The average gamma agreement index (3%/3mm) for COMPAS reconstruction and film was 96.7% and 95.3% when using 70% and 20% dose thresholds, respectively. The corresponding values for COMPASS calculation were 97.1% and 97.1%, respectively. Based on our results, COMPASS can be used for the patient specific QA of hypofractionated prostate treatments delivered with the 6MV-SRS beam.« less

  3. SU-F-J-94: Development of a Plug-in Based Image Analysis Tool for Integration Into Treatment Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owen, D; Anderson, C; Mayo, C

    Purpose: To extend the functionality of a commercial treatment planning system (TPS) to support (i) direct use of quantitative image-based metrics within treatment plan optimization and (ii) evaluation of dose-functional volume relationships to assist in functional image adaptive radiotherapy. Methods: A script was written that interfaces with a commercial TPS via an Application Programming Interface (API). The script executes a program that performs dose-functional volume analyses. Written in C#, the script reads the dose grid and correlates it with image data on a voxel-by-voxel basis through API extensions that can access registration transforms. A user interface was designed through WinFormsmore » to input parameters and display results. To test the performance of this program, image- and dose-based metrics computed from perfusion SPECT images aligned to the treatment planning CT were generated, validated, and compared. Results: The integration of image analysis information was successfully implemented as a plug-in to a commercial TPS. Perfusion SPECT images were used to validate the calculation and display of image-based metrics as well as dose-intensity metrics and histograms for defined structures on the treatment planning CT. Various biological dose correction models, custom image-based metrics, dose-intensity computations, and dose-intensity histograms were applied to analyze the image-dose profile. Conclusion: It is possible to add image analysis features to commercial TPSs through custom scripting applications. A tool was developed to enable the evaluation of image-intensity-based metrics in the context of functional targeting and avoidance. In addition to providing dose-intensity metrics and histograms that can be easily extracted from a plan database and correlated with outcomes, the system can also be extended to a plug-in optimization system, which can directly use the computed metrics for optimization of post-treatment tumor or normal tissue response models. Supported by NIH - P01 - CA059827.« less

  4. Cost Modeling for Space Optical Telescope Assemblies

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip; Henrichs, Todd; Luedtke, Alexander; West, Miranda

    2011-01-01

    Parametric cost models are used to plan missions, compare concepts and justify technology investments. This paper reviews an on-going effort to develop cost modes for space telescopes. This paper summarizes the methodology used to develop cost models and documents how changes to the database have changed previously published preliminary cost models. While the cost models are evolving, the previously published findings remain valid: it costs less per square meter of collecting aperture to build a large telescope than a small telescope; technology development as a function of time reduces cost; and lower areal density telescopes cost more than more massive telescopes.

  5. Moving Up to the Top of the Landfill: A Field-Validated, Science-Based Methane Emissions Inventory Model for California Landfills

    USDA-ARS?s Scientific Manuscript database

    California is typically at the forefront of innovative planning & regulatory strategies for environmental protection in the U.S. Two years ago, a research project was initiated by the California Energy Commission to develop an improved method for landfill methane emissions for the state greenhouse ...

  6. Man in space - The use of animal models

    NASA Technical Reports Server (NTRS)

    Ballard, Rodney W.; Souza, Kenneth A.

    1991-01-01

    The use of animal surrogates as experimental subjects in order to provide essential missing information on the effects of long-term spaceflights, to validate countermeasures, and to test medical treatment techniques is discussed. Research needs also include the definition of biomedical adaptations to flight, and the developments of standards for safe space missions to assure human health and productivity during and following flight. NASA research plans in this area are outlined. Over the next 40 years, NASA plans to concentrate on the use of rodents and nonhuman primates as the models of choice for various physiological responses observed in humans during extended stays in space. This research will include flights on the Space Shuttle, unmanned biosatellites, and the Space Station Freedom.

  7. Combining gait optimization with passive system to increase the energy efficiency of a humanoid robot walking movement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pereira, Ana I.; ALGORITMI,University of Minho; Lima, José

    There are several approaches to create the Humanoid robot gait planning. This problem presents a large number of unknown parameters that should be found to make the humanoid robot to walk. Optimization in simulation models can be used to find the gait based on several criteria such as energy minimization, acceleration, step length among the others. The energy consumption can also be reduced with elastic elements coupled to each joint. The presented paper addresses an optimization method, the Stretched Simulated Annealing, that runs in an accurate and stable simulation model to find the optimal gait combined with elastic elements. Finalmore » results demonstrate that optimization is a valid gait planning technique.« less

  8. Comparison of free-piston Stirling engine model predictions with RE1000 engine test data

    NASA Technical Reports Server (NTRS)

    Tew, R. C., Jr.

    1984-01-01

    Predictions of a free-piston Stirling engine model are compared with RE1000 engine test data taken at NASA-Lewis Research Center. The model validation and the engine testing are being done under a joint interagency agreement between the Department of Energy's Oak Ridge National Laboratory and NASA-Lewis. A kinematic code developed at Lewis was upgraded to permit simulation of free-piston engine performance; it was further upgraded and modified at Lewis and is currently being validated. The model predicts engine performance by numerical integration of equations for each control volume in the working space. Piston motions are determined by numerical integration of the force balance on each piston or can be specified as Fourier series. In addition, the model Fourier analyzes the various piston forces to permit the construction of phasor force diagrams. The paper compares predicted and experimental values of power and efficiency and shows phasor force diagrams for the RE1000 engine displacer and piston. Further development plans for the model are also discussed.

  9. Performance Comparison of NAMI DANCE and FLOW-3D® Models in Tsunami Propagation, Inundation and Currents using NTHMP Benchmark Problems

    NASA Astrophysics Data System (ADS)

    Velioglu Sogut, Deniz; Yalciner, Ahmet Cevdet

    2018-06-01

    Field observations provide valuable data regarding nearshore tsunami impact, yet only in inundation areas where tsunami waves have already flooded. Therefore, tsunami modeling is essential to understand tsunami behavior and prepare for tsunami inundation. It is necessary that all numerical models used in tsunami emergency planning be subject to benchmark tests for validation and verification. This study focuses on two numerical codes, NAMI DANCE and FLOW-3D®, for validation and performance comparison. NAMI DANCE is an in-house tsunami numerical model developed by the Ocean Engineering Research Center of Middle East Technical University, Turkey and Laboratory of Special Research Bureau for Automation of Marine Research, Russia. FLOW-3D® is a general purpose computational fluid dynamics software, which was developed by scientists who pioneered in the design of the Volume-of-Fluid technique. The codes are validated and their performances are compared via analytical, experimental and field benchmark problems, which are documented in the ``Proceedings and Results of the 2011 National Tsunami Hazard Mitigation Program (NTHMP) Model Benchmarking Workshop'' and the ``Proceedings and Results of the NTHMP 2015 Tsunami Current Modeling Workshop". The variations between the numerical solutions of these two models are evaluated through statistical error analysis.

  10. Model Development and Experimental Validation of the Fusible Heat Sink Design for Exploration Vehicles

    NASA Technical Reports Server (NTRS)

    Cognata, Thomas J.; Leimkuehler, Thomas O.; Sheth, Rubik B.; Le,Hung

    2012-01-01

    The Fusible Heat Sink is a novel vehicle heat rejection technology which combines a flow through radiator with a phase change material. The combined technologies create a multi-function device able to shield crew members against Solar Particle Events (SPE), reduce radiator extent by permitting sizing to the average vehicle heat load rather than to the peak vehicle heat load, and to substantially absorb heat load excursions from the average while constantly maintaining thermal control system setpoints. This multi-function technology provides great flexibility for mission planning, making it possible to operate a vehicle in hot or cold environments and under high or low heat load conditions for extended periods of time. This paper describes the model development and experimental validation of the Fusible Heat Sink technology. The model developed was intended to meet the radiation and heat rejection requirements of a nominal MMSEV mission. Development parameters and results, including sizing and model performance will be discussed. From this flight-sized model, a scaled test-article design was modeled, designed, and fabricated for experimental validation of the technology at Johnson Space Center thermal vacuum chamber facilities. Testing showed performance comparable to the model at nominal loads and the capability to maintain heat loads substantially greater than nominal for extended periods of time.

  11. Model Development and Experimental Validation of the Fusible Heat Sink Design for Exploration Vehicles

    NASA Technical Reports Server (NTRS)

    Cognata, Thomas J.; Leimkuehler, Thomas; Sheth, Rubik; Le, Hung

    2013-01-01

    The Fusible Heat Sink is a novel vehicle heat rejection technology which combines a flow through radiator with a phase change material. The combined technologies create a multi-function device able to shield crew members against Solar Particle Events (SPE), reduce radiator extent by permitting sizing to the average vehicle heat load rather than to the peak vehicle heat load, and to substantially absorb heat load excursions from the average while constantly maintaining thermal control system setpoints. This multi-function technology provides great flexibility for mission planning, making it possible to operate a vehicle in hot or cold environments and under high or low heat load conditions for extended periods of time. This paper describes the modeling and experimental validation of the Fusible Heat Sink technology. The model developed was intended to meet the radiation and heat rejection requirements of a nominal MMSEV mission. Development parameters and results, including sizing and model performance will be discussed. From this flight-sized model, a scaled test-article design was modeled, designed, and fabricated for experimental validation of the technology at Johnson Space Center thermal vacuum chamber facilities. Testing showed performance comparable to the model at nominal loads and the capability to maintain heat loads substantially greater than nominal for extended periods of time.

  12. Development and Validation of a Predictive Model for Functional Outcome After Stroke Rehabilitation: The Maugeri Model.

    PubMed

    Scrutinio, Domenico; Lanzillo, Bernardo; Guida, Pietro; Mastropasqua, Filippo; Monitillo, Vincenzo; Pusineri, Monica; Formica, Roberto; Russo, Giovanna; Guarnaschelli, Caterina; Ferretti, Chiara; Calabrese, Gianluigi

    2017-12-01

    Prediction of outcome after stroke rehabilitation may help clinicians in decision-making and planning rehabilitation care. We developed and validated a predictive tool to estimate the probability of achieving improvement in physical functioning (model 1) and a level of independence requiring no more than supervision (model 2) after stroke rehabilitation. The models were derived from 717 patients admitted for stroke rehabilitation. We used multivariable logistic regression analysis to build each model. Then, each model was prospectively validated in 875 patients. Model 1 included age, time from stroke occurrence to rehabilitation admission, admission motor and cognitive Functional Independence Measure scores, and neglect. Model 2 included age, male gender, time since stroke onset, and admission motor and cognitive Functional Independence Measure score. Both models demonstrated excellent discrimination. In the derivation cohort, the area under the curve was 0.883 (95% confidence intervals, 0.858-0.910) for model 1 and 0.913 (95% confidence intervals, 0.884-0.942) for model 2. The Hosmer-Lemeshow χ 2 was 4.12 ( P =0.249) and 1.20 ( P =0.754), respectively. In the validation cohort, the area under the curve was 0.866 (95% confidence intervals, 0.840-0.892) for model 1 and 0.850 (95% confidence intervals, 0.815-0.885) for model 2. The Hosmer-Lemeshow χ 2 was 8.86 ( P =0.115) and 34.50 ( P =0.001), respectively. Both improvement in physical functioning (hazard ratios, 0.43; 0.25-0.71; P =0.001) and a level of independence requiring no more than supervision (hazard ratios, 0.32; 0.14-0.68; P =0.004) were independently associated with improved 4-year survival. A calculator is freely available for download at https://goo.gl/fEAp81. This study provides researchers and clinicians with an easy-to-use, accurate, and validated predictive tool for potential application in rehabilitation research and stroke management. © 2017 American Heart Association, Inc.

  13. Logistics modelling: improving resource management and public information strategies in Florida.

    PubMed

    Walsh, Daniel M; Van Groningen, Chuck; Craig, Brian

    2011-10-01

    One of the most time-sensitive and logistically-challenging emergency response operations today is to provide mass prophylaxis to every man, woman and child in a community within 48 hours of a bioterrorism attack. To meet this challenge, federal, state and local public health departments in the USA have joined forces to develop, test and execute large-scale bioterrorism response plans. This preparedness and response effort is funded through the US Centers for Disease Control and Prevention's Cities Readiness Initiative, a programme dedicated to providing oral antibiotics to an entire population within 48 hours of a weaponised inhalation anthrax attack. This paper will demonstrate how the State of Florida used a logistics modelling tool to improve its CRI mass prophylaxis plans. Special focus will be on how logistics modelling strengthened Florida's resource management policies and validated its public information strategies.

  14. Teaching and Learning Activity Sequencing System using Distributed Genetic Algorithms

    NASA Astrophysics Data System (ADS)

    Matsui, Tatsunori; Ishikawa, Tomotake; Okamoto, Toshio

    The purpose of this study is development of a supporting system for teacher's design of lesson plan. Especially design of lesson plan which relates to the new subject "Information Study" is supported. In this study, we developed a system which generates teaching and learning activity sequences by interlinking lesson's activities corresponding to the various conditions according to the user's input. Because user's input is multiple information, there will be caused contradiction which the system should solve. This multiobjective optimization problem is resolved by Distributed Genetic Algorithms, in which some fitness functions are defined with reference models on lesson, thinking and teaching style. From results of various experiments, effectivity and validity of the proposed methods and reference models were verified; on the other hand, some future works on reference models and evaluation functions were also pointed out.

  15. A Hyperbolic Ontology Visualization Tool for Model Application Programming Interface Documentation

    NASA Technical Reports Server (NTRS)

    Hyman, Cody

    2011-01-01

    Spacecraft modeling, a critically important portion in validating planned spacecraft activities, is currently carried out using a time consuming method of mission to mission model implementations and integration. A current project in early development, Integrated Spacecraft Analysis (ISCA), aims to remedy this hindrance by providing reusable architectures and reducing time spent integrating models with planning and sequencing tools. The principle objective of this internship was to develop a user interface for an experimental ontology-based structure visualization of navigation and attitude control system modeling software. To satisfy this, a number of tree and graph visualization tools were researched and a Java based hyperbolic graph viewer was selected for experimental adaptation. Early results show promise in the ability to organize and display large amounts of spacecraft model documentation efficiently and effectively through a web browser. This viewer serves as a conceptual implementation for future development but trials with both ISCA developers and end users should be performed to truly evaluate the effectiveness of continued development of such visualizations.

  16. Maximally Expressive Modeling of Operations Tasks

    NASA Technical Reports Server (NTRS)

    Jaap, John; Richardson, Lea; Davis, Elizabeth

    2002-01-01

    Planning and scheduling systems organize "tasks" into a timeline or schedule. The tasks are defined within the scheduling system in logical containers called models. The dictionary might define a model of this type as "a system of things and relations satisfying a set of rules that, when applied to the things and relations, produce certainty about the tasks that are being modeled." One challenging domain for a planning and scheduling system is the operation of on-board experiments for the International Space Station. In these experiments, the equipment used is among the most complex hardware ever developed, the information sought is at the cutting edge of scientific endeavor, and the procedures are intricate and exacting. Scheduling is made more difficult by a scarcity of station resources. The models to be fed into the scheduler must describe both the complexity of the experiments and procedures (to ensure a valid schedule) and the flexibilities of the procedures and the equipment (to effectively utilize available resources). Clearly, scheduling International Space Station experiment operations calls for a "maximally expressive" modeling schema.

  17. Implementing microscopic charcoal in a global climate-aerosol model

    NASA Astrophysics Data System (ADS)

    Gilgen, Anina; Lohmann, Ulrike; Brügger, Sandra; Adolf, Carole; Ickes, Luisa

    2017-04-01

    Information about past fire activity is crucial to validate fire models and to better understand their deficiencies. Several paleofire records exist, among them ice cores and sediments, which preserve fire tracers like levoglucosan, vanillic acid, or charcoal particles. In this work, we implement microscopic charcoal particles (maximum dimension 10-100 μm) into the global climate-aerosol model ECHAM6.3HAM2.3. Since we are not aware of any reliable estimates of microscopic charcoal emissions, we scaled black carbon emissions from GFAS to capture the charcoal fluxes from a calibration dataset. After that, model results were compared with a validation dataset. The coarse model resolution (T63L31; 1.9°x1.9°) impedes the model to capture local variability of charcoal fluxes. However, variability on the global scale is pronounced due to highly-variable fire emissions. In future, we plan to model charcoal fluxes in the past 1-2 centuries using fire emissions provided from fire models. Furthermore, we intend to compare modelled charcoal fluxes from prescribed fire emissions with those calculated by an interactive fire model.

  18. Reverse-translational biomarker validation of Abnormal Repetitive Behaviors in mice: an illustration of the 4P's modeling approach

    PubMed Central

    Garner, Joseph P.; Thogerson, Collette M.; Dufour, Brett D.; Würbel, Hanno; Murray, James D.; Mench, Joy A.

    2011-01-01

    The NIMH's new strategic plan, with its emphasis on the “4P's” (Prediction, Preemption, Personalization, & Populations) and biomarker-based medicine requires a radical shift in animal modeling methodology. In particular 4P's models will be non-determinant (i.e. disease severity will depend on secondary environmental and genetic factors); and validated by reverse-translation of animal homologues to human biomarkers. A powerful consequence of the biomarker approach is that different closely-related disorders have a unique fingerprint of biomarkers. Animals can be validated as a highly-specific model of a single disorder by matching this `fingerprint'; or as a model of a symptom seen in multiple disorders by matching common biomarkers. Here we illustrate this approach with two Abnormal Repetitive Behaviors (ARBs) in mice: stereotypies; and barbering (hair pulling). We developed animal versions of the neuropsychological biomarkers that distinguish human ARBs, and tested the fingerprint of the different mouse ARBs. As predicted, the two mouse ARBs were associated with different biomarkers. Both barbering and stereotypy could be discounted as models of OCD (even though they are widely used as such), due to the absence of limbic biomarkers which are characteristic of OCD and hence are necessary for a valid model. Conversely barbering matched the fingerprint of trichotillomania (i.e. selective deficits in set-shifting), suggesting it may be a highly specific model of this disorder. In contrast stereotypies were correlated only with a biomarker (deficits in response shifting) correlated with stereotypies in multiple disorders, suggesting that animal stereotypies model stereotypies in multiple disorders. PMID:21219937

  19. Combining correlative and mechanistic habitat suitability models to improve ecological compensation.

    PubMed

    Meineri, Eric; Deville, Anne-Sophie; Grémillet, David; Gauthier-Clerc, Michel; Béchet, Arnaud

    2015-02-01

    Only a few studies have shown positive impacts of ecological compensation on species dynamics affected by human activities. We argue that this is due to inappropriate methods used to forecast required compensation in environmental impact assessments. These assessments are mostly descriptive and only valid at limited spatial and temporal scales. However, habitat suitability models developed to predict the impacts of environmental changes on potential species' distributions should provide rigorous science-based tools for compensation planning. Here we describe the two main classes of predictive models: correlative models and individual-based mechanistic models. We show how these models can be used alone or synoptically to improve compensation planning. While correlative models are easier to implement, they tend to ignore underlying ecological processes and lack accuracy. On the contrary, individual-based mechanistic models can integrate biological interactions, dispersal ability and adaptation. Moreover, among mechanistic models, those considering animal energy balance are particularly efficient at predicting the impact of foraging habitat loss. However, mechanistic models require more field data compared to correlative models. Hence we present two approaches which combine both methods for compensation planning, especially in relation to the spatial scale considered. We show how the availability of biological databases and software enabling fast and accurate population projections could be advantageously used to assess ecological compensation requirement efficiently in environmental impact assessments. © 2014 The Authors. Biological Reviews © 2014 Cambridge Philosophical Society.

  20. Normal tissue complication probability (NTCP) modelling using spatial dose metrics and machine learning methods for severe acute oral mucositis resulting from head and neck radiotherapy.

    PubMed

    Dean, Jamie A; Wong, Kee H; Welsh, Liam C; Jones, Ann-Britt; Schick, Ulrike; Newbold, Kate L; Bhide, Shreerang A; Harrington, Kevin J; Nutting, Christopher M; Gulliford, Sarah L

    2016-07-01

    Severe acute mucositis commonly results from head and neck (chemo)radiotherapy. A predictive model of mucositis could guide clinical decision-making and inform treatment planning. We aimed to generate such a model using spatial dose metrics and machine learning. Predictive models of severe acute mucositis were generated using radiotherapy dose (dose-volume and spatial dose metrics) and clinical data. Penalised logistic regression, support vector classification and random forest classification (RFC) models were generated and compared. Internal validation was performed (with 100-iteration cross-validation), using multiple metrics, including area under the receiver operating characteristic curve (AUC) and calibration slope, to assess performance. Associations between covariates and severe mucositis were explored using the models. The dose-volume-based models (standard) performed equally to those incorporating spatial information. Discrimination was similar between models, but the RFCstandard had the best calibration. The mean AUC and calibration slope for this model were 0.71 (s.d.=0.09) and 3.9 (s.d.=2.2), respectively. The volumes of oral cavity receiving intermediate and high doses were associated with severe mucositis. The RFCstandard model performance is modest-to-good, but should be improved, and requires external validation. Reducing the volumes of oral cavity receiving intermediate and high doses may reduce mucositis incidence. Copyright © 2016 The Author(s). Published by Elsevier Ireland Ltd.. All rights reserved.

  1. Space Technology 5: Changing the Mission Design without Changing the Hardware

    NASA Technical Reports Server (NTRS)

    Carlisle, Candace C.; Webb, Evan H.; Slavin, James A.

    2005-01-01

    The Space Technology 5 (ST-5) Project is part of NASA's New Millennium Program. The validation objectives are to demonstrate the research-quality science capability of the ST-5 spacecraft; to operate the three spacecraft as a constellation; and to design, develop, test and flight-validate three capable micro-satellites with new technologies. A three-month flight demonstration phase is planned, beginning in March 2006. This year, the mission was re-planned for a Pegasus XL dedicated launch into an elliptical polar orbit (instead of the Originally-planned Geosynchronous Transfer Orbit.) The re-plan allows the mission to achieve the same high-level technology validation objectives with a different launch vehicle. The new mission design involves a revised science validation strategy, a new orbit and different communication strategy, while minimizing changes to the ST-5 spacecraft itself. The constellation operations concepts have also been refined. While the system engineers, orbit analysts, and operations teams were re-planning the mission, the implementation team continued to make progress on the flight hardware. Most components have been delivered, and the first spacecraft is well into integration and test.

  2. The theory of planned behaviour: reactions and reflections.

    PubMed

    Ajzen, Icek

    2011-09-01

    The seven articles in this issue, and the accompanying meta-analysis in Health Psychology Review [McEachan, R.R.C., Conner, M., Taylor, N., & Lawton, R.J. (2011). Prospective prediction of health-related behaviors with the theory of planned behavior: A meta-analysis. Health Psychology Review, 5, 97-144], illustrate the wide application of the theory of planned behaviour [Ajzen, I. (1991). The theory of planned behavior. Organizational Behavior and Human Decision Processes, 50, 179-211] in the health domain. In this editorial, Ajzen reflects on some of the issues raised by the different authors. Among the topics addressed are the nature of intentions and the limits of predictive validity; rationality, affect and emotions; past behaviour and habit; the prototype/willingness model; and the role of such background factors as the big five personality traits and social comparison tendency.

  3. Turkish translation and adaptation of Champion's Health Belief Model Scales for breast cancer mammography screening.

    PubMed

    Yilmaz, Meryem; Sayin, Yazile Yazici

    2014-07-01

    To examine the translation and adaptation process from English to Turkish and the validity and reliability of the Champion's Health Belief Model Scales for Mammography Screening. Its aim (1) is to provide data about and (2) to assess Turkish women's attitudes and behaviours towards mammography. The proportion of women who have mammography is lower in Turkey. The Champion's Health Belief Model Scales for Mammography Screening-Turkish version can be helpful to determine Turkish women's health beliefs, particularly about mammography. Cross-sectional design was used to collect survey data from Turkish women: classical measurement method. The Champion's Health Belief Model Scales for Mammography Screening was translated from English to Turkish. Again, it was back translated into English. Later, the meaning and clarity of the scale items were evaluated by a bilingual group representing the culture of the target population. Finally, the tool was evaluated by two bilingual professional researchers in terms of content validity, translation validity and psychometric estimates of the validity and reliability. The analysis included a total of 209 Turkish women. The validity of the scale was confirmed by confirmatory factor analysis and criterion-related validity testing. The Champion's Health Belief Model Scales for Mammography Screening aligned to four factors that were coherent and relatively independent of each other. There was a statistically significant relationship among all of the subscale items: the positive and high correlation of the total item test score and high Cronbach's α. The scale has a strong stability over time: the Champion's Health Belief Model Scales for Mammography Screening demonstrated acceptable preliminary values of reliability and validity. The Champion's Health Belief Model Scales for Mammography Screening is both a reliable and valid instrument that can be useful in measuring the health beliefs of Turkish women. It can be used to provide data about healthcare practices required for mammography screening and breast cancer prevention. This scale will show nurses that nursing intervention planning is essential for increasing Turkish women's participation in mammography screening. © 2013 John Wiley & Sons Ltd.

  4. Simulation of runoff and nutrient export from a typical small watershed in China using the Hydrological Simulation Program-Fortran.

    PubMed

    Li, Zhaofu; Liu, Hongyu; Luo, Chuan; Li, Yan; Li, Hengpeng; Pan, Jianjun; Jiang, Xiaosan; Zhou, Quansuo; Xiong, Zhengqin

    2015-05-01

    The Hydrological Simulation Program-Fortran (HSPF), which is a hydrological and water-quality computer model that was developed by the United States Environmental Protection Agency, was employed to simulate runoff and nutrient export from a typical small watershed in a hilly eastern monsoon region of China. First, a parameter sensitivity analysis was performed to assess how changes in the model parameters affect runoff and nutrient export. Next, the model was calibrated and validated using measured runoff and nutrient concentration data. The Nash-Sutcliffe efficiency (E NS ) values of the yearly runoff were 0.87 and 0.69 for the calibration and validation periods, respectively. For storms runoff events, the E NS values were 0.93 for the calibration period and 0.47 for the validation period. Antecedent precipitation and soil moisture conditions can affect the simulation accuracy of storm event flow. The E NS values for the total nitrogen (TN) export were 0.58 for the calibration period and 0.51 for the validation period. In addition, the correlation coefficients between the observed and simulated TN concentrations were 0.84 for the calibration period and 0.74 for the validation period. For phosphorus export, the E NS values were 0.89 for the calibration period and 0.88 for the validation period. In addition, the correlation coefficients between the observed and simulated orthophosphate concentrations were 0.96 and 0.94 for the calibration and validation periods, respectively. The nutrient simulation results are generally satisfactory even though the parameter-lumped HSPF model cannot represent the effects of the spatial pattern of land cover on nutrient export. The model parameters obtained in this study could serve as reference values for applying the model to similar regions. In addition, HSPF can properly describe the characteristics of water quantity and quality processes in this area. After adjustment, calibration, and validation of the parameters, the HSPF model is suitable for hydrological and water-quality simulations in watershed planning and management and for designing best management practices.

  5. Chemical decontamination technical resources at Los Alamos National Laboratory (2008)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moore, Murray E

    This document supplies information resources for a person seeking to create planning or pre-planning documents for chemical decontamination operations. A building decontamination plan can be separated into four different sections: Pre-planning, Characterization, Decontamination (Initial response and also complete cleanup), and Clearance. Of the identified Los Alamos resources, they can be matched with these four sections: Pre-planning -- Dave Seidel, EO-EPP, Emergency Planning and Preparedness; David DeCroix and Bruce Letellier, D-3, Computational fluids modeling of structures; Murray E. Moore, RP-2, Aerosol sampling and ventilation engineering. Characterization (this can include development projects) -- Beth Perry, IAT-3, Nuclear Counterterrorism Response (SNIPER database); Fernandomore » Garzon, MPA-11, Sensors and Electrochemical Devices (development); George Havrilla, C-CDE, Chemical Diagnostics and Engineering; Kristen McCabe, B-7, Biosecurity and Public Health. Decontamination -- Adam Stively, EO-ER, Emergency Response; Dina Matz, IHS-IP, Industrial hygiene; Don Hickmott, EES-6, Chemical cleanup. Clearance (validation) -- Larry Ticknor, CCS-6, Statistical Sciences.« less

  6. Extension, validation and application of the NASCAP code

    NASA Technical Reports Server (NTRS)

    Katz, I.; Cassidy, J. J., III; Mandell, M. J.; Schnuelle, G. W.; Steen, P. G.; Parks, D. E.; Rotenberg, M.; Alexander, J. H.

    1979-01-01

    Numerous extensions were made in the NASCAP code. They fall into three categories: a greater range of definable objects, a more sophisticated computational model, and simplified code structure and usage. An important validation of NASCAP was performed using a new two dimensional computer code (TWOD). An interactive code (MATCHG) was written to compare material parameter inputs with charging results. The first major application of NASCAP was performed on the SCATHA satellite. Shadowing and charging calculation were completed. NASCAP was installed at the Air Force Geophysics Laboratory, where researchers plan to use it to interpret SCATHA data.

  7. Model Development and Process Analysis for Lean Cellular Design Planning in Aerospace Assembly and Manufacturing

    NASA Astrophysics Data System (ADS)

    Hilburn, Monty D.

    Successful lean manufacturing and cellular manufacturing execution relies upon a foundation of leadership commitment and strategic planning built upon solid data and robust analysis. The problem for this study was to create and employ a simple lean transformation planning model and review process that could be used to identify functional support staff resources required to plan and execute lean manufacturing cells within aerospace assembly and manufacturing sites. The lean planning model was developed using available literature for lean manufacturing kaizen best practices and validated through a Delphi panel of lean experts. The resulting model and a standardized review process were used to assess the state of lean transformation planning at five sites of an international aerospace manufacturing and assembly company. The results of the three day, on-site review were compared with baseline plans collected from each of the five sites to determine if there analyzed, with focus on three critical areas of lean planning: the number and type of manufacturing cells identified, the number, type, and duration of planned lean and continuous kaizen events, and the quantity and type of functional staffing resources planned to support the kaizen schedule. Summarized data of the baseline and on-site reviews was analyzed with descriptive statistics. ANOVAs and paired-t tests at 95% significance level were conducted on the means of data sets to determine if null hypotheses related to cell, kaizen event, and support resources could be rejected. The results of the research found significant differences between lean transformation plans developed by site leadership and plans developed utilizing the structured, on-site review process and lean transformation planning model. The null hypothesis that there was no difference between the means of pre-review and on-site cell counts was rejected, as was the null hypothesis that there was no significant difference in kaizen event plans. These factors are critical inputs into the support staffing resources calculation used by the lean planning model. Null hypothesis related to functional support staff resources was rejected for most functional groups, indicating that the baseline site plan inadequately provided for cross-functional staff involvement to support the lean transformation plan. Null hypotheses related to total lean transformation staffing could not be rejected, indicating that while total staffing plans were not significantly different than plans developed during the on-site review and through the use of the lean planning model, the allocation of staffing among various functional groups such as engineering, production, and materials planning was an issue. The on-site review process and simple lean transformation plan developed was determined to be useful in identifying short-comings in lean transformation planning within aerospace manufacturing and assembly sites. It was concluded that the differences uncovered were likely contributing factors affecting the effectiveness of aerospace manufacturing sites' implementation of lean cellular manufacturing.

  8. Helium ions for radiotherapy? Physical and biological verifications of a novel treatment modality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krämer, Michael, E-mail: m.kraemer@gsi.de; Scifoni, Emanuele; Schuy, Christoph

    Purpose: Modern facilities for actively scanned ion beam radiotherapy allow in principle the use of helium beams, which could present specific advantages, especially for pediatric tumors. In order to assess the potential use of these beams for radiotherapy, i.e., to create realistic treatment plans, the authors set up a dedicated {sup 4}He beam model, providing base data for their treatment planning system TRiP98, and they have reported that in this work together with its physical and biological validations. Methods: A semiempirical beam model for the physical depth dose deposition and the production of nuclear fragments was developed and introduced inmore » TRiP98. For the biological effect calculations the last version of the local effect model was used. The model predictions were experimentally verified at the HIT facility. The primary beam attenuation and the characteristics of secondary charged particles at various depth in water were investigated using {sup 4}He ion beams of 200 MeV/u. The nuclear charge of secondary fragments was identified using a ΔE/E telescope. 3D absorbed dose distributions were measured with pin point ionization chambers and the biological dosimetry experiments were realized irradiating a Chinese hamster ovary cells stack arranged in an extended target. Results: The few experimental data available on basic physical processes are reproduced by their beam model. The experimental verification of absorbed dose distributions in extended target volumes yields an overall agreement, with a slight underestimation of the lateral spread. Cell survival along a 4 cm extended target is reproduced with remarkable accuracy. Conclusions: The authors presented a simple simulation model for therapeutical {sup 4}He beams which they introduced in TRiP98, and which is validated experimentally by means of physical and biological dosimetries. Thus, it is now possible to perform detailed treatment planning studies with {sup 4}He beams, either exclusively or in combination with other ion modalities.« less

  9. "Could I return to my life?" Integrated Narrative Nursing Model in Education (INNE).

    PubMed

    Artioli, Giovanna; Foà, Chiara; Cosentino, Chiara; Sulla, Francesco; Sollami, Alfonso; Taffurelli, Chiara

    2018-03-28

    The Integrated Narrative Nursing Model (INNM) is an approach that integrates the qualitative methodology typical of the human sciences, with the quantitative methodology more often associated with the natural sciences. This complex model, which combines a focus on narrative with quantitative measures, has recently been effectively applied to the assessment of chronic patients. In this study, the model is applied to the planning phase of education (Integrated Narrative Nursing Education, INNE), and proves to be a valid instrument for the promotion of the current educational paradigm that is centered on the engagement of both the patient and the caregiver in their own path of care. The aim of this study is therefore to describe the nurse's strategy in the planning of an educational intervention by using the INNE model. The case of a 70-year-old woman with pulmonary neoplasm is described at her first admission to Hospice. Each step conducted by the reference nurse, who uses INNE to record the nurse-patient narrative and collect subsequent questionnaires in order to create a shared educational plan, is also described. The information collected was submitted, starting from a grounded methodology to the following four levels of analysis: I. Needs Assessment, II. Narrative Diagnosis, III. Quantitative Outcome, IV. Integrated Outcome. Step IV, which is derived from the integration of all levels of analysis, allows a nurse to define, even graphically, the conceptual map of a patient's needs, resources and perspectives, in a completely tailored manner. The INNE model offers a valid methodological support for the professional who intends to educate the patient through an inter-subjective and engaged pathway, between the professional, their patient and the socio-relational context. It is a matter of adopting a complex vision that combines processes and methods that require a steady scientific basis and advanced methodological expertise with active listening and empathy - skills which require emotional intelligence.

  10. Adaptation and Validation of the Tower of London Test of Planning and Problem Solving in People with Intellectual Disabilities

    ERIC Educational Resources Information Center

    Masson, J. D.; Dagnan, D.; Evans, J.

    2010-01-01

    Background: There is a need for validated, standardised tools for the assessment of executive functions in adults with intellectual disabilities (ID). This study examines the validity of a test of planning and problem solving (Tower of London) with adults with ID. Method: Participants completed an adapted version of the Tower of London (ToL) while…

  11. Application of Large-Scale Aptamer-Based Proteomic Profiling to Planned Myocardial Infarctions.

    PubMed

    Jacob, Jaison; Ngo, Debby; Finkel, Nancy; Pitts, Rebecca; Gleim, Scott; Benson, Mark D; Keyes, Michelle J; Farrell, Laurie A; Morgan, Thomas; Jennings, Lori L; Gerszten, Robert E

    2018-03-20

    Emerging proteomic technologies using novel affinity-based reagents allow for efficient multiplexing with high-sample throughput. To identify early biomarkers of myocardial injury, we recently applied an aptamer-based proteomic profiling platform that measures 1129 proteins to samples from patients undergoing septal alcohol ablation for hypertrophic cardiomyopathy, a human model of planned myocardial injury. Here, we examined the scalability of this approach using a markedly expanded platform to study a far broader range of human proteins in the context of myocardial injury. We applied a highly multiplexed, expanded proteomic technique that uses single-stranded DNA aptamers to assay 4783 human proteins (4137 distinct human gene targets) to derivation and validation cohorts of planned myocardial injury, individuals with spontaneous myocardial infarction, and at-risk controls. We found 376 target proteins that significantly changed in the blood after planned myocardial injury in a derivation cohort (n=20; P <1.05E-05, 1-way repeated measures analysis of variance, Bonferroni threshold). Two hundred forty-seven of these proteins were validated in an independent planned myocardial injury cohort (n=15; P <1.33E-04, 1-way repeated measures analysis of variance); >90% were directionally consistent and reached nominal significance in the validation cohort. Among the validated proteins that were increased within 1 hour after planned myocardial injury, 29 were also elevated in patients with spontaneous myocardial infarction (n=63; P <6.17E-04). Many of the novel markers identified in our study are intracellular proteins not previously identified in the peripheral circulation or have functional roles relevant to myocardial injury. For example, the cardiac LIM protein, cysteine- and glycine-rich protein 3, is thought to mediate cardiac mechanotransduction and stress responses, whereas the mitochondrial ATP synthase F 0 subunit component is a vasoactive peptide on its release from cells. Last, we performed aptamer-affinity enrichment coupled with mass spectrometry to technically verify aptamer specificity for a subset of the new biomarkers. Our results demonstrate the feasibility of large-scale aptamer multiplexing at a level that has not previously been reported and with sample throughput that greatly exceeds other existing proteomic methods. The expanded aptamer-based proteomic platform provides a unique opportunity for biomarker and pathway discovery after myocardial injury. © 2017 American Heart Association, Inc.

  12. Novel prediction model of renal function after nephrectomy from automated renal volumetry with preoperative multidetector computed tomography (MDCT).

    PubMed

    Isotani, Shuji; Shimoyama, Hirofumi; Yokota, Isao; Noma, Yasuhiro; Kitamura, Kousuke; China, Toshiyuki; Saito, Keisuke; Hisasue, Shin-ichi; Ide, Hisamitsu; Muto, Satoru; Yamaguchi, Raizo; Ukimura, Osamu; Gill, Inderbir S; Horie, Shigeo

    2015-10-01

    The predictive model of postoperative renal function may impact on planning nephrectomy. To develop the novel predictive model using combination of clinical indices with computer volumetry to measure the preserved renal cortex volume (RCV) using multidetector computed tomography (MDCT), and to prospectively validate performance of the model. Total 60 patients undergoing radical nephrectomy from 2011 to 2013 participated, including a development cohort of 39 patients and an external validation cohort of 21 patients. RCV was calculated by voxel count using software (Vincent, FUJIFILM). Renal function before and after radical nephrectomy was assessed via the estimated glomerular filtration rate (eGFR). Factors affecting postoperative eGFR were examined by regression analysis to develop the novel model for predicting postoperative eGFR with a backward elimination method. The predictive model was externally validated and the performance of the model was compared with that of the previously reported models. The postoperative eGFR value was associated with age, preoperative eGFR, preserved renal parenchymal volume (RPV), preserved RCV, % of RPV alteration, and % of RCV alteration (p < 0.01). The significant correlated variables for %eGFR alteration were %RCV preservation (r = 0.58, p < 0.01) and %RPV preservation (r = 0.54, p < 0.01). We developed our regression model as follows: postoperative eGFR = 57.87 - 0.55(age) - 15.01(body surface area) + 0.30(preoperative eGFR) + 52.92(%RCV preservation). Strong correlation was seen between postoperative eGFR and the calculated estimation model (r = 0.83; p < 0.001). The external validation cohort (n = 21) showed our model outperformed previously reported models. Combining MDCT renal volumetry and clinical indices might yield an important tool for predicting postoperative renal function.

  13. High resolution infrared datasets useful for validating stratospheric models

    NASA Technical Reports Server (NTRS)

    Rinsland, Curtis P.

    1992-01-01

    An important objective of the High Speed Research Program (HSRP) is to support research in the atmospheric sciences that will improve the basic understanding of the circulation and chemistry of the stratosphere and lead to an interim assessment of the impact of a projected fleet of High Speed Civil Transports (HSCT's) on the stratosphere. As part of this work, critical comparisons between models and existing high quality measurements are planned. These comparisons will be used to test the reliability of current atmospheric chemistry models. Two suitable sets of high resolution infrared measurements are discussed.

  14. Adaptation and validation of the Tower of London test of planning and problem solving in people with intellectual disabilities.

    PubMed

    Masson, J D; Dagnan, D; Evans, J

    2010-05-01

    There is a need for validated, standardised tools for the assessment of executive functions in adults with intellectual disabilities (ID). This study examines the validity of a test of planning and problem solving (Tower of London) with adults with ID. Participants completed an adapted version of the Tower of London (ToL) while day-centre staff completed adaptive function (Adaptive Behaviour Scale - Residential and Community: Second Edition, modified version) and dysexecutive function (DEX-Independent Rater) questionnaires for each participant. Correlation analyses of test and questionnaire variables were undertaken. The adapted ToL has a robust structure and shows significant associations with independent living skills, challenging behaviour and behaviours related to dysexecutive function. The adapted ToL is a valid test for use with people with ID. However, there is also a need to develop other ecologically valid tools based on everyday planning tasks undertaken by people with ID.

  15. Parameter Optimisation and Uncertainty Analysis in Visual MODFLOW based Flow Model for predicting the groundwater head in an Eastern Indian Aquifer

    NASA Astrophysics Data System (ADS)

    Mohanty, B.; Jena, S.; Panda, R. K.

    2016-12-01

    The overexploitation of groundwater elicited in abandoning several shallow tube wells in the study Basin in Eastern India. For the sustainability of groundwater resources, basin-scale modelling of groundwater flow is indispensable for the effective planning and management of the water resources. The basic intent of this study is to develop a 3-D groundwater flow model of the study basin using the Visual MODFLOW Flex 2014.2 package and successfully calibrate and validate the model using 17 years of observed data. The sensitivity analysis was carried out to quantify the susceptibility of aquifer system to the river bank seepage, recharge from rainfall and agriculture practices, horizontal and vertical hydraulic conductivities, and specific yield. To quantify the impact of parameter uncertainties, Sequential Uncertainty Fitting Algorithm (SUFI-2) and Markov chain Monte Carlo (McMC) techniques were implemented. Results from the two techniques were compared and the advantages and disadvantages were analysed. Nash-Sutcliffe coefficient (NSE), Coefficient of Determination (R2), Mean Absolute Error (MAE), Mean Percent Deviation (Dv) and Root Mean Squared Error (RMSE) were adopted as criteria of model evaluation during calibration and validation of the developed model. NSE, R2, MAE, Dv and RMSE values for groundwater flow model during calibration and validation were in acceptable range. Also, the McMC technique was able to provide more reasonable results than SUFI-2. The calibrated and validated model will be useful to identify the aquifer properties, analyse the groundwater flow dynamics and the change in groundwater levels in future forecasts.

  16. Transport aircraft loading and balancing system: Using a CLIPS expert system for military aircraft load planning

    NASA Technical Reports Server (NTRS)

    Richardson, J.; Labbe, M.; Belala, Y.; Leduc, Vincent

    1994-01-01

    The requirement for improving aircraft utilization and responsiveness in airlift operations has been recognized for quite some time by the Canadian Forces. To date, the utilization of scarce airlift resources has been planned mainly through the employment of manpower-intensive manual methods in combination with the expertise of highly qualified personnel. In this paper, we address the problem of facilitating the load planning process for military aircraft cargo planes through the development of a computer-based system. We introduce TALBAS (Transport Aircraft Loading and BAlancing System), a knowledge-based system designed to assist personnel involved in preparing valid load plans for the C130 Hercules aircraft. The main features of this system which are accessible through a convivial graphical user interface, consists of the automatic generation of valid cargo arrangements given a list of items to be transported, the user-definition of load plans and the automatic validation of such load plans.

  17. Satellite (SWOT) and Airborne (AirSWOT) Wide-Swath Altimeters to Study the Garonne River

    NASA Astrophysics Data System (ADS)

    Biancamaria, S.; Rodriguez, E.; Goutal, N.; Ricci, S.; Mognard, N.; Rogel, P.; Le Pape, E.

    2013-09-01

    The future NASA/CNES Surface Water and Ocean Topography (SWOT) satellite mission will provide global 2D maps of water elevations, water surface volume change and river discharge at an unprecedented resolution. To prepare this mission, airborne campaigns, called AirSWOT, will fly over the Garonne River (and other targets of interest) in 2014. To plan AirSWOT flights over the Garonne, 1D and 2D hydrodynamic models of the 50 km Garonne River reach between Tonneins and La Reole towns developed by the Laboratoire National d'Hydraulique et Environnement (LNHE) will be used. Models outputs will help to validate airborne measurements. After validation, AirSWOT measurements will be assimilated in the models to reduce model errors. Finally, potential algorithms to estimate discharge from AirSWOT and SWOT observations will be tested over this river reach. This paper presents the study domain, the hydrodynamic models and their use in the context of AirSWOT campaigns in France.

  18. Artificial neural network based gynaecological image-guided adaptive brachytherapy treatment planning correction of intra-fractional organs at risk dose variation

    PubMed Central

    Jaberi, Ramin; Aghamiri, Mahmoud Reza; Kirisits, Christian; Ghaderi, Reza

    2017-01-01

    Purpose Intra-fractional organs at risk (OARs) deformations can lead to dose variation during image-guided adaptive brachytherapy (IGABT). The aim of this study was to modify the final accepted brachytherapy treatment plan to dosimetrically compensate for these intra-fractional organs-applicators position variations and, at the same time, fulfilling the dosimetric criteria. Material and methods Thirty patients with locally advanced cervical cancer, after external beam radiotherapy (EBRT) of 45-50 Gy over five to six weeks with concomitant weekly chemotherapy, and qualified for intracavitary high-dose-rate (HDR) brachytherapy with tandem-ovoid applicators were selected for this study. Second computed tomography scan was done for each patient after finishing brachytherapy treatment with applicators in situ. Artificial neural networks (ANNs) based models were used to predict intra-fractional OARs dose-volume histogram parameters variations and propose a new final plan. Results A model was developed to estimate the intra-fractional organs dose variations during gynaecological intracavitary brachytherapy. Also, ANNs were used to modify the final brachytherapy treatment plan to compensate dosimetrically for changes in ‘organs-applicators’, while maintaining target dose at the original level. Conclusions There are semi-automatic and fast responding models that can be used in the routine clinical workflow to reduce individually IGABT uncertainties. These models can be more validated by more patients’ plans to be able to serve as a clinical tool. PMID:29441094

  19. A comprehensive surface-groundwater flow model

    NASA Astrophysics Data System (ADS)

    Arnold, Jeffrey G.; Allen, Peter M.; Bernhardt, Gilbert

    1993-02-01

    In this study, a simple groundwater flow and height model was added to an existing basin-scale surface water model. The linked model is: (1) watershed scale, allowing the basin to be subdivided; (2) designed to accept readily available inputs to allow general use over large regions; (3) continuous in time to allow simulation of land management, including such factors as climate and vegetation changes, pond and reservoir management, groundwater withdrawals, and stream and reservoir withdrawals. The model is described, and is validated on a 471 km 2 watershed near Waco, Texas. This linked model should provide a comprehensive tool for water resource managers in development and planning.

  20. A NASTRAN model of a large flexible swing-wing bomber. Volume 5: NASTRAN model development-fairing structure

    NASA Technical Reports Server (NTRS)

    Mock, W. D.; Latham, R. A.

    1982-01-01

    The NASTRAN model plan for the fairing structure was expanded in detail to generate the NASTRAN model of this substructure. The grid point coordinates, element definitions, material properties, and sizing data for each element were specified. The fairing model was thoroughly checked out for continuity, connectivity, and constraints. The substructure was processed for structural influence coefficients (SIC) point loadings to determine the deflection characteristics of the fairing model. Finally, a demonstration and validation processing of this substructure was accomplished using the NASTRAN finite element program. The bulk data deck, stiffness matrices, and SIC output data were delivered.

  1. Predicting intention to attend and actual attendance at a universal parent-training programme: a comparison of social cognition models.

    PubMed

    Thornton, Sarah; Calam, Rachel

    2011-07-01

    The predictive validity of the Health Belief Model (HBM) and the Theory of Planned Behaviour (TPB) were examined in relation to 'intention to attend' and 'actual attendance' at a universal parent-training intervention for parents of children with behavioural difficulties. A validation and reliability study was conducted to develop two questionnaires (N = 108 parents of children aged 4-7).These questionnaires were then used to investigate the predictive validity of the two models in relation to 'intention to attend' and 'actual attendance' at a parent-training intervention ( N = 53 parents of children aged 4-7). Both models significantly predicted 'intention to attend a parent-training group'; however, the TPB accounted for more variance in the outcome variable compared to the HBM. Preliminary investigations highlighted that attendees were more likely to intend to attend the groups, have positive attitudes towards the groups, perceive important others as having positive attitudes towards the groups, and report elevated child problem behaviour scores. These findings provide useful information regarding the belief-based factors that affect attendance at universal parent-training groups. Possible interventions aimed at increasing 'intention to attend' and 'actual attendance' at parent-training groups are discussed.

  2. Duchenne Regulatory Science Consortium Meeting on Disease Progression Modeling for Duchenne Muscular Dystrophy

    PubMed Central

    Larkindale, Jane; Abresch, Richard; Aviles, Enrique; Bronson, Abby; Chin, Janice; Furlong, Pat; Gordish-Dressman, Heather; Habeeb-Louks, Elizabeth; Henricson, Erik; Kroger, Hans; Lynn, Charles; Lynn, Stephen; Martin, Dana; Nuckolls, Glen; Rooney, William; Romero, Klaus; Sweeney, Lee; Vandenborne, Krista; Walter, Glenn; Wolff, Jodi; Wong, Brenda; McDonald, Craig M.; Duchenne Regulatory Science Consortium, Imaging-DMD Consortium and the CINRG Investigators, members of the

    2017-01-01

    Introduction: The Duchenne Regulatory Science Consortium (D-RSC) was established to develop tools to accelerate drug development for DMD.  The resulting tools are anticipated to meet validity requirements outlined by qualification/endorsement pathways at both the U.S. Food and Drug Administration (FDA) and European Medicines Administration (EMA), and will be made available to the drug development community. The initial goals of the consortium include the development of a disease progression model, with the goal of creating a model that would be used to forecast changes in clinically meaningful endpoints, which would inform clinical trial protocol development and data analysis.  Methods: In April of 2016 the consortium and other experts met to formulate plans for the development of the model.  Conclusions: Here we report the results of the meeting, and discussion as to the form of the model that we plan to move forward to develop, after input from the regulatory authorities. PMID:28228973

  3. Space Weather Modeling at the Community Coordinated Modeling Center

    NASA Astrophysics Data System (ADS)

    Hesse, M.; Falasca, A.; Johnson, J.; Keller, K.; Kuznetsova, M.; Rastaetter, L.

    2003-04-01

    The Community Coordinated Modeling Center (CCMC) is a multi-agency partnership aimed at the creation of next generation space weather models. The goal of the CCMC is to support the research and developmental work necessary to substantially increase the present-day modeling capability for space weather purposes, and to provide models for transition to the rapid prototyping centers at the space weather forecast centers. This goal requires close collaborations with and substantial involvement of the research community. The physical regions to be addressed by CCMC-related activities range from the solar atmosphere to the Earth's upper atmosphere. The CCMC is an integral part of NASA's Living With a Star (LWS) initiative, of the National Space Weather Program Implementation Plan, and of the Department of Defense Space Weather Transition Plan. CCMC includes a facility at NASA Goddard Space Flight Center, as well as distributed computing facilities provided by the US Air Force. CCMC also provides, to the research community, access to state-of-the-art space research models. In this paper we will provide updates on CCMC status, on current plans, research and development accomplishments and goals, and on the model testing and validation process undertaken as part of the CCMC mandate. We will demonstrate the capabilities of models resident at CCMC via the analysis of a geomagnetic storm, driven by a shock in the solar wind.

  4. Stakeholder validation of a model of readiness for transition to adult care.

    PubMed

    Schwartz, Lisa A; Brumley, Lauren D; Tuchman, Lisa K; Barakat, Lamia P; Hobbie, Wendy L; Ginsberg, Jill P; Daniel, Lauren C; Kazak, Anne E; Bevans, Katherine; Deatrick, Janet A

    2013-10-01

    That too few youth with special health care needs make the transition to adult-oriented health care successfully may be due, in part, to lack of readiness to transfer care. There is a lack of theoretical models to guide development and implementation of evidence-based guidelines, assessments, and interventions to improve transition readiness. To further validate the Social-ecological Model of Adolescent and Young Adult Readiness to Transition (SMART) via feedback from stakeholders (patients, parents, and providers) from a medically diverse population in need of life-long follow-up care, survivors of childhood cancer. Mixed-methods participatory research design. A large Mid-Atlantic children's hospital. Adolescent and young adult survivors of childhood cancer (n = 14), parents (n = 18), and pediatric providers (n = 10). Patients and parents participated in focus groups; providers participated in individual semi-structured interviews. Validity of SMART was assessed 3 ways: (1) ratings on importance of SMART components for transition readiness using a 5-point scale (0-4; ratings >2 support validity), (2) nominations of 3 "most important" components, and (3) directed content analysis of focus group/interview transcripts. Qualitative data supported the validity of SMART, with minor modifications to definitions of components. Quantitative ratings met criteria for validity; stakeholders endorsed all components of SMART as important for transition. No additional SMART variables were suggested by stakeholders and the "most important" components varied by stakeholders, thus supporting the comprehensiveness of SMART and need to involve multiple perspectives. SMART represents a comprehensive and empirically validated framework for transition research and program planning, supported by survivors of childhood cancer, parents, and pediatric providers. Future research should validate SMART among other populations with special health care needs.

  5. Modeling demand for public transit services in rural areas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Attaluri, P.; Seneviratne, P.N.; Javid, M.

    1997-05-01

    Accurate estimates of demand are critical for planning, designing, and operating public transit systems. Previous research has demonstrated that the expected demand in rural areas is a function of both demographic and transit system variables. Numerous models have been proposed to describe the relationship between the aforementioned variables. However, most of them are site specific and their validity over time and space is not reported or perhaps has not been tested. Moreover, input variables in some cases are extremely difficult to quantify. In this article, the estimation of demand using the generalized linear modeling technique is discussed. Two separate models,more » one for fixed-route and another for demand-responsive services, are presented. These models, calibrated with data from systems in nine different states, are used to demonstrate the appropriateness and validity of generalized linear models compared to the regression models. They explain over 70% of the variation in expected demand for fixed-route services and 60% of the variation in expected demand for demand-responsive services. It was found that the models are spatially transferable and that data for calibration are easily obtainable.« less

  6. Performance assessment of Large Eddy Simulation (LES) for modeling dispersion in an urban street canyon with tree planting

    NASA Astrophysics Data System (ADS)

    Moonen, P.; Gromke, C.; Dorer, V.

    2013-08-01

    The potential of a Large Eddy Simulation (LES) model to reliably predict near-field pollutant dispersion is assessed. To that extent, detailed time-resolved numerical simulations of coupled flow and dispersion are conducted for a street canyon with tree planting. Different crown porosities are considered. The model performance is assessed in several steps, ranging from a qualitative comparison to measured concentrations, over statistical data analysis by means of scatter plots and box plots, up to the calculation of objective validation metrics. The extensive validation effort highlights and quantifies notable features and shortcomings of the model, which would otherwise remain unnoticed. The model performance is found to be spatially non-uniform. Closer agreement with measurement data is achieved near the canyon ends than for the central part of the canyon, and typical model acceptance criteria are satisfied more easily for the leeward than for the windward canyon wall. This demonstrates the need for rigorous model evaluation. Only quality-assured models can be used with confidence to support assessment, planning and implementation of pollutant mitigation strategies.

  7. The validation index: a new metric for validation of segmentation algorithms using two or more expert outlines with application to radiotherapy planning.

    PubMed

    Juneja, Prabhjot; Evans, Philp M; Harris, Emma J

    2013-08-01

    Validation is required to ensure automated segmentation algorithms are suitable for radiotherapy target definition. In the absence of true segmentation, algorithmic segmentation is validated against expert outlining of the region of interest. Multiple experts are used to overcome inter-expert variability. Several approaches have been studied in the literature, but the most appropriate approach to combine the information from multiple expert outlines, to give a single metric for validation, is unclear. None consider a metric that can be tailored to case-specific requirements in radiotherapy planning. Validation index (VI), a new validation metric which uses experts' level of agreement was developed. A control parameter was introduced for the validation of segmentations required for different radiotherapy scenarios: for targets close to organs-at-risk and for difficult to discern targets, where large variation between experts is expected. VI was evaluated using two simulated idealized cases and data from two clinical studies. VI was compared with the commonly used Dice similarity coefficient (DSCpair - wise) and found to be more sensitive than the DSCpair - wise to the changes in agreement between experts. VI was shown to be adaptable to specific radiotherapy planning scenarios.

  8. Technical Note: Dosimetric effects of couch position variability on treatment plan quality with an MRI-guided Co-60 radiation therapy machine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chow, Phillip E., E-mail: pechow@mednet.ucla.edu

    2016-08-15

    Purpose: Magnetic resonance imaging (MRI) guidance in radiation therapy brings real-time imaging and adaptive planning into the treatment vault where it can account for interfraction and intrafraction movement of soft tissue. The only commercially available MRI-guided radiation therapy device is a three-head {sup 60}Co and MRI system with an integrated treatment planning system (TPS). Couch attenuation of the beam of up to 20% is well modeled in the TPS. Variations in the patient’s day-to-day position introduce discrepancies in the actual couch attenuation as modeled in the treatment plan. For this reason, the authors’ institution avoids plans with beams that passmore » through or near the couch edges. This study investigates the effects of differential beam attenuation by the couch due to couch shifts in order to determine whether couch edge avoidance restrictions can be lifted. Couch shifts were simulated using a Monte Carlo treatment planning system and ion chamber measurements performed for validation. Methods: A total of 27 plans from 23 patients were investigated. Couch shifts of 1 and 2 cm were introduced in combinations of lateral and vertical directions to simulate patient position variations giving 16 shifted plans per reference plan. The 1 and 2 cm shifts were based on shifts recorded in 320 treatment fractions. Results: Following TG176 recommendations for measurement methods, couch attenuation measurements agreed with TPS modeled attenuation to within 2.1%. Planning target volume D95 changed less than 1% for 1 and 2 cm couch shifts in only the x-direction and less than 3% for all directions. Conclusions: Dosimetry of all plans tested was robust to couch shifts up to ±2 cm. In general, couch shifts resulted in clinically insignificant dosimetric deviations. It is conceivable that in certain cases with large systematic couch shifts and plans that are particularly sensitive to shifts, dosimetric changes might rise to a clinically significant level.« less

  9. A comprehensive scoring system to measure healthy community design in land use plans and regulations.

    PubMed

    Maiden, Kristin M; Kaplan, Marina; Walling, Lee Ann; Miller, Patricia P; Crist, Gina

    2017-02-01

    Comprehensive land use plans and their corresponding regulations play a role in determining the nature of the built environment and community design, which are factors that influence population health and health disparities. To determine the level in which a plan addresses healthy living and active design, there is a need for a systematic, reliable and valid method of analyzing and scoring health-related content in plans and regulations. This paper describes the development and validation of a scoring tool designed to measure the strength and comprehensiveness of health-related content found in land use plans and the corresponding regulations. The measures are scored based on the presence of a specific item and the specificity and action-orientation of language. To establish reliability and validity, 42 land use plans and regulations from across the United States were scored January-April 2016. Results of the psychometric analysis indicate the scorecard is a reliable scoring tool for land use plans and regulations related to healthy living and active design. Intraclass correlation coefficients (ICC) scores showed strong inter-rater reliability for total strength and comprehensiveness. ICC scores for total implementation scores showed acceptable consistency among scorers. Cronbach's alpha values for all focus areas were acceptable. Strong content validity was measured through a committee vetting process. The development of this tool has far-reaching implications, bringing standardization of measurement to the field of land use plan assessment, and paving the way for systematic inclusion of health-related design principles, policies, and requirements in land use plans and their corresponding regulations. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. A singular value decomposition linear programming (SVDLP) optimization technique for circular cone based robotic radiotherapy.

    PubMed

    Liang, Bin; Li, Yongbao; Wei, Ran; Guo, Bin; Xu, Xuang; Liu, Bo; Li, Jiafeng; Wu, Qiuwen; Zhou, Fugen

    2018-01-05

    With robot-controlled linac positioning, robotic radiotherapy systems such as CyberKnife significantly increase freedom of radiation beam placement, but also impose more challenges on treatment plan optimization. The resampling mechanism in the vendor-supplied treatment planning system (MultiPlan) cannot fully explore the increased beam direction search space. Besides, a sparse treatment plan (using fewer beams) is desired to improve treatment efficiency. This study proposes a singular value decomposition linear programming (SVDLP) optimization technique for circular collimator based robotic radiotherapy. The SVDLP approach initializes the input beams by simulating the process of covering the entire target volume with equivalent beam tapers. The requirements on dosimetry distribution are modeled as hard and soft constraints, and the sparsity of the treatment plan is achieved by compressive sensing. The proposed linear programming (LP) model optimizes beam weights by minimizing the deviation of soft constraints subject to hard constraints, with a constraint on the l 1 norm of the beam weight. A singular value decomposition (SVD) based acceleration technique was developed for the LP model. Based on the degeneracy of the influence matrix, the model is first compressed into lower dimension for optimization, and then back-projected to reconstruct the beam weight. After beam weight optimization, the number of beams is reduced by removing the beams with low weight, and optimizing the weights of the remaining beams using the same model. This beam reduction technique is further validated by a mixed integer programming (MIP) model. The SVDLP approach was tested on a lung case. The results demonstrate that the SVD acceleration technique speeds up the optimization by a factor of 4.8. Furthermore, the beam reduction achieves a similar plan quality to the globally optimal plan obtained by the MIP model, but is one to two orders of magnitude faster. Furthermore, the SVDLP approach is tested and compared with MultiPlan on three clinical cases of varying complexities. In general, the plans generated by the SVDLP achieve steeper dose gradient, better conformity and less damage to normal tissues. In conclusion, the SVDLP approach effectively improves the quality of treatment plan due to the use of the complete beam search space. This challenging optimization problem with the complete beam search space is effectively handled by the proposed SVD acceleration.

  11. A singular value decomposition linear programming (SVDLP) optimization technique for circular cone based robotic radiotherapy

    NASA Astrophysics Data System (ADS)

    Liang, Bin; Li, Yongbao; Wei, Ran; Guo, Bin; Xu, Xuang; Liu, Bo; Li, Jiafeng; Wu, Qiuwen; Zhou, Fugen

    2018-01-01

    With robot-controlled linac positioning, robotic radiotherapy systems such as CyberKnife significantly increase freedom of radiation beam placement, but also impose more challenges on treatment plan optimization. The resampling mechanism in the vendor-supplied treatment planning system (MultiPlan) cannot fully explore the increased beam direction search space. Besides, a sparse treatment plan (using fewer beams) is desired to improve treatment efficiency. This study proposes a singular value decomposition linear programming (SVDLP) optimization technique for circular collimator based robotic radiotherapy. The SVDLP approach initializes the input beams by simulating the process of covering the entire target volume with equivalent beam tapers. The requirements on dosimetry distribution are modeled as hard and soft constraints, and the sparsity of the treatment plan is achieved by compressive sensing. The proposed linear programming (LP) model optimizes beam weights by minimizing the deviation of soft constraints subject to hard constraints, with a constraint on the l 1 norm of the beam weight. A singular value decomposition (SVD) based acceleration technique was developed for the LP model. Based on the degeneracy of the influence matrix, the model is first compressed into lower dimension for optimization, and then back-projected to reconstruct the beam weight. After beam weight optimization, the number of beams is reduced by removing the beams with low weight, and optimizing the weights of the remaining beams using the same model. This beam reduction technique is further validated by a mixed integer programming (MIP) model. The SVDLP approach was tested on a lung case. The results demonstrate that the SVD acceleration technique speeds up the optimization by a factor of 4.8. Furthermore, the beam reduction achieves a similar plan quality to the globally optimal plan obtained by the MIP model, but is one to two orders of magnitude faster. Furthermore, the SVDLP approach is tested and compared with MultiPlan on three clinical cases of varying complexities. In general, the plans generated by the SVDLP achieve steeper dose gradient, better conformity and less damage to normal tissues. In conclusion, the SVDLP approach effectively improves the quality of treatment plan due to the use of the complete beam search space. This challenging optimization problem with the complete beam search space is effectively handled by the proposed SVD acceleration.

  12. The Development and Validation of a School Unit Output Model for Recognizing Merit Schools.

    ERIC Educational Resources Information Center

    Rouse, Albert H.

    The current success and acceptability of the Design for School Excellence (DSE) in the Cincinnati (Ohio) Public Schools is due largely to the seriousness in which the Board of Education and the Superintendent communicated with school employees and the entire community in planning the final outcome of this program. From the original suggestion of a…

  13. Factors Affecting Acceptance & Use of ReWIND: Validating the Extended Unified Theory of Acceptance and Use of Technology

    ERIC Educational Resources Information Center

    Nair, Pradeep Kumar; Ali, Faizan; Leong, Lim Chee

    2015-01-01

    Purpose: This study aims to explain the factors affecting students' acceptance and usage of a lecture capture system (LCS)--ReWIND--in a Malaysian university based on the extended unified theory of acceptance and use of technology (UTAUT2) model. Technological advances have become an important feature of universities' plans to improve the…

  14. Adaptation of Teachers' Conceptions and Practices of Formative Assessment Scale into Turkish Culture and a Structural Equation Modeling

    ERIC Educational Resources Information Center

    Karaman, Pinar; Sahin, Çavus

    2017-01-01

    The purpose of this study was to adapt Teachers' Conceptions and Practices of Formative Assessment Scale (TCPFS) based on the Theory of Planned Behavior (TPB) into Turkish culture and apply the TPB to examine teachers' intentions and behaviors regarding formative assessment. After examining linguistic validity of the scale, Turkish scale was…

  15. Top-Mounted Propulsion Test Plans (TMP17)

    NASA Technical Reports Server (NTRS)

    Bridges, James; Henderson, Brenda; Huff, Dennis

    2017-01-01

    NASA recently completed a study of propulsion cycles and nozzle types applicable to a 70-passenger, M1.6 supersonic airliner, paying especial attention to the noise produced during landing and take-off. The results of the study were validated in a model-scale test at NASA Glenn last summer. The findings of that study and test, along with other studies, have resulted in a new strategy for achieving the Commercial Supersonic Technologys goals for noise and performance. Key to that strategy is moving the propulsion to the top-side of the vehicle and modifying the nozzle and inlet to maximally shield the propulsion noise while maintaining efficient operation. Installed exhaust configurations have been designed to minimize the exhaust noise using new acoustic design tools. A test planned for the fall of 2017 will validate both the new design tools and the low-noise concept using a new translating phased array. During the test, questions regarding modifications of convected waves in the jet near-field that are key to new understandings of aft jet noise will be addressed. Also, to better tie rig results to real-world measurements, a model-scale version of a nozzle that was flight tested by Glenn Research Center in 2001 will be tested.

  16. Proton and helium ion radiotherapy for meningioma tumors: a Monte Carlo-based treatment planning comparison.

    PubMed

    Tessonnier, Thomas; Mairani, Andrea; Chen, Wenjing; Sala, Paola; Cerutti, Francesco; Ferrari, Alfredo; Haberer, Thomas; Debus, Jürgen; Parodi, Katia

    2018-01-09

    Due to their favorable physical and biological properties, helium ion beams are increasingly considered a promising alternative to proton beams for radiation therapy. Hence, this work aims at comparing in-silico the treatment of brain and ocular meningiomas with protons and helium ions, using for the first time a dedicated Monte Carlo (MC) based treatment planning engine (MCTP) thoroughly validated both in terms of physical and biological models. Starting from clinical treatment plans of four patients undergoing proton therapy with a fixed relative biological effectiveness (RBE) of 1.1 and a fraction dose of 1.8 Gy(RBE), new treatment plans were optimized with MCTP for both protons (with variable and fixed RBE) and helium ions (with variable RBE) under the same constraints derived from the initial clinical plans. The resulting dose distributions were dosimetrically compared in terms of dose volume histograms (DVH) parameters for the planning target volume (PTV) and the organs at risk (OARs), as well as dose difference maps. In most of the cases helium ion plans provided a similar PTV coverage as protons with a consistent trend of superior OAR sparing. The latter finding was attributed to the ability of helium ions to offer sharper distal and lateral dose fall-offs, as well as a more favorable differential RBE variation in target and normal tissue. Although more studies are needed to investigate the clinical potential of helium ions for different tumour entities, the results of this work based on an experimentally validated MC engine support the promise of this modality with state-of-the-art pencil beam scanning delivery, especially in case of tumours growing in close proximity of multiple OARs such as meningiomas.

  17. Effect of Radiotherapy Planning Complexity on Survival of Elderly Patients With Unresected Localized Lung Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Chang H.; Bonomi, Marcelo; Cesaretti, Jamie

    2011-11-01

    Purpose: To evaluate whether complex radiotherapy (RT) planning was associated with improved outcomes in a cohort of elderly patients with unresected Stage I-II non-small-cell lung cancer (NSCLC). Methods and Materials: Using the Surveillance, Epidemiology, and End Results registry linked to Medicare claims, we identified 1998 patients aged >65 years with histologically confirmed, unresected stage I-II NSCLC. Patients were classified into an intermediate or complex RT planning group using Medicare physician codes. To address potential selection bias, we used propensity score modeling. Survival of patients who received intermediate and complex simulation was compared using Cox regression models adjusting for propensity scoresmore » and in a stratified and matched analysis according to propensity scores. Results: Overall, 25% of patients received complex RT planning. Complex RT planning was associated with better overall (hazard ratio 0.84; 95% confidence interval, 0.75-0.95) and lung cancer-specific (hazard ratio 0.81; 95% confidence interval, 0.71-0.93) survival after controlling for propensity scores. Similarly, stratified and matched analyses showed better overall and lung cancer-specific survival of patients treated with complex RT planning. Conclusions: The use of complex RT planning is associated with improved survival among elderly patients with unresected Stage I-II NSCLC. These findings should be validated in prospective randomized controlled trials.« less

  18. Development and construct validation of the Client-Centredness of Goal Setting (C-COGS) scale.

    PubMed

    Doig, Emmah; Prescott, Sarah; Fleming, Jennifer; Cornwell, Petrea; Kuipers, Pim

    2015-07-01

    Client-centred philosophy is integral to occupational therapy practice and client-centred goal planning is considered fundamental to rehabilitation. Evaluation of whether goal-planning practices are client-centred requires an understanding of the client's perspective about goal-planning processes and practices. The Client-Centredness of Goal Setting (C-COGS) was developed for use by practitioners who seek to be more client-centred and who require a scale to guide and evaluate individually orientated practice, especially with adults with cognitive impairment related to acquired brain injury. To describe development of the C-COGS scale and examine its construct validity. The C-COGS was administered to 42 participants with acquired brain injury after multidisciplinary goal planning. C-COGS scores were correlated with the Canadian Occupational Performance Measure (COPM) importance scores, and measures of therapeutic alliance, motivation, and global functioning to establish construct validity. The C-COGS scale has three subscales evaluating goal alignment, goal planning participation, and client-centredness of goals. The C-COGS subscale items demonstrated moderately significant correlations with scales measuring similar constructs. Findings provide preliminary evidence to support the construct validity of the C-COGS scale, which is intended to be used to evaluate and reflect on client-centred goal planning in clinical practice, and to highlight factors contributing to best practice rehabilitation.

  19. Power, Performance, and Perception (P3): Integrating Usability Metrics and Technology Acceptance Determinants to Validate a New Model for Predicting System Usage

    DTIC Science & Technology

    1999-12-01

    Ajzen , I . and M . Fishbein . Understanding Attitudes and Predicting Social Behavior. ’ Prentice-Hall, Englewood Cliffs, NJ: 1980. Alwang, Greg. "Speech...Decline: Computer Introduction in the Financial Industry." Technology Forecasting and Social Change. 31: 143-154. Fishbein , M . and I . Ajzen . Belief...Theory of Reasoned Action (TRA) ( Fishbein and Ajzen , 1980) 13 3. Theory of Planned Behavior (TPB) ( Ajzen , 1991) 15 4. Technology Acceptance Model

  20. Development of a pharmacogenetic-guided warfarin dosing algorithm for Puerto Rican patients.

    PubMed

    Ramos, Alga S; Seip, Richard L; Rivera-Miranda, Giselle; Felici-Giovanini, Marcos E; Garcia-Berdecia, Rafael; Alejandro-Cowan, Yirelia; Kocherla, Mohan; Cruz, Iadelisse; Feliu, Juan F; Cadilla, Carmen L; Renta, Jessica Y; Gorowski, Krystyna; Vergara, Cunegundo; Ruaño, Gualberto; Duconge, Jorge

    2012-12-01

    This study was aimed at developing a pharmacogenetic-driven warfarin-dosing algorithm in 163 admixed Puerto Rican patients on stable warfarin therapy. A multiple linear-regression analysis was performed using log-transformed effective warfarin dose as the dependent variable, and combining CYP2C9 and VKORC1 genotyping with other relevant nongenetic clinical and demographic factors as independent predictors. The model explained more than two-thirds of the observed variance in the warfarin dose among Puerto Ricans, and also produced significantly better 'ideal dose' estimates than two pharmacogenetic models and clinical algorithms published previously, with the greatest benefit seen in patients ultimately requiring <7 mg/day. We also assessed the clinical validity of the model using an independent validation cohort of 55 Puerto Rican patients from Hartford, CT, USA (R(2) = 51%). Our findings provide the basis for planning prospective pharmacogenetic studies to demonstrate the clinical utility of genotyping warfarin-treated Puerto Rican patients.

  1. Development and Validation of a Gender Ideology Scale for Family Planning Services in Rural China

    PubMed Central

    Yang, Xueyan; Li, Shuzhuo; Feldman, Marcus W.

    2013-01-01

    The objectives of this study are to develop a scale of gender role ideology appropriate for assessing Quality of Care in family planning services for rural China. Literature review, focus-group discussions and in-depth interviews with service providers and clients from two counties in eastern and western China, as well as experts’ assessments, were used to develop a scale for family planning services. Psychometric methodologies were applied to samples of 601 service clients and 541 service providers from a survey in a district in central China to validate its internal consistency, reliability, and construct validity with realistic and strategic dimensions. This scale is found to be reliable and valid, and has prospects for application both academically and practically in the field. PMID:23573222

  2. Ocean Observations with EOS/MODIS: Algorithm Development and Post Launch Studies

    NASA Technical Reports Server (NTRS)

    Gordon, Howard R.; Conboy, Barbara (Technical Monitor)

    1999-01-01

    This separation has been logical thus far; however, as launch of AM-1 approaches, it must be recognized that many of these activities will shift emphasis from algorithm development to validation. For example, the second, third, and fifth bullets will become almost totally validation-focussed activities in the post-launch era, providing the core of our experimental validation effort. Work under the first bullet will continue into the post-launch time frame, driven in part by algorithm deficiencies revealed as a result of validation activities. Prior to the start of the 1999 fiscal year (FY99) we were requested to prepare a brief plan for our FY99 activities. This plan is included as Appendix 1. The present report describes the progress made on our planned activities.

  3. A Particle Model for Prediction of Cement Infiltration of Cancellous Bone in Osteoporotic Bone Augmentation.

    PubMed

    Basafa, Ehsan; Murphy, Ryan J; Kutzer, Michael D; Otake, Yoshito; Armand, Mehran

    2013-01-01

    Femoroplasty is a potential preventive treatment for osteoporotic hip fractures. It involves augmenting mechanical properties of the femur by injecting Polymethylmethacrylate (PMMA) bone cement. To reduce the risks involved and maximize the outcome, however, the procedure needs to be carefully planned and executed. An important part of the planning system is predicting infiltration of cement into the porous medium of cancellous bone. We used the method of Smoothed Particle Hydrodynamics (SPH) to model the flow of PMMA inside porous media. We modified the standard formulation of SPH to incorporate the extreme viscosities associated with bone cement. Darcy creeping flow of fluids through isotropic porous media was simulated and the results were compared with those reported in the literature. Further validation involved injecting PMMA cement inside porous foam blocks - osteoporotic cancellous bone surrogates - and simulating the injections using our proposed SPH model. Millimeter accuracy was obtained in comparing the simulated and actual cement shapes. Also, strong correlations were found between the simulated and the experimental data of spreading distance (R(2) = 0.86) and normalized pressure (R(2) = 0.90). Results suggest that the proposed model is suitable for use in an osteoporotic femoral augmentation planning framework.

  4. Revalidation of the NASA Ames 11-by 11-Foot Transonic Wind Tunnel with a Commercial Airplane Model

    NASA Technical Reports Server (NTRS)

    Kmak, Frank J.; Hudgins, M.; Hergert, D.; George, Michael W. (Technical Monitor)

    2001-01-01

    The 11-By 11-Foot Transonic leg of the Unitary Plan Wind Tunnel (UPWT) was modernized to improve tunnel performance, capability, productivity, and reliability. Wind tunnel tests to demonstrate the readiness of the tunnel for a return to production operations included an Integrated Systems Test (IST), calibration tests, and airplane validation tests. One of the two validation tests was a 0.037-scale Boeing 777 model that was previously tested in the 11-By 11-Foot tunnel in 1991. The objective of the validation tests was to compare pre-modernization and post-modernization results from the same airplane model in order to substantiate the operational readiness of the facility. Evaluation of within-test, test-to-test, and tunnel-to-tunnel data repeatability were made to study the effects of the tunnel modifications. Tunnel productivity was also evaluated to determine the readiness of the facility for production operations. The operation of the facility, including model installation, tunnel operations, and the performance of tunnel systems, was observed and facility deficiency findings generated. The data repeatability studies and tunnel-to-tunnel comparisons demonstrated outstanding data repeatability and a high overall level of data quality. Despite some operational and facility problems, the validation test was successful in demonstrating the readiness of the facility to perform production airplane wind tunnel%, tests.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hall, D; Trofimov, A; Winey, B

    Purpose: We developed a knowledge-based model that can predict the patient-specific benefits of proton therapy based upon geometric considerations. The model could also aid patient selection in model-based clinical trials or help justify clinical decisions to insurance companies. Methods: The knowledge-based method trains a model upon existing proton treatment plans, exploiting correlations between dose and distance-to-target. Each OAR is split into concentric subvolumes surrounding the target volume, and a skew-normal PDF is fit to the dose distribution found within each shell. The model learns from shared trends in how the best-fit skew-normal parameters depend upon distance-to-target. It can then predictmore » feasible OAR DVHs for a new patient (without a proton plan) based upon their geometry. The expected benefits of proton therapy are assessed by comparing the predicted DVHs to those of an IMRT plan, using a metric such as the equivalent uniform dose (EUD). Results: A model was trained for clival chordoma, owing to its geometric complexity and the multitude of nearby OARs. The model was trained using 20 patients and validated with a further 20 patients, and considers several different OARs. The predicted EUD was in good agreement with that of the actual proton plan. The coefficient of determination (R-squared) was 85% overall, 92% for cochleas, 80% for optic chiasm and 79% for spinal cord. The model exhibited no signs of bias or overfitting. When compared to an IMRT plan, the model could classify whether a patient will experience a gain or a loss with an accuracy between 75% and 95%, depending upon the OAR. Conclusion: We developed a model that can quickly and accurately predict the patient-specific benefits of proton therapy in clival chordoma patients, though models could be trained for other tumor sites. This work is funded by National Cancer Institute grant U19 CA 021239.« less

  6. Predicting objective function weights from patient anatomy in prostate IMRT treatment planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Taewoo, E-mail: taewoo.lee@utoronto.ca; Hammad, Muhannad; Chan, Timothy C. Y.

    2013-12-15

    Purpose: Intensity-modulated radiation therapy (IMRT) treatment planning typically combines multiple criteria into a single objective function by taking a weighted sum. The authors propose a statistical model that predicts objective function weights from patient anatomy for prostate IMRT treatment planning. This study provides a proof of concept for geometry-driven weight determination. Methods: A previously developed inverse optimization method (IOM) was used to generate optimal objective function weights for 24 patients using their historical treatment plans (i.e., dose distributions). These IOM weights were around 1% for each of the femoral heads, while bladder and rectum weights varied greatly between patients. Amore » regression model was developed to predict a patient's rectum weight using the ratio of the overlap volume of the rectum and bladder with the planning target volume at a 1 cm expansion as the independent variable. The femoral head weights were fixed to 1% each and the bladder weight was calculated as one minus the rectum and femoral head weights. The model was validated using leave-one-out cross validation. Objective values and dose distributions generated through inverse planning using the predicted weights were compared to those generated using the original IOM weights, as well as an average of the IOM weights across all patients. Results: The IOM weight vectors were on average six times closer to the predicted weight vectors than to the average weight vector, usingl{sub 2} distance. Likewise, the bladder and rectum objective values achieved by the predicted weights were more similar to the objective values achieved by the IOM weights. The difference in objective value performance between the predicted and average weights was statistically significant according to a one-sided sign test. For all patients, the difference in rectum V54.3 Gy, rectum V70.0 Gy, bladder V54.3 Gy, and bladder V70.0 Gy values between the dose distributions generated by the predicted weights and IOM weights was less than 5 percentage points. Similarly, the difference in femoral head V54.3 Gy values between the two dose distributions was less than 5 percentage points for all but one patient. Conclusions: This study demonstrates a proof of concept that patient anatomy can be used to predict appropriate objective function weights for treatment planning. In the long term, such geometry-driven weights may serve as a starting point for iterative treatment plan design or may provide information about the most clinically relevant region of the Pareto surface to explore.« less

  7. Predicting objective function weights from patient anatomy in prostate IMRT treatment planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Taewoo, E-mail: taewoo.lee@utoronto.ca; Hammad, Muhannad; Chan, Timothy C. Y.

    Purpose: Intensity-modulated radiation therapy (IMRT) treatment planning typically combines multiple criteria into a single objective function by taking a weighted sum. The authors propose a statistical model that predicts objective function weights from patient anatomy for prostate IMRT treatment planning. This study provides a proof of concept for geometry-driven weight determination. Methods: A previously developed inverse optimization method (IOM) was used to generate optimal objective function weights for 24 patients using their historical treatment plans (i.e., dose distributions). These IOM weights were around 1% for each of the femoral heads, while bladder and rectum weights varied greatly between patients. Amore » regression model was developed to predict a patient's rectum weight using the ratio of the overlap volume of the rectum and bladder with the planning target volume at a 1 cm expansion as the independent variable. The femoral head weights were fixed to 1% each and the bladder weight was calculated as one minus the rectum and femoral head weights. The model was validated using leave-one-out cross validation. Objective values and dose distributions generated through inverse planning using the predicted weights were compared to those generated using the original IOM weights, as well as an average of the IOM weights across all patients. Results: The IOM weight vectors were on average six times closer to the predicted weight vectors than to the average weight vector, usingl{sub 2} distance. Likewise, the bladder and rectum objective values achieved by the predicted weights were more similar to the objective values achieved by the IOM weights. The difference in objective value performance between the predicted and average weights was statistically significant according to a one-sided sign test. For all patients, the difference in rectum V54.3 Gy, rectum V70.0 Gy, bladder V54.3 Gy, and bladder V70.0 Gy values between the dose distributions generated by the predicted weights and IOM weights was less than 5 percentage points. Similarly, the difference in femoral head V54.3 Gy values between the two dose distributions was less than 5 percentage points for all but one patient. Conclusions: This study demonstrates a proof of concept that patient anatomy can be used to predict appropriate objective function weights for treatment planning. In the long term, such geometry-driven weights may serve as a starting point for iterative treatment plan design or may provide information about the most clinically relevant region of the Pareto surface to explore.« less

  8. CME Arrival-time Validation of Real-time WSA-ENLIL+Cone Simulations at the CCMC/SWRC

    NASA Astrophysics Data System (ADS)

    Wold, A. M.; Mays, M. L.; Taktakishvili, A.; Jian, L.; Odstrcil, D.; MacNeice, P. J.

    2016-12-01

    The Wang-Sheeley-Arge (WSA)-ENLIL+Cone model is used extensively in space weather operations worldwide to model CME propagation, as such it is important to assess its performance. We present validation results of the WSA-ENLIL+Cone model installed at the Community Coordinated Modeling Center (CCMC) and executed in real-time by the CCMC/Space Weather Research Center (SWRC). The SWRC is a CCMC sub-team that provides space weather services to NASA robotic mission operators and science campaigns, and also prototypes new forecasting models and techniques. CCMC/SWRC uses the WSA-ENLIL+Cone model to predict CME arrivals at NASA missions throughout the inner heliosphere. In this work we compare model predicted CME arrival-times to in-situ ICME shock observations near Earth (ACE, Wind), STEREO-A and B for simulations completed between March 2010 - July 2016 (over 1500 runs). We report hit, miss, false alarm, and correct rejection statistics for all three spacecraft. For hits we compute the bias, RMSE, and average absolute CME arrival time error, and the dependence of these errors on CME input parameters. We compare the predicted geomagnetic storm strength (Kp index) to the CME arrival time error for Earth-directed CMEs. The predicted Kp index is computed using the WSA-ENLIL+Cone plasma parameters at Earth with a modified Newell et al. (2007) coupling function. We also explore the impact of the multi-spacecraft observations on the CME parameters used initialize the model by comparing model validation results before and after the STEREO-B communication loss (since September 2014) and STEREO-A side-lobe operations (August 2014-December 2015). This model validation exercise has significance for future space weather mission planning such as L5 missions.

  9. Experimental Validation Plan for the Xolotl Plasma-Facing Component Simulator Using Tokamak Sample Exposures

    NASA Astrophysics Data System (ADS)

    Chan, V. S.; Wong, C. P. C.; McLean, A. G.; Luo, G. N.; Wirth, B. D.

    2013-10-01

    The Xolotl code under development by PSI-SciDAC will enhance predictive modeling capability of plasma-facing materials under burning plasma conditions. The availability and application of experimental data to compare to code-calculated observables are key requirements to validate the breadth and content of physics included in the model and ultimately gain confidence in its results. A dedicated effort has been in progress to collect and organize a) a database of relevant experiments and their publications as previously carried out at sample exposure facilities in US and Asian tokamaks (e.g., DIII-D DiMES, and EAST MAPES), b) diagnostic and surface analysis capabilities available at each device, and c) requirements for future experiments with code validation in mind. The content of this evolving database will serve as a significant resource for the plasma-material interaction (PMI) community. Work supported in part by the US Department of Energy under GA-DE-SC0008698, DE-AC52-07NA27344 and DE-AC05-00OR22725.

  10. Measuring infrastructure: A key step in program evaluation and planning

    PubMed Central

    Schmitt, Carol L.; Glasgow, LaShawn; Lavinghouze, S. Rene; Rieker, Patricia P.; Fulmer, Erika; McAleer, Kelly; Rogers, Todd

    2016-01-01

    State tobacco prevention and control programs (TCPs) require a fully functioning infrastructure to respond effectively to the Surgeon General’s call for accelerating the national reduction in tobacco use. The literature describes common elements of infrastructure; however, a lack of valid and reliable measures has made it difficult for program planners to monitor relevant infrastructure indicators and address observed deficiencies, or for evaluators to determine the association among infrastructure, program efforts, and program outcomes. The Component Model of Infrastructure (CMI) is a comprehensive, evidence-based framework that facilitates TCP program planning efforts to develop and maintain their infrastructure. Measures of CMI components were needed to evaluate the model’s utility and predictive capability for assessing infrastructure. This paper describes the development of CMI measures and results of a pilot test with nine state TCP managers. Pilot test findings indicate that the tool has good face validity and is clear and easy to follow. The CMI tool yields data that can enhance public health efforts in a funding-constrained environment and provides insight into program sustainability. Ultimately, the CMI measurement tool could facilitate better evaluation and program planning across public health programs. PMID:27037655

  11. AQUATOOL, a generalized decision-support system for water-resources planning and operational management

    NASA Astrophysics Data System (ADS)

    Andreu, J.; Capilla, J.; Sanchís, E.

    1996-04-01

    This paper describes a generic decision-support system (DSS) which was originally designed for the planning stage of dicision-making associated with complex river basins. Subsequently, it was expanded to incorporate modules relating to the operational stage of decision-making. Computer-assisted design modules allow any complex water-resource system to be represented in graphical form, giving access to geographically referenced databases and knowledge bases. The modelling capability includes basin simulation and optimization modules, an aquifer flow modelling module and two modules for risk assessment. The Segura and Tagus river basins have been used as case studies in the development and validation phases. The value of this DSS is demonstrated by the fact that both River Basin Agencies currently use a version for the efficient management of their water resources.

  12. TRANSPORT PLANNING MODEL FOR WIDE AREA RECYCLING SYSTEM OF INDUSTRIAL WASTE PLASTIC

    NASA Astrophysics Data System (ADS)

    Arai, Yasuhiro; Kawamura, Hisashi; Koizumi, Akira; Mogi, Satoshi

    To date, the majority of industrial waste plastic generated in an urban city has been processed into landfill. However, it is now necessary to actively utilize that plastic as a useful resource to create a recycling society with a low environment influence. In order to construct a reasonable recycling system, it is necessary to address the "transportation problem," which means determining how much industrial waste plastic is to be transported to what location. With the goal of eliminating landfill processing, this study considers a transport planning model for industrial waste plastic applying linear programming. The results of running optimized calculations under given scenarios clarified not only the possibilities for recycle processing in the Metropolitan area, but also the validity of wide area recycling system.

  13. Patient feature based dosimetric Pareto front prediction in esophageal cancer radiotherapy.

    PubMed

    Wang, Jiazhou; Jin, Xiance; Zhao, Kuaike; Peng, Jiayuan; Xie, Jiang; Chen, Junchao; Zhang, Zhen; Studenski, Matthew; Hu, Weigang

    2015-02-01

    To investigate the feasibility of the dosimetric Pareto front (PF) prediction based on patient's anatomic and dosimetric parameters for esophageal cancer patients. Eighty esophagus patients in the authors' institution were enrolled in this study. A total of 2928 intensity-modulated radiotherapy plans were obtained and used to generate PF for each patient. On average, each patient had 36.6 plans. The anatomic and dosimetric features were extracted from these plans. The mean lung dose (MLD), mean heart dose (MHD), spinal cord max dose, and PTV homogeneity index were recorded for each plan. Principal component analysis was used to extract overlap volume histogram (OVH) features between PTV and other organs at risk. The full dataset was separated into two parts; a training dataset and a validation dataset. The prediction outcomes were the MHD and MLD. The spearman's rank correlation coefficient was used to evaluate the correlation between the anatomical features and dosimetric features. The stepwise multiple regression method was used to fit the PF. The cross validation method was used to evaluate the model. With 1000 repetitions, the mean prediction error of the MHD was 469 cGy. The most correlated factor was the first principal components of the OVH between heart and PTV and the overlap between heart and PTV in Z-axis. The mean prediction error of the MLD was 284 cGy. The most correlated factors were the first principal components of the OVH between heart and PTV and the overlap between lung and PTV in Z-axis. It is feasible to use patients' anatomic and dosimetric features to generate a predicted Pareto front. Additional samples and further studies are required improve the prediction model.

  14. TU-C-17A-10: Patient Features Based Dosimetric Pareto Front Prediction In Esophagus Cancer Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, J; Zhao, K; Peng, J

    2014-06-15

    Purpose: The purpose of this study is to study the feasibility of the dosimetric pareto front (PF) prediction based on patient anatomic and dosimetric parameters for esophagus cancer patients. Methods: Sixty esophagus patients in our institution were enrolled in this study. A total 2920 IMRT plans were created to generated PF for each patient. On average, each patient had 48 plans. The anatomic and dosimetric features were extracted from those plans. The mean lung dose (MLD), mean heart dose (MHD), spinal cord max dose and PTV homogeneous index (PTVHI) were recorded for each plan. The principal component analysis (PCA) wasmore » used to extract overlap volume histogram (OVH) features between PTV and other critical organs. The full dataset was separated into two parts include the training dataset and the validation dataset. The prediction outcomes were the MHD and MLD for the current study. The spearman rank correlation coefficient was used to evaluate the correlation between the anatomical features and dosimetric features. The PF was fit by the the stepwise multiple regression method. The cross-validation method was used to evaluation the model. Results: The mean prediction error of the MHD was 465 cGy with 100 repetitions. The most correlated factors were the first principal components of the OVH between heart and PTV, and the overlap between heart and PTV in Z-axis. The mean prediction error of the MLD was 195 cGy. The most correlated factors were the first principal components of the OVH between lung and PTV, and the overlap between lung and PTV in Z-axis. Conclusion: It is feasible to use patients anatomic and dosimetric features to generate a predicted PF. Additional samples and further studies were required to get a better prediction model.« less

  15. Patient feature based dosimetric Pareto front prediction in esophageal cancer radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jiazhou; Zhao, Kuaike; Peng, Jiayuan

    2015-02-15

    Purpose: To investigate the feasibility of the dosimetric Pareto front (PF) prediction based on patient’s anatomic and dosimetric parameters for esophageal cancer patients. Methods: Eighty esophagus patients in the authors’ institution were enrolled in this study. A total of 2928 intensity-modulated radiotherapy plans were obtained and used to generate PF for each patient. On average, each patient had 36.6 plans. The anatomic and dosimetric features were extracted from these plans. The mean lung dose (MLD), mean heart dose (MHD), spinal cord max dose, and PTV homogeneity index were recorded for each plan. Principal component analysis was used to extract overlapmore » volume histogram (OVH) features between PTV and other organs at risk. The full dataset was separated into two parts; a training dataset and a validation dataset. The prediction outcomes were the MHD and MLD. The spearman’s rank correlation coefficient was used to evaluate the correlation between the anatomical features and dosimetric features. The stepwise multiple regression method was used to fit the PF. The cross validation method was used to evaluate the model. Results: With 1000 repetitions, the mean prediction error of the MHD was 469 cGy. The most correlated factor was the first principal components of the OVH between heart and PTV and the overlap between heart and PTV in Z-axis. The mean prediction error of the MLD was 284 cGy. The most correlated factors were the first principal components of the OVH between heart and PTV and the overlap between lung and PTV in Z-axis. Conclusions: It is feasible to use patients’ anatomic and dosimetric features to generate a predicted Pareto front. Additional samples and further studies are required improve the prediction model.« less

  16. On the experimental validation of model-based dose calculation algorithms for 192Ir HDR brachytherapy treatment planning

    NASA Astrophysics Data System (ADS)

    Pappas, Eleftherios P.; Zoros, Emmanouil; Moutsatsos, Argyris; Peppa, Vasiliki; Zourari, Kyveli; Karaiskos, Pantelis; Papagiannis, Panagiotis

    2017-05-01

    There is an acknowledged need for the design and implementation of physical phantoms appropriate for the experimental validation of model-based dose calculation algorithms (MBDCA) introduced recently in 192Ir brachytherapy treatment planning systems (TPS), and this work investigates whether it can be met. A PMMA phantom was prepared to accommodate material inhomogeneities (air and Teflon), four plastic brachytherapy catheters, as well as 84 LiF TLD dosimeters (MTS-100M 1  ×  1  ×  1 mm3 microcubes), two radiochromic films (Gafchromic EBT3) and a plastic 3D dosimeter (PRESAGE). An irradiation plan consisting of 53 source dwell positions was prepared on phantom CT images using a commercially available TPS and taking into account the calibration dose range of each detector. Irradiation was performed using an 192Ir high dose rate (HDR) source. Dose to medium in medium, Dmm , was calculated using the MBDCA option of the same TPS as well as Monte Carlo (MC) simulation with the MCNP code and a benchmarked methodology. Measured and calculated dose distributions were spatially registered and compared. The total standard (k  =  1) spatial uncertainties for TLD, film and PRESAGE were: 0.71, 1.58 and 2.55 mm. Corresponding percentage total dosimetric uncertainties were: 5.4-6.4, 2.5-6.4 and 4.85, owing mainly to the absorbed dose sensitivity correction and the relative energy dependence correction (position dependent) for TLD, the film sensitivity calibration (dose dependent) and the dependencies of PRESAGE sensitivity. Results imply a LiF over-response due to a relative intrinsic energy dependence between 192Ir and megavoltage calibration energies, and a dose rate dependence of PRESAGE sensitivity at low dose rates (<1 Gy min-1). Calculations were experimentally validated within uncertainties except for MBDCA results for points in the phantom periphery and dose levels  <20%. Experimental MBDCA validation is laborious, yet feasible. Further work is required for the full characterization of dosimeter response for 192Ir and the reduction of experimental uncertainties.

  17. Cooling Technology for Large Space Telescopes

    NASA Technical Reports Server (NTRS)

    DiPirro, Michael; Cleveland, Paul; Durand, Dale; Klavins, Andy; Muheim, Daniella; Paine, Christopher; Petach, Mike; Tenerelli, Domenick; Tolomeo, Jason; Walyus, Keith

    2007-01-01

    NASA's New Millennium Program funded an effort to develop a system cooling technology, which is applicable to all future infrared, sub-millimeter and millimeter cryogenic space telescopes. In particular, this technology is necessary for the proposed large space telescope Single Aperture Far-Infrared Telescope (SAFIR) mission. This technology will also enhance the performance and lower the risk and cost for other cryogenic missions. The new paradigm for cooling to low temperatures will involve passive cooling using lightweight deployable membranes that serve both as sunshields and V-groove radiators, in combination with active cooling using mechanical coolers operating down to 4 K. The Cooling Technology for Large Space Telescopes (LST) mission planned to develop and demonstrate a multi-layered sunshield, which is actively cooled by a multi-stage mechanical cryocooler, and further the models and analyses critical to scaling to future missions. The outer four layers of the sunshield cool passively by radiation, while the innermost layer is actively cooled to enable the sunshield to decrease the incident solar irradiance by a factor of more than one million. The cryocooler cools the inner layer of the sunshield to 20 K, and provides cooling to 6 K at a telescope mounting plate. The technology readiness level (TRL) of 7 will be achieved by the active cooling technology following the technology validation flight in Low Earth Orbit. In accordance with the New Millennium charter, tests and modeling are tightly integrated to advance the technology and the flight design for "ST-class" missions. Commercial off-the-shelf engineering analysis products are used to develop validated modeling capabilities to allow the techniques and results from LST to apply to a wide variety of future missions. The LST mission plans to "rewrite the book" on cryo-thermal testing and modeling techniques, and validate modeling techniques to scale to future space telescopes such as SAFIR.

  18. Automating an integrated spatial data-mining model for landfill site selection

    NASA Astrophysics Data System (ADS)

    Abujayyab, Sohaib K. M.; Ahamad, Mohd Sanusi S.; Yahya, Ahmad Shukri; Ahmad, Siti Zubaidah; Aziz, Hamidi Abdul

    2017-10-01

    An integrated programming environment represents a robust approach to building a valid model for landfill site selection. One of the main challenges in the integrated model is the complicated processing and modelling due to the programming stages and several limitations. An automation process helps avoid the limitations and improve the interoperability between integrated programming environments. This work targets the automation of a spatial data-mining model for landfill site selection by integrating between spatial programming environment (Python-ArcGIS) and non-spatial environment (MATLAB). The model was constructed using neural networks and is divided into nine stages distributed between Matlab and Python-ArcGIS. A case study was taken from the north part of Peninsular Malaysia. 22 criteria were selected to utilise as input data and to build the training and testing datasets. The outcomes show a high-performance accuracy percentage of 98.2% in the testing dataset using 10-fold cross validation. The automated spatial data mining model provides a solid platform for decision makers to performing landfill site selection and planning operations on a regional scale.

  19. 16 CFR 310.4 - Abusive telemarketing acts or practices.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... to a settlement agreement, debt management plan, or other such valid contractual agreement executed by the customer; (B) The customer has made at least one payment pursuant to that settlement agreement, debt management plan, or other valid contractual agreement between the customer and the creditor or...

  20. 16 CFR 310.4 - Abusive telemarketing acts or practices.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... to a settlement agreement, debt management plan, or other such valid contractual agreement executed by the customer; (B) The customer has made at least one payment pursuant to that settlement agreement, debt management plan, or other valid contractual agreement between the customer and the creditor or...

  1. 16 CFR 310.4 - Abusive telemarketing acts or practices.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... to a settlement agreement, debt management plan, or other such valid contractual agreement executed by the customer; (B) The customer has made at least one payment pursuant to that settlement agreement, debt management plan, or other valid contractual agreement between the customer and the creditor or...

  2. An Update on the VAMOS Extremes Working Group Activities

    NASA Technical Reports Server (NTRS)

    Schubert, Siegfried; Cavalcanti, Iracema

    2011-01-01

    We review here the progress of the Variability of the American MOnsoon Systems (VAMOS) extremes working group since it was formed in February of 2010. The goals of the working group are to 1) develop an atlas of warm-season extremes over the Americas, 2) evaluate existing and planned simulations, and 3) suggest new model runs to address mechanisms and predictability of extremes. Substantial progress has been made in the development of an extremes atlas based on gridded observations and several reanalysis products including Modern Era Retrospective-Analysis for Research and Applications (MERRA) and Climate Forecast System Reanalysis (CFSR). The status of the atlas, remaining issues and plans for its expansion to include model data will be discussed. This includes the possibility of adding a companion atlas based on station observations based on the software developed under the World Climate Research Programme (WCRP) Expert Team on Climate Change. Detection and Indices (ETCCDI) activity. We will also review progress on relevant research and plans for the use and validation of the atlas results.

  3. Dynamic Modeling and Soil Mechanics for Path Planning of the Mars Exploration Rovers

    NASA Technical Reports Server (NTRS)

    Trease, Brian; Arvidson, Raymond; Lindemann, Randel; Bennett, Keith; Zhou, Feng; Iagnemma, Karl; Senatore, Carmine; Van Dyke, Lauren

    2011-01-01

    To help minimize risk of high sinkage and slippage during drives and to better understand soil properties and rover terramechanics from drive data, a multidisciplinary team was formed under the Mars Exploration Rover (MER) project to develop and utilize dynamic computer-based models for rover drives over realistic terrains. The resulting tool, named ARTEMIS (Adams-based Rover Terramechanics and Mobility Interaction Simulator), consists of the dynamic model, a library of terramechanics subroutines, and the high-resolution digital elevation maps of the Mars surface. A 200-element model of the rovers was developed and validated for drop tests before launch, using MSC-Adams dynamic modeling software. Newly modeled terrain-rover interactions include the rut-formation effect of deformable soils, using the classical Bekker-Wong implementation of compaction resistances and bull-dozing effects. The paper presents the details and implementation of the model with two case studies based on actual MER telemetry data. In its final form, ARTEMIS will be used in a predictive manner to assess terrain navigability and will become part of the overall effort in path planning and navigation for both Martian and lunar rovers.

  4. Risk assessment of storm surge disaster based on numerical models and remote sensing

    NASA Astrophysics Data System (ADS)

    Liu, Qingrong; Ruan, Chengqing; Zhong, Shan; Li, Jian; Yin, Zhonghui; Lian, Xihu

    2018-06-01

    Storm surge is one of the most serious ocean disasters in the world. Risk assessment of storm surge disaster for coastal areas has important implications for planning economic development and reducing disaster losses. Based on risk assessment theory, this paper uses coastal hydrological observations, a numerical storm surge model and multi-source remote sensing data, proposes methods for valuing hazard and vulnerability for storm surge and builds a storm surge risk assessment model. Storm surges in different recurrence periods are simulated in numerical models and the flooding areas and depth are calculated, which are used for assessing the hazard of storm surge; remote sensing data and GIS technology are used for extraction of coastal key objects and classification of coastal land use are identified, which is used for vulnerability assessment of storm surge disaster. The storm surge risk assessment model is applied for a typical coastal city, and the result shows the reliability and validity of the risk assessment model. The building and application of storm surge risk assessment model provides some basis reference for the city development plan and strengthens disaster prevention and mitigation.

  5. Reconstructing liver shape and position from MR image slices using an active shape model

    NASA Astrophysics Data System (ADS)

    Fenchel, Matthias; Thesen, Stefan; Schilling, Andreas

    2008-03-01

    We present an algorithm for fully automatic reconstruction of 3D position, orientation and shape of the human liver from a sparsely covering set of n 2D MR slice images. Reconstructing the shape of an organ from slice images can be used for scan planning, for surgical planning or other purposes where 3D anatomical knowledge has to be inferred from sparse slices. The algorithm is based on adapting an active shape model of the liver surface to a given set of slice images. The active shape model is created from a training set of liver segmentations from a group of volunteers. The training set is set up with semi-manual segmentations of T1-weighted volumetric MR images. Searching for the optimal shape model that best fits to the image data is done by maximizing a similarity measure based on local appearance at the surface. Two different algorithms for the active shape model search are proposed and compared: both algorithms seek to maximize the a-posteriori probability of the grey level appearance around the surface while constraining the surface to the space of valid shapes. The first algorithm works by using grey value profile statistics in normal direction. The second algorithm uses average and variance images to calculate the local surface appearance on the fly. Both algorithms are validated by fitting the active shape model to abdominal 2D slice images and comparing the shapes, which have been reconstructed, to the manual segmentations and to the results of active shape model searches from 3D image data. The results turn out to be promising and competitive to active shape model segmentations from 3D data.

  6. Cup Implant Planning Based on 2-D/3-D Radiographic Pelvis Reconstruction-First Clinical Results.

    PubMed

    Schumann, Steffen; Sato, Yoshinobu; Nakanishi, Yuki; Yokota, Futoshi; Takao, Masaki; Sugano, Nobuhiko; Zheng, Guoyan

    2015-11-01

    In the following, we will present a newly developed X-ray calibration phantom and its integration for 2-D/3-D pelvis reconstruction and subsequent automatic cup planning. Two different planning strategies were applied and evaluated with clinical data. Two different cup planning methods were investigated: The first planning strategy is based on a combined pelvis and cup statistical atlas. Thereby, the pelvis part of the combined atlas is matched to the reconstructed pelvis model, resulting in an optimized cup planning. The second planning strategy analyzes the morphology of the reconstructed pelvis model to determine the best fitting cup implant. The first planning strategy was compared to 3-D CT-based planning. Digitally reconstructed radiographs of THA patients with differently severe pathologies were used to evaluate the accuracy of predicting the cup size and position. Within a discrepancy of one cup size, the size was correctly identified in 100% of the cases for Crowe type I datasets and in 77.8% of the cases for Crowe type II, III, and IV datasets. The second planning strategy was analyzed with respect to the eventually implanted cup size. In seven patients, the estimated cup diameter was correct within one cup size, while the estimation for the remaining five patients differed by two cup sizes. While both planning strategies showed the same prediction rate with a discrepancy of one cup size (87.5%), the prediction of the exact cup size was increased for the statistical atlas-based strategy (56%) in contrast to the anatomically driven approach (37.5%). The proposed approach demonstrated the clinical validity of using 2-D/3-D reconstruction technique for cup planning.

  7. Reverse-translational biomarker validation of Abnormal Repetitive Behaviors in mice: an illustration of the 4P's modeling approach.

    PubMed

    Garner, Joseph P; Thogerson, Collette M; Dufour, Brett D; Würbel, Hanno; Murray, James D; Mench, Joy A

    2011-06-01

    The NIMH's new strategic plan, with its emphasis on the "4P's" (Prediction, Pre-emption, Personalization, and Populations) and biomarker-based medicine requires a radical shift in animal modeling methodology. In particular 4P's models will be non-determinant (i.e. disease severity will depend on secondary environmental and genetic factors); and validated by reverse-translation of animal homologues to human biomarkers. A powerful consequence of the biomarker approach is that different closely related disorders have a unique fingerprint of biomarkers. Animals can be validated as a highly specific model of a single disorder by matching this 'fingerprint'; or as a model of a symptom seen in multiple disorders by matching common biomarkers. Here we illustrate this approach with two Abnormal Repetitive Behaviors (ARBs) in mice: stereotypies and barbering (hair pulling). We developed animal versions of the neuropsychological biomarkers that distinguish human ARBs, and tested the fingerprint of the different mouse ARBs. As predicted, the two mouse ARBs were associated with different biomarkers. Both barbering and stereotypy could be discounted as models of OCD (even though they are widely used as such), due to the absence of limbic biomarkers which are characteristic of OCD and hence are necessary for a valid model. Conversely barbering matched the fingerprint of trichotillomania (i.e. selective deficits in set-shifting), suggesting it may be a highly specific model of this disorder. In contrast stereotypies were correlated only with a biomarker (deficits in response shifting) correlated with stereotypies in multiple disorders, suggesting that animal stereotypies model stereotypies in multiple disorders. Copyright © 2011 Elsevier B.V. All rights reserved.

  8. Multi-criteria, personalized route planning using quantifier-guided ordered weighted averaging operators

    NASA Astrophysics Data System (ADS)

    Nadi, S.; Delavar, M. R.

    2011-06-01

    This paper presents a generic model for using different decision strategies in multi-criteria, personalized route planning. Some researchers have considered user preferences in navigation systems. However, these prior studies typically employed a high tradeoff decision strategy, which used a weighted linear aggregation rule, and neglected other decision strategies. The proposed model integrates a pairwise comparison method and quantifier-guided ordered weighted averaging (OWA) aggregation operators to form a personalized route planning method that incorporates different decision strategies. The model can be used to calculate the impedance of each link regarding user preferences in terms of the route criteria, criteria importance and the selected decision strategy. Regarding the decision strategy, the calculated impedance lies between aggregations that use a logical "and" (which requires all the criteria to be satisfied) and a logical "or" (which requires at least one criterion to be satisfied). The calculated impedance also includes taking the average of the criteria scores. The model results in multiple alternative routes, which apply different decision strategies and provide users with the flexibility to select one of them en-route based on the real world situation. The model also defines the robust personalized route under different decision strategies. The influence of different decision strategies on the results are investigated in an illustrative example. This model is implemented in a web-based geographical information system (GIS) for Isfahan in Iran and verified in a tourist routing scenario. The results demonstrated, in real world situations, the validity of the route planning carried out in the model.

  9. Patient-specific atrium models for training and pre-procedure surgical planning

    NASA Astrophysics Data System (ADS)

    Laing, Justin; Moore, John; Bainbridge, Daniel; Drangova, Maria; Peters, Terry

    2017-03-01

    Minimally invasive cardiac procedures requiring a trans-septal puncture such as atrial ablation and MitraClip® mitral valve repair are becoming increasingly common. These procedures are performed on the beating heart, and require clinicians to rely on image-guided techniques. For cases of complex or diseased anatomy, in which fluoroscopic and echocardiography images can be difficult to interpret, clinicians may benefit from patient-specific atrial models that can be used for training, surgical planning, and the validation of new devices and guidance techniques. Computed tomography (CT) images of a patient's heart were segmented and used to generate geometric models to create a patient-specific atrial phantom. Using rapid prototyping, the geometric models were converted into physical representations and used to build a mold. The atria were then molded using tissue-mimicking materials and imaged using CT. The resulting images were segmented and used to generate a point cloud data set that could be registered to the original patient data. The absolute distance of the two point clouds was compared and evaluated to determine the model's accuracy. The result when comparing the molded model point cloud to the original data set, resulted in a maximum Euclidean distance error of 4.5 mm, an average error of 0.5 mm and a standard deviation of 0.6 mm. Using our workflow for creating atrial models, potential complications, particularly for complex repairs, may be accounted for in pre-operative planning. The information gained by clinicians involved in planning and performing the procedure should lead to shorter procedural times and better outcomes for patients.

  10. Validation of US3D for Capsule Aerodynamics using 05-CA Wind Tunnel Test Data

    NASA Technical Reports Server (NTRS)

    Schwing, Alan

    2012-01-01

    Several comparisons of computational fluid dynamics to wind tunnel test data are shown for the purpose of code validation. The wind tunnel test, 05-CA, uses a 7.66% model of NASA's Multi-Purpose Crew Vehicle in the 11-foot test section of the Ames Unitary Plan Wind tunnel. A variety of freestream conditions over four Mach numbers and three angles of attack are considered. Test data comparisons include time-averaged integrated forces and moments, time-averaged static pressure ports on the surface, and Strouhal Number. The applicability of the US3D code to subsonic and transonic flow over a bluff body is assessed on a comprehensive data set. With close comparison, this work validates US3D for highly separated flows similar to those examined here.

  11. Regression and statistical shape model based substitute CT generation for MRI alone external beam radiation therapy from standard clinical MRI sequences.

    PubMed

    Ghose, Soumya; Greer, Peter B; Sun, Jidi; Pichler, Peter; Rivest-Henault, David; Mitra, Jhimli; Richardson, Haylea; Wratten, Chris; Martin, Jarad; Arm, Jameen; Best, Leah; Dowling, Jason A

    2017-10-27

    In MR only radiation therapy planning, generation of the tissue specific HU map directly from the MRI would eliminate the need of CT image acquisition and may improve radiation therapy planning. The aim of this work is to generate and validate substitute CT (sCT) scans generated from standard T2 weighted MR pelvic scans in prostate radiation therapy dose planning. A Siemens Skyra 3T MRI scanner with laser bridge, flat couch and pelvic coil mounts was used to scan 39 patients scheduled for external beam radiation therapy for localized prostate cancer. For sCT generation a whole pelvis MRI (1.6 mm 3D isotropic T2w SPACE sequence) was acquired. Patients received a routine planning CT scan. Co-registered whole pelvis CT and T2w MRI pairs were used as training images. Advanced tissue specific non-linear regression models to predict HU for the fat, muscle, bladder and air were created from co-registered CT-MRI image pairs. On a test case T2w MRI, the bones and bladder were automatically segmented using a novel statistical shape and appearance model, while other soft tissues were separated using an Expectation-Maximization based clustering model. The CT bone in the training database that was most 'similar' to the segmented bone was then transformed with deformable registration to create the sCT component of the test case T2w MRI bone tissue. Predictions for the bone, air and soft tissue from the separate regression models were successively combined to generate a whole pelvis sCT. The change in monitor units between the sCT-based plans relative to the gold standard CT plan for the same IMRT dose plan was found to be [Formula: see text] (mean  ±  standard deviation) for 39 patients. The 3D Gamma pass rate was [Formula: see text] (2 mm/2%). The novel hybrid model is computationally efficient, generating an sCT in 20 min from standard T2w images for prostate cancer radiation therapy dose planning and DRR generation.

  12. Regression and statistical shape model based substitute CT generation for MRI alone external beam radiation therapy from standard clinical MRI sequences

    NASA Astrophysics Data System (ADS)

    Ghose, Soumya; Greer, Peter B.; Sun, Jidi; Pichler, Peter; Rivest-Henault, David; Mitra, Jhimli; Richardson, Haylea; Wratten, Chris; Martin, Jarad; Arm, Jameen; Best, Leah; Dowling, Jason A.

    2017-11-01

    In MR only radiation therapy planning, generation of the tissue specific HU map directly from the MRI would eliminate the need of CT image acquisition and may improve radiation therapy planning. The aim of this work is to generate and validate substitute CT (sCT) scans generated from standard T2 weighted MR pelvic scans in prostate radiation therapy dose planning. A Siemens Skyra 3T MRI scanner with laser bridge, flat couch and pelvic coil mounts was used to scan 39 patients scheduled for external beam radiation therapy for localized prostate cancer. For sCT generation a whole pelvis MRI (1.6 mm 3D isotropic T2w SPACE sequence) was acquired. Patients received a routine planning CT scan. Co-registered whole pelvis CT and T2w MRI pairs were used as training images. Advanced tissue specific non-linear regression models to predict HU for the fat, muscle, bladder and air were created from co-registered CT-MRI image pairs. On a test case T2w MRI, the bones and bladder were automatically segmented using a novel statistical shape and appearance model, while other soft tissues were separated using an Expectation-Maximization based clustering model. The CT bone in the training database that was most ‘similar’ to the segmented bone was then transformed with deformable registration to create the sCT component of the test case T2w MRI bone tissue. Predictions for the bone, air and soft tissue from the separate regression models were successively combined to generate a whole pelvis sCT. The change in monitor units between the sCT-based plans relative to the gold standard CT plan for the same IMRT dose plan was found to be 0.3%+/-0.9% (mean  ±  standard deviation) for 39 patients. The 3D Gamma pass rate was 99.8+/-0.00 (2 mm/2%). The novel hybrid model is computationally efficient, generating an sCT in 20 min from standard T2w images for prostate cancer radiation therapy dose planning and DRR generation.

  13. Spatial distribution of nymphs of Scaphoideus titanus (Homoptera: Cicadellidae) in grapes, and evaluation of sequential sampling plans.

    PubMed

    Lessio, Federico; Alma, Alberto

    2006-04-01

    The spatial distribution of the nymphs of Scaphoideus titanus Ball (Homoptera Cicadellidae), the vector of grapevine flavescence dorée (Candidatus Phytoplasma vitis, 16Sr-V), was studied by applying Taylor's power law. Studies were conducted from 2002 to 2005, in organic and conventional vineyards of Piedmont, northern Italy. Minimum sample size and fixed precision level stop lines were calculated to develop appropriate sampling plans. Model validation was performed, using independent field data, by means of Resampling Validation of Sample Plans (RVSP) resampling software. The nymphal distribution, analyzed via Taylor's power law, was aggregated, with b = 1.49. A sample of 32 plants was adequate at low pest densities with a precision level of D0 = 0.30; but for a more accurate estimate (D0 = 0.10), the required sample size needs to be 292 plants. Green's fixed precision level stop lines seem to be more suitable for field sampling: RVSP simulations of this sampling plan showed precision levels very close to the desired levels. However, at a prefixed precision level of 0.10, sampling would become too time-consuming, whereas a precision level of 0.25 is easily achievable. How these results could influence the correct application of the compulsory control of S. titanus and Flavescence dorée in Italy is discussed.

  14. Validation experiments to determine radiation partitioning of heat flux to an object in a fully turbulent fire.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ricks, Allen; Blanchat, Thomas K.; Jernigan, Dann A.

    2006-06-01

    It is necessary to improve understanding and develop validation data of the heat flux incident to an object located within the fire plume for the validation of SIERRA/ FUEGO/SYRINX fire and SIERRA/CALORE. One key aspect of the validation data sets is the determination of the relative contribution of the radiative and convective heat fluxes. To meet this objective, a cylindrical calorimeter with sufficient instrumentation to measure total and radiative heat flux had been designed and fabricated. This calorimeter will be tested both in the controlled radiative environment of the Penlight facility and in a fire environment in the FLAME/Radiant Heatmore » (FRH) facility. Validation experiments are specifically designed for direct comparison with the computational predictions. Making meaningful comparisons between the computational and experimental results requires careful characterization and control of the experimental features or parameters used as inputs into the computational model. Validation experiments must be designed to capture the essential physical phenomena, including all relevant initial and boundary conditions. A significant question of interest to modeling heat flux incident to an object in or near a fire is the contribution of the radiation and convection modes of heat transfer. The series of experiments documented in this test plan is designed to provide data on the radiation partitioning, defined as the fraction of the total heat flux that is due to radiation.« less

  15. Using the Theory of Planned Behavior to predict HPV vaccination intentions of college men.

    PubMed

    Catalano, Hannah Priest; Knowlden, Adam P; Birch, David A; Leeper, James D; Paschal, Angelia M; Usdan, Stuart L

    2017-04-01

    The purpose of this study was to test Theory of Planned Behavior (TPB) constructs in predicting human papillomavirus (HPV) vaccination behavioral intentions of vaccine-eligible college men. Participants were unvaccinated college men aged 18-26 years attending a large public university in the southeastern United States during Spring 2015. A nonexperimental, cross-sectional study design was employed. Instrumentation comprised a qualitative elicitation study, expert panel review, pilot test, test-retest, and internal consistency, construct validity, and predictive validity assessments using data collected from an online self-report questionnaire. The sample consisted of 256 college men, and the final structural model exhibited acceptable fit of the data. Attitude toward the behavior (β = .169) and subjective norm (β = 0.667) were significant predictors of behavioral intention, accounting for 58% of its variance. Practitioners may utilize this instrument for the development and evaluation of TPB-based interventions to increase HPV vaccination intentions of undergraduate college men.

  16. Roadway management plan based on rockfall modelling calibration and validation. Application along the Ma-10 road in Mallorca (Spain)

    NASA Astrophysics Data System (ADS)

    Mateos, Rosa Maria; Garcia, Inmaculada; Reichenbach, Paola; Herrera, Gerardo; Sarro, Roberto; Rius, Joan; Aguilo, Raul

    2016-04-01

    The Tramuntana range, in the northwestern sector of the island of Mallorca (Spain), is frequently affected by rockfalls which have caused significant damage, mainly along the road network. The Ma-10 road constitutes the main transportation corridor on the range with a heavy traffic estimated at 7,200 vehicles per day on average. With a length of 111 km and a tortuous path, the road is the connecting track for 12 municipalities and constitutes a strategic road on the island for many tourist resorts. For the period spanning from 1995 to current times, 63 rockfalls have affected the Ma-10 road with volumes ranging from 0.3m3 to 30,000 m3. Fortunately, no fatalities occurred but numerous blockages on the road took place which caused significant economic losses, valued of around 11 MEuro (Mateos el al., 2013). In this work we present the procedure we have applied to calibrate and validate rockfall modelling in the Tramuntana region, using 103 cases of the available detailed rockfall inventory (Mateos, 2006). We have exploited STONE (Guzzetti et al. 2002), a GIS based rockfall simulation software which computes 2D and 3D rockfall trajectories starting from a DTM and maps of the dynamic rolling friction coefficient and of the normal and tangential energy restitution coefficients. The appropriate identification of these parameters determines the accuracy of the simulation. To calibrate them, we have selected 40 rockfalls along the range which include a wide variety of outcropping lithologies. Coefficients values have been changed in numerous attempts in order to select those where the extent and shape of the simulation matched the field mapping. Best results were summarized with the average statistical values for each parameter and for each geotechnical unit, determining that mode values represent more precisely the data. Initially, for the validation stage, 10 well- known rockfalls exploited in the calibration phase have been selected. Confidence tests have been applied taking into account, not only the success, but also the mistakes. We have further validated the calibrated parameters along the Ma-road using the 63 rockfall recorded during the past 18 years along the road. 81.5% of the rockfalls are well represented by STONE modelling. Results have been exploited by the Road Maintenance Service of Mallorca for the design of the following road management plan: (1) Phase 1. Short-term. Design a specific plan for the road- sections where rockfalls were registered and modelling results were obtained. A large investment will be expended for implementation of retention and protection measures. (2) Phase 2. Medium-term. Design a specific plan for the road- sections where rockfalls were registered but no modelling results were obtained. For these cases, new studies at local scale are necessary as well as the application of other modelling software which include higher resolution input data. (3) Phase 3. Long-term. Design a specific plan for the road- sections where no rockfalls were registered but modelling results were obtained. These are potential rockfall areas and local and specific ground studies are necessaries. References Mateos RM (2006) Los movimientos de ladera en la Serra de Tramuntana (Mallorca). Caracterización geomecánica y análisis de peligrosidad. PhD. Servicio de Publicaciones de la Universidad Complutense de Madrid. Madrid, 299 p. Mateos RM, García-Moreno I, Herrera G, Mulas J (2013) Damage caused by recent mass-movements in Majorca (Spain), a region with a high risk due to tourism.Landslide Science and Practice.Claudio Margottini, Paolo Canuti and KyojiSassa (Editors). Volume 7: Social and Economic Impact and Policies. 105-113. Guzzetti F, Crosta G, Detti R, Agliardi F (2002) STONE: A computer program for the three-dimensional simulation of rock-falls. Computers Geosciences. Vol. 28:1079-1093.

  17. Model-based calculations of surface mass balance of mountain glaciers for the purpose of water consumption planning: focus on Djankuat Glacier (Central Caucasus)

    NASA Astrophysics Data System (ADS)

    Rybak, O. O.; Rybak, E. A.

    2018-01-01

    Mountain glaciers act as regulators of run-off in the summer period, which is very crucial for economy especially in dynamically developing regions with rapidly growing population, such as Central Asia or the Northern Caucasus in Russia. In overall, glaciers stabilize water consumption in comparatively arid areas and provide conditions for sustainable development of the economy in mountainous regions and in the surrounding territories. A proper prediction of the glacial run-off is required to elaborate strategies of the regional development. This goal can be achieved by implementation of mathematical modeling methods into planning methodologies. In the paper, we consider one of the first steps in glacier dynamical modeling - surface mass balance simulation. We focus on the Djankuat Glacier in the Central Caucasus, where regular observations have been conducted during the last fifty years providing an exceptional opportunity to calibrate and to validate a mathematical model.

  18. The role of computer-aided 3D surgery and stereolithographic modelling for vector orientation in premaxillary and trans-sinusoidal maxillary distraction osteogenesis.

    PubMed

    Varol, Altan; Basa, Selçuk

    2009-06-01

    Maxillary distraction osteogenesis is a challenging procedure when it is performed with internal submerged distractors due to obligation of setting accurate distraction vectors. Five patients with severe maxillary retrognathy were planned with Mimics 10.01 CMF and Simplant 10.01 software. Distraction vectors and rods of distractors were arranged in 3D environment and on STL models. All patients were operated under general anaesthesia and complete Le Fort I downfracture was performed. All distractions were performed according to orientated vectors. All patients achieved stable occlusion and satisfactory aesthetic outcome at the end of the treatment period. Preoperative bending of internal maxillary distractors prevents significant loss of operation time. 3D computer-aided surgical simulation and model surgery provide accurate orientation of distraction vectors for premaxillary and internal trans-sinusoidal maxillary distraction. Combination of virtual surgical simulation and stereolithographic models surgery can be validated as an effective method of preoperative planning for complicated maxillofacial surgery cases.

  19. Independent dose verification system with Monte Carlo simulations using TOPAS for passive scattering proton therapy at the National Cancer Center in Korea

    NASA Astrophysics Data System (ADS)

    Shin, Wook-Geun; Testa, Mauro; Kim, Hak Soo; Jeong, Jong Hwi; Byeong Lee, Se; Kim, Yeon-Joo; Min, Chul Hee

    2017-10-01

    For the independent validation of treatment plans, we developed a fully automated Monte Carlo (MC)-based patient dose calculation system with the tool for particle simulation (TOPAS) and proton therapy machine installed at the National Cancer Center in Korea to enable routine and automatic dose recalculation for each patient. The proton beam nozzle was modeled with TOPAS to simulate the therapeutic beam, and MC commissioning was performed by comparing percent depth dose with the measurement. The beam set-up based on the prescribed beam range and modulation width was automated by modifying the vendor-specific method. The CT phantom was modeled based on the DICOM CT files with TOPAS-built-in function, and an in-house-developed C++ code directly imports the CT files for positioning the CT phantom, RT-plan file for simulating the treatment plan, and RT-structure file for applying the Hounsfield unit (HU) assignment, respectively. The developed system was validated by comparing the dose distributions with those calculated by the treatment planning system (TPS) for a lung phantom and two patient cases of abdomen and internal mammary node. The results of the beam commissioning were in good agreement of up to 0.8 mm2 g-1 for B8 option in both of the beam range and the modulation width of the spread-out Bragg peaks. The beam set-up technique can predict the range and modulation width with an accuracy of 0.06% and 0.51%, respectively, with respect to the prescribed range and modulation in arbitrary points of B5 option (128.3, 132.0, and 141.2 mm2 g-1 of range). The dose distributions showed higher than 99% passing rate for the 3D gamma index (3 mm distance to agreement and 3% dose difference) between the MC simulations and the clinical TPS in the target volume. However, in the normal tissues, less favorable agreements were obtained for the radiation treatment planning with the lung phantom and internal mammary node cases. The discrepancies might come from the limitations of the clinical TPS, which is the inaccurate dose calculation algorithm for the scattering effect, in the range compensator and inhomogeneous material. Moreover, the steep slope of the compensator, conversion of the HU values to the human phantom, and the dose calculation algorithm for the HU assignment also could be reasons of the discrepancies. The current study could be used for the independent dose validation of treatment plans including high inhomogeneities, the steep compensator, and riskiness such as lung, head & neck cases. According to the treatment policy, the dose discrepancies predicted with MC could be used for the acceptance decision of the original treatment plan.

  20. SeaQuaKE: Sea-optimized Quantum Key Exchange

    DTIC Science & Technology

    2014-06-01

    is led by Applied Communications Sciences under the ONR Free Space Optical Quantum Key Distribution Special Notice (13-SN-0004 under ONRBAA13-001...In addition, we discuss our initial progress towards the free - space quantum channel model and planning for the experimental validation effort. 15...SUBJECT TERMS Quantum communications, free - space optical communications 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT Same as

  1. A systematic plan for the continued study of dimensional stability of metallic alloys considered for the fabrication of cryogenic wind tunnel models

    NASA Technical Reports Server (NTRS)

    Wigley, D. A.

    1985-01-01

    Interrelated research and development activities, phased development of stepped specimen program are documented and a sequence for a specific program of machining, validation and heat treatment cycles for one material are described. Proposed work for the next phase of dimensional stability research is presented and further technology development activities are proposed.

  2. Personality and organizational influences on aerospace human performance

    NASA Technical Reports Server (NTRS)

    Helmreich, Robert L.

    1989-01-01

    Individual and organizational influences on performance in aerospace environments are discussed. A model of personality with demonstrated validity is described along with reasons why personality's effects on performance have been underestimated. Organizational forces including intergroup conflict and coercive pressures are also described. It is suggested that basic and applied research in analog situations is needed to provide necessary guidance for planning future space missions.

  3. Evaluation of Advanced Stirling Convertor Net Heat Input Correlation Methods Using a Thermal Standard

    NASA Technical Reports Server (NTRS)

    Briggs, Maxwell; Schifer, Nicholas

    2011-01-01

    Test hardware used to validate net heat prediction models. Problem: Net Heat Input cannot be measured directly during operation. Net heat input is a key parameter needed in prediction of efficiency for convertor performance. Efficiency = Electrical Power Output (Measured) divided by Net Heat Input (Calculated). Efficiency is used to compare convertor designs and trade technology advantages for mission planning.

  4. A survey of ground operations tools developed to simulate the pointing of space telescopes and the design for WISE

    NASA Technical Reports Server (NTRS)

    Fabinsky, Beth

    2006-01-01

    WISE, the Wide Field Infrared Survey Explorer, is scheduled for launch in June 2010. The mission operations system for WISE requires a software modeling tool to help plan, integrate and simulate all spacecraft pointing and verify that no attitude constraints are violated. In the course of developing the requirements for this tool, an investigation was conducted into the design of similar tools for other space-based telescopes. This paper summarizes the ground software and processes used to plan and validate pointing for a selection of space telescopes; with this information as background, the design for WISE is presented.

  5. Extrapolation of enalapril efficacy from adults to children using pharmacokinetic/pharmacodynamic modelling.

    PubMed

    Kechagia, Irene-Ariadne; Kalantzi, Lida; Dokoumetzidis, Aristides

    2015-11-01

    To extrapolate enalapril efficacy to children 0-6 years old, a pharmacokinetic/pharmacodynamic (PKPD) model was built using literature data, with blood pressure as the PD endpoint. A PK model of enalapril was developed from literature paediatric data up to 16 years old. A PD model of enalapril was fitted to adult literature response vs time data with various doses. The final PKPD model was validated with literature paediatric efficacy observations (diastolic blood pressure (DBP) drop after 2 weeks of treatment) in children of age 6 years and higher. The model was used to predict enalapril efficacy for ages 0-6 years. A two-compartment PK model was chosen with weight, reflecting indirectly age as a covariate on clearance and central volume. An indirect link PD model was chosen to describe drug effect. External validation of the model's capability to predict efficacy in children was successful. Enalapril efficacy was extrapolated to ages 1-11 months and 1-6 years finding the mean DBP drop 11.2 and 11.79 mmHg, respectively. Mathematical modelling was used to extrapolate enalapril efficacy to young children to support a paediatric investigation plan targeting a paediatric-use marketing authorization application. © 2015 Royal Pharmaceutical Society.

  6. Drift-Scale Coupled Processes (DST and THC Seepage) Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    E. Gonnenthal; N. Spyoher

    The purpose of this Analysis/Model Report (AMR) is to document the Near-Field Environment (NFE) and Unsaturated Zone (UZ) models used to evaluate the potential effects of coupled thermal-hydrologic-chemical (THC) processes on unsaturated zone flow and transport. This is in accordance with the ''Technical Work Plan (TWP) for Unsaturated Zone Flow and Transport Process Model Report'', Addendum D, Attachment D-4 (Civilian Radioactive Waste Management System (CRWMS) Management and Operating Contractor (M and O) 2000 [153447]) and ''Technical Work Plan for Nearfield Environment Thermal Analyses and Testing'' (CRWMS M and O 2000 [153309]). These models include the Drift Scale Test (DST) THCmore » Model and several THC seepage models. These models provide the framework to evaluate THC coupled processes at the drift scale, predict flow and transport behavior for specified thermal loading conditions, and predict the chemistry of waters and gases entering potential waste-emplacement drifts. The intended use of this AMR is to provide input for the following: (1) Performance Assessment (PA); (2) Abstraction of Drift-Scale Coupled Processes AMR (ANL-NBS-HS-000029); (3) UZ Flow and Transport Process Model Report (PMR); and (4) Near-Field Environment (NFE) PMR. The work scope for this activity is presented in the TWPs cited above, and summarized as follows: continue development of the repository drift-scale THC seepage model used in support of the TSPA in-drift geochemical model; incorporate heterogeneous fracture property realizations; study sensitivity of results to changes in input data and mineral assemblage; validate the DST model by comparison with field data; perform simulations to predict mineral dissolution and precipitation and their effects on fracture properties and chemistry of water (but not flow rates) that may seep into drifts; submit modeling results to the TDMS and document the models. The model development, input data, sensitivity and validation studies described in this AMR are required to fully document and address the requirements of the TWPs.« less

  7. Drift-Scale Coupled Processes (DST and THC Seepage) Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    E. Sonnenthale

    The purpose of this Analysis/Model Report (AMR) is to document the Near-Field Environment (NFE) and Unsaturated Zone (UZ) models used to evaluate the potential effects of coupled thermal-hydrologic-chemical (THC) processes on unsaturated zone flow and transport. This is in accordance with the ''Technical Work Plan (TWP) for Unsaturated Zone Flow and Transport Process Model Report'', Addendum D, Attachment D-4 (Civilian Radioactive Waste Management System (CRWMS) Management and Operating Contractor (M&O) 2000 [1534471]) and ''Technical Work Plan for Nearfield Environment Thermal Analyses and Testing'' (CRWMS M&O 2000 [153309]). These models include the Drift Scale Test (DST) THC Model and several THCmore » seepage models. These models provide the framework to evaluate THC coupled processes at the drift scale, predict flow and transport behavior for specified thermal loading conditions, and predict the chemistry of waters and gases entering potential waste-emplacement drifts. The intended use of this AMR is to provide input for the following: Performance Assessment (PA); Near-Field Environment (NFE) PMR; Abstraction of Drift-Scale Coupled Processes AMR (ANL-NBS-HS-000029); and UZ Flow and Transport Process Model Report (PMR). The work scope for this activity is presented in the TWPs cited above, and summarized as follows: Continue development of the repository drift-scale THC seepage model used in support of the TSPA in-drift geochemical model; incorporate heterogeneous fracture property realizations; study sensitivity of results to changes in input data and mineral assemblage; validate the DST model by comparison with field data; perform simulations to predict mineral dissolution and precipitation and their effects on fracture properties and chemistry of water (but not flow rates) that may seep into drifts; submit modeling results to the TDMS and document the models. The model development, input data, sensitivity and validation studies described in this AMR are required to fully document and address the requirements of the TWPs.« less

  8. A NASTRAN model of a large flexible swing-wing bomber. Volume 3: NASTRAN model development-wing structure

    NASA Technical Reports Server (NTRS)

    Mock, W. D.; Latham, R. A.

    1982-01-01

    The NASTRAN model plan for the wing structure was expanded in detail to generate the NASTRAN model for this substructure. The grid point coordinates were coded for each element. The material properties and sizing data for each element were specified. The wing substructure model was thoroughly checked out for continuity, connectivity, and constraints. This substructure was processed for structural influence coefficients (SIC) point loadings and the deflections were compared to those computed for the aircraft detail model. Finally, a demonstration and validation processing of this substructure was accomplished using the NASTRAN finite element program. The bulk data deck, stiffness matrices, and SIC output data were delivered.

  9. A NASTRAN model of a large flexible swing-wing bomber. Volume 2: NASTRAN model development-horizontal stabilzer, vertical stabilizer and nacelle structures

    NASA Technical Reports Server (NTRS)

    Mock, W. D.; Latham, R. A.; Tisher, E. D.

    1982-01-01

    The NASTRAN model plans for the horizontal stabilizer, vertical stabilizer, and nacelle structure were expanded in detail to generate the NASTRAN model for each of these substructures. The grid point coordinates were coded for each element. The material properties and sizing data for each element were specified. Each substructure model was thoroughly checked out for continuity, connectivity, and constraints. These substructures were processed for structural influence coefficients (SIC) point loadings and the deflections were compared to those computed for the aircraft detail models. Finally, a demonstration and validation processing of these substructures was accomplished using the NASTRAN finite element program installed at NASA/DFRC facility.

  10. Visualization of the variability of 3D statistical shape models by animation.

    PubMed

    Lamecker, Hans; Seebass, Martin; Lange, Thomas; Hege, Hans-Christian; Deuflhard, Peter

    2004-01-01

    Models of the 3D shape of anatomical objects and the knowledge about their statistical variability are of great benefit in many computer assisted medical applications like images analysis, therapy or surgery planning. Statistical model of shapes have successfully been applied to automate the task of image segmentation. The generation of 3D statistical shape models requires the identification of corresponding points on two shapes. This remains a difficult problem, especially for shapes of complicated topology. In order to interpret and validate variations encoded in a statistical shape model, visual inspection is of great importance. This work describes the generation and interpretation of statistical shape models of the liver and the pelvic bone.

  11. A NASTRAN model of a large flexible swing-wing bomber. Volume 4: NASTRAN model development-fuselage structure

    NASA Technical Reports Server (NTRS)

    Mock, W. D.; Latham, R. A.

    1982-01-01

    The NASTRAN model plan for the fuselage structure was expanded in detail to generate the NASTRAN model for this substructure. The grid point coordinates were coded for each element. The material properties and sizing data for each element were specified. The fuselage substructure model was thoroughly checked out for continuity, connectivity, and constraints. This substructure was processed for structural influence coefficients (SIC) point loadings and the deflections were compared to those computed for the aircraft detail model. Finally, a demonstration and validation processing of this substructure was accomplished using the NASTRAN finite element program. The bulk data deck, stiffness matrices, and SIC output data were delivered.

  12. Comparison of statistical and theoretical habitat models for conservation planning: the benefit of ensemble prediction

    USGS Publications Warehouse

    Jones-Farrand, D. Todd; Fearer, Todd M.; Thogmartin, Wayne E.; Thompson, Frank R.; Nelson, Mark D.; Tirpak, John M.

    2011-01-01

    Selection of a modeling approach is an important step in the conservation planning process, but little guidance is available. We compared two statistical and three theoretical habitat modeling approaches representing those currently being used for avian conservation planning at landscape and regional scales: hierarchical spatial count (HSC), classification and regression tree (CRT), habitat suitability index (HSI), forest structure database (FS), and habitat association database (HA). We focused our comparison on models for five priority forest-breeding species in the Central Hardwoods Bird Conservation Region: Acadian Flycatcher, Cerulean Warbler, Prairie Warbler, Red-headed Woodpecker, and Worm-eating Warbler. Lacking complete knowledge on the distribution and abundance of each species with which we could illuminate differences between approaches and provide strong grounds for recommending one approach over another, we used two approaches to compare models: rank correlations among model outputs and comparison of spatial correspondence. In general, rank correlations were significantly positive among models for each species, indicating general agreement among the models. Worm-eating Warblers had the highest pairwise correlations, all of which were significant (P , 0.05). Red-headed Woodpeckers had the lowest agreement among models, suggesting greater uncertainty in the relative conservation value of areas within the region. We assessed model uncertainty by mapping the spatial congruence in priorities (i.e., top ranks) resulting from each model for each species and calculating the coefficient of variation across model ranks for each location. This allowed identification of areas more likely to be good targets of conservation effort for a species, those areas that were least likely, and those in between where uncertainty is higher and thus conservation action incorporates more risk. Based on our results, models developed independently for the same purpose (conservation planning for a particular species in a particular geography) yield different answers and thus different conservation strategies. We assert that using only one habitat model (even if validated) as the foundation of a conservation plan is risky. Using multiple models (i.e., ensemble prediction) can reduce uncertainty and increase efficacy of conservation action when models corroborate one another and increase understanding of the system when they do not.

  13. Repeatability of dose painting by numbers treatment planning in prostate cancer radiotherapy based on multiparametric magnetic resonance imaging

    NASA Astrophysics Data System (ADS)

    van Schie, Marcel A.; Steenbergen, Peter; Viet Dinh, Cuong; Ghobadi, Ghazaleh; van Houdt, Petra J.; Pos, Floris J.; Heijmink, Stijn W. T. J. P.; van der Poel, Henk G.; Renisch, Steffen; Vik, Torbjørn; van der Heide, Uulke A.

    2017-07-01

    Dose painting by numbers (DPBN) refers to a voxel-wise prescription of radiation dose modelled from functional image characteristics, in contrast to dose painting by contours which requires delineations to define the target for dose escalation. The direct relation between functional imaging characteristics and DPBN implies that random variations in images may propagate into the dose distribution. The stability of MR-only prostate cancer treatment planning based on DPBN with respect to these variations is as yet unknown. We conducted a test-retest study to investigate the stability of DPBN for prostate cancer in a semi-automated MR-only treatment planning workflow. Twelve patients received a multiparametric MRI on two separate days prior to prostatectomy. The tumor probability (TP) within the prostate was derived from image features with a logistic regression model. Dose mapping functions were applied to acquire a DPBN prescription map that served to generate an intensity modulated radiation therapy (IMRT) treatment plan. Dose calculations were done on a pseudo-CT derived from the MRI. The TP and DPBN map and the IMRT dose distribution were compared between both MRI sessions, using the intraclass correlation coefficient (ICC) to quantify repeatability of the planning pipeline. The quality of each treatment plan was measured with a quality factor (QF). Median ICC values for the TP and DPBN map and the IMRT dose distribution were 0.82, 0.82 and 0.88, respectively, for linear dose mapping and 0.82, 0.84 and 0.94 for square root dose mapping. A median QF of 3.4% was found among all treatment plans. We demonstrated the stability of DPBN radiotherapy treatment planning in prostate cancer, with excellent overall repeatability and acceptable treatment plan quality. Using validated tumor probability modelling and simple dose mapping techniques it was shown that despite day-to-day variations in imaging data still consistent treatment plans were obtained.

  14. World Ocean Circulation Experiment

    NASA Technical Reports Server (NTRS)

    Clarke, R. Allyn

    1992-01-01

    The oceans are an equal partner with the atmosphere in the global climate system. The World Ocean Circulation Experiment is presently being implemented to improve ocean models that are useful for climate prediction both by encouraging more model development but more importantly by providing quality data sets that can be used to force or to validate such models. WOCE is the first oceanographic experiment that plans to generate and to use multiparameter global ocean data sets. In order for WOCE to succeed, oceanographers must establish and learn to use more effective methods of assembling, quality controlling, manipulating and distributing oceanographic data.

  15. Validation of the Oncentra Brachy Advanced Collapsed cone Engine for a commercial (192)Ir source using heterogeneous geometries.

    PubMed

    Ma, Yunzhi; Lacroix, Fréderic; Lavallée, Marie-Claude; Beaulieu, Luc

    2015-01-01

    To validate the Advanced Collapsed cone Engine (ACE) dose calculation engine of Oncentra Brachy (OcB) treatment planning system using an (192)Ir source. Two levels of validation were performed, conformant to the model-based dose calculation algorithm commissioning guidelines of American Association of Physicists in Medicine TG-186 report. Level 1 uses all-water phantoms, and the validation is against TG-43 methodology. Level 2 uses real-patient cases, and the validation is against Monte Carlo (MC) simulations. For each case, the ACE and TG-43 calculations were performed in the OcB treatment planning system. ALGEBRA MC system was used to perform MC simulations. In Level 1, the ray effect depends on both accuracy mode and the number of dwell positions. The volume fraction with dose error ≥2% quickly reduces from 23% (13%) for a single dwell to 3% (2%) for eight dwell positions in the standard (high) accuracy mode. In Level 2, the 10% and higher isodose lines were observed overlapping between ACE (both standard and high-resolution modes) and MC. Major clinical indices (V100, V150, V200, D90, D50, and D2cc) were investigated and validated by MC. For example, among the Level 2 cases, the maximum deviation in V100 of ACE from MC is 2.75% but up to ~10% for TG-43. Similarly, the maximum deviation in D90 is 0.14 Gy between ACE and MC but up to 0.24 Gy for TG-43. ACE demonstrated good agreement with MC in most clinically relevant regions in the cases tested. Departure from MC is significant for specific situations but limited to low-dose (<10% isodose) regions. Copyright © 2015 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  16. Developing Statistical Physics Course Handout on Distribution Function Materials Based on Science, Technology, Engineering, and Mathematics

    NASA Astrophysics Data System (ADS)

    Riandry, M. A.; Ismet, I.; Akhsan, H.

    2017-09-01

    This study aims to produce a valid and practical statistical physics course handout on distribution function materials based on STEM. Rowntree development model is used to produce this handout. The model consists of three stages: planning, development and evaluation stages. In this study, the evaluation stage used Tessmer formative evaluation. It consists of 5 stages: self-evaluation, expert review, one-to-one evaluation, small group evaluation and field test stages. However, the handout is limited to be tested on validity and practicality aspects, so the field test stage is not implemented. The data collection technique used walkthroughs and questionnaires. Subjects of this study are students of 6th and 8th semester of academic year 2016/2017 Physics Education Study Program of Sriwijaya University. The average result of expert review is 87.31% (very valid category). One-to-one evaluation obtained the average result is 89.42%. The result of small group evaluation is 85.92%. From one-to-one and small group evaluation stages, averagestudent response to this handout is 87,67% (very practical category). Based on the results of the study, it can be concluded that the handout is valid and practical.

  17. Methods for Validation and Intercomparison of Remote Sensing and In situ Ice Water Measurements: Case Studies from CRYSTAL-FACE and Model Results

    NASA Technical Reports Server (NTRS)

    Sayres, D.S.; Pittman, J. V.; Smith, J. B.; Weinstock, E. M.; Anderson, J. G.; Heymsfield, G.; Li, L.; Fridlind, A.; Ackerman, A. S.

    2004-01-01

    Remote sensing observations, such as those from AURA, are necessary to understand the role of cirrus in determining the radiative and humidity budgets of the upper troposphere. Using these measurements quantitatively requires comparisons with in situ measurements that have previously been validated. However, a direct comparison of remote and in situ measurements is difficult due to the requirement that the spatial and temporal overlap be sufficient in order to guarantee that both instruments are measuring the same air parcel. A difficult as this might be for gas phase intercomparisons, cloud inhomogeneities significantly exacerbate the problem for cloud ice water content measurements. The CRYSTAL-FACE mission provided an opportunity to assess how well such intercomparisons can be performed and to establish flight plans that will be necessary for validation of future satellite instruments. During CRYSTAL-FACE, remote and in situ instruments were placed on different aircraft (NASA's ER-2 and WB-59, and the two planes flew in tandem so that the in situ payload flew in the field of view of the remote instruments. We show here that, even with this type of careful flight planning, it is not always possible to guarantee that remote and in situ instruments are viewing the same air parcel. We use ice water data derived from the in situ Harvard Total Water (HV-TW) instrument, and the remote Goddard Cloud Radar System (CRS) and show that agreement between HV-TW and CRS is a strong function of the horizontal separation and the time delay between the aircraft transects. We also use a cloud model to simulate possible trajectories through a cloud and evaluate the use of statistical analysis in determining the agreement between the two instruments. This type of analysis should guide flight planning for future intercomparison efforts, whether for aircraft or satellite-borne instrumentation.

  18. The MADE Reference Information Model for Interoperable Pervasive Telemedicine Systems.

    PubMed

    Fung, Nick L S; Jones, Valerie M; Hermens, Hermie J

    2017-03-23

    The main objective is to develop and validate a reference information model (RIM) to support semantic interoperability of pervasive telemedicine systems. The RIM is one component within a larger, computer-interpretable "MADE language" developed by the authors in the context of the MobiGuide project. To validate our RIM, we applied it to a clinical guideline for patients with gestational diabetes mellitus (GDM). The RIM is derived from a generic data flow model of disease management which comprises a network of four types of concurrent processes: Monitoring (M), Analysis (A), Decision (D) and Effectuation (E). This resulting MADE RIM, which was specified using the formal Vienna Development Method (VDM), includes six main, high-level data types representing measurements, observations, abstractions, action plans, action instructions and control instructions. The authors applied the MADE RIM to the complete GDM guideline and derived from it a domain information model (DIM) comprising 61 archetypes, specifically 1 measurement, 8 observation, 10 abstraction, 18 action plan, 3 action instruction and 21 control instruction archetypes. It was observed that there are six generic patterns for transforming different guideline elements into MADE archetypes, although a direct mapping does not exist in some cases. Most notable examples are notifications to the patient and/or clinician as well as decision conditions which pertain to specific stages in the therapy. The results provide evidence that the MADE RIM is suitable for modelling clinical data in the design of pervasive telemedicine systems. Together with the other components of the MADE language, the MADE RIM supports development of pervasive telemedicine systems that are interoperable and independent of particular clinical applications.

  19. Development of a Common Research Model for Applied CFD Validation Studies

    NASA Technical Reports Server (NTRS)

    Vassberg, John C.; Dehaan, Mark A.; Rivers, S. Melissa; Wahls, Richard A.

    2008-01-01

    The development of a wing/body/nacelle/pylon/horizontal-tail configuration for a common research model is presented, with focus on the aerodynamic design of the wing. Here, a contemporary transonic supercritical wing design is developed with aerodynamic characteristics that are well behaved and of high performance for configurations with and without the nacelle/pylon group. The horizontal tail is robustly designed for dive Mach number conditions and is suitably sized for typical stability and control requirements. The fuselage is representative of a wide/body commercial transport aircraft; it includes a wing-body fairing, as well as a scrubbing seal for the horizontal tail. The nacelle is a single-cowl, high by-pass-ratio, flow-through design with an exit area sized to achieve a natural unforced mass-flow-ratio typical of commercial aircraft engines at cruise. The simplicity of this un-bifurcated nacelle geometry will facilitate grid generation efforts of subsequent CFD validation exercises. Detailed aerodynamic performance data has been generated for this model; however, this information is presented in such a manner as to not bias CFD predictions planned for the fourth AIAA CFD Drag Prediction Workshop, which incorporates this common research model into its blind test cases. The CFD results presented include wing pressure distributions with and without the nacelle/pylon, ML/D trend lines, and drag-divergence curves; the design point for the wing/body configuration is within 1% of its max-ML/D. Plans to test the common research model in the National Transonic Facility and the Ames 11-ft wind tunnels are also discussed.

  20. Application of a neuro-fuzzy model to landslide-susceptibility mapping for shallow landslides in a tropical hilly area

    NASA Astrophysics Data System (ADS)

    Oh, Hyun-Joo; Pradhan, Biswajeet

    2011-09-01

    This paper presents landslide-susceptibility mapping using an adaptive neuro-fuzzy inference system (ANFIS) using a geographic information system (GIS) environment. In the first stage, landslide locations from the study area were identified by interpreting aerial photographs and supported by an extensive field survey. In the second stage, landslide-related conditioning factors such as altitude, slope angle, plan curvature, distance to drainage, distance to road, soil texture and stream power index (SPI) were extracted from the topographic and soil maps. Then, landslide-susceptible areas were analyzed by the ANFIS approach and mapped using landslide-conditioning factors. In particular, various membership functions (MFs) were applied for the landslide-susceptibility mapping and their results were compared with the field-verified landslide locations. Additionally, the receiver operating characteristics (ROC) curve for all landslide susceptibility maps were drawn and the areas under curve values were calculated. The ROC curve technique is based on the plotting of model sensitivity — true positive fraction values calculated for different threshold values, versus model specificity — true negative fraction values, on a graph. Landslide test locations that were not used during the ANFIS modeling purpose were used to validate the landslide susceptibility maps. The validation results revealed that the susceptibility maps constructed by the ANFIS predictive models using triangular, trapezoidal, generalized bell and polynomial MFs produced reasonable results (84.39%), which can be used for preliminary land-use planning. Finally, the authors concluded that ANFIS is a very useful and an effective tool in regional landslide susceptibility assessment.

  1. SeaSat-A Satellite Scatterometer (SASS) Validation and Experiment Plan

    NASA Technical Reports Server (NTRS)

    Schroeder, L. C. (Editor)

    1978-01-01

    This plan was generated by the SeaSat-A satellite scatterometer experiment team to define the pre-and post-launch activities necessary to conduct sensor validation and geophysical evaluation. Details included are an instrument and experiment description/performance requirements, success criteria, constraints, mission requirements, data processing requirement and data analysis responsibilities.

  2. Validity of the Butcher Treatment Planning Inventory as a Measure of Negative Treatment Attitudes

    ERIC Educational Resources Information Center

    Hatchett, Gregory T.

    2007-01-01

    This study evaluated the validity of the Butcher Treatment Planning Inventory (BTPI) as a measure of negative expectations and attitudes toward counseling. Undergraduate students completed the BTPI, the Attitudes Toward Seeking Professional Psychological Help Scale-Abbreviated Version, and the Expectations About Counseling-Brief Form during one…

  3. Two-Method Planned Missing Designs for Longitudinal Research

    ERIC Educational Resources Information Center

    Garnier-Villarreal, Mauricio; Rhemtulla, Mijke; Little, Todd D.

    2014-01-01

    We examine longitudinal extensions of the two-method measurement design, which uses planned missingness to optimize cost-efficiency and validity of hard-to-measure constructs. These designs use a combination of two measures: a "gold standard" that is highly valid but expensive to administer, and an inexpensive (e.g., survey-based)…

  4. Targeting Low Career Confidence Using the Career Planning Confidence Scale

    ERIC Educational Resources Information Center

    McAuliffe, Garrett; Jurgens, Jill C.; Pickering, Worth; Calliotte, James; Macera, Anthony; Zerwas, Steven

    2006-01-01

    The authors describe the development and validation of a test of career planning confidence that makes possible the targeting of specific problem issues in employment counseling. The scale, developed using a rational process and the authors' experience with clients, was tested for criterion-related validity against 2 other measures. The scale…

  5. Poster — Thur Eve — 30: 4D VMAT dose calculation methodology to investigate the interplay effect: experimental validation using TrueBeam Developer Mode and Gafchromic film

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teke, T; Milette, MP; Huang, V

    2014-08-15

    The interplay effect between the tumor motion and the radiation beam modulation during a VMAT treatment delivery alters the delivered dose distribution from the planned one. This work present and validate a method to accurately calculate the dose distribution in 4D taking into account the tumor motion, the field modulation and the treatment starting phase. A QUASAR™ respiratory motion phantom was 4D scanned with motion amplitude of 3 cm and with a 3 second period. A static scan was also acquired with the lung insert and the tumor contained in it centered. A VMAT plan with a 6XFFF beam wasmore » created on the averaged CT and delivered on a Varian TrueBeam and the trajectory log file was saved. From the trajectory log file 10 VMAT plans (one for each breathing phase) and a developer mode XML file were created. For the 10 VMAT plans, the tumor motion was modeled by moving the isocentre on the static scan, the plans were re-calculated and summed in the treatment planning system. In the developer mode, the tumor motion was simulated by moving the couch dynamically during the treatment. Gafchromic films were placed in the QUASAR phantom static and irradiated using the developer mode. Different treatment starting phase were investigated (no phase shift, maximum inhalation and maximum exhalation). Calculated and measured isodose lines and profiles are in very good agreement. For each starting phase, the dose distribution exhibit significant differences but are accurately calculated with the methodology presented in this work.« less

  6. How much is a word? Predicting ease of articulation planning from apraxic speech error patterns.

    PubMed

    Ziegler, Wolfram; Aichert, Ingrid

    2015-08-01

    According to intuitive concepts, 'ease of articulation' is influenced by factors like word length or the presence of consonant clusters in an utterance. Imaging studies of speech motor control use these factors to systematically tax the speech motor system. Evidence from apraxia of speech, a disorder supposed to result from speech motor planning impairment after lesions to speech motor centers in the left hemisphere, supports the relevance of these and other factors in disordered speech planning and the genesis of apraxic speech errors. Yet, there is no unified account of the structural properties rendering a word easy or difficult to pronounce. To model the motor planning demands of word articulation by a nonlinear regression model trained to predict the likelihood of accurate word production in apraxia of speech. We used a tree-structure model in which vocal tract gestures are embedded in hierarchically nested prosodic domains to derive a recursive set of terms for the computation of the likelihood of accurate word production. The model was trained with accuracy data from a set of 136 words averaged over 66 samples from apraxic speakers. In a second step, the model coefficients were used to predict a test dataset of accuracy values for 96 new words, averaged over 120 samples produced by a different group of apraxic speakers. Accurate modeling of the first dataset was achieved in the training study (R(2)adj = .71). In the cross-validation, the test dataset was predicted with a high accuracy as well (R(2)adj = .67). The model shape, as reflected by the coefficient estimates, was consistent with current phonetic theories and with clinical evidence. In accordance with phonetic and psycholinguistic work, a strong influence of word stress on articulation errors was found. The proposed model provides a unified and transparent account of the motor planning requirements of word articulation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Methods for landslide susceptibility modelling in Lower Austria

    NASA Astrophysics Data System (ADS)

    Bell, Rainer; Petschko, Helene; Glade, Thomas; Leopold, Philip; Heiss, Gerhard; Proske, Herwig; Granica, Klaus; Schweigl, Joachim; Pomaroli, Gilbert

    2010-05-01

    Landslide susceptibility modelling and implementation of the resulting maps is still a challenge for geoscientists, spatial and infrastructure planners. Particularly on a regional scale landslide processes and their dynamics are poorly understood. Furthermore, the availability of appropriate spatial data in high resolution is often a limiting factor for modelling high quality landslide susceptibility maps for large study areas. However, these maps form an important basis for preventive spatial planning measures. Thus, new methods have to be developed, especially focussing on the implementation of final maps into spatial planning processes. The main objective of the project "MoNOE" (Method development for landslide susceptibility modelling in Lower Austria) is to design a method for landslide susceptibility modelling for a large study area (about 10.200 km²) and to produce landslide susceptibility maps which are finally implemented in the spatial planning strategies of the Federal state of Lower Austria. The project focuses primarily on the landslide types fall and slide. To enable susceptibility modelling, landslide inventories for the respective landslide types must be compiled and relevant data has to be gathered, prepared and homogenized. Based on this data new methods must be developed to tackle the needs of the spatial planning strategies. Considerable efforts will also be spent on the validation of the resulting maps for each landslide type. A great challenge will be the combination of the susceptibility maps for slides and falls in just one single susceptibility map (which is requested by the government) and the definition of the final visualisation. Since numerous landslides have been favoured or even triggered by human impact, the human influence on landslides will also have to be investigated. Furthermore possibilities to integrate respective findings in regional susceptibility modelling will be explored. According to these objectives the project is structured in four work packages namely data preparation and homogenization (WP1), susceptibility modelling and validation (WP2), integrative susceptibility assessment (WP3) and human impact (WP4). The expected results are a landslide inventory map covering all endangered parts of the Federal state of Lower Austria, a land cover map of Lower Austria with high spatial resolution, processed spatial input data and an optimized integrative susceptibility map visualized at a scale of 1:25.000. The structure of the research project, research strategies as well as first results will be presented at the conference. The project is funded by the Federal state government of Lower Austria.

  8. Using face validity to recognize empirical community observations.

    PubMed

    Gaber, John; Gaber, Sharon L

    2010-05-01

    There is a growing interest among international planning scholars to explore community participation in the plan making process from a qualitative research approach. In this paper the research assessment tool "face validity" is discussed as one way to help planners decipher when the community is sharing empirically grounded observations that can advance the applicability of the plan making process. Face validity provides a common sense assessment of research conclusions. It allows the assessor to look at an entire research project and ask: "on the face of things, does this research make sense?" With planners listening to citizen comments with an ear for face validity observations, holds open the opportunity for government to empirically learn from the community to see if they "got it right." And if not, to chart out a course on how they can get it right. Copyright 2009 Elsevier Ltd. All rights reserved.

  9. Draft Plan for Characterizing Commercial Data Products in Support of Earth Science Research

    NASA Technical Reports Server (NTRS)

    Ryan, Robert E.; Terrie, Greg; Berglund, Judith

    2006-01-01

    This presentation introduces a draft plan for characterizing commercial data products for Earth science research. The general approach to the commercial product verification and validation includes focused selection of a readily available commercial remote sensing products that support Earth science research. Ongoing product verification and characterization will question whether the product meets specifications and will examine its fundamental properties, potential and limitations. Validation will encourage product evaluation for specific science research and applications. Specific commercial products included in the characterization plan include high-spatial-resolution multispectral (HSMS) imagery and LIDAR data products. Future efforts in this process will include briefing NASA headquarters and modifying plans based on feedback, increased engagement with the science community and refinement of details, coordination with commercial vendors and The Joint Agency Commercial Imagery Evaluation (JACIE) for HSMS satellite acquisitions, acquiring waveform LIDAR data and performing verification and validation.

  10. Bridging groundwater models and decision support with a Bayesian network

    USGS Publications Warehouse

    Fienen, Michael N.; Masterson, John P.; Plant, Nathaniel G.; Gutierrez, Benjamin T.; Thieler, E. Robert

    2013-01-01

    Resource managers need to make decisions to plan for future environmental conditions, particularly sea level rise, in the face of substantial uncertainty. Many interacting processes factor in to the decisions they face. Advances in process models and the quantification of uncertainty have made models a valuable tool for this purpose. Long-simulation runtimes and, often, numerical instability make linking process models impractical in many cases. A method for emulating the important connections between model input and forecasts, while propagating uncertainty, has the potential to provide a bridge between complicated numerical process models and the efficiency and stability needed for decision making. We explore this using a Bayesian network (BN) to emulate a groundwater flow model. We expand on previous approaches to validating a BN by calculating forecasting skill using cross validation of a groundwater model of Assateague Island in Virginia and Maryland, USA. This BN emulation was shown to capture the important groundwater-flow characteristics and uncertainty of the groundwater system because of its connection to island morphology and sea level. Forecast power metrics associated with the validation of multiple alternative BN designs guided the selection of an optimal level of BN complexity. Assateague island is an ideal test case for exploring a forecasting tool based on current conditions because the unique hydrogeomorphological variability of the island includes a range of settings indicative of past, current, and future conditions. The resulting BN is a valuable tool for exploring the response of groundwater conditions to sea level rise in decision support.

  11. Uniting statistical and individual-based approaches for animal movement modelling.

    PubMed

    Latombe, Guillaume; Parrott, Lael; Basille, Mathieu; Fortin, Daniel

    2014-01-01

    The dynamic nature of their internal states and the environment directly shape animals' spatial behaviours and give rise to emergent properties at broader scales in natural systems. However, integrating these dynamic features into habitat selection studies remains challenging, due to practically impossible field work to access internal states and the inability of current statistical models to produce dynamic outputs. To address these issues, we developed a robust method, which combines statistical and individual-based modelling. Using a statistical technique for forward modelling of the IBM has the advantage of being faster for parameterization than a pure inverse modelling technique and allows for robust selection of parameters. Using GPS locations from caribou monitored in Québec, caribou movements were modelled based on generative mechanisms accounting for dynamic variables at a low level of emergence. These variables were accessed by replicating real individuals' movements in parallel sub-models, and movement parameters were then empirically parameterized using Step Selection Functions. The final IBM model was validated using both k-fold cross-validation and emergent patterns validation and was tested for two different scenarios, with varying hardwood encroachment. Our results highlighted a functional response in habitat selection, which suggests that our method was able to capture the complexity of the natural system, and adequately provided projections on future possible states of the system in response to different management plans. This is especially relevant for testing the long-term impact of scenarios corresponding to environmental configurations that have yet to be observed in real systems.

  12. Uniting Statistical and Individual-Based Approaches for Animal Movement Modelling

    PubMed Central

    Latombe, Guillaume; Parrott, Lael; Basille, Mathieu; Fortin, Daniel

    2014-01-01

    The dynamic nature of their internal states and the environment directly shape animals' spatial behaviours and give rise to emergent properties at broader scales in natural systems. However, integrating these dynamic features into habitat selection studies remains challenging, due to practically impossible field work to access internal states and the inability of current statistical models to produce dynamic outputs. To address these issues, we developed a robust method, which combines statistical and individual-based modelling. Using a statistical technique for forward modelling of the IBM has the advantage of being faster for parameterization than a pure inverse modelling technique and allows for robust selection of parameters. Using GPS locations from caribou monitored in Québec, caribou movements were modelled based on generative mechanisms accounting for dynamic variables at a low level of emergence. These variables were accessed by replicating real individuals' movements in parallel sub-models, and movement parameters were then empirically parameterized using Step Selection Functions. The final IBM model was validated using both k-fold cross-validation and emergent patterns validation and was tested for two different scenarios, with varying hardwood encroachment. Our results highlighted a functional response in habitat selection, which suggests that our method was able to capture the complexity of the natural system, and adequately provided projections on future possible states of the system in response to different management plans. This is especially relevant for testing the long-term impact of scenarios corresponding to environmental configurations that have yet to be observed in real systems. PMID:24979047

  13. Measuring the levels of noise at the İstanbul Atatürk Airport and comparisons with model simulations.

    PubMed

    Sari, Deniz; Ozkurt, Nesimi; Akdag, Ali; Kutukoglu, Murat; Gurarslan, Aliye

    2014-06-01

    Airport noise and its impact on the surrounding areas are major issues in the aviation industry. The İstanbul Atatürk Airport is a major global airport with passenger numbers increasing rapidly per annum. The noise levels for day, evening and night times were modeled around the İstanbul Atatürk Airport according to the European Noise Directive using the actual data records for the year 2011. The "ECAC Doc. 29-Interim" method was used for the computation of the aircraft traffic noise. In the setting the noise model for the local airport topography was taken into consideration together with the noise source data, the airport loadings, features of aircraft and actual air traffic data. Model results were compared with long-term noise measurement values for calibration. According to calibration results, classifications of the aircraft type and flight tracks were revised. For noise model validation, the daily noise measurements at four additional locations were used during the verification period. The input data was re-edited only for these periods and the model was validated. A successful model performance was obtained in several zones around the airport. The validated noise model of the İstanbul Atatürk Airport can be now utilized both for determining the noise levels in the future and for producing new strategies which are about the land use planning, operational considerations for the air traffic management and the noise abatement procedures. Crown Copyright © 2013. All rights reserved.

  14. Cross-cultural validity of the theory of planned behavior for predicting healthy food choice in secondary school students of Inner Mongolia.

    PubMed

    Shimazaki, Takashi; Bao, Hugejiletu; Deli, Geer; Uechi, Hiroaki; Lee, Ying-Hua; Miura, Kayo; Takenaka, Koji

    2017-11-01

    Unhealthy eating behavior is a serious health concern among secondary school students in Inner Mongolia. To predict their healthy food choices and devise methods of correcting unhealthy choices, we sought to confirm the cross-cultural validity of the theory of planned behavior among Inner Mongolian students. A cross-sectional study, conducted between November and December 2014. Overall, 3047 students were enrolled. We devised a questionnaire based on the theory of planned behavior to measure its components (intentions, attitudes, subjective norms, and perceived behavioral control) in relation to healthy food choices; we also assessed their current engagement in healthy food choices. A principal component analysis revealed high contribution rates for the components (69.32%-88.77%). A confirmatory factor analysis indicated that the components of the questionnaire had adequate model fit (goodness of fit index=0.997, adjusted goodness of fit index=0.984, comparative fit index=0.998, and root mean square error of approximation=0.049). Notably, data from participants within the suburbs did not support the theory of planned behavior construction. Several paths did not predict the hypothesis variables. However, attitudes toward healthy food choices strongly predicted behavioral intention (path coefficients 0.49-0.77, p<0.01), regardless of demographic characteristics. Our results support that the theory of planned behavior can apply to secondary school students in urban areas. Furthermore, attitudes towards healthy food choices were the best predictor of behavioral intentions to engage in such choices in Inner Mongolian students. Copyright © 2017 Diabetes India. Published by Elsevier Ltd. All rights reserved.

  15. A Comprehensive Plan for the Long-Term Calibration and Validation of Oceanic Biogeochemical Satellite Data

    NASA Technical Reports Server (NTRS)

    Hooker, Stanford B.; McClain, Charles R.; Mannino, Antonio

    2007-01-01

    The primary objective of this planning document is to establish a long-term capability and validating oceanic biogeochemical satellite data. It is a pragmatic solution to a practical problem based primarily o the lessons learned from prior satellite missions. All of the plan's elements are seen to be interdependent, so a horizontal organizational scheme is anticipated wherein the overall leadership comes from the NASA Ocean Biology and Biogeochemistry (OBB) Program Manager and the entire enterprise is split into two components of equal sature: calibration and validation plus satellite data processing. The detailed elements of the activity are based on the basic tasks of the two main components plus the current objectives of the Carbon Cycle and Ecosystems Roadmap. The former is distinguished by an internal core set of responsibilities and the latter is facilitated through an external connecting-core ring of competed or contracted activities. The core elements for the calibration and validation component include a) publish protocols and performance metrics; b) verify uncertainty budgets; c) manage the development and evaluation of instrumentation; and d) coordinate international partnerships. The core elements for the satellite data processing component are e) process and reprocess multisensor data; f) acquire, distribute, and archive data products; and g) implement new data products. Both components have shared responsibilities for initializing and temporally monitoring satellite calibration. Connecting-core elements include (but are not restricted to) atmospheric correction and characterization, standards and traceability, instrument and analysis round robins, field campaigns and vicarious calibration sites, in situ database, bio-optical algorithm (and product) validation, satellite characterization and vicarious calibration, and image processing software. The plan also includes an accountability process, creating a Calibration and Validation Team (to help manage the activity), and a discussion of issues associated with the plan's scientific focus.

  16. SU-E-T-626: Accuracy of Dose Calculation Algorithms in MultiPlan Treatment Planning System in Presence of Heterogeneities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moignier, C; Huet, C; Barraux, V

    Purpose: Advanced stereotactic radiotherapy (SRT) treatments require accurate dose calculation for treatment planning especially for treatment sites involving heterogeneous patient anatomy. The purpose of this study was to evaluate the accuracy of dose calculation algorithms, Raytracing and Monte Carlo (MC), implemented in the MultiPlan treatment planning system (TPS) in presence of heterogeneities. Methods: First, the LINAC of a CyberKnife radiotherapy facility was modeled with the PENELOPE MC code. A protocol for the measurement of dose distributions with EBT3 films was established and validated thanks to comparison between experimental dose distributions and calculated dose distributions obtained with MultiPlan Raytracing and MCmore » algorithms as well as with the PENELOPE MC model for treatments planned with the homogenous Easycube phantom. Finally, bones and lungs inserts were used to set up a heterogeneous Easycube phantom. Treatment plans with the 10, 7.5 or the 5 mm field sizes were generated in Multiplan TPS with different tumor localizations (in the lung and at the lung/bone/soft tissue interface). Experimental dose distributions were compared to the PENELOPE MC and Multiplan calculations using the gamma index method. Results: Regarding the experiment in the homogenous phantom, 100% of the points passed for the 3%/3mm tolerance criteria. These criteria include the global error of the method (CT-scan resolution, EBT3 dosimetry, LINAC positionning …), and were used afterwards to estimate the accuracy of the MultiPlan algorithms in heterogeneous media. Comparison of the dose distributions obtained in the heterogeneous phantom is in progress. Conclusion: This work has led to the development of numerical and experimental dosimetric tools for small beam dosimetry. Raytracing and MC algorithms implemented in MultiPlan TPS were evaluated in heterogeneous media.« less

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dinh, Nam; Athe, Paridhi; Jones, Christopher

    The Virtual Environment for Reactor Applications (VERA) code suite is assessed in terms of capability and credibility against the Consortium for Advanced Simulation of Light Water Reactors (CASL) Verification and Validation Plan (presented herein) in the context of three selected challenge problems: CRUD-Induced Power Shift (CIPS), Departure from Nucleate Boiling (DNB), and Pellet-Clad Interaction (PCI). Capability refers to evidence of required functionality for capturing phenomena of interest while capability refers to the evidence that provides confidence in the calculated results. For this assessment, each challenge problem defines a set of phenomenological requirements against which the VERA software is assessed. Thismore » approach, in turn, enables the focused assessment of only those capabilities relevant to the challenge problem. The evaluation of VERA against the challenge problem requirements represents a capability assessment. The mechanism for assessment is the Sandia-developed Predictive Capability Maturity Model (PCMM) that, for this assessment, evaluates VERA on 8 major criteria: (1) Representation and Geometric Fidelity, (2) Physics and Material Model Fidelity, (3) Software Quality Assurance and Engineering, (4) Code Verification, (5) Solution Verification, (6) Separate Effects Model Validation, (7) Integral Effects Model Validation, and (8) Uncertainty Quantification. For each attribute, a maturity score from zero to three is assigned in the context of each challenge problem. The evaluation of these eight elements constitutes the credibility assessment for VERA.« less

  18. Evaluation of the AnnAGNPS Model for Predicting Runoff and Nutrient Export in a Typical Small Watershed in the Hilly Region of Taihu Lake.

    PubMed

    Luo, Chuan; Li, Zhaofu; Li, Hengpeng; Chen, Xiaomin

    2015-09-02

    The application of hydrological and water quality models is an efficient approach to better understand the processes of environmental deterioration. This study evaluated the ability of the Annualized Agricultural Non-Point Source (AnnAGNPS) model to predict runoff, total nitrogen (TN) and total phosphorus (TP) loading in a typical small watershed of a hilly region near Taihu Lake, China. Runoff was calibrated and validated at both an annual and monthly scale, and parameter sensitivity analysis was performed for TN and TP before the two water quality components were calibrated. The results showed that the model satisfactorily simulated runoff at annual and monthly scales, both during calibration and validation processes. Additionally, results of parameter sensitivity analysis showed that the parameters Fertilizer rate, Fertilizer organic, Canopy cover and Fertilizer inorganic were more sensitive to TN output. In terms of TP, the parameters Residue mass ratio, Fertilizer rate, Fertilizer inorganic and Canopy cover were the most sensitive. Based on these sensitive parameters, calibration was performed. TN loading produced satisfactory results for both the calibration and validation processes, whereas the performance of TP loading was slightly poor. The simulation results showed that AnnAGNPS has the potential to be used as a valuable tool for the planning and management of watersheds.

  19. On the validity of the incremental approach to estimate the impact of cities on air quality

    NASA Astrophysics Data System (ADS)

    Thunis, Philippe

    2018-01-01

    The question of how much cities are the sources of their own air pollution is not only theoretical as it is critical to the design of effective strategies for urban air quality planning. In this work, we assess the validity of the commonly used incremental approach to estimate the likely impact of cities on their air pollution. With the incremental approach, the city impact (i.e. the concentration change generated by the city emissions) is estimated as the concentration difference between a rural background and an urban background location, also known as the urban increment. We show that the city impact is in reality made up of the urban increment and two additional components and consequently two assumptions need to be fulfilled for the urban increment to be representative of the urban impact. The first assumption is that the rural background location is not influenced by emissions from within the city whereas the second requires that background concentration levels, obtained with zero city emissions, are equal at both locations. Because the urban impact is not measurable, the SHERPA modelling approach, based on a full air quality modelling system, is used in this work to assess the validity of these assumptions for some European cities. Results indicate that for PM2.5, these two assumptions are far from being fulfilled for many large or medium city sizes. For this type of cities, urban increments are largely underestimating city impacts. Although results are in better agreement for NO2, similar issues are met. In many situations the incremental approach is therefore not an adequate estimate of the urban impact on air pollution. This poses issues in terms of interpretation when these increments are used to define strategic options in terms of air quality planning. We finally illustrate the interest of comparing modelled and measured increments to improve our confidence in the model results.

  20. Predicting the Operational Acceptability of Route Advisories

    NASA Technical Reports Server (NTRS)

    Evans, Antony; Lee, Paul

    2017-01-01

    NASA envisions a future Air Traffic Management system that allows safe, efficient growth in global operations, enabled by increasing levels of automation and autonomy. In a safety-critical system, the introduction of increasing automation and autonomy has to be done in stages, making human-system integrated concepts critical in the foreseeable future. One example where this is relevant is for tools that generate more efficient flight routings or reroute advisories. If these routes are not operationally acceptable, they will be rejected by human operators, and the associated benefits will not be realized. Operational acceptance is therefore required to enable the increased efficiency and reduced workload benefits associated with these tools. In this paper, the authors develop a predictor of operational acceptability for reroute advisories. Such a capability has applications in tools that identify more efficient routings around weather and congestion and that better meet airline preferences. The capability is based on applying data mining techniques to flight plan amendment data reported by the Federal Aviation Administration and data on requested reroutes collected from a field trial of the NASA developed Dynamic Weather Routes tool, which advised efficient route changes to American Airlines dispatchers in 2014. 10-Fold cross validation was used for feature, model and parameter selection, while nested cross validation was used to validate the model. The model performed well in predicting controller acceptance or rejection of a route change as indicated by chosen performance metrics. Features identified as relevant to controller acceptance included the historical usage of the advised route, the location of the maneuver start point relative to the boundaries of the airspace sector containing the maneuver start (the maneuver start sector), the reroute deviation from the original flight plan, and the demand level in the maneuver start sector. A random forest with forty trees was the best performing of the five models evaluated in this paper.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liang, Bin; Li, Yongbao; Liu, Bo

    Purpose: CyberKnife system is initially equipped with fixed circular cones for stereotactic radiosurgery. Two dose calculation algorithms, Ray-Tracing and Monte Carlo, are available in the supplied treatment planning system. A multileaf collimator system was recently introduced in the latest generation of system, capable of arbitrarily shaped treatment field. The purpose of this study is to develop a model based dose calculation algorithm to better handle the lateral scatter in an irregularly shaped small field for the CyberKnife system. Methods: A pencil beam dose calculation algorithm widely used in linac based treatment planning system was modified. The kernel parameters and intensitymore » profile were systematically determined by fitting to the commissioning data. The model was tuned using only a subset of measured data (4 out of 12 cones) and applied to all fixed circular cones for evaluation. The root mean square (RMS) of the difference between the measured and calculated tissue-phantom-ratios (TPRs) and off-center-ratio (OCR) was compared. Three cone size correction techniques were developed to better fit the OCRs at the penumbra region, which are further evaluated by the output factors (OFs). The pencil beam model was further validated against measurement data on the variable dodecagon-shaped Iris collimators and a half-beam blocked field. Comparison with Ray-Tracing and Monte Carlo methods was also performed on a lung SBRT case. Results: The RMS between the measured and calculated TPRs is 0.7% averaged for all cones, with the descending region at 0.5%. The RMSs of OCR at infield and outfield regions are both at 0.5%. The distance to agreement (DTA) at the OCR penumbra region is 0.2 mm. All three cone size correction models achieve the same improvement in OCR agreement, with the effective source shift model (SSM) preferred, due to their ability to predict more accurately the OF variations with the source to axis distance (SAD). In noncircular field validation, the pencil beam calculated results agreed well with the film measurement of both Iris collimators and the half-beam blocked field, fared much better than the Ray-Tracing calculation. Conclusions: The authors have developed a pencil beam dose calculation model for the CyberKnife system. The dose calculation accuracy is better than the standard linac based system because the model parameters were specifically tuned to the CyberKnife system and geometry correction factors. The model handles better the lateral scatter and has the potential to be used for the irregularly shaped fields. Comprehensive validations on MLC equipped system are necessary for its clinical implementation. It is reasonably fast enough to be used during plan optimization.« less

  2. Markov Jump-Linear Performance Models for Recoverable Flight Control Computers

    NASA Technical Reports Server (NTRS)

    Zhang, Hong; Gray, W. Steven; Gonzalez, Oscar R.

    2004-01-01

    Single event upsets in digital flight control hardware induced by atmospheric neutrons can reduce system performance and possibly introduce a safety hazard. One method currently under investigation to help mitigate the effects of these upsets is NASA Langley s Recoverable Computer System. In this paper, a Markov jump-linear model is developed for a recoverable flight control system, which will be validated using data from future experiments with simulated and real neutron environments. The method of tracking error analysis and the plan for the experiments are also described.

  3. Investigating the Mechanisms of Action and the Identification of Breast Carcinogens by Computational Analysis of Female Rodent Carcinogens

    DTIC Science & Technology

    2005-08-01

    QSAR in Environmental Researth was accepted and published in April of 2005. The manuscript described the cat -SAR program in detail. We note the...analysis of this data yielded a very good model. As such, this was a suitable dataset on which to develop and test the cat -SAR program. A copy of the...developed and validated (i.e., a-c) as planned in MCASE and then with the cat -SAR program. We have also updated rodent carcinogenicity models so that

  4. Simulation Modeling for Off-Nominal Conditions - Where Are We Today?

    NASA Technical Reports Server (NTRS)

    Shah, Gautam H.; Foster, John V.; Cunningham, Kevin

    2010-01-01

    The modeling of aircraft flight characteris4cs in off-nominal or otherwise adverse conditions has become increasingly important for simulation in the loss-of-control arena. Adverse conditions include environmentally-induced upsets such as wind shear or wake vortex encounters; off-nominal flight conditions, such as stall or departure; on-board systems failures; and structural failures or aircraft damage. Spirited discussions in the research community are taking place as to the fidelity and data requirements for adequate representation of vehicle dynamics under such conditions for a host of research areas, including recovery training, flight controls development, trajectory guidance/planning, and envelope limiting. The increasing need for multiple sources of data (empirical, computational, experimental) for modeling across a larger flight envelope leads to challenges in developing methods of appropriately applying or combining such data, particularly in a dynamic flight environment with a physically and/or aerodynamically asymmetric vehicle. Traditional simplifications and symmetry assumptions in current modeling methodology may no longer be valid. Furthermore, once modeled, challenges abound in the validation of flight dynamics characteristics in adverse flight regimes

  5. Evaluation of the rusle and disturbed wepp erosion models for predicting soil loss in the first year after wildfire in NW Spain.

    PubMed

    Fernández, Cristina; Vega, José A

    2018-05-04

    Severe fire greatly increases soil erosion rates and overland-flow in forest land. Soil erosion prediction models are essential for estimating fire impacts and planning post-fire emergency responses. We evaluated the performance of a) the Revised Universal Soil Loss Equation (RUSLE), modified by inclusion of an alternative equation for the soil erodibility factor, and b) the Disturbed WEPP model, by comparing the soil loss predicted by the models and the soil loss measured in the first year after wildfire in 44 experimental field plots in NW Spain. The Disturbed WEPP has not previously been validated with field data for use in NW Spain; validation studies are also very scarce in other areas. We found that both models underestimated the erosion rates. The accuracy of the RUSLE model was low, even after inclusion of a modified soil erodibility factor accounting for high contents of soil organic matter. We conclude that neither model is suitable for predicting soil erosion in the first year after fire in NW Spain and suggest that soil burn severity should be given greater weighting in post-fire soil erosion modelling. Copyright © 2018 Elsevier Inc. All rights reserved.

  6. Modular modeling system for building distributed hydrologic models with a user-friendly software package

    NASA Astrophysics Data System (ADS)

    Wi, S.; Ray, P. A.; Brown, C.

    2015-12-01

    A software package developed to facilitate building distributed hydrologic models in a modular modeling system is presented. The software package provides a user-friendly graphical user interface that eases its practical use in water resources-related research and practice. The modular modeling system organizes the options available to users when assembling models according to the stages of hydrological cycle, such as potential evapotranspiration, soil moisture accounting, and snow/glacier melting processes. The software is intended to be a comprehensive tool that simplifies the task of developing, calibrating, validating, and using hydrologic models through the inclusion of intelligent automation to minimize user effort, and reduce opportunities for error. Processes so far automated include the definition of system boundaries (i.e., watershed delineation), climate and geographical input generation, and parameter calibration. Built-in post-processing toolkits greatly improve the functionality of the software as a decision support tool for water resources system management and planning. Example post-processing toolkits enable streamflow simulation at ungauged sites with predefined model parameters, and perform climate change risk assessment by means of the decision scaling approach. The software is validated through application to watersheds representing a variety of hydrologic regimes.

  7. Clinical decision tool for optimal delivery of liver stereotactic body radiation therapy: Photons versus protons.

    PubMed

    Gandhi, Saumil J; Liang, Xing; Ding, Xuanfeng; Zhu, Timothy C; Ben-Josef, Edgar; Plastaras, John P; Metz, James M; Both, Stefan; Apisarnthanarax, Smith

    2015-01-01

    Stereotactic body radiation therapy (SBRT) for treatment of liver tumors is often limited by liver dose constraints. Protons offer potential for more liver sparing, but clinical situations in which protons may be superior to photons are not well described. We developed and validated a treatment decision model to determine whether liver tumors of certain sizes and locations are more suited for photon versus proton SBRT. Six spherical mock tumors from 1 to 6 cm in diameter were contoured on computed tomography images of 1 patient at 4 locations: dome, caudal, left medial, and central. Photon and proton plans were generated to deliver 50 Gy in 5 fractions to each tumor and optimized to deliver equivalent target coverage and maximal liver sparing. Using these plans, we developed a hypothesis-generating model to predict the optimal modality for maximal liver sparing based on tumor size and location. We then validated this model in 10 patients with liver tumors. Protons spared significantly more liver than photons for dome or central tumors ≥3 cm (dome: 134 ± 21 cm(3), P = .03; central: 108 ± 4 cm(3), P = .01). Our model correctly predicted the optimal SBRT modality for all 10 patients. For patients with dome or central tumors ≥3 cm, protons significantly increased the volume of liver spared (176 ± 21 cm(3), P = .01) and decreased the mean liver dose (8.4 vs 12.2 Gy, P = .01) while offering no significant advantage for tumors <3 cm at any location or for caudal and left medial tumors of any size. When feasible, protons should be considered as the radiation modality of choice for dome and central tumors >3 cm to allow maximal liver sparing and potentially reduce radiation toxicity. Protons should also be considered for any tumor >5 cm if photon plans fail to achieve adequate coverage or exceed the mean liver threshold. Copyright © 2015 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  8. Integrated Modeling and Participatory Scenario Planning for Climate Adaptation: the Maui Groundwater Project

    NASA Astrophysics Data System (ADS)

    Keener, V. W.; Finucane, M.; Brewington, L.

    2014-12-01

    For the last century, the island of Maui, Hawaii, has been the center of environmental, agricultural, and legal conflict with respect to surface and groundwater allocation. Planning for adequate future freshwater resources requires flexible and adaptive policies that emphasize partnerships and knowledge transfer between scientists and non-scientists. In 2012 the Hawai'i state legislature passed the Climate Change Adaptation Priority Guidelines (Act 286) law requiring county and state policy makers to include island-wide climate change scenarios in their planning processes. This research details the ongoing work by researchers in the NOAA funded Pacific RISA to support the development of Hawaii's first island-wide water use plan under the new climate adaptation directive. This integrated project combines several models with participatory future scenario planning. The dynamically downscaled triply nested Hawaii Regional Climate Model (HRCM) was modified from the WRF community model and calibrated to simulate the many microclimates on the Hawaiian archipelago. For the island of Maui, the HRCM was validated using 20 years of hindcast data, and daily projections were created at a 1 km scale to capture the steep topography and diverse rainfall regimes. Downscaled climate data are input into a USGS hydrological model to quantify groundwater recharge. This model was previously used for groundwater management, and is being expanded utilizing future climate projections, current land use maps and future scenario maps informed by stakeholder input. Participatory scenario planning began in 2012 to bring together a diverse group of over 50 decision-makers in government, conservation, and agriculture to 1) determine the type of information they would find helpful in planning for climate change, and 2) develop a set of scenarios that represent alternative climate/management futures. This is an iterative process, resulting in flexible and transparent narratives at multiple scales. The resulting climate, land use, and groundwater recharge maps give stakeholders a common set of future scenarios that they understand through the participatory scenario process, and identify the vulnerabilities, trade-offs, and adaptive priorities for different groundwater management and land uses in an uncertain future.

  9. US Navy Global and Regional Wave Modeling

    DTIC Science & Technology

    2014-09-01

    Future plans call for increasing the resolution to 0.5 degree, upgrading to WW3 version 4, and including the ...NAVOCEANO WW3 system is in the early stages, and a number of key shortcomings have been identified for future improvement. The multigrid sys- tem...J. Shriver, R. Helber, P. Spence, S . Carroll, O.M. Smedstad, and B. Lunde. 2011. Validation Test Report for the Navy Coupled Ocean

  10. Verification and Validation of Rural Propagation in the Sage 2.0 Simulation

    DTIC Science & Technology

    2016-08-01

    position unless so designated by other authorized documents. Citation of manufacturer’s or trade names does not constitute an official endorsement or...Approved for public release; distribution unlimited. 1 1. Introduction The System of Systems Survivability Simulation (S4) is designed to be...materiel developers. The Sage model, part of the S4 simulation suite, has been developed primarily to support SLAD analysts in pretest planning and

  11. Modeling and Simulation Roadmap to Enhance Electrical Energy Security of U.S. Naval Bases

    DTIC Science & Technology

    2012-03-01

    evaluating power system architectures and technologies and, therefore, can become a valuable tool for the implementation of the described plan for Navy...a well validated and consistent process for evaluating power system architectures and technologies and, therefore, can be a valuable tool for the...process for evaluating power system architectures and component technologies is needed to support the development and implementation of these new

  12. Validation of BlueSky Smoke Prediction System using surface and satellite observations during major wildland fire events in Northern California

    Treesearch

    Lesley Fusina; Sharon Zhong; Julide Koracin; Tim Brown; Annie Esperanza; Leland Tarney; Haiganoush Preisler

    2007-01-01

    The BlueSky Smoke Prediction System developed by the U.S. Department of Agriculture, Forest Service, AirFire Team under the National Fire Plan is a modeling framework that integrates tools, knowledge of fuels, moisture, combustion, emissions, plume dynamics, and weather to produce real-time predictions of the cumulative impacts of smoke from wildfires, prescribed fires...

  13. A neural network - based algorithm for predicting stone -free status after ESWL therapy

    PubMed Central

    Seckiner, Ilker; Seckiner, Serap; Sen, Haluk; Bayrak, Omer; Dogan, Kazım; Erturhan, Sakip

    2017-01-01

    ABSTRACT Objective: The prototype artificial neural network (ANN) model was developed using data from patients with renal stone, in order to predict stone-free status and to help in planning treatment with Extracorporeal Shock Wave Lithotripsy (ESWL) for kidney stones. Materials and Methods: Data were collected from the 203 patients including gender, single or multiple nature of the stone, location of the stone, infundibulopelvic angle primary or secondary nature of the stone, status of hydronephrosis, stone size after ESWL, age, size, skin to stone distance, stone density and creatinine, for eleven variables. Regression analysis and the ANN method were applied to predict treatment success using the same series of data. Results: Subsequently, patients were divided into three groups by neural network software, in order to implement the ANN: training group (n=139), validation group (n=32), and the test group (n=32). ANN analysis demonstrated that the prediction accuracy of the stone-free rate was 99.25% in the training group, 85.48% in the validation group, and 88.70% in the test group. Conclusions: Successful results were obtained to predict the stone-free rate, with the help of the ANN model designed by using a series of data collected from real patients in whom ESWL was implemented to help in planning treatment for kidney stones. PMID:28727384

  14. Application of Psychological Theories in Agent-Based Modeling: The Case of the Theory of Planned Behavior.

    PubMed

    Scalco, Andrea; Ceschi, Andrea; Sartori, Riccardo

    2018-01-01

    It is likely that computer simulations will assume a greater role in the next future to investigate and understand reality (Rand & Rust, 2011). Particularly, agent-based models (ABMs) represent a method of investigation of social phenomena that blend the knowledge of social sciences with the advantages of virtual simulations. Within this context, the development of algorithms able to recreate the reasoning engine of autonomous virtual agents represents one of the most fragile aspects and it is indeed crucial to establish such models on well-supported psychological theoretical frameworks. For this reason, the present work discusses the application case of the theory of planned behavior (TPB; Ajzen, 1991) in the context of agent-based modeling: It is argued that this framework might be helpful more than others to develop a valid representation of human behavior in computer simulations. Accordingly, the current contribution considers issues related with the application of the model proposed by the TPB inside computer simulations and suggests potential solutions with the hope to contribute to shorten the distance between the fields of psychology and computer science.

  15. Factors Influencing the Career Planning and Development of University Students in Jordan

    ERIC Educational Resources Information Center

    Khasawneh, Samer

    2010-01-01

    The purpose of this study was to translate and validate an Arabic version of the career influence inventory for use in Jordan. The study also investigated perceptions of university students of the influential factors that have influenced their career planning and development. The validated career influence inventory was administered to 558…

  16. The Unified Language Testing Plan: Speaking Proficiency Test. Russian Pilot Validation Studies. Report Number 2.

    ERIC Educational Resources Information Center

    Thornton, Julie A.

    The report describes one segment of the Federal Language Testing Board's Unified Language Testing Plan (ULTP), the validation of the speaking proficiency test in Russian. The ULTP is a project to increase standardization of foreign language proficiency measurement and promote sharing of resources among testing programs in the federal government.…

  17. The Unified Language Testing Plan: Speaking Proficiency Test. Spanish and English Pilot Validation Studies. Report Number 1.

    ERIC Educational Resources Information Center

    Thornton, Julie A.

    This report describes one segment of the Federal Language Testing Board's Unified Language Testing Plan (ULTP), the validation of speaking proficiency tests in Spanish and English. The ULTP is a project to increase standardization of foreign language proficiency measurement and promote sharing of resources among testing programs in the federal…

  18. Using the Theory of Planned Behavior to Explain and Predict Behavior Intentions in Taiwan

    ERIC Educational Resources Information Center

    Wu, Cheng-Lung

    2015-01-01

    This study aims to use the theory of planned behavior to verify undergraduates' behavioral intentions regarding their participation in aquatic sports. Undergraduates in Taiwan serve as the research subjects and a survey method employs questionnaires. A total of 200 valid questionnaires were received out of 230, thus giving a valid response rate of…

  19. Streamlining Transportation Corridors Planning Processes and Validating the Application of Commercial Remote Sensing and Spatial Information (CRS-SI) Technologies for Environmental Impact Assessments

    DOT National Transportation Integrated Search

    2008-02-05

    The new US DOT RITA program has selected MSU for addressing corridor planning and environmental assessment in new and innovative ways that can be compared to traditional approaches. Our primary focus is on the application and validation of new and in...

  20. Automation of block assignment planning using a diagram-based scenario modeling method

    NASA Astrophysics Data System (ADS)

    Hwang, In Hyuck; Kim, Youngmin; Lee, Dong Kun; Shin, Jong Gye

    2014-03-01

    Most shipbuilding scheduling research so far has focused on the load level on the dock plan. This is be¬cause the dock is the least extendable resource in shipyards, and its overloading is difficult to resolve. However, once dock scheduling is completed, making a plan that makes the best use of the rest of the resources in the shipyard to minimize any additional cost is also important. Block assignment planning is one of the midterm planning tasks; it assigns a block to the facility (factory/shop or surface plate) that will actually manufacture the block according to the block characteristics and current situation of the facility. It is one of the most heavily loaded midterm planning tasks and is carried out manu¬ally by experienced workers. In this study, a method of representing the block assignment rules using a diagram was su¬ggested through analysis of the existing manual process. A block allocation program was developed which automated the block assignment process according to the rules represented by the diagram. The planning scenario was validated through a case study that compared the manual assignment and two automated block assignment results.

  1. Analysis and prediction of agricultural pest dynamics with Tiko'n, a generic tool to develop agroecological food web models

    NASA Astrophysics Data System (ADS)

    Malard, J. J.; Rojas, M.; Adamowski, J. F.; Anandaraja, N.; Tuy, H.; Melgar-Quiñonez, H.

    2016-12-01

    While several well-validated crop growth models are currently widely used, very few crop pest models of the same caliber have been developed or applied, and pest models that take trophic interactions into account are even rarer. This may be due to several factors, including 1) the difficulty of representing complex agroecological food webs in a quantifiable model, and 2) the general belief that pesticides effectively remove insect pests from immediate concern. However, pests currently claim a substantial amount of harvests every year (and account for additional control costs), and the impact of insects and of their trophic interactions on agricultural crops cannot be ignored, especially in the context of changing climates and increasing pressures on crops across the globe. Unfortunately, most integrated pest management frameworks rely on very simple models (if at all), and most examples of successful agroecological management remain more anecdotal than scientifically replicable. In light of this, there is a need for validated and robust agroecological food web models that allow users to predict the response of these webs to changes in management, crops or climate, both in order to predict future pest problems under a changing climate as well as to develop effective integrated management plans. Here we present Tiko'n, a Python-based software whose API allows users to rapidly build and validate trophic web agroecological models that predict pest dynamics in the field. The programme uses a Bayesian inference approach to calibrate the models according to field data, allowing for the reuse of literature data from various sources and reducing the need for extensive field data collection. We apply the model to the cononut black-headed caterpillar (Opisina arenosella) and associated parasitoid data from Sri Lanka, showing how the modeling framework can be used to rapidly develop, calibrate and validate models that elucidate how the internal structures of food webs determine their behaviour and allow users to evaluate different integrated management options.

  2. WE-B-304-03: Biological Treatment Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Orton, C.

    The ultimate goal of radiotherapy treatment planning is to find a treatment that will yield a high tumor control probability (TCP) with an acceptable normal tissue complication probability (NTCP). Yet most treatment planning today is not based upon optimization of TCPs and NTCPs, but rather upon meeting physical dose and volume constraints defined by the planner. It has been suggested that treatment planning evaluation and optimization would be more effective if they were biologically and not dose/volume based, and this is the claim debated in this month’s Point/Counterpoint. After a brief overview of biologically and DVH based treatment planning bymore » the Moderator Colin Orton, Joseph Deasy (for biological planning) and Charles Mayo (against biological planning) will begin the debate. Some of the arguments in support of biological planning include: this will result in more effective dose distributions for many patients DVH-based measures of plan quality are known to have little predictive value there is little evidence that either D95 or D98 of the PTV is a good predictor of tumor control sufficient validated outcome prediction models are now becoming available and should be used to drive planning and optimization Some of the arguments against biological planning include: several decades of experience with DVH-based planning should not be discarded we do not know enough about the reliability and errors associated with biological models the radiotherapy community in general has little direct experience with side by side comparisons of DVH vs biological metrics and outcomes it is unlikely that a clinician would accept extremely cold regions in a CTV or hot regions in a PTV, despite having acceptable TCP values Learning Objectives: To understand dose/volume based treatment planning and its potential limitations To understand biological metrics such as EUD, TCP, and NTCP To understand biologically based treatment planning and its potential limitations.« less

  3. WE-B-304-02: Treatment Planning Evaluation and Optimization Should Be Biologically and Not Dose/volume Based

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deasy, J.

    The ultimate goal of radiotherapy treatment planning is to find a treatment that will yield a high tumor control probability (TCP) with an acceptable normal tissue complication probability (NTCP). Yet most treatment planning today is not based upon optimization of TCPs and NTCPs, but rather upon meeting physical dose and volume constraints defined by the planner. It has been suggested that treatment planning evaluation and optimization would be more effective if they were biologically and not dose/volume based, and this is the claim debated in this month’s Point/Counterpoint. After a brief overview of biologically and DVH based treatment planning bymore » the Moderator Colin Orton, Joseph Deasy (for biological planning) and Charles Mayo (against biological planning) will begin the debate. Some of the arguments in support of biological planning include: this will result in more effective dose distributions for many patients DVH-based measures of plan quality are known to have little predictive value there is little evidence that either D95 or D98 of the PTV is a good predictor of tumor control sufficient validated outcome prediction models are now becoming available and should be used to drive planning and optimization Some of the arguments against biological planning include: several decades of experience with DVH-based planning should not be discarded we do not know enough about the reliability and errors associated with biological models the radiotherapy community in general has little direct experience with side by side comparisons of DVH vs biological metrics and outcomes it is unlikely that a clinician would accept extremely cold regions in a CTV or hot regions in a PTV, despite having acceptable TCP values Learning Objectives: To understand dose/volume based treatment planning and its potential limitations To understand biological metrics such as EUD, TCP, and NTCP To understand biologically based treatment planning and its potential limitations.« less

  4. WE-B-304-01: Treatment Planning Evaluation and Optimization Should Be Dose/volume and Not Biologically Based

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mayo, C.

    The ultimate goal of radiotherapy treatment planning is to find a treatment that will yield a high tumor control probability (TCP) with an acceptable normal tissue complication probability (NTCP). Yet most treatment planning today is not based upon optimization of TCPs and NTCPs, but rather upon meeting physical dose and volume constraints defined by the planner. It has been suggested that treatment planning evaluation and optimization would be more effective if they were biologically and not dose/volume based, and this is the claim debated in this month’s Point/Counterpoint. After a brief overview of biologically and DVH based treatment planning bymore » the Moderator Colin Orton, Joseph Deasy (for biological planning) and Charles Mayo (against biological planning) will begin the debate. Some of the arguments in support of biological planning include: this will result in more effective dose distributions for many patients DVH-based measures of plan quality are known to have little predictive value there is little evidence that either D95 or D98 of the PTV is a good predictor of tumor control sufficient validated outcome prediction models are now becoming available and should be used to drive planning and optimization Some of the arguments against biological planning include: several decades of experience with DVH-based planning should not be discarded we do not know enough about the reliability and errors associated with biological models the radiotherapy community in general has little direct experience with side by side comparisons of DVH vs biological metrics and outcomes it is unlikely that a clinician would accept extremely cold regions in a CTV or hot regions in a PTV, despite having acceptable TCP values Learning Objectives: To understand dose/volume based treatment planning and its potential limitations To understand biological metrics such as EUD, TCP, and NTCP To understand biologically based treatment planning and its potential limitations.« less

  5. Internal validity of a household food security scale is consistent among diverse populations participating in a food supplement program in Colombia

    PubMed Central

    Hackett, Michelle; Melgar-Quinonez, Hugo; Uribe, Martha C Alvarez

    2008-01-01

    Objective We assessed the validity of a locally adapted Colombian Household Food Security Scale (CHFSS) used as a part of the 2006 evaluation of the food supplement component of the Plan for Improving Food and Nutrition in Antioquia, Colombia (MANA – Plan Departamental de Seguridad Alimentaria y Nutricional de Antioquia). Methods Subjects included low-income families with pre-school age children in MANA that responded affirmatively to at least one CHFSS item (n = 1,319). Rasch Modeling was used to evaluate the psychometric characteristics of the items through measure and INFIT values. Differences in CHFSS performance were assessed by area of residency, socioeconomic status and number of children enrolled in MANA. Unidimensionality of a scale by group was further assessed using Differential Item Functioning (DIF). Results Most CHFSS items presented good fitness with most INFIT values within the adequate range of 0.8 to 1.2. Consistency in item measure values between groups was found for all but two items in the comparison by area of residency. Only two adult items exhibited DIF between urban and rural households. Conclusion The results indicate that the adapted CHFSS is a valid tool to assess the household food security of participants in food assistance programs like MANA. PMID:18500988

  6. Internal validity of a household food security scale is consistent among diverse populations participating in a food supplement program in Colombia.

    PubMed

    Hackett, Michelle; Melgar-Quinonez, Hugo; Uribe, Martha C Alvarez

    2008-05-23

    We assessed the validity of a locally adapted Colombian Household Food Security Scale (CHFSS) used as a part of the 2006 evaluation of the food supplement component of the Plan for Improving Food and Nutrition in Antioquia, Colombia (MANA - Plan Departamental de Seguridad Alimentaria y Nutricional de Antioquia). Subjects included low-income families with pre-school age children in MANA that responded affirmatively to at least one CHFSS item (n = 1,319). Rasch Modeling was used to evaluate the psychometric characteristics of the items through measure and INFIT values. Differences in CHFSS performance were assessed by area of residency, socioeconomic status and number of children enrolled in MANA. Unidimensionality of a scale by group was further assessed using Differential Item Functioning (DIF). Most CHFSS items presented good fitness with most INFIT values within the adequate range of 0.8 to 1.2. Consistency in item measure values between groups was found for all but two items in the comparison by area of residency. Only two adult items exhibited DIF between urban and rural households. The results indicate that the adapted CHFSS is a valid tool to assess the household food security of participants in food assistance programs like MANA.

  7. New Activities of the U.S. National Tsunami Hazard Mitigation Program, Mapping and Modeling Subcommittee

    NASA Astrophysics Data System (ADS)

    Wilson, R. I.; Eble, M. C.

    2013-12-01

    The U.S. National Tsunami Hazard Mitigation Program (NTHMP) is comprised of representatives from coastal states and federal agencies who, under the guidance of NOAA, work together to develop protocols and products to help communities prepare for and mitigate tsunami hazards. Within the NTHMP are several subcommittees responsible for complimentary aspects of tsunami assessment, mitigation, education, warning, and response. The Mapping and Modeling Subcommittee (MMS) is comprised of state and federal scientists who specialize in tsunami source characterization, numerical tsunami modeling, inundation map production, and warning forecasting. Until September 2012, much of the work of the MMS was authorized through the Tsunami Warning and Education Act, an Act that has since expired but the spirit of which is being adhered to in parallel with reauthorization efforts. Over the past several years, the MMS has developed guidance and best practices for states and territories to produce accurate and consistent tsunami inundation maps for community level evacuation planning, and has conducted benchmarking of numerical inundation models. Recent tsunami events have highlighted the need for other types of tsunami hazard analyses and products for improving evacuation planning, vertical evacuation, maritime planning, land-use planning, building construction, and warning forecasts. As the program responsible for producing accurate and consistent tsunami products nationally, the NTHMP-MMS is initiating a multi-year plan to accomplish the following: 1) Create and build on existing demonstration projects that explore new tsunami hazard analysis techniques and products, such as maps identifying areas of strong currents and potential damage within harbors as well as probabilistic tsunami hazard analysis for land-use planning. 2) Develop benchmarks for validating new numerical modeling techniques related to current velocities and landslide sources. 3) Generate guidance and protocols for the production and use of new tsunami hazard analysis products. 4) Identify multistate collaborations and funding partners interested in these new products. Application of these new products will improve the overall safety and resilience of coastal communities exposed to tsunami hazards.

  8. SAGE ground truth plan: Correlative measurements for the Stratospheric Aerosol and Gas Experiment (SAGE) on the AEM-B satellite

    NASA Technical Reports Server (NTRS)

    Russell, P. B. (Editor); Cunnold, D. M.; Grams, G. W.; Laver, J.; Mccormick, M. P.; Mcmaster, L. R.; Murcray, D. G.; Pepin, T. J.; Perry, T. W.; Planet, W. G.

    1979-01-01

    The ground truth plan is outlined for correlative measurements to validate the Stratospheric Aerosol and Gas Experiment (SAGE) sensor data. SAGE will fly aboard the Applications Explorer Mission-B satellite scheduled for launch in early 1979 and measure stratospheric vertical profiles of aerosol, ozone, nitrogen dioxide, and molecular extinction between 79 N and 79 S. latitude. The plan gives details of the location and times for the simultaneous satellite/correlative measurements for the nominal launch time, the rationale and choice of the correlative sensors, their characteristics and expected accuracies, and the conversion of their data to extinction profiles. In addition, an overview of the SAGE expected instrument performance and data inversion results are presented. Various atmospheric models representative of stratospheric aerosols and ozone are used in the SAGE and correlative sensor analyses.

  9. 2012 Annual Summary Report for the Area 3 and Area 5 Radioactive Waste Management Sites at the Nevada National Security Site, Nye County, Nevada: Review of the Performance Assessments and Composite Analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shott, G.

    2013-03-18

    The Maintenance Plan for the Performance Assessments and Composite Analyses for the Area 3 and Area 5 Radioactive Waste Management Sites at the Nevada Test Site (National Security Technologies, LLC 2007a) requires an annual review to assess the adequacy of the performance assessments (PAs) and composite analyses (CAs), with the results submitted to the U.S. Department of Energy (DOE) Office of Environmental Management. The Disposal Authorization Statements for the Area 3 and Area 5 Radioactive Waste Management Sites (RWMSs) also require that such reviews be made and that secondary or minor unresolved issues be tracked and addressed as part ofmore » the maintenance plan (DOE 1999a, 2000). The U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office performed an annual review of the Area 3 and Area 5 RWMS PAs and CAs for fiscal year (FY) 2012. This annual summary report presents data and conclusions from the FY 2012 review, and determines the adequacy of the PAs and CAs. Operational factors (e.g., waste forms and containers, facility design, and waste receipts), closure plans, monitoring results, and research and development (R&D) activities were reviewed to determine the adequacy of the PAs. Likewise, the environmental restoration activities at the Nevada National Security Site (NNSS) relevant to the sources of residual radioactive material that are considered in the CAs, the land-use planning, and the results of the environmental monitoring and R&D activities were reviewed to determine the adequacy of the CAs. Important developments in FY 2012 include the following: Release of a special analysis for the Area 3 RWMS assessing the continuing validity of the PA and CA; Development of a new Area 5 RWMS closure inventory estimate based on disposals through FY 2012; Evaluation of new or revised waste streams by special analysis; and Development of version 4.114 of the Area 5 RWMS GoldSim PA model. The Area 3 RWMS has been in inactive status since July 1, 2006, with the last shipment received in April 2006. The FY 2012 review of operations, facility design, closure plans, monitoring results, and R&D results for the Area 3 RWMS indicates no changes that would impact PA validity. A special analysis using the Area 3 RWMS v2.102 GoldSim PA model was prepared to update the PA results for the Area 3 RWMS in FY 2012. The special analysis concludes that all performance objectives can be met and the Area 3 RWMS PA remains valid. There is no need to the revise the Area 3 RWMS PA. Review of Area 5 RWMS operations, design, closure plans, monitoring results, and R&D activities indicates no significant changes other than an increase in the inventory disposed. The FY 2012 PA results, generated with the Area 5 RWMS v4.114 GoldSim PA model, indicate that there continues to be a reasonable expectation of meeting all performance objectives. The results and conclusions of the Area 5 RWMS PA are judged valid, and there is no need to the revise the PA. A review of changes potentially impacting the CAs indicates that no significant changes occurred in FY 2012. The continuing adequacy of the CAs was evaluated with the new models, and no significant changes that would alter CA results or conclusions were found. The revision of the Area 3 RWMS CA, which will include the Underground Test Area source term (Corrective Action Unit [CAU] 97), is scheduled for FY 2024, following the completion of the Yucca Flat CAU 97 Corrective Action Decision Document/Corrective Action Plan in FY 2016. Inclusion of the Frenchman Flat CAU 98 results in the Area 5 RWMS CA is scheduled for FY 2016, pending the completion of the CAU 98 closure report in FY 2015. Near-term R&D efforts will focus on continuing development of the Area 3 and Area 5 RWMS GoldSim PA/CA and inventory models.« less

  10. [Risk factor analysis of the patients with solitary pulmonary nodules and establishment of a prediction model for the probability of malignancy].

    PubMed

    Wang, X; Xu, Y H; Du, Z Y; Qian, Y J; Xu, Z H; Chen, R; Shi, M H

    2018-02-23

    Objective: This study aims to analyze the relationship among the clinical features, radiologic characteristics and pathological diagnosis in patients with solitary pulmonary nodules, and establish a prediction model for the probability of malignancy. Methods: Clinical data of 372 patients with solitary pulmonary nodules who underwent surgical resection with definite postoperative pathological diagnosis were retrospectively analyzed. In these cases, we collected clinical and radiologic features including gender, age, smoking history, history of tumor, family history of cancer, the location of lesion, ground-glass opacity, maximum diameter, calcification, vessel convergence sign, vacuole sign, pleural indentation, speculation and lobulation. The cases were divided to modeling group (268 cases) and validation group (104 cases). A new prediction model was established by logistic regression analying the data from modeling group. Then the data of validation group was planned to validate the efficiency of the new model, and was compared with three classical models(Mayo model, VA model and LiYun model). With the calculated probability values for each model from validation group, SPSS 22.0 was used to draw the receiver operating characteristic curve, to assess the predictive value of this new model. Results: 112 benign SPNs and 156 malignant SPNs were included in modeling group. Multivariable logistic regression analysis showed that gender, age, history of tumor, ground -glass opacity, maximum diameter, and speculation were independent predictors of malignancy in patients with SPN( P <0.05). We calculated a prediction model for the probability of malignancy as follow: p =e(x)/(1+ e(x)), x=-4.8029-0.743×gender+ 0.057×age+ 1.306×history of tumor+ 1.305×ground-glass opacity+ 0.051×maximum diameter+ 1.043×speculation. When the data of validation group was added to the four-mathematical prediction model, The area under the curve of our mathematical prediction model was 0.742, which is greater than other models (Mayo 0.696, VA 0.634, LiYun 0.681), while the differences between any two of the four models were not significant ( P >0.05). Conclusions: Age of patient, gender, history of tumor, ground-glass opacity, maximum diameter and speculation are independent predictors of malignancy in patients with solitary pulmonary nodule. This logistic regression prediction mathematic model is not inferior to those classical models in estimating the prognosis of SPNs.

  11. Using GOMS and Bayesian plan recognition to develop recognition models of operator behavior

    NASA Astrophysics Data System (ADS)

    Zaientz, Jack D.; DeKoven, Elyon; Piegdon, Nicholas; Wood, Scott D.; Huber, Marcus J.

    2006-05-01

    Trends in combat technology research point to an increasing role for uninhabited vehicles in modern warfare tactics. To support increased span of control over these vehicles human responsibilities need to be transformed from tedious, error-prone and cognition intensive operations into tasks that are more supervisory and manageable, even under intensely stressful conditions. The goal is to move away from only supporting human command of low-level system functions to intention-level human-system dialogue about the operator's tasks and situation. A critical element of this process is developing the means to identify when human operators need automated assistance and to identify what assistance they need. Toward this goal, we are developing an unmanned vehicle operator task recognition system that combines work in human behavior modeling and Bayesian plan recognition. Traditionally, human behavior models have been considered generative, meaning they describe all possible valid behaviors. Basing behavior recognition on models designed for behavior generation can offers advantages in improved model fidelity and reuse. It is not clear, however, how to reconcile the structural differences between behavior recognition and behavior modeling approaches. Our current work demonstrates that by pairing a cognitive psychology derived human behavior modeling approach, GOMS, with a Bayesian plan recognition engine, ASPRN, we can translate a behavior generation model into a recognition model. We will discuss the implications for using human performance models in this manner as well as suggest how this kind of modeling may be used to support the real-time control of multiple, uninhabited battlefield vehicles and other semi-autonomous systems.

  12. Fixed-Precision Sequential Sampling Plans for Estimating Alfalfa Caterpillar, Colias lesbia, Egg Density in Alfalfa, Medicago sativa, Fields in Córdoba, Argentina

    PubMed Central

    Serra, Gerardo V.; Porta, Norma C. La; Avalos, Susana; Mazzuferi, Vilma

    2013-01-01

    The alfalfa caterpillar, Colias lesbia (Fabricius) (Lepidoptera: Pieridae), is a major pest of alfalfa, Medicago sativa L. (Fabales: Fabaceae), crops in Argentina. Its management is based mainly on chemical control of larvae whenever the larvae exceed the action threshold. To develop and validate fixed-precision sequential sampling plans, an intensive sampling programme for C. lesbia eggs was carried out in two alfalfa plots located in the Province of Córdoba, Argentina, from 1999 to 2002. Using Resampling for Validation of Sampling Plans software, 12 additional independent data sets were used to validate the sequential sampling plan with precision levels of 0.10 and 0.25 (SE/mean), respectively. For a range of mean densities of 0.10 to 8.35 eggs/sample, an average sample size of only 27 and 26 sample units was required to achieve a desired precision level of 0.25 for the sampling plans of Green and Kuno, respectively. As the precision level was increased to 0.10, average sample size increased to 161 and 157 sample units for the sampling plans of Green and Kuno, respectively. We recommend using Green's sequential sampling plan because it is less sensitive to changes in egg density. These sampling plans are a valuable tool for researchers to study population dynamics and to evaluate integrated pest management strategies. PMID:23909840

  13. Incorporating High-Frequency Physiologic Data Using Computational Dictionary Learning Improves Prediction of Delayed Cerebral Ischemia Compared to Existing Methods.

    PubMed

    Megjhani, Murad; Terilli, Kalijah; Frey, Hans-Peter; Velazquez, Angela G; Doyle, Kevin William; Connolly, Edward Sander; Roh, David Jinou; Agarwal, Sachin; Claassen, Jan; Elhadad, Noemie; Park, Soojin

    2018-01-01

    Accurate prediction of delayed cerebral ischemia (DCI) after subarachnoid hemorrhage (SAH) can be critical for planning interventions to prevent poor neurological outcome. This paper presents a model using convolution dictionary learning to extract features from physiological data available from bedside monitors. We develop and validate a prediction model for DCI after SAH, demonstrating improved precision over standard methods alone. 488 consecutive SAH admissions from 2006 to 2014 to a tertiary care hospital were included. Models were trained on 80%, while 20% were set aside for validation testing. Modified Fisher Scale was considered the standard grading scale in clinical use; baseline features also analyzed included age, sex, Hunt-Hess, and Glasgow Coma Scales. An unsupervised approach using convolution dictionary learning was used to extract features from physiological time series (systolic blood pressure and diastolic blood pressure, heart rate, respiratory rate, and oxygen saturation). Classifiers (partial least squares and linear and kernel support vector machines) were trained on feature subsets of the derivation dataset. Models were applied to the validation dataset. The performances of the best classifiers on the validation dataset are reported by feature subset. Standard grading scale (mFS): AUC 0.54. Combined demographics and grading scales (baseline features): AUC 0.63. Kernel derived physiologic features: AUC 0.66. Combined baseline and physiologic features with redundant feature reduction: AUC 0.71 on derivation dataset and 0.78 on validation dataset. Current DCI prediction tools rely on admission imaging and are advantageously simple to employ. However, using an agnostic and computationally inexpensive learning approach for high-frequency physiologic time series data, we demonstrated that we could incorporate individual physiologic data to achieve higher classification accuracy.

  14. Validation of the "Security Needs Assessment Profile" for measuring the profiles of security needs of Chinese forensic psychiatric inpatients.

    PubMed

    Siu, B W M; Au-Yeung, C C Y; Chan, A W L; Chan, L S Y; Yuen, K K; Leung, H W; Yan, C K; Ng, K K; Lai, A C H; Davies, S; Collins, M

    Mapping forensic psychiatric services with the security needs of patients is a salient step in service planning, audit and review. A valid and reliable instrument for measuring the security needs of Chinese forensic psychiatric inpatients was not yet available. This study aimed to develop and validate the Chinese version of the Security Needs Assessment Profile for measuring the profiles of security needs of Chinese forensic psychiatric inpatients. The Security Needs Assessment Profile by Davis was translated into Chinese. Its face validity, content validity, construct validity and internal consistency reliability were assessed by measuring the security needs of 98 Chinese forensic psychiatric inpatients. Principal factor analysis for construct validity provided a six-factor security needs model explaining 68.7% of the variance. Based on the Cronbach's alpha coefficient, the internal consistency reliability was rated as acceptable for procedural security (0.73), and fair for both physical security (0.62) and relational security (0.58). A significant sex difference (p=0.002) in total security score was found. The Chinese version of the Security Needs Assessment Profile is a valid and reliable instrument for assessing the security needs of Chinese forensic psychiatric inpatients. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Multi-Evaporator Miniature Loop Heat Pipe for Small Spacecraft Thermal Control

    NASA Technical Reports Server (NTRS)

    Ku, Jentung; Ottenstein, Laura; Douglas, Donya

    2008-01-01

    This paper presents the development of the Thermal Loop experiment under NASA's New Millennium Program Space Technology 8 (ST8) Project. The Thermal Loop experiment was originally planned for validating in space an advanced heat transport system consisting of a miniature loop heat pipe (MLHP) with multiple evaporators and multiple condensers. Details of the thermal loop concept, technical advances and benefits, Level 1 requirements and the technology validation approach are described. An MLHP breadboard has been built and tested in the laboratory and thermal vacuum environments, and has demonstrated excellent performance that met or exceeded the design requirements. The MLHP retains all features of state-of-the-art loop heat pipes and offers additional advantages to enhance the functionality, performance, versatility, and reliability of the system. In addition, an analytical model has been developed to simulate the steady state and transient operation of the MHLP, and the model predictions agreed very well with experimental results. A protoflight MLHP has been built and is being tested in a thermal vacuum chamber to validate its performance and technical readiness for a flight experiment.

  16. Evaluating characteristics of dry spell changes in Lake Urmia Basin using an ensemble CMIP5 GCM models

    NASA Astrophysics Data System (ADS)

    Fazel, Nasim; Berndtsson, Ronny; Bertacchi Uvo, Cintia; Klove, Bjorn; Madani, Kaveh

    2015-04-01

    Drought is a natural phenomenon that can cause significant environmental, ecological, and socio-economic losses in water scarce regions. Studies of drought under climate change are essential for water resources planning and management. Dry spells and number of consecutive days with precipitation below a certain threshold can be used to identify the severity of hydrological drought. In this study, we analyzed the projected changes of number of dry days in two future periods, 2011-2040 and 2071-2100, for both seasonal and annual time scales in the Lake Urmia Basin. The lake and its wetlands, located in northwestern Iran, have invaluable environmental, social, and economic importance for the region. The lake level has been shrinking dramatically since 1995 and now the water volume is less than 30% of its original. Moreover, frequent dry spells have struck the region and effected the region's water resources and lake ecosystem as in other parts of Iran too. Analyzing future drought and dry spells characteristics in the region is crucial for sustainable water management and lake restoration plans. We used daily projected precipitation from 20 climate models used in the CMIP5 (Coupled Model Inter-comparison Project Phase 5) driven by three representative paths, RCP2.6, RCP4.5, and, RCP8.5. The model outputs were statistically downscaled and validated based on the historical observation period 1980-2010. We defined days with precipitation less than 1 mm as dry days for both observation periods and model projections. The model validation showed that all models underestimated the number of dry days. An ensemble based on the validation results consisting of five models which were in best agreement with observations was used to assess the changes in number of future dry days in Lake Urmia Basin. The entire ensemble showed increase in number of dry days for all seasons. The projected changes in winter and spring were larger than for summer and autumn. All models projected dryer winter and spring periods in the near and far future periods. The ensemble mean for future annual dry days increased by 6.5 % to 7.3% for the different climate change related emission and concentration pathway RCP2.6, RCP4.5, and, RCP8.5.

  17. Assessing Assessment Texts: Where Is Planning?

    ERIC Educational Resources Information Center

    Fives, Helenrose; Barnes, Nicole; Dacey, Charity; Gillis, Anna

    2016-01-01

    We conducted a content analysis of 27 assessment textbooks to determine how assessment planning was framed in texts for preservice teachers. We identified eight assessment planning themes: alignment, assessment purpose and types, reliability and validity, writing goals and objectives, planning specific assessments, unpacking, overall assessment…

  18. Printed three-dimensional anatomic templates for virtual preoperative planning before reconstruction of old pelvic injuries: initial results.

    PubMed

    Wu, Xin-Bao; Wang, Jun-Qiang; Zhao, Chun-Peng; Sun, Xu; Shi, Yin; Zhang, Zi-An; Li, Yu-Neng; Wang, Man-Yi

    2015-02-20

    Old pelvis fractures are among the most challenging fractures to treat because of their complex anatomy, difficult-to-access surgical sites, and the relatively low incidence of such cases. Proper evaluation and surgical planning are necessary to achieve the pelvic ring symmetry and stable fixation of the fracture. The goal of this study was to assess the use of three-dimensional (3D) printing techniques for surgical management of old pelvic fractures. First, 16 dried human cadaveric pelvises were used to confirm the anatomical accuracy of the 3D models printed based on radiographic data. Next, nine clinical cases between January 2009 and April 2013 were used to evaluate the surgical reconstruction based on the 3D printed models. The pelvic injuries were all type C, and the average time from injury to reconstruction was 11 weeks (range: 8-17 weeks). The workflow consisted of: (1) Printing patient-specific bone models based on preoperative computed tomography (CT) scans, (2) virtual fracture reduction using the printed 3D anatomic template, (3) virtual fracture fixation using Kirschner wires, and (4) preoperatively measuring the osteotomy and implant position relative to landmarks using the virtually defined deformation. These models aided communication between surgical team members during the procedure. This technique was validated by comparing the preoperative planning to the intraoperative procedure. The accuracy of the 3D printed models was within specification. Production of a model from standard CT DICOM data took 7 hours (range: 6-9 hours). Preoperative planning using the 3D printed models was feasible in all cases. Good correlation was found between the preoperative planning and postoperative follow-up X-ray in all nine cases. The patients were followed for 3-29 months (median: 5 months). The fracture healing time was 9-17 weeks (mean: 10 weeks). No delayed incision healing, wound infection, or nonunions occurred. The results were excellent in two cases, good in five, and poor in two based on the Majeed score. The 3D printing planning technique for pelvic surgery was successfully integrated into a clinical workflow to improve patient-specific preoperative planning by providing a visual and haptic model of the injury and allowing patient-specific adaptation of each osteosynthesis implant to the virtually reduced pelvis.

  19. Nanoparticle-enabled, image-guided treatment planning of target specific RNAi therapeutics in an orthotopic prostate cancer model.

    PubMed

    Lin, Qiaoya; Jin, Cheng S; Huang, Huang; Ding, Lili; Zhang, Zhihong; Chen, Juan; Zheng, Gang

    2014-08-13

    The abilities to deliver siRNA to its intended action site and assess the delivery efficiency are challenges for current RNAi therapy, where effective siRNA delivery will join force with patient genetic profiling to achieve optimal treatment outcome. Imaging could become a critical enabler to maximize RNAi efficacy in the context of tracking siRNA delivery, rational dosimetry and treatment planning. Several imaging modalities have been used to visualize nanoparticle-based siRNA delivery but rarely did they guide treatment planning. We report a multimodal theranostic lipid-nanoparticle, HPPS(NIR)-chol-siRNA, which has a near-infrared (NIR) fluorescent core, enveloped by phospholipid monolayer, intercalated with siRNA payloads, and constrained by apoA-I mimetic peptides to give ultra-small particle size (<30 nm). Using fluorescence imaging, we demonstrated its cytosolic delivery capability for both NIR-core and dye-labeled siRNAs and its structural integrity in mice through intravenous administration, validating the usefulness of NIR-core as imaging surrogate for non-labeled therapeutic siRNAs. Next, we validated the targeting specificity of HPPS(NIR)-chol-siRNA to orthotopic tumor using sequential four-steps (in vivo, in situ, ex vivo and frozen-tissue) fluorescence imaging. The image co-registration of computed tomography and fluorescence molecular tomography enabled non-invasive assessment and treatment planning of siRNA delivery into the orthotopic tumor, achieving efficacious RNAi therapy. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Spatial Distribution and Sampling Plans With Fixed Level of Precision for Citrus Aphids (Hom., Aphididae) on Two Orange Species.

    PubMed

    Kafeshani, Farzaneh Alizadeh; Rajabpour, Ali; Aghajanzadeh, Sirous; Gholamian, Esmaeil; Farkhari, Mohammad

    2018-04-02

    Aphis spiraecola Patch, Aphis gossypii Glover, and Toxoptera aurantii Boyer de Fonscolombe are three important aphid pests of citrus orchards. In this study, spatial distributions of the aphids on two orange species, Satsuma mandarin and Thomson navel, were evaluated using Taylor's power law and Iwao's patchiness. In addition, a fixed-precision sequential sampling plant was developed for each species on the host plant by Green's model at precision levels of 0.25 and 0.1. The results revealed that spatial distribution parameters and therefore the sampling plan were significantly different according to aphid and host plant species. Taylor's power law provides a better fit for the data than Iwao's patchiness regression. Except T. aurantii on Thomson navel orange, spatial distribution patterns of the aphids were aggregative on both citrus. T. aurantii had regular dispersion pattern on Thomson navel orange. Optimum sample size of the aphids varied from 30-2061 and 1-1622 shoots on Satsuma mandarin and Thomson navel orange based on aphid species and desired precision level. Calculated stop lines of the aphid species on Satsuma mandarin and Thomson navel orange ranged from 0.48 to 19 and 0.19 to 80.4 aphids per 24 shoots according to aphid species and desired precision level. The performance of the sampling plan was validated by resampling analysis using resampling for validation of sampling plans (RVSP) software. This sampling program is useful for IPM program of the aphids in citrus orchards.

Top