Sample records for values calculated based

  1. 40 CFR 600.206-08 - Calculation and use of FTP-based and HFET-based fuel economy values for vehicle configurations.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... fuel economy values from the tests performed using gasoline or diesel test fuel. (ii) Calculate the city, highway, and combined fuel economy values from the tests performed using alcohol or natural gas test fuel. (b) If only one equivalent petroleum-based fuel economy value exists for an electric...

  2. 40 CFR 600.206-08 - Calculation and use of FTP-based and HFET-based fuel economy values for vehicle configurations.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... fuel economy values from the tests performed using gasoline or diesel test fuel. (ii) Calculate the city, highway, and combined fuel economy values from the tests performed using alcohol or natural gas test fuel. (b) If only one equivalent petroleum-based fuel economy value exists for an electric...

  3. 40 CFR 600.206-08 - Calculation and use of FTP-based and HFET-based fuel economy values for vehicle configurations.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., highway, and combined fuel economy values from the tests performed using gasoline or diesel test fuel. (ii) Calculate the city, highway, and combined fuel economy values from the tests performed using alcohol or natural gas test fuel. (b) If only one equivalent petroleum-based fuel economy value exists for an...

  4. Using 3d Bim Model for the Value-Based Land Share Calculations

    NASA Astrophysics Data System (ADS)

    Çelik Şimşek, N.; Uzun, B.

    2017-11-01

    According to the Turkish condominium ownership system, 3D physical buildings and its condominium units are registered to the condominium ownership books via 2D survey plans. Currently, 2D representations of the 3D physical objects, causes inaccurate and deficient implementations for the determination of the land shares. Condominium ownership and easement right are established with a clear indication of land shares (condominium ownership law, article no. 3). So, the land share of each condominium unit have to be determined including the value differences among the condominium units. However the main problem is that, land share has often been determined with area based over the project before construction of the building. The objective of this study is proposing a new approach in terms of value-based land share calculations of the condominium units that subject to condominium ownership. So, the current approaches and its failure that have taken into account in determining the land shares are examined. And factors that affect the values of the condominium units are determined according to the legal decisions. This study shows that 3D BIM models can provide important approaches for the valuation problems in the determination of the land shares.

  5. 40 CFR 600.208-08 - Calculation of FTP-based and HFET-based fuel economy values for a model type.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Calculation of FTP-based and HFET-based fuel economy values for a model type. 600.208-08 Section 600.208-08 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) ENERGY POLICY FUEL ECONOMY AND CARBON-RELATED EXHAUST EMISSIONS OF MOTOR VEHICLES Fuel Economy Regulations fo...

  6. 40 CFR 600.206-08 - Calculation and use of FTP-based and HFET-based fuel economy values for vehicle configurations.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Calculation and use of FTP-based and HFET-based fuel economy values for vehicle configurations. 600.206-08 Section 600.206-08 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) ENERGY POLICY FUEL ECONOMY AND CARBON-RELATED EXHAUST EMISSIONS OF MOTOR VEHICLES Fuel...

  7. 40 CFR 600.208-12 - Calculation of FTP-based and HFET-based fuel economy and carbon-related exhaust emission values...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Calculation of FTP-based and HFET-based fuel economy and carbon-related exhaust emission values for a model type. 600.208-12 Section 600.208-12 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) ENERGY POLICY FUEL ECONOMY AND CARBON-RELATED EXHAUST EMISSIONS OF MOTOR...

  8. 40 CFR 600.207-08 - Calculation and use of vehicle-specific 5-cycle-based fuel economy values for vehicle...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... economy values from the tests performed using gasoline or diesel test fuel. (ii)(A) Calculate the 5-cycle city and highway fuel economy values from the tests performed using alcohol or natural gas test fuel...-specific 5-cycle-based fuel economy values for vehicle configurations. 600.207-08 Section 600.207-08...

  9. 40 CFR 600.207-08 - Calculation and use of vehicle-specific 5-cycle-based fuel economy values for vehicle...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... economy values from the tests performed using gasoline or diesel test fuel. (ii)(A) Calculate the 5-cycle city and highway fuel economy values from the tests performed using alcohol or natural gas test fuel...-specific 5-cycle-based fuel economy values for vehicle configurations. 600.207-08 Section 600.207-08...

  10. [Calculation on ecological security baseline based on the ecosystem services value and the food security].

    PubMed

    He, Ling; Jia, Qi-jian; Li, Chao; Xu, Hao

    2016-01-01

    The rapid development of coastal economy in Hebei Province caused rapid transition of coastal land use structure, which has threatened land ecological security. Therefore, calculating ecosystem service value of land use and exploring ecological security baseline can provide the basis for regional ecological protection and rehabilitation. Taking Huanghua, a city in the southeast of Hebei Province, as an example, this study explored the joint point, joint path and joint method between ecological security and food security, and then calculated the ecological security baseline of Huanghua City based on the ecosystem service value and the food safety standard. The results showed that ecosystem service value of per unit area from maximum to minimum were in this order: wetland, water, garden, cultivated land, meadow, other land, salt pans, saline and alkaline land, constructive land. The order of contribution rates of each ecological function value from high to low was nutrient recycling, water conservation, entertainment and culture, material production, biodiversity maintenance, gas regulation, climate regulation and environmental purification. The security baseline of grain production was 0.21 kg · m⁻², the security baseline of grain output value was 0.41 yuan · m⁻², the baseline of ecosystem service value was 21.58 yuan · m⁻², and the total of ecosystem service value in the research area was 4.244 billion yuan. In 2081 the ecological security will reach the bottom line and the ecological system, in which human is the subject, will be on the verge of collapse. According to the ecological security status, Huanghua can be divided into 4 zones, i.e., ecological core protection zone, ecological buffer zone, ecological restoration zone and human activity core zone.

  11. 40 CFR 600.208-08 - Calculation of FTP-based and HFET-based fuel economy values for a model type.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... may use fuel economy data from tests conducted on these vehicle configuration(s) at high altitude to...) Calculate the city, highway, and combined fuel economy values from the tests performed using gasoline or diesel test fuel. (ii) Calculate the city, highway, and combined fuel economy values from the tests...

  12. 40 CFR 600.208-08 - Calculation of FTP-based and HFET-based fuel economy values for a model type.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... tests conducted on these vehicle configuration(s) at high altitude to calculate the fuel economy for the... combined fuel economy values from the tests performed using gasoline or diesel test fuel. (ii) Calculate the city, highway, and combined fuel economy values from the tests performed using alcohol or natural...

  13. 40 CFR 600.208-08 - Calculation of FTP-based and HFET-based fuel economy values for a model type.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... tests conducted on these vehicle configuration(s) at high altitude to calculate the fuel economy for the... combined fuel economy values from the tests performed using gasoline or diesel test fuel. (ii) Calculate the city, highway, and combined fuel economy values from the tests performed using alcohol or natural...

  14. 21 CFR 868.1890 - Predictive pulmonary-function value calculator.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Predictive pulmonary-function value calculator. 868.1890 Section 868.1890 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN... pulmonary-function value calculator. (a) Identification. A predictive pulmonary-function value calculator is...

  15. 21 CFR 868.1890 - Predictive pulmonary-function value calculator.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Predictive pulmonary-function value calculator. 868.1890 Section 868.1890 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN... pulmonary-function value calculator. (a) Identification. A predictive pulmonary-function value calculator is...

  16. The pKa Cooperative: A Collaborative Effort to Advance Structure-Based Calculations of pKa values and Electrostatic Effects in Proteins

    PubMed Central

    Nielsen, Jens E.; Gunner, M. R.; Bertrand García-Moreno, E.

    2012-01-01

    The pKa Cooperative http://www.pkacoop.org was organized to advance development of accurate and useful computational methods for structure-based calculation of pKa values and electrostatic energy in proteins. The Cooperative brings together laboratories with expertise and interest in theoretical, computational and experimental studies of protein electrostatics. To improve structure-based energy calculations it is necessary to better understand the physical character and molecular determinants of electrostatic effects. The Cooperative thus intends to foment experimental research into fundamental aspects of proteins that depend on electrostatic interactions. It will maintain a depository for experimental data useful for critical assessment of methods for structure-based electrostatics calculations. To help guide the development of computational methods the Cooperative will organize blind prediction exercises. As a first step, computational laboratories were invited to reproduce an unpublished set of experimental pKa values of acidic and basic residues introduced in the interior of staphylococcal nuclease by site-directed mutagenesis. The pKa values of these groups are unique and challenging to simulate owing to the large magnitude of their shifts relative to normal pKa values in water. Many computational methods were tested in this 1st Blind Prediction Challenge and critical assessment exercise. A workshop was organized in the Telluride Science Research Center to assess objectively the performance of many computational methods tested on this one extensive dataset. This volume of PROTEINS: Structure, Function, and Bioinformatics introduces the pKa Cooperative, presents reports submitted by participants in the blind prediction challenge, and highlights some of the problems in structure-based calculations identified during this exercise. PMID:22002877

  17. Calculation of weighted averages approach for the estimation of ping tolerance values

    USGS Publications Warehouse

    Silalom, S.; Carter, J.L.; Chantaramongkol, P.

    2010-01-01

    A biotic index was created and proposed as a tool to assess water quality in the Upper Mae Ping sub-watersheds. The Ping biotic index was calculated by utilizing Ping tolerance values. This paper presents the calculation of Ping tolerance values of the collected macroinvertebrates. Ping tolerance values were estimated by a weighted averages approach based on the abundance of macroinvertebrates and six chemical constituents that include conductivity, dissolved oxygen, biochemical oxygen demand, ammonia nitrogen, nitrate nitrogen and orthophosphate. Ping tolerance values range from 0 to 10. Macroinvertebrates assigned a 0 are very sensitive to organic pollution while macroinvertebrates assigned 10 are highly tolerant to pollution.

  18. 31 CFR 351.16 - What do I need to know about the base denomination for redemption value calculations?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false What do I need to know about the base denomination for redemption value calculations? 351.16 Section 351.16 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FISCAL SERVICE, DEPARTMENT OF THE TREASURY BUREAU OF THE PUBLIC...

  19. A program for calculation of intrapulmonary shunts, blood-gas and acid-base values with a programmable calculator.

    PubMed

    Ruiz, B C; Tucker, W K; Kirby, R R

    1975-01-01

    With a desk-top, programmable calculator, it is now possible to do complex, previously time-consuming computations in the blood-gas laboratory. The authors have developed a program with the necessary algorithms for temperature correction of blood gases and calculation of acid-base variables and intrapulmonary shunt. It was necessary to develop formulas for the Po2 temperature-correction coefficient, the oxyhemoglobin-dissociation curve for adults (withe necessary adjustments for fetal blood), and changes in water vapor pressure due to variation in body temperature. Using this program in conjuction with a Monroe 1860-21 statistical programmable calculator, it is possible to temperature-correct pH,Pco2, and Po2. The machine will compute alveolar-arterial oxygen tension gradient, oxygen saturation (So2), oxygen content (Co2), actual HCO minus 3 and a modified base excess. If arterial blood and mixed venous blood are obtained, the calculator will print out intrapulmonary shunt data (Qs/Qt) and arteriovenous oxygen differences (a minus vDo2). There also is a formula to compute P50 if pH,Pco2,Po2, and measured So2 from two samples of tonometered blood (one above 50 per cent and one below 50 per cent saturation) are put into the calculator.

  20. 40 CFR 600.207-08 - Calculation and use of vehicle-specific 5-cycle-based fuel economy values for vehicle...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Calculation and use of vehicle-specific 5-cycle-based fuel economy values for vehicle configurations. 600.207-08 Section 600.207-08 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) ENERGY POLICY FUEL ECONOMY AND CARBON-RELATED EXHAUST EMISSIONS OF MOTOR VEHICLES Fue...

  1. An Algorithm for the Calculation of Exact Term Discrimination Values.

    ERIC Educational Resources Information Center

    Willett, Peter

    1985-01-01

    Reports algorithm for calculation of term discrimination values that is sufficiently fast in operation to permit use of exact values. Evidence is presented to show that relationship between term discrimination and term frequency is crucially dependent upon type of inter-document similarity measure used for calculation of discrimination values. (13…

  2. Influence of dose calculation algorithms on the predicted dose distribution and NTCP values for NSCLC patients.

    PubMed

    Nielsen, Tine B; Wieslander, Elinore; Fogliata, Antonella; Nielsen, Morten; Hansen, Olfred; Brink, Carsten

    2011-05-01

    used for plan evaluation. The NTCP values for heart complication are, in this study, not very sensitive to the choice of algorithm. Dose calculations based on density corrections result in quite different NTCP values than calculations without density corrections. It is therefore important when working with NTCP planning to use NTCP parameter values based on calculations and treatments similar to those for which the NTCP is of interest.

  3. Calculation for simulation of archery goal value using a web camera and ultrasonic sensor

    NASA Astrophysics Data System (ADS)

    Rusjdi, Darma; Abdurrasyid, Wulandari, Dewi Arianti

    2017-08-01

    Development of the device simulator digital indoor archery-based embedded systems as a solution to the limitations of the field or open space is adequate, especially in big cities. Development of the device requires simulations to calculate the value of achieving the target based on the approach defined by the parabolic motion variable initial velocity and direction of motion of the arrow reaches the target. The simulator device should be complemented with an initial velocity measuring device using ultrasonic sensors and measuring direction of the target using a digital camera. The methodology uses research and development of application software from modeling and simulation approach. The research objective to create simulation applications calculating the value of the achievement of the target arrows. Benefits as a preliminary stage for the development of the simulator device of archery. Implementation of calculating the value of the target arrows into the application program generates a simulation game of archery that can be used as a reference development of the digital archery simulator in a room with embedded systems using ultrasonic sensors and web cameras. Applications developed with the simulation calculation comparing the outer radius of the circle produced a camera from a distance of three meters.

  4. 40 CFR 600.208-12 - Calculation of FTP-based and HFET-based fuel economy and carbon-related exhaust emission values...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... emission data from tests conducted on these vehicle configuration(s) at high altitude to calculate the fuel... values from the tests performed using alcohol or natural gas test fuel. (b) For each model type, as..., highway, and combined fuel economy and carbon-related exhaust emission values from the tests performed...

  5. 40 CFR 600.207-08 - Calculation and use of vehicle-specific 5-cycle-based fuel economy values for vehicle...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    .... (i) Calculate the 5-cycle city and highway fuel economy values from the tests performed using gasoline or diesel test fuel. (ii)(A) Calculate the 5-cycle city and highway fuel economy values from the tests performed using alcohol or natural gas test fuel, if 5-cycle testing has been performed. Otherwise...

  6. 19 CFR 351.405 - Calculation of normal value based on constructed value.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... value. 351.405 Section 351.405 Customs Duties INTERNATIONAL TRADE ADMINISTRATION, DEPARTMENT OF COMMERCE... constructed value as the basis for normal value where: neither the home market nor a third country market is... a fictitious market are disregarded; no contemporaneous sales of comparable merchandise are...

  7. 31 CFR Appendix A to Part 359 - Redemption Value Calculations

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 31 Money and Finance:Treasury 2 2012-07-01 2012-07-01 false Redemption Value Calculations A... of a State, except for estate or inheritance taxes. (See 31 U.S.C. 3124.) 2. What is an example of a book-entry Series I savings bonds redemption value calculation? Assume a New Treasury Direct par...

  8. 31 CFR Appendix A to Part 359 - Redemption Value Calculations

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 31 Money and Finance:Treasury 2 2011-07-01 2011-07-01 false Redemption Value Calculations A... of a State, except for estate or inheritance taxes. (See 31 U.S.C. 3124.) 2. What is an example of a book-entry Series I savings bonds redemption value calculation? Assume a New Treasury Direct par...

  9. 31 CFR Appendix A to Part 359 - Redemption Value Calculations

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 31 Money and Finance:Treasury 2 2013-07-01 2013-07-01 false Redemption Value Calculations A... of a State, except for estate or inheritance taxes. (See 31 U.S.C. 3124.) 2. What is an example of a book-entry Series I savings bonds redemption value calculation? Assume a New Treasury Direct par...

  10. 31 CFR Appendix A to Part 359 - Redemption Value Calculations

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 31 Money and Finance: Treasury 2 2014-07-01 2014-07-01 false Redemption Value Calculations A... of a State, except for estate or inheritance taxes. (See 31 U.S.C. 3124.) 2. What is an example of a book-entry Series I savings bonds redemption value calculation? Assume a New Treasury Direct par...

  11. 12 CFR 997.4 - Calculation of the quarterly present-value determination.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Calculation of the quarterly present-value determination. 997.4 Section 997.4 Banks and Banking FEDERAL HOUSING FINANCE BOARD NON-BANK SYSTEM ENTITIES RESOLUTION FUNDING CORPORATION OBLIGATIONS OF THE BANKS § 997.4 Calculation of the quarterly present-value...

  12. Sensitivity of NTCP parameter values against a change of dose calculation algorithm.

    PubMed

    Brink, Carsten; Berg, Martin; Nielsen, Morten

    2007-09-01

    Optimization of radiation treatment planning requires estimations of the normal tissue complication probability (NTCP). A number of models exist that estimate NTCP from a calculated dose distribution. Since different dose calculation algorithms use different approximations the dose distributions predicted for a given treatment will in general depend on the algorithm. The purpose of this work is to test whether the optimal NTCP parameter values change significantly when the dose calculation algorithm is changed. The treatment plans for 17 breast cancer patients have retrospectively been recalculated with a collapsed cone algorithm (CC) to compare the NTCP estimates for radiation pneumonitis with those obtained from the clinically used pencil beam algorithm (PB). For the PB calculations the NTCP parameters were taken from previously published values for three different models. For the CC calculations the parameters were fitted to give the same NTCP as for the PB calculations. This paper demonstrates that significant shifts of the NTCP parameter values are observed for three models, comparable in magnitude to the uncertainties of the published parameter values. Thus, it is important to quote the applied dose calculation algorithm when reporting estimates of NTCP parameters in order to ensure correct use of the models.

  13. 25 CFR 39.206 - How does OIEP calculate the value of one WSU?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 25 Indians 1 2011-04-01 2011-04-01 false How does OIEP calculate the value of one WSU? 39.206 Section 39.206 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR EDUCATION THE INDIAN SCHOOL... calculate the value of one WSU? (a) To calculate the appropriated dollar value of one WSU, OIEP divides the...

  14. 25 CFR 39.206 - How does OIEP calculate the value of one WSU?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false How does OIEP calculate the value of one WSU? 39.206 Section 39.206 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR EDUCATION THE INDIAN SCHOOL... calculate the value of one WSU? (a) To calculate the appropriated dollar value of one WSU, OIEP divides the...

  15. 19 CFR 351.403 - Sales used in calculating normal value; transactions between affiliated parties.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 3 2010-04-01 2010-04-01 false Sales used in calculating normal value... ADMINISTRATION, DEPARTMENT OF COMMERCE ANTIDUMPING AND COUNTERVAILING DUTIES Calculation of Export Price, Constructed Export Price, Fair Value, and Normal Value § 351.403 Sales used in calculating normal value...

  16. Sensitivity of NTCP parameter values against a change of dose calculation algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brink, Carsten; Berg, Martin; Nielsen, Morten

    2007-09-15

    Optimization of radiation treatment planning requires estimations of the normal tissue complication probability (NTCP). A number of models exist that estimate NTCP from a calculated dose distribution. Since different dose calculation algorithms use different approximations the dose distributions predicted for a given treatment will in general depend on the algorithm. The purpose of this work is to test whether the optimal NTCP parameter values change significantly when the dose calculation algorithm is changed. The treatment plans for 17 breast cancer patients have retrospectively been recalculated with a collapsed cone algorithm (CC) to compare the NTCP estimates for radiation pneumonitis withmore » those obtained from the clinically used pencil beam algorithm (PB). For the PB calculations the NTCP parameters were taken from previously published values for three different models. For the CC calculations the parameters were fitted to give the same NTCP as for the PB calculations. This paper demonstrates that significant shifts of the NTCP parameter values are observed for three models, comparable in magnitude to the uncertainties of the published parameter values. Thus, it is important to quote the applied dose calculation algorithm when reporting estimates of NTCP parameters in order to ensure correct use of the models.« less

  17. Correct fair market value calculation needed to avoid regulatory challenges.

    PubMed

    Dietrich, M O

    1997-09-01

    In valuing a physician practice for acquisition, it is important for buyers and sellers to distinguish between fair market value and strategic value. Although many buyers would willingly pay for the strategic value of a practice, tax-exempt buyers are required by law to consider only the fair market value in setting a bid price. Valuators must adjust group earnings to exclude items that do not apply to any willing seller and include items that do apply to any willing seller to arrive at the fair market value of the practice. In addition, the weighted average cost of capital (WACC), which becomes the discount rate in the valuation model, is critical to the measure of value of the practice. Small medical practices are assumed to have few hard assets and little long-term debt, and the WACC is calculated on the basis of those assumptions. When a small practice has considerable debt, however, this calculated WACC may be inappropriate for valuing the practice. In every case, evidence that shows that a transaction has been negotiated "at arm's length" should stave off any regulatory challenge.

  18. S-values calculated from a tomographic head/brain model for brain imaging

    NASA Astrophysics Data System (ADS)

    Chao, Tsi-chian; Xu, X. George

    2004-11-01

    A tomographic head/brain model was developed from the Visible Human images and used to calculate S-values for brain imaging procedures. This model contains 15 segmented sub-regions including caudate nucleus, cerebellum, cerebral cortex, cerebral white matter, corpus callosum, eyes, lateral ventricles, lenses, lentiform nucleus, optic chiasma, optic nerve, pons and middle cerebellar peduncle, skull CSF, thalamus and thyroid. S-values for C-11, O-15, F-18, Tc-99m and I-123 have been calculated using this model and a Monte Carlo code, EGS4. Comparison of the calculated S-values with those calculated from the MIRD (1999) stylized head/brain model shows significant differences. In many cases, the stylized head/brain model resulted in smaller S-values (as much as 88%), suggesting that the doses to a specific patient similar to the Visible Man could have been underestimated using the existing clinical dosimetry.

  19. Critical analysis of fragment-orbital DFT schemes for the calculation of electronic coupling values

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schober, Christoph; Reuter, Karsten; Oberhofer, Harald, E-mail: harald.oberhofer@ch.tum.de

    2016-02-07

    We present a critical analysis of the popular fragment-orbital density-functional theory (FO-DFT) scheme for the calculation of electronic coupling values. We discuss the characteristics of different possible formulations or “flavors” of the scheme which differ by the number of electrons in the calculation of the fragments and the construction of the Hamiltonian. In addition to two previously described variants based on neutral fragments, we present a third version taking a different route to the approximate diabatic state by explicitly considering charged fragments. In applying these FO-DFT flavors to the two molecular test sets HAB7 (electron transfer) and HAB11 (hole transfer),more » we find that our new scheme gives improved electronic couplings for HAB7 (−6.2% decrease in mean relative signed error) and greatly improved electronic couplings for HAB11 (−15.3% decrease in mean relative signed error). A systematic investigation of the influence of exact exchange on the electronic coupling values shows that the use of hybrid functionals in FO-DFT calculations improves the electronic couplings, giving values close to or even better than more sophisticated constrained DFT calculations. Comparing the accuracy and computational cost of each variant, we devise simple rules to choose the best possible flavor depending on the task. For accuracy, our new scheme with charged-fragment calculations performs best, while numerically more efficient at reasonable accuracy is the variant with neutral fragments.« less

  20. 40 CFR 600.209-85 - Calculation of fuel economy values for labeling.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 30 2011-07-01 2011-07-01 false Calculation of fuel economy values for... (CONTINUED) ENERGY POLICY FUEL ECONOMY AND CARBON-RELATED EXHAUST EMISSIONS OF MOTOR VEHICLES Procedures for Calculating Fuel Economy and Carbon-Related Exhaust Emission Values for 1977 and Later Model Year Automobiles...

  1. Intensity of emission lines of the quiescent solar corona: comparison between calculated and observed values

    NASA Astrophysics Data System (ADS)

    Krissinel, Boris

    2018-03-01

    The paper reports the results of calculations of the center-to-limb intensity of optically thin line emission in EUV and FUV wavelength ranges. The calculations employ a multicomponent model for the quiescent solar corona. The model includes a collection of loops of various sizes, spicules, and free (inter-loop) matter. Theoretical intensity values are found from probabilities of encountering parts of loops in the line of sight with respect to the probability of absence of other coronal components. The model uses 12 loops with sizes from 3200 to 210000 km with different values of rarefaction index and pressure at the loop base and apex. The temperature at loop apices is 1 400 000 K. The calculations utilize the CHIANTI database. The comparison between theoretical and observed emission intensity values for coronal and transition region lines obtained by the SUMER, CDS, and EIS telescopes shows quite satisfactory agreement between them, particularly for the solar disk center. For the data acquired above the limb, the enhanced discrepancies after the analysis refer to errors in EIS measurements.

  2. 31 CFR Appendix A to Part 359 - Redemption Value Calculations

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... book-entry Series I savings bonds redemption value calculation? Assume a New Treasury Direct par investment amount in a book-entry Series I savings bonds of $34.59, with an issue date of May, 2001, and a..., 2001 and redeemed December, 2001 = $101.96. Calculation: [(Book-entry par investment) ÷ (100)] × CRV...

  3. Development of MATLAB Scripts for the Calculation of Thermal Manikin Regional Resistance Values

    DTIC Science & Technology

    2016-01-01

    CALCULATION OF THERMAL MANIKIN REGIONAL RESISTANCE VALUES DISCLAIMER The opinions or assertions contained herein are the private views of the...USARIEM TECHNICAL NOTE TN16-1 DEVELOPMENT OF MATLAB® SCRIPTS FOR THE CALCULATION OF THERMAL MANIKIN REGIONAL RESISTANCE VALUES...performed by thermal manikin and modeling personnel. Steps to operate the scripts as well as the underlying calculations are outlined in detail

  4. Metrix Matrix: A Cloud-Based System for Tracking Non-Relative Value Unit Value-Added Work Metrics.

    PubMed

    Kovacs, Mark D; Sheafor, Douglas H; Thacker, Paul G; Hardie, Andrew D; Costello, Philip

    2018-03-01

    In the era of value-based medicine, it will become increasingly important for radiologists to provide metrics that demonstrate their value beyond clinical productivity. In this article the authors describe their institution's development of an easy-to-use system for tracking value-added but non-relative value unit (RVU)-based activities. Metrix Matrix is an efficient cloud-based system for tracking value-added work. A password-protected home page contains links to web-based forms created using Google Forms, with collected data populating Google Sheets spreadsheets. Value-added work metrics selected for tracking included interdisciplinary conferences, hospital committee meetings, consulting on nonbilled outside studies, and practice-based quality improvement. Over a period of 4 months, value-added work data were collected for all clinical attending faculty members in a university-based radiology department (n = 39). Time required for data entry was analyzed for 2 faculty members over the same time period. Thirty-nine faculty members (equivalent to 36.4 full-time equivalents) reported a total of 1,223.5 hours of value-added work time (VAWT). A formula was used to calculate "value-added RVUs" (vRVUs) from VAWT. VAWT amounted to 5,793.6 vRVUs or 6.0% of total work performed (vRVUs plus work RVUs [wRVUs]). Were vRVUs considered equivalent to wRVUs for staffing purposes, this would require an additional 2.3 full-time equivalents, on the basis of average wRVU calculations. Mean data entry time was 56.1 seconds per day per faculty member. As health care reimbursement evolves with an emphasis on value-based medicine, it is imperative that radiologists demonstrate the value they add to patient care beyond wRVUs. This free and easy-to-use cloud-based system allows the efficient quantification of value-added work activities. Copyright © 2017 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  5. Calculation of Expectation Values of Operators in the Complex Scaling Method

    DOE PAGES

    Papadimitriou, G.

    2016-06-14

    In the complex scaling method (CSM) provides with a way to obtain resonance parameters of particle unstable states by rotating the coordinates and momenta of the original Hamiltonian. It is convenient to use an L 2 integrable basis to resolve the complex rotated or complex scaled Hamiltonian H θ , with θ being the angle of rotation in the complex energy plane. Within the CSM, resonance and scattering solutions have fall-off asymptotics. Furthermore, one of the consequences is that, expectation values of operators in a resonance or scattering complex scaled solution are calculated by complex rotating the operators. In thismore » work we are exploring applications of the CSM on calculations of expectation values of quantum mechanical operators by using the regularized backrotation technique and calculating hence the expectation value using the unrotated operator. Moreover, the test cases involve a schematic two-body Gaussian model and also applications using realistic interactions.« less

  6. 19 CFR 351.408 - Calculation of normal value of merchandise from nonmarket economy countries.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 3 2010-04-01 2010-04-01 false Calculation of normal value of merchandise from nonmarket economy countries. 351.408 Section 351.408 Customs Duties INTERNATIONAL TRADE ADMINISTRATION, DEPARTMENT OF COMMERCE ANTIDUMPING AND COUNTERVAILING DUTIES Calculation of Export Price, Constructed Export Price, Fair Value, and Normal Value §...

  7. Characterization of protein folding by a Φ-value calculation with a statistical-mechanical model.

    PubMed

    Wako, Hiroshi; Abe, Haruo

    2016-01-01

    The Φ-value analysis approach provides information about transition-state structures along the folding pathway of a protein by measuring the effects of an amino acid mutation on folding kinetics. Here we compared the theoretically calculated Φ values of 27 proteins with their experimentally observed Φ values; the theoretical values were calculated using a simple statistical-mechanical model of protein folding. The theoretically calculated Φ values reflected the corresponding experimentally observed Φ values with reasonable accuracy for many of the proteins, but not for all. The correlation between the theoretically calculated and experimentally observed Φ values strongly depends on whether the protein-folding mechanism assumed in the model holds true in real proteins. In other words, the correlation coefficient can be expected to illuminate the folding mechanisms of proteins, providing the answer to the question of which model more accurately describes protein folding: the framework model or the nucleation-condensation model. In addition, we tried to characterize protein folding with respect to various properties of each protein apart from the size and fold class, such as the free-energy profile, contact-order profile, and sensitivity to the parameters used in the Φ-value calculation. The results showed that any one of these properties alone was not enough to explain protein folding, although each one played a significant role in it. We have confirmed the importance of characterizing protein folding from various perspectives. Our findings have also highlighted that protein folding is highly variable and unique across different proteins, and this should be considered while pursuing a unified theory of protein folding.

  8. Characterization of protein folding by a Φ-value calculation with a statistical-mechanical model

    PubMed Central

    Wako, Hiroshi; Abe, Haruo

    2016-01-01

    The Φ-value analysis approach provides information about transition-state structures along the folding pathway of a protein by measuring the effects of an amino acid mutation on folding kinetics. Here we compared the theoretically calculated Φ values of 27 proteins with their experimentally observed Φ values; the theoretical values were calculated using a simple statistical-mechanical model of protein folding. The theoretically calculated Φ values reflected the corresponding experimentally observed Φ values with reasonable accuracy for many of the proteins, but not for all. The correlation between the theoretically calculated and experimentally observed Φ values strongly depends on whether the protein-folding mechanism assumed in the model holds true in real proteins. In other words, the correlation coefficient can be expected to illuminate the folding mechanisms of proteins, providing the answer to the question of which model more accurately describes protein folding: the framework model or the nucleation-condensation model. In addition, we tried to characterize protein folding with respect to various properties of each protein apart from the size and fold class, such as the free-energy profile, contact-order profile, and sensitivity to the parameters used in the Φ-value calculation. The results showed that any one of these properties alone was not enough to explain protein folding, although each one played a significant role in it. We have confirmed the importance of characterizing protein folding from various perspectives. Our findings have also highlighted that protein folding is highly variable and unique across different proteins, and this should be considered while pursuing a unified theory of protein folding. PMID:28409079

  9. 31 CFR 359.55 - How are redemption values calculated for book-entry Series I savings bonds?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... for book-entry Series I savings bonds? 359.55 Section 359.55 Money and Finance: Treasury Regulations... DEBT OFFERING OF UNITED STATES SAVINGS BONDS, SERIES I Book-Entry Series I Savings Bonds § 359.55 How are redemption values calculated for book-entry Series I savings bonds? We base current redemption...

  10. 31 CFR 351.70 - How are redemption values calculated for book-entry Series EE savings bonds?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... for book-entry Series EE savings bonds? 351.70 Section 351.70 Money and Finance: Treasury Regulations... DEBT OFFERING OF UNITED STATES SAVINGS BONDS, SERIES EE Book-Entry Series EE Savings Bonds § 351.70 How are redemption values calculated for book-entry Series EE savings bonds? We base current redemption...

  11. 31 CFR 351.70 - How are redemption values calculated for book-entry Series EE savings bonds?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... for book-entry Series EE savings bonds? 351.70 Section 351.70 Money and Finance: Treasury Regulations... DEBT OFFERING OF UNITED STATES SAVINGS BONDS, SERIES EE Book-Entry Series EE Savings Bonds § 351.70 How are redemption values calculated for book-entry Series EE savings bonds? We base current redemption...

  12. 40 CFR 600.207-86 - Calculation of fuel economy values for a model type.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 30 2011-07-01 2011-07-01 false Calculation of fuel economy values for... AGENCY (CONTINUED) ENERGY POLICY FUEL ECONOMY AND CARBON-RELATED EXHAUST EMISSIONS OF MOTOR VEHICLES Procedures for Calculating Fuel Economy and Carbon-Related Exhaust Emission Values for 1977 and Later Model...

  13. 40 CFR 600.207-93 - Calculation of fuel economy values for a model type.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... economy data from tests conducted on these vehicle configuration(s) at high altitude to calculate the fuel... city, highway, and combined fuel economy values from the tests performed using gasoline or diesel test fuel. (ii) Calculate the city, highway, and combined fuel economy values from the tests performed using...

  14. Time-dependent importance sampling in semiclassical initial value representation calculations for time correlation functions.

    PubMed

    Tao, Guohua; Miller, William H

    2011-07-14

    An efficient time-dependent importance sampling method is developed for the Monte Carlo calculation of time correlation functions via the initial value representation (IVR) of semiclassical (SC) theory. A prefactor-free time-dependent sampling function weights the importance of a trajectory based on the magnitude of its contribution to the time correlation function, and global trial moves are used to facilitate the efficient sampling the phase space of initial conditions. The method can be generally applied to sampling rare events efficiently while avoiding being trapped in a local region of the phase space. Results presented in the paper for two system-bath models demonstrate the efficiency of this new importance sampling method for full SC-IVR calculations.

  15. Programmable calculator programs to solve softwood volume and value equations.

    Treesearch

    Janet K. Ayer Sachet

    1982-01-01

    This paper presents product value and product volume equations as programs for handheld calculators. These tree equations are for inland Douglas-fir, young-growth Douglas-fir, western white pine, ponderosa pine, and western larch. Operating instructions and an example are included.

  16. Assessing value-based health care delivery for haemodialysis.

    PubMed

    Parra, Eduardo; Arenas, María Dolores; Alonso, Manuel; Martínez, María Fernanda; Gamen, Ángel; Aguarón, Juan; Escobar, María Teresa; Moreno-Jiménez, José María; Alvarez-Ude, Fernando

    2017-06-01

    Disparities in haemodialysis outcomes among centres have been well-documented. Besides, attempts to assess haemodialysis results have been based on non-comprehensive methodologies. This study aimed to develop a comprehensive methodology for assessing haemodialysis centres, based on the value of health care. The value of health care is defined as the patient benefit from a specific medical intervention per monetary unit invested (Value = Patient Benefit/Cost). This study assessed the value of health care and ranked different haemodialysis centres. A nephrology quality management group identified the criteria for the assessment. An expert group composed of stakeholders (patients, clinicians and managers) agreed on the weighting of each variable, considering values and preferences. Multi-criteria methodology was used to analyse the data. Four criteria and their weights were identified: evidence-based clinical performance measures = 43 points; yearly mortality = 27 points; patient satisfaction = 13 points; and health-related quality of life = 17 points (100-point scale). Evidence-based clinical performance measures included five sub-criteria, with respective weights, including: dialysis adequacy; haemoglobin concentration; mineral and bone disorders; type of vascular access; and hospitalization rate. The patient benefit was determined from co-morbidity-adjusted results and corresponding weights. The cost of each centre was calculated as the average amount expended per patient per year. The study was conducted in five centres (1-5). After adjusting for co-morbidity, value of health care was calculated, and the centres were ranked. A multi-way sensitivity analysis that considered different weights (10-60% changes) and costs (changes of 10% in direct and 30% in allocated costs) showed that the methodology was robust. The rankings: 4-5-3-2-1 and 4-3-5-2-1 were observed in 62.21% and 21.55%, respectively, of simulations, when weights were varied by 60

  17. Development of a web-based CT dose calculator: WAZA-ARI.

    PubMed

    Ban, N; Takahashi, F; Sato, K; Endo, A; Ono, K; Hasegawa, T; Yoshitake, T; Katsunuma, Y; Kai, M

    2011-09-01

    A web-based computed tomography (CT) dose calculation system (WAZA-ARI) is being developed based on the modern techniques for the radiation transport simulation and for software implementation. Dose coefficients were calculated in a voxel-type Japanese adult male phantom (JM phantom), using the Particle and Heavy Ion Transport code System. In the Monte Carlo simulation, the phantom was irradiated with a 5-mm-thick, fan-shaped photon beam rotating in a plane normal to the body axis. The dose coefficients were integrated into the system, which runs as Java servlets within Apache Tomcat. Output of WAZA-ARI for GE LightSpeed 16 was compared with the dose values calculated similarly using MIRD and ICRP Adult Male phantoms. There are some differences due to the phantom configuration, demonstrating the significance of the dose calculation with appropriate phantoms. While the dose coefficients are currently available only for limited CT scanner models and scanning options, WAZA-ARI will be a useful tool in clinical practice when development is finalised.

  18. Using Calculators for Assessing Pupils' Conceptualization on Place-Value

    ERIC Educational Resources Information Center

    Papadopoulos, Ioannis

    2013-01-01

    In this paper a two-stage research study is described focused on problem solving relevant to place-value and on the use of the operations within the calculator environment. The findings show that in this specific environment and via appropriate tasks teachers are provided with a context to better understand what year 5 or 6 pupils know or do not…

  19. Calculation of the surface tension of liquid Ga-based alloys

    NASA Astrophysics Data System (ADS)

    Dogan, Ali; Arslan, Hüseyin

    2018-05-01

    As known, Eyring and his collaborators have applied the structure theory to the properties of binary liquid mixtures. In this work, the Eyring model has been extended to calculate the surface tension of liquid Ga-Bi, Ga-Sn and Ga-In binary alloys. It was found that the addition of Sn, In and Bi into Ga leads to significant decrease in the surface tension of the three Ga-based alloy systems, especially for that of Ga-Bi alloys. The calculated surface tension values of these alloys exhibit negative deviation from the corresponding ideal mixing isotherms. Moreover, a comparison between the calculated results and corresponding literature data indicates a good agreement.

  20. Proposed equations and reference values for calculating bone health in children and adolescent based on age and sex

    PubMed Central

    Gómez-Campos, Rossana; Andruske, Cynthia Lee; de Arruda, Miguel; Urra Albornoz, Camilo; Cossio-Bolaños, Marco

    2017-01-01

    Background The Dual Energy X-Ray Absorptiometry (DXA) is the gold standard for measuring BMD and bone mineral content (BMC). In general, DXA is ideal for pediatric use. However, the development of specific standards for particular geographic regions limits its use and application for certain socio-cultural contexts. Additionally, the anthropometry may be a low cost and easy to use alternative method in epidemiological contexts. The goal of our study was to develop regression equations for predicting bone health of children and adolescents based on anthropometric indicators to propose reference values based on age and sex. Methods 3020 students (1567 males and 1453 females) ranging in ages 4.0 to 18.9 were studied from the Maule Region (Chile). Anthropometric variables evaluated included: weight, standing height, sitting height, forearm length, and femur diameter. A total body scan (without the head) was conducted by means of the Dual Energy X-Ray Absorptiometry. Bone mineral density (BMD) and the bone mineral content (BMC) were also determined. Calcium consumption was controlled for by recording the intake of the three last days prior to the evaluation. Body Mass Index (BMI) was calculated, and somatic maturation was determined by using the years of peak growth rate (APHV). Results Four regression models were generated to calculate bone health: for males BMD = (R2 = 0.79) and BMC = (R2 = 0.84) and for the females BMD = (R2 = 0.76) and BMC = (R2 = 0.83). Percentiles were developed by using the LMS method (p3, p5, p15, p25, p50, p75, p85, p95 and p97). Conclusions Regression equations and reference curves were developed to assess the bone health of Chilean children and adolescents. These instruments help identify children with potential underlying problems in bone mineralization during the growth stage and biological maturation. PMID:28759569

  1. Proposed equations and reference values for calculating bone health in children and adolescent based on age and sex.

    PubMed

    Gómez-Campos, Rossana; Andruske, Cynthia Lee; Arruda, Miguel de; Urra Albornoz, Camilo; Cossio-Bolaños, Marco

    2017-01-01

    The Dual Energy X-Ray Absorptiometry (DXA) is the gold standard for measuring BMD and bone mineral content (BMC). In general, DXA is ideal for pediatric use. However, the development of specific standards for particular geographic regions limits its use and application for certain socio-cultural contexts. Additionally, the anthropometry may be a low cost and easy to use alternative method in epidemiological contexts. The goal of our study was to develop regression equations for predicting bone health of children and adolescents based on anthropometric indicators to propose reference values based on age and sex. 3020 students (1567 males and 1453 females) ranging in ages 4.0 to 18.9 were studied from the Maule Region (Chile). Anthropometric variables evaluated included: weight, standing height, sitting height, forearm length, and femur diameter. A total body scan (without the head) was conducted by means of the Dual Energy X-Ray Absorptiometry. Bone mineral density (BMD) and the bone mineral content (BMC) were also determined. Calcium consumption was controlled for by recording the intake of the three last days prior to the evaluation. Body Mass Index (BMI) was calculated, and somatic maturation was determined by using the years of peak growth rate (APHV). Four regression models were generated to calculate bone health: for males BMD = (R2 = 0.79) and BMC = (R2 = 0.84) and for the females BMD = (R2 = 0.76) and BMC = (R2 = 0.83). Percentiles were developed by using the LMS method (p3, p5, p15, p25, p50, p75, p85, p95 and p97). Regression equations and reference curves were developed to assess the bone health of Chilean children and adolescents. These instruments help identify children with potential underlying problems in bone mineralization during the growth stage and biological maturation.

  2. Calculating p-values and their significances with the Energy Test for large datasets

    NASA Astrophysics Data System (ADS)

    Barter, W.; Burr, C.; Parkes, C.

    2018-04-01

    The energy test method is a multi-dimensional test of whether two samples are consistent with arising from the same underlying population, through the calculation of a single test statistic (called the T-value). The method has recently been used in particle physics to search for samples that differ due to CP violation. The generalised extreme value function has previously been used to describe the distribution of T-values under the null hypothesis that the two samples are drawn from the same underlying population. We show that, in a simple test case, the distribution is not sufficiently well described by the generalised extreme value function. We present a new method, where the distribution of T-values under the null hypothesis when comparing two large samples can be found by scaling the distribution found when comparing small samples drawn from the same population. This method can then be used to quickly calculate the p-values associated with the results of the test.

  3. [Gas Concentration Measurement Based on the Integral Value of Absorptance Spectrum].

    PubMed

    Liu, Hui-jun; Tao, Shao-hua; Yang, Bing-chu; Deng, Hong-gui

    2015-12-01

    The absorptance spectrum of a gas is the basis for the qualitative and quantitative analysis of the gas by the law of the Lambert-Beer. The integral value of the absorptance spectrum is an important parameter to describe the characteristics of the gas absorption. Based on the measured absorptance spectrum of a gas, we collected the required data from the database of HIT-RAN, and chose one of the spectral lines and calculated the integral value of the absorptance spectrum in the frequency domain, and then substituted the integral value into Lambert-Beer's law to obtain the concentration of the detected gas. By calculating the integral value of the absorptance spectrum we can avoid the more complicated calculation of the spectral line function and a series of standard gases for calibration, so the gas concentration measurement will be simpler and faster. We studied the changing trends of the integral values of the absorptance spectrums versus temperature. Since temperature variation would cause the corresponding variation in pressure, we studied the changing trends of the integral values of the absorptance spectrums versus both the pressure not changed with temperature and changed with the temperature variation. Based on the two cases, we found that the integral values of the absorptance spectrums both would firstly increase, then decrease, and finally stabilize with temperature increasing, but the ranges of specific changing trend were different in the two cases. In the experiments, we found that the relative errors of the integrated values of the absorptance spectrum were much higher than 1% and still increased with temperature when we only considered the change of temperature and completely ignored the pressure affected by the temperature variation, and the relative errors of the integrated values of the absorptance spectrum were almost constant at about only 1% when we considered that the pressure were affected by the temperature variation. As the integral value

  4. Critical Values for Lawshe's Content Validity Ratio: Revisiting the Original Methods of Calculation

    ERIC Educational Resources Information Center

    Ayre, Colin; Scally, Andrew John

    2014-01-01

    The content validity ratio originally proposed by Lawshe is widely used to quantify content validity and yet methods used to calculate the original critical values were never reported. Methods for original calculation of critical values are suggested along with tables of exact binomial probabilities.

  5. Study of activity based costing implementation for palm oil production using value-added and non-value-added activity consideration in PT XYZ palm oil mill

    NASA Astrophysics Data System (ADS)

    Sembiring, M. T.; Wahyuni, D.; Sinaga, T. S.; Silaban, A.

    2018-02-01

    Cost allocation at manufacturing industry particularly in Palm Oil Mill still widely practiced based on estimation. It leads to cost distortion. Besides, processing time determined by company is not in accordance with actual processing time in work station. Hence, the purpose of this study is to eliminates non-value-added activities therefore processing time could be shortened and production cost could be reduced. Activity Based Costing Method is used in this research to calculate production cost with Value Added and Non-Value-Added Activities consideration. The result of this study is processing time decreasing for 35.75% at Weighting Bridge Station, 29.77% at Sorting Station, 5.05% at Loading Ramp Station, and 0.79% at Sterilizer Station. Cost of Manufactured for Crude Palm Oil are IDR 5.236,81/kg calculated by Traditional Method, IDR 4.583,37/kg calculated by Activity Based Costing Method before implementation of Activity Improvement and IDR 4.581,71/kg after implementation of Activity Improvement Meanwhile Cost of Manufactured for Palm Kernel are IDR 2.159,50/kg calculated by Traditional Method, IDR 4.584,63/kg calculated by Activity Based Costing Method before implementation of Activity Improvement and IDR 4.582,97/kg after implementation of Activity Improvement.

  6. Calculation of day and night emittance values

    NASA Technical Reports Server (NTRS)

    Kahle, Anne B.

    1986-01-01

    In July 1983, the Thermal Infrared Multispectral Scanner (TIMS) was flown over Death Valley, California on both a midday and predawn flight within a two-day period. The availability of calibrated digital data permitted the calculation of day and night surface temperature and surface spectral emittance. Image processing of the data included panorama correction and calibration to radiance using the on-board black bodies and the measured spectral response of each channel. Scene-dependent isolated-point noise due to bit drops, was located by its relatively discontinuous values and replaced by the average of the surrounding data values. A method was developed in order to separate the spectral and temperature information contained in the TIMS data. Night and day data sets were processed. The TIMS is unique in allowing collection of both spectral emittance and thermal information in digital format with the same airborne scanner. For the first time it was possible to produce day and night emittance images of the same area, coregistered. These data add to an understanding of the physical basis for the discrimination of difference in surface materials afforded by TIMS.

  7. A relative-value-based system for calculating faculty productivity in teaching, research, administration, and patient care.

    PubMed

    Hilton, C; Fisher, W; Lopez, A; Sanders, C

    1997-09-01

    To design and test a simple, easily modifiable system for calculating faculty productivity in teaching, research, administration, and patient care in which all areas of endeavor would be recognized and high productivity in one area would produce results similar to high productivity in another at the Louisiana State University School of Medicine in New Orleans. A relative-value and time-based system was designed in 1996 so that similar efforts in the four areas would produce similar scores, and a profile reflecting the authors' estimates of high productivity ("super faculty") was developed for each area. The activity profiles of 17 faculty members were used to test the system. "Super-faculty" scores in all areas were similar. The faculty members' mean scores were higher for teaching and research than for administration and patient care, and all four mean scores were substantially lower than the respective totals for the "super faculty". In each category the scores of those faculty members who scored above the mean in that category were used to calculate new mean scores. The mean scores for these faculty members were similar to those for the "super faculty" in teaching and research but were substantially lower for administration and patient care. When the mean total score of the eight faculty members predicted to have total scores below the group mean was compared with the mean total score of the nine faculty members predicted to have total scores above the group mean, the difference was significant (p < .0001). For the former, every score in each category was below the mean, with the exception of one faculty member's score in one category. Of the latter, eight had higher scores in teaching and four had higher scores in teaching and research combined. This system provides a quantitative method for the equal recognition of faculty productivity in a number of areas, and it may be useful as a starting point for other academic units exploring similar issues.

  8. 40 CFR 600.211-08 - Sample calculation of fuel economy values for labeling.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 30 2011-07-01 2011-07-01 false Sample calculation of fuel economy... AGENCY (CONTINUED) ENERGY POLICY FUEL ECONOMY AND CARBON-RELATED EXHAUST EMISSIONS OF MOTOR VEHICLES Procedures for Calculating Fuel Economy and Carbon-Related Exhaust Emission Values for 1977 and Later Model...

  9. 40 CFR 600.207-12 - Calculation and use of vehicle-specific 5-cycle-based fuel economy and CO2 emission values for...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... economy and CO2 emission values from the tests performed using gasoline or diesel test fuel. (ii) Calculate the 5-cycle city and highway fuel economy and CO2 emission values from the tests performed using alcohol or natural gas test fuel, if 5-cycle testing has been performed. Otherwise, the procedure in § 600...

  10. 40 CFR 600.207-12 - Calculation and use of vehicle-specific 5-cycle-based fuel economy and CO2 emission values for...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... economy and CO2 emission values from the tests performed using gasoline or diesel test fuel. (ii) Calculate the 5-cycle city and highway fuel economy and CO2 emission values from the tests performed using alcohol or natural gas test fuel, if 5-cycle testing has been performed. Otherwise, the procedure in § 600...

  11. 40 CFR 600.207-12 - Calculation and use of vehicle-specific 5-cycle-based fuel economy and CO2 emission values for...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... economy and CO2 emission values from the tests performed using gasoline or diesel test fuel. (ii) Calculate the 5-cycle city and highway fuel economy and CO2 emission values from the tests performed using alcohol or natural gas test fuel, if 5-cycle testing has been performed. Otherwise, the procedure in § 600...

  12. Reference Value Advisor: a new freeware set of macroinstructions to calculate reference intervals with Microsoft Excel.

    PubMed

    Geffré, Anne; Concordet, Didier; Braun, Jean-Pierre; Trumel, Catherine

    2011-03-01

    International recommendations for determination of reference intervals have been recently updated, especially for small reference sample groups, and use of the robust method and Box-Cox transformation is now recommended. Unfortunately, these methods are not included in most software programs used for data analysis by clinical laboratories. We have created a set of macroinstructions, named Reference Value Advisor, for use in Microsoft Excel to calculate reference limits applying different methods. For any series of data, Reference Value Advisor calculates reference limits (with 90% confidence intervals [CI]) using a nonparametric method when n≥40 and by parametric and robust methods from native and Box-Cox transformed values; tests normality of distributions using the Anderson-Darling test and outliers using Tukey and Dixon-Reed tests; displays the distribution of values in dot plots and histograms and constructs Q-Q plots for visual inspection of normality; and provides minimal guidelines in the form of comments based on international recommendations. The critical steps in determination of reference intervals are correct selection of as many reference individuals as possible and analysis of specimens in controlled preanalytical and analytical conditions. Computing tools cannot compensate for flaws in selection and size of the reference sample group and handling and analysis of samples. However, if those steps are performed properly, Reference Value Advisor, available as freeware at http://www.biostat.envt.fr/spip/spip.php?article63, permits rapid assessment and comparison of results calculated using different methods, including currently unavailable methods. This allows for selection of the most appropriate method, especially as the program provides the CI of limits. It should be useful in veterinary clinical pathology when only small reference sample groups are available. ©2011 American Society for Veterinary Clinical Pathology.

  13. Pseudospectral calculation of helium wave functions, expectation values, and oscillator strength

    NASA Astrophysics Data System (ADS)

    Grabowski, Paul E.; Chernoff, David F.

    2011-10-01

    We show that the pseudospectral method is a powerful tool for finding precise solutions of Schrödinger’s equation for two-electron atoms with general angular momentum. Realizing the method’s full promise for atomic calculations requires special handling of singularities due to two-particle Coulomb interactions. We give a prescription for choosing coordinates and subdomains whose efficacy we illustrate by solving several challenging problems. One test centers on the determination of the nonrelativistic electric dipole oscillator strength for the helium 11S→21P transition. The result achieved, 0.27616499(27), is comparable to the best in the literature. The formally equivalent length, velocity, and acceleration expressions for the oscillator strength all yield roughly the same accuracy. We also calculate a diverse set of helium ground-state expectation values, reaching near state-of-the-art accuracy without the necessity of implementing any special-purpose numerics. These successes imply that general matrix elements are directly and reliably calculable with pseudospectral methods. A striking result is that all the relevant quantities tested in this paper—energy eigenvalues, S-state expectation values and a bound-bound dipole transition between the lowest energy S and P states—converge exponentially with increasing resolution and at roughly the same rate. Each individual calculation samples and weights the configuration space wave function uniquely but all behave in a qualitatively similar manner. These results suggest that the method has great promise for similarly accurate treatment of few-particle systems.

  14. Decay hazard (Scheffer) index values calculated from 1971-2000 climate normal data

    Treesearch

    Charles G. Carll

    2009-01-01

    Climate index values for estimating decay hazard to wood exposed outdoors above ground (commonly known as Scheffer index values) were calculated for 280 locations in the United States (270 locations in the conterminous United States) using the most current climate normal data available from the National Climatic Data Center. These were data for the period 1971–2000. In...

  15. 30 CFR 206.105 - What records must I keep to support my calculations of value under this subpart?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... you to use a different value if it determines that the reported value is inconsistent with the... calculations of value under this subpart? 206.105 Section 206.105 Mineral Resources MINERALS MANAGEMENT SERVICE... must I keep to support my calculations of value under this subpart? If you determine the value of your...

  16. What Is the Value of Value-Based Purchasing?

    PubMed

    Tanenbaum, Sandra J

    2016-10-01

    Value-based purchasing (VBP) is a widely favored strategy for improving the US health care system. The meaning of value that predominates in VBP schemes is (1) conformance to selected process and/or outcome metrics, and sometimes (2) such conformance at the lowest possible cost. In other words, VBP schemes choose some number of "quality indicators" and financially incent providers to meet them (and not others). Process measures are usually based on clinical science that cannot determine the effects of a process on individual patients or patients with comorbidities, and do not necessarily measure effects that patients value; additionally, there is no provision for different patients valuing different things. Proximate outcome measures may or may not predict distal ones, and the more distal the outcome, the less reliably it can be attributed to health care. Outcome measures may be quite rudimentary, such as mortality rates, or highly contestable: survival or function after prostate surgery? When cost is an element of value-based purchasing, it is the cost to the value-based payer and not to other payers or patients' families. The greatest value of value-based purchasing may not be to patients or even payers, but to policy makers seeking a morally justifiable alternative to politically contested regulatory policies. Copyright © 2016 by Duke University Press.

  17. Regional potential evapotranspiration in arid climates based on temperature, topography and calculated solar radiation

    NASA Astrophysics Data System (ADS)

    Shevenell, Lisa

    1999-03-01

    Values of evapotranspiration are required for a variety of water planning activities in arid and semi-arid climates, yet data requirements are often large, and it is costly to obtain this information. This work presents a method where a few, readily available data (temperature, elevation) are required to estimate potential evapotranspiration (PET). A method using measured temperature and the calculated ratio of total to vertical radiation (after the work of Behnke and Maxey, 1969) to estimate monthly PET was applied for the months of April-October and compared with pan evaporation measurements. The test area used in this work was in Nevada, which has 124 weather stations that record sufficient amounts of temperature data. The calculated PET values were found to be well correlated (R2=0·940-0·983, slopes near 1·0) with mean monthly pan evaporation measurements at eight weather stations.In order to extrapolate these calculated PET values to areas without temperature measurements and to sites at differing elevations, the state was divided into five regions based on latitude, and linear regressions of PET versus elevation were calculated for each of these regions. These extrapolated PET values generally compare well with the pan evaporation measurements (R2=0·926-0·988, slopes near 1·0). The estimated values are generally somewhat lower than the pan measurements, in part because the effects of wind are not explicitly considered in the calculations, and near-freezing temperatures result in a calculated PET of zero at higher elevations in the spring months. The calculated PET values for April-October are 84-100% of the measured pan evaporation values. Using digital elevation models in a geographical information system, calculated values were adjusted for slope and aspect, and the data were used to construct a series of maps of monthly PET. The resultant maps show a realistic distribution of regional variations in PET throughout Nevada which inversely mimics

  18. 31 CFR 359.55 - How are redemption values calculated for book-entry Series I savings bonds?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 31 Money and Finance:Treasury 2 2013-07-01 2013-07-01 false How are redemption values calculated... prorated to the book-entry par investment amount for the corresponding issue and redemption dates... to $25.04; calculated value of $25.045 rounds to $25.05. [Book-entry par investment ÷ 100] × [CRV...

  19. 31 CFR 359.55 - How are redemption values calculated for book-entry Series I savings bonds?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 31 Money and Finance:Treasury 2 2012-07-01 2012-07-01 false How are redemption values calculated... prorated to the book-entry par investment amount for the corresponding issue and redemption dates... to $25.04; calculated value of $25.045 rounds to $25.05. [Book-entry par investment ÷ 100] × [CRV...

  20. 31 CFR 359.55 - How are redemption values calculated for book-entry Series I savings bonds?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 31 Money and Finance: Treasury 2 2014-07-01 2014-07-01 false How are redemption values calculated... prorated to the book-entry par investment amount for the corresponding issue and redemption dates... to $25.04; calculated value of $25.045 rounds to $25.05. [Book-entry par investment ÷ 100] × [CRV...

  1. 40 CFR 600.206-12 - Calculation and use of FTP-based and HFET-based fuel economy, CO2 emissions, and carbon-related...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... values from the tests performed using gasoline or diesel test fuel. (ii) Calculate the city, highway, and combined fuel economy, CO2 emissions, and carbon-related exhaust emission values from the tests performed using alcohol or natural gas test fuel. (b) If only one equivalent petroleum-based fuel economy value...

  2. 40 CFR 600.206-12 - Calculation and use of FTP-based and HFET-based fuel economy, CO2 emissions, and carbon-related...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... values from the tests performed using gasoline or diesel test fuel. (ii) Calculate the city, highway, and combined fuel economy, CO2 emissions, and carbon-related exhaust emission values from the tests performed using alcohol or natural gas test fuel. (b) If only one equivalent petroleum-based fuel economy value...

  3. 40 CFR 600.206-12 - Calculation and use of FTP-based and HFET-based fuel economy, CO2 emissions, and carbon-related...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... values from the tests performed using gasoline or diesel test fuel. (ii) Calculate the city, highway, and combined fuel economy, CO2 emissions, and carbon-related exhaust emission values from the tests performed using alcohol or natural gas test fuel. (b) If only one equivalent petroleum-based fuel economy value...

  4. 19 CFR 351.408 - Calculation of normal value of merchandise from nonmarket economy countries.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... nonmarket economy countries. 351.408 Section 351.408 Customs Duties INTERNATIONAL TRADE ADMINISTRATION... economy countries. (a) Introduction. In identifying dumping from a nonmarket economy country, the Secretary normally will calculate normal value by valuing the nonmarket economy producers' factors of...

  5. 19 CFR 351.408 - Calculation of normal value of merchandise from nonmarket economy countries.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... nonmarket economy countries. 351.408 Section 351.408 Customs Duties INTERNATIONAL TRADE ADMINISTRATION... economy countries. (a) Introduction. In identifying dumping from a nonmarket economy country, the Secretary normally will calculate normal value by valuing the nonmarket economy producers' factors of...

  6. 19 CFR 351.408 - Calculation of normal value of merchandise from nonmarket economy countries.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... nonmarket economy countries. 351.408 Section 351.408 Customs Duties INTERNATIONAL TRADE ADMINISTRATION... economy countries. (a) Introduction. In identifying dumping from a nonmarket economy country, the Secretary normally will calculate normal value by valuing the nonmarket economy producers' factors of...

  7. 19 CFR 351.408 - Calculation of normal value of merchandise from nonmarket economy countries.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... nonmarket economy countries. 351.408 Section 351.408 Customs Duties INTERNATIONAL TRADE ADMINISTRATION... economy countries. (a) Introduction. In identifying dumping from a nonmarket economy country, the Secretary normally will calculate normal value by valuing the nonmarket economy producers' factors of...

  8. 19 CFR 351.406 - Calculation of normal value if sales are made at less than cost of production.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 3 2010-04-01 2010-04-01 false Calculation of normal value if sales are made at less than cost of production. 351.406 Section 351.406 Customs Duties INTERNATIONAL TRADE ADMINISTRATION... Price, Fair Value, and Normal Value § 351.406 Calculation of normal value if sales are made at less than...

  9. An Approach for Calculating Student-Centered Value in Education - A Link between Quality, Efficiency, and the Learning Experience in the Health Professions.

    PubMed

    Nicklen, Peter; Rivers, George; Ooi, Caryn; Ilic, Dragan; Reeves, Scott; Walsh, Kieran; Maloney, Stephen

    2016-01-01

    Health professional education is experiencing a cultural shift towards student-centered education. Although we are now challenging our traditional training methods, our methods for evaluating the impact of the training on the learner remains largely unchanged. What is not typically measured is student-centered value; whether it was 'worth' what the learner paid. The primary aim of this study was to apply a method of calculating student-centered value, applied to the context of a change in teaching methods within a health professional program. This study took place over the first semester of the third year of the Bachelor of Physiotherapy at Monash University, Victoria, Australia, in 2014. The entire third year cohort (n = 78) was invited to participate. Survey based design was used to collect the appropriate data. A blended learning model was implemented; subsequently students were only required to attend campus three days per week, with the remaining two days comprising online learning. This was compared to the previous year's format, a campus-based face-to-face approach where students attended campus five days per week, with the primary outcome-Value to student. Value to student incorporates, user costs associated with transportation and equipment, the amount of time saved, the price paid and perceived gross benefit. Of the 78 students invited to participate, 76 completed the post-unit survey (non-participation rate 2.6%). Based on Value to student the blended learning approach provided a $1,314.93 net benefit to students. Another significant finding was that the perceived gross benefit for the blended learning approach was $4014.84 compared to the campus-based face-to-face approach of $3651.72, indicating that students would pay more for the blended learning approach. This paper successfully applied a novel method of calculating student-centered value. This is the first step in validating the value to student outcome. Measuring economic value to the student may

  10. Value-based medicine: evidence-based medicine and beyond.

    PubMed

    Brown, Gary C; Brown, Melissa M; Sharma, Sanjay

    2003-09-01

    Value-based medicine is the practice of medicine emphasizing the value received from an intervention. Value is measured by objectively quantifying: 1) the improvement in quality of life and/or 2) the improvement in length of life conferred by an intervention. Evidence-based medicine often measures the improvement gained in length of life, but generally ignores the importance of quality of life improvement or loss. Value-based medicine incorporates the best features of evidence-based medicine and takes evidence-based data to a higher level by incorporating the quality of life perceptions of patients with a disease in concerning the value of an intervention. Inherent in value-based medicine are the costs associated with an intervention. The resources expended for the value gained in value-based medicine is measured with cost-utility analysis in terms of the US dollars/QALY (money spent per quality-adjusted life-year gained). A review of the current status and the likely future of value-based medicine is addressed herein.

  11. A generally applicable lightweight method for calculating a value structure for tools and services in bioinformatics infrastructure projects.

    PubMed

    Mayer, Gerhard; Quast, Christian; Felden, Janine; Lange, Matthias; Prinz, Manuel; Pühler, Alfred; Lawerenz, Chris; Scholz, Uwe; Glöckner, Frank Oliver; Müller, Wolfgang; Marcus, Katrin; Eisenacher, Martin

    2017-10-30

    Sustainable noncommercial bioinformatics infrastructures are a prerequisite to use and take advantage of the potential of big data analysis for research and economy. Consequently, funders, universities and institutes as well as users ask for a transparent value model for the tools and services offered. In this article, a generally applicable lightweight method is described by which bioinformatics infrastructure projects can estimate the value of tools and services offered without determining exactly the total costs of ownership. Five representative scenarios for value estimation from a rough estimation to a detailed breakdown of costs are presented. To account for the diversity in bioinformatics applications and services, the notion of service-specific 'service provision units' is introduced together with the factors influencing them and the main underlying assumptions for these 'value influencing factors'. Special attention is given on how to handle personnel costs and indirect costs such as electricity. Four examples are presented for the calculation of the value of tools and services provided by the German Network for Bioinformatics Infrastructure (de.NBI): one for tool usage, one for (Web-based) database analyses, one for consulting services and one for bioinformatics training events. Finally, from the discussed values, the costs of direct funding and the costs of payment of services by funded projects are calculated and compared. © The Author 2017. Published by Oxford University Press.

  12. 40 CFR 600.206-12 - Calculation and use of FTP-based and HFET-based fuel economy and carbon-related exhaust emission...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... exhaust emission values from the tests performed using gasoline or diesel test fuel. (ii) Calculate the city, highway, and combined fuel economy and carbon-related exhaust emission values from the tests performed using alcohol or natural gas test fuel. (b) If only one equivalent petroleum-based fuel economy...

  13. DFT calculation of pKa’s for dimethoxypyrimidinylsalicylic based herbicides

    NASA Astrophysics Data System (ADS)

    Delgado, Eduardo J.

    2009-03-01

    Dimethoxypyrimidinylsalicylic derived compounds show potent herbicidal activity as a result of the inhibition of acetohydroxyacid synthase, the first common enzyme in the biosynthetic pathway of the branched-chain aminoacids (valine, leucine and isoleucine) in plants, bacteria and fungi. Despite its practical importance, this family of compounds have been poorly characterized from a physico-chemical point of view. Thus for instance, their pK a's have not been reported earlier neither experimentally nor theoretically. In this study, the acid-dissociation constants of 39 dimethoxypyrimidinylsalicylic derived herbicides are calculated by DFT methods at B3LYP/6-31G(d,p) level of theory. The calculated values are validated by two checking tests based on the Hammett equation.

  14. Pcetk: A pDynamo-based Toolkit for Protonation State Calculations in Proteins.

    PubMed

    Feliks, Mikolaj; Field, Martin J

    2015-10-26

    Pcetk (a pDynamo-based continuum electrostatic toolkit) is an open-source, object-oriented toolkit for the calculation of proton binding energetics in proteins. The toolkit is a module of the pDynamo software library, combining the versatility of the Python scripting language and the efficiency of the compiled languages, C and Cython. In the toolkit, we have connected pDynamo to the external Poisson-Boltzmann solver, extended-MEAD. Our goal was to provide a modern and extensible environment for the calculation of protonation states, electrostatic energies, titration curves, and other electrostatic-dependent properties of proteins. Pcetk is freely available under the CeCILL license, which is compatible with the GNU General Public License. The toolkit can be found on the Web at the address http://github.com/mfx9/pcetk. The calculation of protonation states in proteins requires a knowledge of pKa values of protonatable groups in aqueous solution. However, for some groups, such as protonatable ligands bound to protein, the pKa aq values are often difficult to obtain from experiment. As a complement to Pcetk, we revisit an earlier computational method for the estimation of pKa aq values that has an accuracy of ± 0.5 pKa-units or better. Finally, we verify the Pcetk module and the method for estimating pKa aq values with different model cases.

  15. Geometric constraints in semiclassical initial value representation calculations in Cartesian coordinates: accurate reduction in zero-point energy.

    PubMed

    Issack, Bilkiss B; Roy, Pierre-Nicholas

    2005-08-22

    An approach for the inclusion of geometric constraints in semiclassical initial value representation calculations is introduced. An important aspect of the approach is that Cartesian coordinates are used throughout. We devised an algorithm for the constrained sampling of initial conditions through the use of multivariate Gaussian distribution based on a projected Hessian. We also propose an approach for the constrained evaluation of the so-called Herman-Kluk prefactor in its exact log-derivative form. Sample calculations are performed for free and constrained rare-gas trimers. The results show that the proposed approach provides an accurate evaluation of the reduction in zero-point energy. Exact basis set calculations are used to assess the accuracy of the semiclassical results. Since Cartesian coordinates are used, the approach is general and applicable to a variety of molecular and atomic systems.

  16. An Approach for Calculating Student-Centered Value in Education – A Link between Quality, Efficiency, and the Learning Experience in the Health Professions

    PubMed Central

    Ooi, Caryn; Reeves, Scott; Walsh, Kieran

    2016-01-01

    Health professional education is experiencing a cultural shift towards student-centered education. Although we are now challenging our traditional training methods, our methods for evaluating the impact of the training on the learner remains largely unchanged. What is not typically measured is student-centered value; whether it was ‘worth’ what the learner paid. The primary aim of this study was to apply a method of calculating student-centered value, applied to the context of a change in teaching methods within a health professional program. This study took place over the first semester of the third year of the Bachelor of Physiotherapy at Monash University, Victoria, Australia, in 2014. The entire third year cohort (n = 78) was invited to participate. Survey based design was used to collect the appropriate data. A blended learning model was implemented; subsequently students were only required to attend campus three days per week, with the remaining two days comprising online learning. This was compared to the previous year’s format, a campus-based face-to-face approach where students attended campus five days per week, with the primary outcome—Value to student. Value to student incorporates, user costs associated with transportation and equipment, the amount of time saved, the price paid and perceived gross benefit. Of the 78 students invited to participate, 76 completed the post-unit survey (non-participation rate 2.6%). Based on Value to student the blended learning approach provided a $1,314.93 net benefit to students. Another significant finding was that the perceived gross benefit for the blended learning approach was $4014.84 compared to the campus-based face-to-face approach of $3651.72, indicating that students would pay more for the blended learning approach. This paper successfully applied a novel method of calculating student-centered value. This is the first step in validating the value to student outcome. Measuring economic value to the

  17. 40 CFR 1066.610 - Mass-based and molar-based exhaust emission calculations.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Mass-based and molar-based exhaust... (CONTINUED) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Calculations § 1066.610 Mass-based and molar-based exhaust emission calculations. (a) Calculate your total mass of emissions over a test cycle as...

  18. 40 CFR 1066.610 - Mass-based and molar-based exhaust emission calculations.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Mass-based and molar-based exhaust... (CONTINUED) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Calculations § 1066.610 Mass-based and molar-based exhaust emission calculations. (a) Calculate your total mass of emissions over a test cycle as...

  19. 40 CFR 600.210-08 - Calculation of fuel economy values for labeling.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... electric vehicles, fuel cell vehicles, plug-in hybrid electric vehicles and vehicles equipped with hydrogen... 40 Protection of Environment 31 2012-07-01 2012-07-01 false Calculation of fuel economy values for... (CONTINUED) ENERGY POLICY FUEL ECONOMY AND GREENHOUSE GAS EXHAUST EMISSIONS OF MOTOR VEHICLES Procedures for...

  20. 40 CFR 600.210-08 - Calculation of fuel economy values for labeling.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... electric vehicles, fuel cell vehicles, plug-in hybrid electric vehicles and vehicles equipped with hydrogen... 40 Protection of Environment 31 2013-07-01 2013-07-01 false Calculation of fuel economy values for... (CONTINUED) ENERGY POLICY FUEL ECONOMY AND GREENHOUSE GAS EXHAUST EMISSIONS OF MOTOR VEHICLES Procedures for...

  1. 40 CFR 600.210-08 - Calculation of fuel economy values for labeling.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... including, but not limited to battery electric vehicles, fuel cell vehicles, plug-in hybrid electric... 40 Protection of Environment 30 2011-07-01 2011-07-01 false Calculation of fuel economy values for... (CONTINUED) ENERGY POLICY FUEL ECONOMY AND CARBON-RELATED EXHAUST EMISSIONS OF MOTOR VEHICLES Procedures for...

  2. Value-based genomics.

    PubMed

    Gong, Jun; Pan, Kathy; Fakih, Marwan; Pal, Sumanta; Salgia, Ravi

    2018-03-20

    Advancements in next-generation sequencing have greatly enhanced the development of biomarker-driven cancer therapies. The affordability and availability of next-generation sequencers have allowed for the commercialization of next-generation sequencing platforms that have found widespread use for clinical-decision making and research purposes. Despite the greater availability of tumor molecular profiling by next-generation sequencing at our doorsteps, the achievement of value-based care, or improving patient outcomes while reducing overall costs or risks, in the era of precision oncology remains a looming challenge. In this review, we highlight available data through a pre-established and conceptualized framework for evaluating value-based medicine to assess the cost (efficiency), clinical benefit (effectiveness), and toxicity (safety) of genomic profiling in cancer care. We also provide perspectives on future directions of next-generation sequencing from targeted panels to whole-exome or whole-genome sequencing and describe potential strategies needed to attain value-based genomics.

  3. Value-based genomics

    PubMed Central

    Gong, Jun; Pan, Kathy; Fakih, Marwan; Pal, Sumanta; Salgia, Ravi

    2018-01-01

    Advancements in next-generation sequencing have greatly enhanced the development of biomarker-driven cancer therapies. The affordability and availability of next-generation sequencers have allowed for the commercialization of next-generation sequencing platforms that have found widespread use for clinical-decision making and research purposes. Despite the greater availability of tumor molecular profiling by next-generation sequencing at our doorsteps, the achievement of value-based care, or improving patient outcomes while reducing overall costs or risks, in the era of precision oncology remains a looming challenge. In this review, we highlight available data through a pre-established and conceptualized framework for evaluating value-based medicine to assess the cost (efficiency), clinical benefit (effectiveness), and toxicity (safety) of genomic profiling in cancer care. We also provide perspectives on future directions of next-generation sequencing from targeted panels to whole-exome or whole-genome sequencing and describe potential strategies needed to attain value-based genomics. PMID:29644010

  4. Calculating a Continuous Metabolic Syndrome Score Using Nationally Representative Reference Values.

    PubMed

    Guseman, Emily Hill; Eisenmann, Joey C; Laurson, Kelly R; Cook, Stephen R; Stratbucker, William

    2018-02-26

    The prevalence of metabolic syndrome in youth varies on the basis of the classification system used, prompting implementation of continuous scores; however, the use of these scores is limited to the sample from which they were derived. We sought to describe the derivation of the continuous metabolic syndrome score using nationally representative reference values in a sample of obese adolescents and a national sample obtained from National Health and Nutrition Examination Survey (NHANES) 2011-2012. Clinical data were collected from 50 adolescents seeking obesity treatment at a stage 3 weight management center. A second analysis relied on data from adolescents included in NHANES 2011-2012, performed for illustrative purposes. The continuous metabolic syndrome score was calculated by regressing individual values onto nationally representative age- and sex-specific standards (NHANES III). Resultant z scores were summed to create a total score. The final sample included 42 obese adolescents (15 male and 35 female subjects; mean age, 14.8 ± 1.9 years) and an additional 445 participants from NHANES 2011-2012. Among the clinical sample, the mean continuous metabolic syndrome score was 4.16 ± 4.30, while the NHANES sample mean was quite a bit lower, at -0.24 ± 2.8. We provide a method to calculate the continuous metabolic syndrome by comparing individual risk factor values to age- and sex-specific percentiles from a nationally representative sample. Copyright © 2018 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.

  5. The Principal Axis Approach to Value-Added Calculation

    ERIC Educational Resources Information Center

    He, Qingping; Tymms, Peter

    2014-01-01

    The assessment of the achievement of students and the quality of schools has drawn increasing attention from educational researchers, policy makers, and practitioners. Various test-based accountability and feedback systems involving the use of value-added techniques have been developed for evaluating the effectiveness of individual teaching…

  6. Overstating values: medical facts, diverse values, bioethics and values-based medicine.

    PubMed

    Parker, Malcolm

    2013-02-01

    Fulford has argued that (1) the medical concepts illness, disease and dysfunction are inescapably evaluative terms, (2) illness is conceptually prior to disease, and (3) a model conforming to (2) has greater explanatory power and practical utility than the conventional value-free medical model. This 'reverse' model employs Hare's distinction between description and evaluation, and the sliding relationship between descriptive and evaluative meaning. Fulford's derivative 'Values Based Medicine' (VBM) readjusts the imbalance between the predominance of facts over values in medicine. VBM allegedly responds to the increased choices made available by, inter alia, the progress of medical science itself. VBM attributes appropriate status to evaluative meaning, where strong consensus about descriptive meaning is lacking. According to Fulford, quasi-legal bioethics, while it can be retained as a kind of deliberative framework, is outcome-based and pursues 'the right answer', while VBM approximates a democratic, process-oriented method for dealing with diverse values, in partnership with necessary contributions from evidence-based medicine (EBM). I support the non-cognitivist underpinnings of VBM, and its emphasis on the importance of values in medicine. But VBM overstates the complexity and diversity of values, misrepresents EBM and VBM as responses to scientific and evaluative complexity, and mistakenly depicts 'quasi-legal bioethics' as a space of settled descriptive meaning. Bioethical reasoning can expose strategies that attempt to reduce authentic values to scientific facts, illustrating that VBM provides no advantage over bioethics in delineating the connections between facts and values in medicine. © 2011 Blackwell Publishing Ltd.

  7. 40 CFR 600.209-95 - Calculation of fuel economy values for labeling.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Calculation of fuel economy values for labeling. 600.209-95 Section 600.209-95 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) ENERGY POLICY FUEL ECONOMY AND CARBON-RELATED EXHAUST EMISSIONS OF MOTOR VEHICLES Fuel Economy Regulations for 1977 and Later Model Year...

  8. 40 CFR 600.209-85 - Calculation of fuel economy values for labeling.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Calculation of fuel economy values for labeling. 600.209-85 Section 600.209-85 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) ENERGY POLICY FUEL ECONOMY AND CARBON-RELATED EXHAUST EMISSIONS OF MOTOR VEHICLES Fuel Economy Regulations for 1977 and Later Model Year...

  9. 40 CFR 600.210-08 - Calculation of fuel economy values for labeling.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Calculation of fuel economy values for labeling. 600.210-08 Section 600.210-08 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) ENERGY POLICY FUEL ECONOMY AND CARBON-RELATED EXHAUST EMISSIONS OF MOTOR VEHICLES Fuel Economy Regulations for 1977 and Later Model Year...

  10. Value-based metrics and Internet-based enterprises

    NASA Astrophysics Data System (ADS)

    Gupta, Krishan M.

    2001-10-01

    Within the last few years, a host of value-based metrics like EVA, MVA, TBR, CFORI, and TSR have evolved. This paper attempts to analyze the validity and applicability of EVA and Balanced Scorecard for Internet based organizations. Despite the collapse of the dot-com model, the firms engaged in e- commerce continue to struggle to find new ways to account for customer-base, technology, employees, knowledge, etc, as part of the value of the firm. While some metrics, like the Balance Scorecard are geared towards internal use, others like EVA are for external use. Value-based metrics are used for performing internal audits as well as comparing firms against one another; and can also be effectively utilized by individuals outside the firm looking to determine if the firm is creating value for its stakeholders.

  11. The HackensackUMC Value-Based Care Model: Building Essentials for Value-Based Purchasing.

    PubMed

    Douglas, Claudia; Aroh, Dianne; Colella, Joan; Quadri, Mohammed

    2016-01-01

    The Affordable Care Act, 2010, and the subsequent shift from a quantity-focus to a value-centric reimbursement model led our organization to create the HackensackUMC Value-Based Care Model to improve our process capability and performance to meet and sustain the triple aims of value-based purchasing: higher quality, lower cost, and consumer perception. This article describes the basics of our model and illustrates how we used it to reduce the costs of our patient sitter program.

  12. 40 CFR 600.206-12 - Calculation and use of FTP-based and HFET-based fuel economy and carbon-related exhaust emission...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Calculation and use of FTP-based and HFET-based fuel economy and carbon-related exhaust emission values for vehicle configurations. 600.206-12 Section 600.206-12 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) ENERGY POLICY FUEL ECONOMY AND CARBON-RELATED EXHAUST...

  13. Calculation and affection of pH value of different desulfurization and dehydration rates in the filling station based on Aspen Plus

    NASA Astrophysics Data System (ADS)

    Lv, J. X.; Wang, B. F.; Nie, L. H.; Xu, R. R.; Zhou, J. Y.; Hao, Y. J.

    2018-01-01

    The simulation process of the whole CNG filling station are established using Aspen Plus V7.2. The separator (Sep) was used to simulate the desulfurization and dehydration equipment in the gas station, and the flash module separator Flash 2 was used to simulate the gas storage well with proper temperature and environmental pressure. Furthermore, the sensitivity module was used to analyse the behaviour of the dehydration and desulfurization rate, and the residual pH value of the gas storage wells was between 2.2 and 3.3. The results indicated that the effect of water content on pH value is higher than that of hydrogen sulphide in the environment of gas storage wells, and the calculation process of the pH value is feasible. Additionally, the simulation process provides basic data for the subsequent anticorrosive mechanism and work of gas storage well and has great potential for practical applications.

  14. 25 CFR 179.102 - How does the Secretary calculate the value of a remainder and a life estate?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 25 Indians 1 2014-04-01 2014-04-01 false How does the Secretary calculate the value of a remainder... How does the Secretary calculate the value of a remainder and a life estate? (a) If income is subject to division, the Secretary will use Actuarial Table S, Valuation of Annuities, found at 26 CFR 20...

  15. 25 CFR 179.102 - How does the Secretary calculate the value of a remainder and a life estate?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 25 Indians 1 2011-04-01 2011-04-01 false How does the Secretary calculate the value of a remainder... How does the Secretary calculate the value of a remainder and a life estate? (a) If income is subject to division, the Secretary will use Actuarial Table S, Valuation of Annuities, found at 26 CFR 20...

  16. 25 CFR 179.102 - How does the Secretary calculate the value of a remainder and a life estate?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 25 Indians 1 2012-04-01 2011-04-01 true How does the Secretary calculate the value of a remainder... How does the Secretary calculate the value of a remainder and a life estate? (a) If income is subject to division, the Secretary will use Actuarial Table S, Valuation of Annuities, found at 26 CFR 20...

  17. 25 CFR 179.102 - How does the Secretary calculate the value of a remainder and a life estate?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 25 Indians 1 2013-04-01 2013-04-01 false How does the Secretary calculate the value of a remainder... How does the Secretary calculate the value of a remainder and a life estate? (a) If income is subject to division, the Secretary will use Actuarial Table S, Valuation of Annuities, found at 26 CFR 20...

  18. 25 CFR 179.102 - How does the Secretary calculate the value of a remainder and a life estate?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false How does the Secretary calculate the value of a remainder... How does the Secretary calculate the value of a remainder and a life estate? (a) If income is subject to division, the Secretary will use Actuarial Table S, Valuation of Annuities, found at 26 CFR 20...

  19. What Is Professional Development Worth? Calculating the Value of Onboarding Programs in Extension

    ERIC Educational Resources Information Center

    Harder, Amy; Hodges, Alan; Zelaya, Priscilla

    2017-01-01

    Return on investment (ROI) is a commonly used metric for organizations concerned with demonstrating the value of their investments; it can be used to determine whether funds spent providing professional development programs for Extension professionals are good investments. This article presents a method for calculating ROI for an onboarding…

  20. Online plasma calculator

    NASA Astrophysics Data System (ADS)

    Wisniewski, H.; Gourdain, P.-A.

    2017-10-01

    APOLLO is an online, Linux based plasma calculator. Users can input variables that correspond to their specific plasma, such as ion and electron densities, temperatures, and external magnetic fields. The system is based on a webserver where a FastCGI protocol computes key plasma parameters including frequencies, lengths, velocities, and dimensionless numbers. FastCGI was chosen to overcome security problems caused by JAVA-based plugins. The FastCGI also speeds up calculations over PHP based systems. APOLLO is built upon the WT library, which turns any web browser into a versatile, fast graphic user interface. All values with units are expressed in SI units except temperature, which is in electron-volts. SI units were chosen over cgs units because of the gradual shift to using SI units within the plasma community. APOLLO is intended to be a fast calculator that also provides the user with the proper equations used to calculate the plasma parameters. This system is intended to be used by undergraduates taking plasma courses as well as graduate students and researchers who need a quick reference calculation.

  1. Evaluation of steam sterilization processes: comparing calculations using temperature data and biointegrator reduction data and calculation of theoretical temperature difference.

    PubMed

    Lundahl, Gunnel

    2007-01-01

    When calculating of the physical F121.1 degrees c-value by the equation F121.1 degrees C = t x 10(T-121.1/z the temperature (T), in combination with the z-value, influences the F121.1 degrees c-value exponentially. Because the z-value for spores of Geobacillus stearothermophilus often varies between 6 and 9, the biological F-value (F(Bio) will not always correspond to the F0-value based on temperature records from the sterilization process calculated with a z-value of 10, even if the calibration of both of them are correct. Consequently an error in calibration of thermocouples and difference in z-values influences the F121.1 degrees c-values logarithmically. The paper describes how results from measurements with different z-values can be compared. The first part describes the mathematics of a calculation program, which makes it easily possible to compare F0-values based on temperature records with the F(BIO)-value based on analysis of bioindicators such as glycerin-water-suspension sensors. For biological measurements, a suitable bioindicator with a high D121-value can be used (such a bioindicator can be manufactured as described in the article "A Method of Increasing Test Range and Accuracy of Bioindicators-Geobacillus stearothermophilus Spores"). By the mathematics and calculations described in this macro program it is possible to calculate for every position the theoretical temperature difference (deltaT(th)) needed to explain the difference in results between the thermocouple and the biointegrator. Since the temperature difference is a linear function and constant all over the process this value is an indication of the magnitude of an error. A graph and table from these calculations gives a picture of the run. The second part deals with product characteristics, the sterilization processes, loading patterns. Appropriate safety margins have to be chosen in the development phase of a sterilization process to achieve acceptable safety limits. Case studies are

  2. Update on value-based medicine.

    PubMed

    Brown, Melissa M; Brown, Gary C

    2013-05-01

    To update concepts in Value-Based Medicine, especially in view of the Patient Protection and Affordable Care Act. The Patient Protection and Affordable Care Act assures that some variant of Value-Based Medicine cost-utility analysis will play a key role in the healthcare system. It identifies the highest quality care, thereby maximizing the most efficacious use of healthcare resources and empowering patients and physicians.Standardization is critical for the creation and acceptance of a Value-Based Medicine, cost-utility analysis, information system, since 27 million different input variants can go into a cost-utility analysis. Key among such standards is the use of patient preferences (utilities), as patients best understand the quality of life associated with their health states. The inclusion of societal costs, versus direct medical costs alone, demonstrates that medical interventions are more cost effective and, in many instances, provide a net financial return-on-investment to society referent to the direct medical costs expended. Value-Based Medicine provides a standardized methodology, integrating critical, patient, quality-of-life preferences, and societal costs, to allow the highest quality, most cost-effective care. Central to Value-Based Medicine is the concept that all patients deserve the interventions that provide the greatest patient value (improvement in quality of life and/or length of life).

  3. Accuracy of radiotherapy dose calculations based on cone-beam CT: comparison of deformable registration and image correction based methods

    NASA Astrophysics Data System (ADS)

    Marchant, T. E.; Joshi, K. D.; Moore, C. J.

    2018-03-01

    Radiotherapy dose calculations based on cone-beam CT (CBCT) images can be inaccurate due to unreliable Hounsfield units (HU) in the CBCT. Deformable image registration of planning CT images to CBCT, and direct correction of CBCT image values are two methods proposed to allow heterogeneity corrected dose calculations based on CBCT. In this paper we compare the accuracy and robustness of these two approaches. CBCT images for 44 patients were used including pelvis, lung and head & neck sites. CBCT HU were corrected using a ‘shading correction’ algorithm and via deformable registration of planning CT to CBCT using either Elastix or Niftyreg. Radiotherapy dose distributions were re-calculated with heterogeneity correction based on the corrected CBCT and several relevant dose metrics for target and OAR volumes were calculated. Accuracy of CBCT based dose metrics was determined using an ‘override ratio’ method where the ratio of the dose metric to that calculated on a bulk-density assigned version of the same image is assumed to be constant for each patient, allowing comparison to the patient’s planning CT as a gold standard. Similar performance is achieved by shading corrected CBCT and both deformable registration algorithms, with mean and standard deviation of dose metric error less than 1% for all sites studied. For lung images, use of deformed CT leads to slightly larger standard deviation of dose metric error than shading corrected CBCT with more dose metric errors greater than 2% observed (7% versus 1%).

  4. Establishing values-based leadership and value systems in healthcare organizations.

    PubMed

    Graber, David R; Kilpatrick, Anne Osborne

    2008-01-01

    The importance of values in organizations is often discussed in management literature. Possessing strong or inspiring values is increasingly considered to be a key quality of successful leaders. Another common theme is that organizational values contribute to the culture and ultimate success of organizations. These conceptions or expectations are clearly applicable to healthcare organizations in the United States. However, healthcare organizations have unique structures and are subject to societal expectations that must be accommodated within an organizational values system. This article describes theoretical literature on organizational values. Cultural and religious influences on Americans and how they may influence expectations from healthcare providers are discussed. Organizational cultures and the training and socialization of the numerous professional groups in healthcare also add to the considerable heterogeneity of value systems within healthcare organizations. These contribute to another challenge confronting healthcare managers--competing or conflicting values within a unit or the entire organization. Organizations often fail to reward members who uphold or enact the organization's values, which can lead to lack of motivation and commitment to the organization. Four key elements of values-based leadership are presented for healthcare managers who seek to develop as values-based leaders. 1) Recognize your personal and professional values, 2) Determine what you expect from the larger organization and what you can implement within your sphere of influence, 3) Understand and incorporate the values of internal stakeholders, and 4) Commit to values-based leadership.

  5. Measurement-based model of a wide-bore CT scanner for Monte Carlo dosimetric calculations with GMCTdospp software.

    PubMed

    Skrzyński, Witold

    2014-11-01

    The aim of this work was to create a model of a wide-bore Siemens Somatom Sensation Open CT scanner for use with GMCTdospp, which is an EGSnrc-based software tool dedicated for Monte Carlo calculations of dose in CT examinations. The method was based on matching spectrum and filtration to half value layer and dose profile, and thus was similar to the method of Turner et al. (Med. Phys. 36, pp. 2154-2164). Input data on unfiltered beam spectra were taken from two sources: the TASMIP model and IPEM Report 78. Two sources of HVL data were also used, namely measurements and documentation. Dose profile along the fan-beam was measured with Gafchromic RTQA-1010 (QA+) film. Two-component model of filtration was assumed: bow-tie filter made of aluminum with 0.5 mm thickness on central axis, and flat filter made of one of four materials: aluminum, graphite, lead, or titanium. Good agreement between calculations and measurements was obtained for models based on the measured values of HVL. Doses calculated with GMCTdospp differed from the doses measured with pencil ion chamber placed in PMMA phantom by less than 5%, and root mean square difference for four tube potentials and three positions in the phantom did not exceed 2.5%. The differences for models based on HVL values from documentation exceeded 10%. Models based on TASMIP spectra and IPEM78 spectra performed equally well. Copyright © 2014 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  6. Tree value system: description and assumptions.

    Treesearch

    D.G. Briggs

    1989-01-01

    TREEVAL is a microcomputer model that calculates tree or stand values and volumes based on product prices, manufacturing costs, and predicted product recovery. It was designed as an aid in evaluating management regimes. TREEVAL calculates values in either of two ways, one based on optimized tree bucking using dynamic programming and one simulating the results of user-...

  7. Neurocognitive mechanisms underlying value-based decision-making: from core values to economic value

    PubMed Central

    Brosch, Tobias; Sander, David

    2013-01-01

    Value plays a central role in practically every aspect of human life that requires a decision: whether we choose between different consumer goods, whether we decide which person we marry or which political candidate gets our vote, we choose the option that has more value to us. Over the last decade, neuroeconomic research has mapped the neural substrates of economic value, revealing that activation in brain regions such as ventromedial prefrontal cortex (VMPFC), ventral striatum or posterior cingulate cortex reflects how much an individual values an option and which of several options he/she will choose. However, while great progress has been made exploring the mechanisms underlying concrete decisions, neuroeconomic research has been less concerned with the questions of why people value what they value, and why different people value different things. Social psychologists and sociologists have long been interested in core values, motivational constructs that are intrinsically linked to the self-schema and are used to guide actions and decisions across different situations and different time points. Core value may thus be an important determinant of individual differences in economic value computation and decision-making. Based on a review of recent neuroimaging studies investigating the neural representation of core values and their interactions with neural systems representing economic value, we outline a common framework that integrates the core value concept and neuroeconomic research on value-based decision-making. PMID:23898252

  8. Effect of blood sampling schedule and method of calculating the area under the curve on validity and precision of glycaemic index values.

    PubMed

    Wolever, Thomas M S

    2004-02-01

    To evaluate the suitability for glycaemic index (GI) calculations of using blood sampling schedules and methods of calculating area under the curve (AUC) different from those recommended, the GI values of five foods were determined by recommended methods (capillary blood glucose measured seven times over 2.0 h) in forty-seven normal subjects and different calculations performed on the same data set. The AUC was calculated in four ways: incremental AUC (iAUC; recommended method), iAUC above the minimum blood glucose value (AUCmin), net AUC (netAUC) and iAUC including area only before the glycaemic response curve cuts the baseline (AUCcut). In addition, iAUC was calculated using four different sets of less than seven blood samples. GI values were derived using each AUC calculation. The mean GI values of the foods varied significantly according to the method of calculating GI. The standard deviation of GI values calculating using iAUC (20.4), was lower than six of the seven other methods, and significantly less (P<0.05) than that using netAUC (24.0). To be a valid index of food glycaemic response independent of subject characteristics, GI values in subjects should not be related to their AUC after oral glucose. However, calculating GI using AUCmin or less than seven blood samples resulted in significant (P<0.05) relationships between GI and mean AUC. It is concluded that, in subjects without diabetes, the recommended blood sampling schedule and method of AUC calculation yields more valid and/or more precise GI values than the seven other methods tested here. The only method whose results agreed reasonably well with the recommended method (ie. within +/-5 %) was AUCcut.

  9. 40 CFR 600.211-08 - Sample calculation of fuel economy values for labeling.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Sample calculation of fuel economy values for labeling. 600.211-08 Section 600.211-08 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) ENERGY POLICY FUEL ECONOMY AND CARBON-RELATED EXHAUST EMISSIONS OF MOTOR VEHICLES Fuel Economy Regulations for 1977 and Later Model...

  10. Calculating Measurement Uncertainty of the “Conventional Value of the Result of Weighing in Air”

    DOE PAGES

    Flicker, Celia J.; Tran, Hy D.

    2016-04-02

    The conventional value of the result of weighing in air is frequently used in commercial calibrations of balances. The guidance in OIML D-028 for reporting uncertainty of the conventional value is too terse. When calibrating mass standards at low measurement uncertainties, it is necessary to perform a buoyancy correction before reporting the result. When calculating the conventional result after calibrating true mass, the uncertainty due to calculating the conventional result is correlated with the buoyancy correction. We show through Monte Carlo simulations that the measurement uncertainty of the conventional result is less than the measurement uncertainty when reporting true mass.more » The Monte Carlo simulation tool is available in the online version of this article.« less

  11. Modeling and Ab initio Calculations of Thermal Transport in Si-Based Clathrates and Solar Perovskites

    NASA Astrophysics Data System (ADS)

    He, Yuping

    2015-03-01

    We present calculations of the thermal transport coefficients of Si-based clathrates and solar perovskites, as obtained from ab initio calculations and models, where all input parameters derived from first principles. We elucidated the physical mechanisms responsible for the measured low thermal conductivity in Si-based clatherates and predicted their electronic properties and mobilities, which were later confirmed experimentally. We also predicted that by appropriately tuning the carrier concentration, the thermoelectric figure of merit of Sn and Pb based perovskites may reach values ranging between 1 and 2, which could possibly be further increased by optimizing the lattice thermal conductivity through engineering perovskite superlattices. Work done in collaboration with Prof. G. Galli, and supported by DOE/BES Grant No. DE-FG0206ER46262.

  12. TrackEtching - A Java based code for etched track profile calculations in SSNTDs

    NASA Astrophysics Data System (ADS)

    Muraleedhara Varier, K.; Sankar, V.; Gangadathan, M. P.

    2017-09-01

    A java code incorporating a user friendly GUI has been developed to calculate the parameters of chemically etched track profiles of ion-irradiated solid state nuclear track detectors. Huygen's construction of wavefronts based on secondary wavelets has been used to numerically calculate the etched track profile as a function of the etching time. Provision for normal incidence and oblique incidence on the detector surface has been incorporated. Results in typical cases are presented and compared with experimental data. Different expressions for the variation of track etch rate as a function of the ion energy have been utilized. The best set of values of the parameters in the expressions can be obtained by comparing with available experimental data. Critical angle for track development can also be calculated using the present code.

  13. Value-based management of design reuse

    NASA Astrophysics Data System (ADS)

    Carballo, Juan Antonio; Cohn, David L.; Belluomini, Wendy; Montoye, Robert K.

    2003-06-01

    Effective design reuse in electronic products has the potential to provide very large cost savings, substantial time-to-market reduction, and extra sources of revenue. Unfortunately, critical reuse opportunities are often missed because, although they provide clear value to the corporation, they may not benefit the business performance of an internal organization. It is therefore crucial to provide tools to help reuse partners participate in a reuse transaction when the transaction provides value to the corporation as a whole. Value-based Reuse Management (VRM) addresses this challenge by (a) ensuring that all parties can quickly assess the business performance impact of a reuse opportunity, and (b) encouraging high-value reuse opportunities by supplying value-based rewards to potential parties. In this paper we introduce the Value-Based Reuse Management approach and we describe key results on electronic designs that demonstrate its advantages. Our results indicate that Value-Based Reuse Management has the potential to significantly increase the success probability of high-value electronic design reuse.

  14. Strategies for defining traits when calculating economic values for livestock breeding: a review.

    PubMed

    Wolfová, M; Wolf, J

    2013-09-01

    The objective of the present review was (i) to survey different approaches for choosing the complex of traits for which economic values (EVs) are calculated, (ii) to call attention to the proper definition of traits and (iii) to discuss the manner and extent to which relationships among traits have been considered in the calculation of EVs. For this purpose, papers dealing with the estimation of EVs of traits in livestock were reviewed. The most important reasons for incompatibility of EVs for similar traits estimated in different countries and by different authors were found to be inconsistencies in trait definitions and in assumptions being made about relationships among traits. An important problem identified was how to choose the most appropriate criterion to characterise production or functional ability for a particular class of animals. Accordingly, the review covered the following three topics: (i) which trait(s) would best characterise the growth ability of an animal; (ii) how to define traits expressed repeatedly in subsequent reproductive cycles of breeding females and (iii) how to deal with traits that differ in average value between sexes or among animal groups. Various approaches that have been used to solve these problems were discussed. Furthermore, the manner in which diverse authors chose one or more traits from a group of alternatives for describing a specific biological potential were reviewed and commented on. The consequences of including or excluding relationships among economically important traits when estimating the EV for a specific trait were also examined. An important conclusion of the review is that, for a better comparability and interpretability of estimated EVs in the literature, it is desirable that clear and unique definitions of the traits, complete information on assumptions used in analytical models and details on inter-relationships between traits are documented. Furthermore, the method and the model used for the genetic

  15. New approach to CT pixel-based photon dose calculations in heterogeneous media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wong, J.W.; Henkelman, R.M.

    The effects of small cavities on dose in water and the dose in a homogeneous nonunit density medium illustrate that inhomogeneities do not act independently in photon dose perturbation, and serve as two constraints which should be satisfied by approximate methods of computed tomography (CT) pixel-based dose calculations. Current methods at best satisfy only one of the two constraints and show inadequacies in some intermediate geometries. We have developed an approximate method that satisfies both these constraints and treats much of the synergistic effect of multiple inhomogeneities correctly. The method calculates primary and first-scatter doses by first-order ray tracing withmore » the first-scatter contribution augmented by a component of second scatter that behaves like first scatter. Multiple-scatter dose perturbation values extracted from small cavity experiments are used in a function which approximates the small residual multiple-scatter dose. For a wide range of geometries tested, our method agrees very well with measurements. The average deviation is less than 2% with a maximum of 3%. In comparison, calculations based on existing methods can have errors larger than 10%.« less

  16. Values-based recruitment in health care.

    PubMed

    Miller, Sam Louise

    2015-01-27

    Values-based recruitment is a process being introduced to student selection for nursing courses and appointment to registered nurse posts. This article discusses the process of values-based recruitment and demonstrates why it is important in health care today. It examines the implications of values-based recruitment for candidates applying to nursing courses and to newly qualified nurses applying for their first posts in England. To ensure the best chance of success, candidates should understand the principles and process of values-based recruitment and how to prepare for this type of interview.

  17. Interval MULTIMOORA method with target values of attributes based on interval distance and preference degree: biomaterials selection

    NASA Astrophysics Data System (ADS)

    Hafezalkotob, Arian; Hafezalkotob, Ashkan

    2017-06-01

    A target-based MADM method covers beneficial and non-beneficial attributes besides target values for some attributes. Such techniques are considered as the comprehensive forms of MADM approaches. Target-based MADM methods can also be used in traditional decision-making problems in which beneficial and non-beneficial attributes only exist. In many practical selection problems, some attributes have given target values. The values of decision matrix and target-based attributes can be provided as intervals in some of such problems. Some target-based decision-making methods have recently been developed; however, a research gap exists in the area of MADM techniques with target-based attributes under uncertainty of information. We extend the MULTIMOORA method for solving practical material selection problems in which material properties and their target values are given as interval numbers. We employ various concepts of interval computations to reduce degeneration of uncertain data. In this regard, we use interval arithmetic and introduce innovative formula for interval distance of interval numbers to create interval target-based normalization technique. Furthermore, we use a pairwise preference matrix based on the concept of degree of preference of interval numbers to calculate the maximum, minimum, and ranking of these numbers. Two decision-making problems regarding biomaterials selection of hip and knee prostheses are discussed. Preference degree-based ranking lists for subordinate parts of the extended MULTIMOORA method are generated by calculating the relative degrees of preference for the arranged assessment values of the biomaterials. The resultant rankings for the problem are compared with the outcomes of other target-based models in the literature.

  18. SU-E-T-02: 90Y Microspheres Dosimetry Calculation with Voxel-S-Value Method: A Simple Use in the Clinic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maneru, F; Gracia, M; Gallardo, N

    2015-06-15

    Purpose: To present a simple and feasible method of voxel-S-value (VSV) dosimetry calculation for daily clinical use in radioembolization (RE) with {sup 90}Y microspheres. Dose distributions are obtained and visualized over CT images. Methods: Spatial dose distributions and dose in liver and tumor are calculated for RE patients treated with Sirtex Medical miscrospheres at our center. Data obtained from the previous simulation of treatment were the basis for calculations: Tc-99m maggregated albumin SPECT-CT study in a gammacamera (Infinia, General Electric Healthcare.). Attenuation correction and ordered-subsets expectation maximization (OSEM) algorithm were applied.For VSV calculations, both SPECT and CT were exported frommore » the gammacamera workstation and registered with the radiotherapy treatment planning system (Eclipse, Varian Medical systems). Convolution of activity matrix and local dose deposition kernel (S values) was implemented with an in-house developed software based on Python code. The kernel was downloaded from www.medphys.it. Final dose distribution was evaluated with the free software Dicompyler. Results: Liver mean dose is consistent with Partition method calculations (accepted as a good standard). Tumor dose has not been evaluated due to the high dependence on its contouring. Small lesion size, hot spots in health tissue and blurred limits can affect a lot the dose distribution in tumors. Extra work includes: export and import of images and other dicom files, create and calculate a dummy plan of external radiotherapy, convolution calculation and evaluation of the dose distribution with dicompyler. Total time spent is less than 2 hours. Conclusion: VSV calculations do not require any extra appointment or any uncomfortable process for patient. The total process is short enough to carry it out the same day of simulation and to contribute to prescription decisions prior to treatment. Three-dimensional dose knowledge provides much more

  19. Value-based medicine: concepts and application.

    PubMed

    Bae, Jong-Myon

    2015-01-01

    Global healthcare in the 21st century is characterized by evidence-based medicine (EBM), patient-centered care, and cost effectiveness. EBM involves clinical decisions being made by integrating patient preference with medical treatment evidence and physician experiences. The Center for Value-Based Medicine suggested value-based medicine (VBM) as the practice of medicine based upon the patient-perceived value conferred by an intervention. VBM starts with the best evidence-based data and converts it to patient value-based data, so that it allows clinicians to deliver higher quality patient care than EBM alone. The final goals of VBM are improving quality of healthcare and using healthcare resources efficiently. This paper introduces the concepts and application of VBM and suggests some strategies for promoting related research.

  20. Value-based medicine: concepts and application

    PubMed Central

    Bae, Jong-Myon

    2015-01-01

    Global healthcare in the 21st century is characterized by evidence-based medicine (EBM), patient-centered care, and cost effectiveness. EBM involves clinical decisions being made by integrating patient preference with medical treatment evidence and physician experiences. The Center for Value-Based Medicine suggested value-based medicine (VBM) as the practice of medicine based upon the patient-perceived value conferred by an intervention. VBM starts with the best evidence-based data and converts it to patient value-based data, so that it allows clinicians to deliver higher quality patient care than EBM alone. The final goals of VBM are improving quality of healthcare and using healthcare resources efficiently. This paper introduces the concepts and application of VBM and suggests some strategies for promoting related research. PMID:25773441

  1. Data base to compare calculations and observations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tichler, J.L.

    Meteorological and climatological data bases were compared with known tritium release points and diffusion calculations to determine if calculated concentrations could replace measure concentrations at the monitoring stations. Daily tritium concentrations were monitored at 8 stations and 16 possible receptors. Automated data retrieval strategies are listed. (PSB)

  2. Calculations of Hubbard U from first-principles

    NASA Astrophysics Data System (ADS)

    Aryasetiawan, F.; Karlsson, K.; Jepsen, O.; Schönberger, U.

    2006-09-01

    The Hubbard U of the 3d transition metal series as well as SrVO3 , YTiO3 , Ce, and Gd has been estimated using a recently proposed scheme based on the random-phase approximation. The values obtained are generally in good accord with the values often used in model calculations but for some cases the estimated values are somewhat smaller than those used in the literature. We have also calculated the frequency-dependent U for some of the materials. The strong frequency dependence of U in some of the cases considered in this paper suggests that the static value of U may not be the most appropriate one to use in model calculations. We have also made comparison with the constrained local density approximation (LDA) method and found some discrepancies in a number of cases. We emphasize that our scheme and the constrained local density approximation LDA method theoretically ought to give similar results and the discrepancies may be attributed to technical difficulties in performing calculations based on currently implemented constrained LDA schemes.

  3. Simplified approach to the mixed time-averaging semiclassical initial value representation for the calculation of dense vibrational spectra

    NASA Astrophysics Data System (ADS)

    Buchholz, Max; Grossmann, Frank; Ceotto, Michele

    2018-03-01

    We present and test an approximate method for the semiclassical calculation of vibrational spectra. The approach is based on the mixed time-averaging semiclassical initial value representation method, which is simplified to a form that contains a filter to remove contributions from approximately harmonic environmental degrees of freedom. This filter comes at no additional numerical cost, and it has no negative effect on the accuracy of peaks from the anharmonic system of interest. The method is successfully tested for a model Hamiltonian and then applied to the study of the frequency shift of iodine in a krypton matrix. Using a hierarchic model with up to 108 normal modes included in the calculation, we show how the dynamical interaction between iodine and krypton yields results for the lowest excited iodine peaks that reproduce experimental findings to a high degree of accuracy.

  4. Valuing Trial Designs from a Pharmaceutical Perspective Using Value-Based Pricing.

    PubMed

    Breeze, Penny; Brennan, Alan

    2015-11-01

    Our aim was to adapt the traditional framework for expected net benefit of sampling (ENBS) to be more compatible with drug development trials from the pharmaceutical perspective. We modify the traditional framework for conducting ENBS and assume that the price of the drug is conditional on the trial outcomes. We use a value-based pricing (VBP) criterion to determine price conditional on trial data using Bayesian updating of cost-effectiveness (CE) model parameters. We assume that there is a threshold price below which the company would not market the new intervention. We present a case study in which a phase III trial sample size and trial duration are varied. For each trial design, we sampled 10,000 trial outcomes and estimated VBP using a CE model. The expected commercial net benefit is calculated as the expected profits minus the trial costs. A clinical trial with shorter follow-up, and larger sample size, generated the greatest expected commercial net benefit. Increasing the duration of follow-up had a modest impact on profit forecasts. Expected net benefit of sampling can be adapted to value clinical trials in the pharmaceutical industry to optimise the expected commercial net benefit. However, the analyses can be very time consuming for complex CE models. © 2014 The Authors. Health Economics published by John Wiley & Sons Ltd.

  5. Tree value system: users guide.

    Treesearch

    J.K. Ayer Sachet; D.G. Briggs; R.D. Fight

    1989-01-01

    This paper instructs resource analysts on use of the Tree Value System (TREEVAL). TREEVAL is a microcomputer system of programs for calculating tree or stand values and volumes based on predicted product recovery. Designed for analyzing silvicultural decisions, the system can also be used for appraisals and for evaluating log bucking. The system calculates results...

  6. Electrostatic potential calculation for biomolecules--creating a database of pre-calculated values reported on a per residue basis for all PDB protein structures.

    PubMed

    Rocchia, W; Neshich, G

    2007-10-05

    STING and Java Protein Dossier provide a collection of physical-chemical parameters, describing protein structure, stability, function, and interaction, considered one of the most comprehensive among the available protein databases of similar type. Particular attention in STING is paid to the electrostatic potential. It makes use of DelPhi, a well-known tool that calculates this physical-chemical quantity for biomolecules by solving the Poisson Boltzmann equation. In this paper, we describe a modification to the DelPhi program aimed at integrating it within the STING environment. We also outline how the "amino acid electrostatic potential" and the "surface amino acid electrostatic potential" are calculated (over all Protein Data Bank (PDB) content) and how the corresponding values are made searchable in STING_DB. In addition, we show that the STING and Java Protein Dossier are also capable of providing these particular parameter values for the analysis of protein structures modeled in computers or being experimentally solved, but not yet deposited in the PDB. Furthermore, we compare the calculated electrostatic potential values obtained by using the earlier version of DelPhi and those by STING, for the biologically relevant case of lysozyme-antibody interaction. Finally, we describe the STING capacity to make queries (at both residue and atomic levels) across the whole PDB, by looking at a specific case where the electrostatic potential parameter plays a crucial role in terms of a particular protein function, such as ligand binding. BlueStar STING is available at http://www.cbi.cnptia.embrapa.br.

  7. Values for digestible indispensable amino acid scores (DIAAS) for some dairy and plant proteins may better describe protein quality than values calculated using the concept for protein digestibility-corrected amino acid scores (PDCAAS).

    PubMed

    Mathai, John K; Liu, Yanhong; Stein, Hans H

    2017-02-01

    An experiment was conducted to compare values for digestible indispensable amino acid scores (DIAAS) for four animal proteins and four plant proteins with values calculated as recommended for protein digestibility-corrected amino acid scores (PDCAAS), but determined in pigs instead of in rats. Values for standardised total tract digestibility (STTD) of crude protein (CP) and standardised ileal digestibility (SID) of amino acids (AA) were calculated for whey protein isolate (WPI), whey protein concentrate (WPC), milk protein concentrate (MPC), skimmed milk powder (SMP), pea protein concentrate (PPC), soya protein isolate (SPI), soya flour and whole-grain wheat. The PDCAAS-like values were calculated using the STTD of CP to estimate AA digestibility and values for DIAAS were calculated from values for SID of AA. Results indicated that values for SID of most indispensable AA in WPI, WPC and MPC were greater (P<0·05) than for SMP, PPC, SPI, soya flour and wheat. With the exception of arginine and tryptophan, the SID of all indispensable AA in SPI was greater (P<0·05) than in soya flour, and with the exception of threonine, the SID of all indispensable AA in wheat was less (P<0·05) than in all other ingredients. If the same scoring pattern for children between 6 and 36 months was used to calculate PDCAAS-like values and DIAAS, PDCAAS-like values were greater (P<0·05) than DIAAS values for SMP, PPC, SPI, soya flour and wheat indicating that PDCAAS-like values estimated in pigs may overestimate the quality of these proteins.

  8. Absorbed fractions in a voxel-based phantom calculated with the MCNP-4B code.

    PubMed

    Yoriyaz, H; dos Santos, A; Stabin, M G; Cabezas, R

    2000-07-01

    A new approach for calculating internal dose estimates was developed through the use of a more realistic computational model of the human body. The present technique shows the capability to build a patient-specific phantom with tomography data (a voxel-based phantom) for the simulation of radiation transport and energy deposition using Monte Carlo methods such as in the MCNP-4B code. MCNP-4B absorbed fractions for photons in the mathematical phantom of Snyder et al. agreed well with reference values. Results obtained through radiation transport simulation in the voxel-based phantom, in general, agreed well with reference values. Considerable discrepancies, however, were found in some cases due to two major causes: differences in the organ masses between the phantoms and the occurrence of organ overlap in the voxel-based phantom, which is not considered in the mathematical phantom.

  9. Value-based medicine and vitreoretinal diseases.

    PubMed

    Brown, Melissa M; Brown, Gary C; Sharma, Sanjay

    2004-06-01

    The purpose of the review is to examine the role of value-based medicine and its impact, or potential impact, on vitreoretinal interventions. Value-based medicine integrates evidence-based data from clinical trials with the patient-perceived improvement in quality of life conferred by an intervention. Cost-utility analysis, the healthcare economic instrument used to create a value-based medicine database, is being increasingly used to study the cost-effectiveness of vitreoretinal interventions. Vitreoretinal interventions are generally cost-effective because of the great value they impart to patients. Laser surgical procedures, such as for diabetic retinopathy, threshold retinopathy of prematurity, and exudative macular degeneration appear to be especially cost-effective as a group.

  10. Environmental flow allocation and statistics calculator

    USGS Publications Warehouse

    Konrad, Christopher P.

    2011-01-01

    The Environmental Flow Allocation and Statistics Calculator (EFASC) is a computer program that calculates hydrologic statistics based on a time series of daily streamflow values. EFASC will calculate statistics for daily streamflow in an input file or will generate synthetic daily flow series from an input file based on rules for allocating and protecting streamflow and then calculate statistics for the synthetic time series. The program reads dates and daily streamflow values from input files. The program writes statistics out to a series of worksheets and text files. Multiple sites can be processed in series as one run. EFASC is written in MicrosoftRegistered Visual BasicCopyright for Applications and implemented as a macro in MicrosoftOffice Excel 2007Registered. EFASC is intended as a research tool for users familiar with computer programming. The code for EFASC is provided so that it can be modified for specific applications. All users should review how output statistics are calculated and recognize that the algorithms may not comply with conventions used to calculate streamflow statistics published by the U.S. Geological Survey.

  11. Environment-based pin-power reconstruction method for homogeneous core calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leroyer, H.; Brosselard, C.; Girardi, E.

    2012-07-01

    Core calculation schemes are usually based on a classical two-step approach associated with assembly and core calculations. During the first step, infinite lattice assemblies calculations relying on a fundamental mode approach are used to generate cross-sections libraries for PWRs core calculations. This fundamental mode hypothesis may be questioned when dealing with loading patterns involving several types of assemblies (UOX, MOX), burnable poisons, control rods and burn-up gradients. This paper proposes a calculation method able to take into account the heterogeneous environment of the assemblies when using homogeneous core calculations and an appropriate pin-power reconstruction. This methodology is applied to MOXmore » assemblies, computed within an environment of UOX assemblies. The new environment-based pin-power reconstruction is then used on various clusters of 3x3 assemblies showing burn-up gradients and UOX/MOX interfaces, and compared to reference calculations performed with APOLLO-2. The results show that UOX/MOX interfaces are much better calculated with the environment-based calculation scheme when compared to the usual pin-power reconstruction method. The power peak is always better located and calculated with the environment-based pin-power reconstruction method on every cluster configuration studied. This study shows that taking into account the environment in transport calculations can significantly improve the pin-power reconstruction so far as it is consistent with the core loading pattern. (authors)« less

  12. Value-based care in hepatology.

    PubMed

    Strazzabosco, Mario; Allen, John I; Teisberg, Elizabeth O

    2017-05-01

    The migration from legacy fee-for-service reimbursement to payments linked to high-value health care is accelerating in the United States because of new legislation and redesign of payments from the Centers for Medicare and Medicaid Services. Because patients with chronic diseases account for substantial use of health care resources, payers and health systems are focusing on maximizing the value of care for these patients. Because chronic liver diseases impose a major health burden worldwide affecting the health and lives of many individuals and families as well as substantial costs for individuals and payers, hepatologists must understand how they can improve their practices. Hepatologists practice a high-intensity cognitive subspecialty, using complex and costly procedures and medications. High-value patient care requires multidisciplinary coordination, labor-intensive support for critically ill patients, and effective chronic disease management. Under current fee-for-service reimbursement, patient values, medical success, and financial success can all be misaligned. Many current attempts to link health outcomes to reimbursement are based on compliance with process measures, with less emphasis on outcomes that matter most to patients, thus slowing transformation to higher-value team-based care. Outcome measures that reflect the entire cycle of care are needed to assist both clinicians and administrators in improving the quality and value of care. A comprehensive set of outcome measures for liver diseases is not currently available. Numerous researchers now are attempting to fill this gap by devising and testing outcome indicators and patient-reported outcomes for the major liver conditions. These indicators will provide tools to implement a value-based approach for patients with chronic liver diseases to compare results and value of care between referral centers, to perform health technology assessment, and to guide decision-making processes for health

  13. NHS constitution values for values-based recruitment: a virtue ethics perspective.

    PubMed

    Groothuizen, Johanna Elise; Callwood, Alison; Gallagher, Ann

    2018-05-17

    Values-based recruitment is used in England to select healthcare staff, trainees and students on the basis that their values align with those stated in the Constitution of the UK National Health Service (NHS). However, it is unclear whether the extensive body of existing literature within the field of moral philosophy was taken into account when developing these values. Although most values have a long historical tradition, a tendency to assume that they have just been invented, and to approach them uncritically, exists within the healthcare sector. Reflection is necessary. We are of the opinion that selected virtue ethics writings, which are underpinned by historical literature as well as practical analysis of the healthcare professions, provide a helpful framework for evaluation of the NHS Constitution values, to determine whether gaps exist and improvements can be made. Based on this evaluation, we argue that the definitions of certain NHS Constitution values are ambiguous. In addition to this, we argue that 'integrity' and 'practical wisdom', two important concepts in the virtue ethics literature, are not sufficiently represented within the NHS Constitution values. We believe that the NHS Constitution values could be strengthened by providing clearer definitions, and by integrating 'integrity' and 'practical wisdom'. This will benefit values-based recruitment strategies. Should healthcare policy-makers in other countries wish to develop a similar values-based recruitment framework, we advise that they proceed reflectively, and take previously published virtue ethics literature into consideration. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  14. International comparison of experience-based health state values at the population level.

    PubMed

    Heijink, Richard; Reitmeir, Peter; Leidl, Reiner

    2017-07-07

    Decision makers need to know whether health state values, an important component of summary measures of health, are valid for their target population. A key outcome is the individuals' valuation of their current health. This experience-based perspective is increasingly used to derive health state values. This study is the first to compare such experience-based valuations at the population level across countries. We examined the relationship between respondents' self-rated health as measured by the EQ-VAS, and the different dimensions and levels of the EQ-5D-3 L. The dataset included almost 32,000 survey respondents from 15 countries. We estimated generalized linear models with logit link function, including country-specific models and pooled-data models with country effects. The results showed significant and meaningful differences in the valuation of health states and individual health dimensions between countries, even though similarities were present too. Between countries, coefficients correlated positively for the values of mobility, self-care and usual activities, but not for the values of pain and anxiety, thus underlining structural differences. The findings indicate that, ideally, population-specific experience-based value sets are developed and used for the calculation of health outcomes. Otherwise, sensitivity analyses are needed. Furthermore, transferring the results of foreign studies into the national context should be performed with caution. We recommend future studies to investigate the causes of differences in experience-based health state values through a single international study possibly complemented with qualitative research on the determinants of valuation.

  15. Calculations of atomic magnetic nuclear shielding constants based on the two-component normalized elimination of the small component method

    NASA Astrophysics Data System (ADS)

    Yoshizawa, Terutaka; Zou, Wenli; Cremer, Dieter

    2017-04-01

    A new method for calculating nuclear magnetic resonance shielding constants of relativistic atoms based on the two-component (2c), spin-orbit coupling including Dirac-exact NESC (Normalized Elimination of the Small Component) approach is developed where each term of the diamagnetic and paramagnetic contribution to the isotropic shielding constant σi s o is expressed in terms of analytical energy derivatives with regard to the magnetic field B and the nuclear magnetic moment 𝝁 . The picture change caused by renormalization of the wave function is correctly described. 2c-NESC/HF (Hartree-Fock) results for the σiso values of 13 atoms with a closed shell ground state reveal a deviation from 4c-DHF (Dirac-HF) values by 0.01%-0.76%. Since the 2-electron part is effectively calculated using a modified screened nuclear shielding approach, the calculation is efficient and based on a series of matrix manipulations scaling with (2M)3 (M: number of basis functions).

  16. 40 CFR 600.207-93 - Calculation of fuel economy values for a model type.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Calculation of fuel economy values for a model type. 600.207-93 Section 600.207-93 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) ENERGY POLICY FUEL ECONOMY AND CARBON-RELATED EXHAUST EMISSIONS OF MOTOR VEHICLES Fuel Economy Regulations for 1977 and Later Model...

  17. 40 CFR 600.207-86 - Calculation of fuel economy values for a model type.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Calculation of fuel economy values for a model type. 600.207-86 Section 600.207-86 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) ENERGY POLICY FUEL ECONOMY AND CARBON-RELATED EXHAUST EMISSIONS OF MOTOR VEHICLES Fuel Economy Regulations for 1977 and Later Model...

  18. HbA1c values calculated from blood glucose levels using truncated Fourier series and implementation in standard SQL database language.

    PubMed

    Temsch, W; Luger, A; Riedl, M

    2008-01-01

    This article presents a mathematical model to calculate HbA1c values based on self-measured blood glucose and past HbA1c levels, thereby enabling patients to monitor diabetes therapy between scheduled checkups. This method could help physicians to make treatment decisions if implemented in a system where glucose data are transferred to a remote server. The method, however, cannot replace HbA1c measurements; past HbA1c values are needed to gauge the method. The mathematical model of HbA1c formation was developed based on biochemical principles. Unlike an existing HbA1c formula, the new model respects the decreasing contribution of older glucose levels to current HbA1c values. About 12 standard SQL statements embedded in a php program were used to perform Fourier transform. Regression analysis was used to gauge results with previous HbA1c values. The method can be readily implemented in any SQL database. The predicted HbA1c values thus obtained were in accordance with measured values. They also matched the results of the HbA1c formula in the elevated range. By contrast, the formula was too "optimistic" in the range of better glycemic control. Individual analysis of two subjects improved the accuracy of values and reflected the bias introduced by different glucometers and individual measurement habits.

  19. Method of calculation of critical values of financial indicators for developing food security strategy

    NASA Astrophysics Data System (ADS)

    Aigyl Ilshatovna, Sabirova; Svetlana Fanilevna, Khasanova; Vildanovna, Nagumanova Regina

    2018-05-01

    On the basis of decision making theory (minimax and maximin approaches) the authors propose a technique with the results of calculations of the critical values of effectiveness indicators of agricultural producers in the Republic of Tatarstan for 2013-2015. There is justified necessity of monitoring the effectiveness of the state support and the direction of its improvement.

  20. Value and limitations of transpulmonary pressure calculations during intra-abdominal hypertension.

    PubMed

    Cortes-Puentes, Gustavo A; Gard, Kenneth E; Adams, Alexander B; Faltesek, Katherine A; Anderson, Christopher P; Dries, David J; Marini, John J

    2013-08-01

    To clarify the effect of progressively increasing intra-abdominal pressure on esophageal pressure, transpulmonary pressure, and functional residual capacity. Controlled application of increased intra-abdominal pressure at two positive end-expiratory pressure levels (1 and 10 cm H2O) in an anesthetized porcine model of controlled ventilation. Large animal laboratory of a university-affiliated hospital. Eleven deeply anesthetized swine (weight 46.2 ± 6.2 kg). Air-regulated intra-abdominal hypertension (0-25 mm Hg). Esophageal pressure, tidal compliance, bladder pressure, and end-expiratory lung aeration by gas dilution. Functional residual capacity was significantly reduced by increasing intra-abdominal pressure at both positive end-expiratory pressure levels (p ≤ 0.0001) without corresponding changes of end-expiratory esophageal pressure. Above intra-abdominal pressure 5 mm Hg, plateau airway pressure increased linearly by ~ 50% of the applied intra-abdominal pressure value, associated with commensurate changes of esophageal pressure. With tidal volume held constant, negligible changes occurred in transpulmonary pressure due to intra-abdominal pressure. Driving pressures calculated from airway pressures alone (plateau airway pressure--positive end-expiratory pressure) did not equate to those computed from transpulmonary pressure (tidal changes in transpulmonary pressure). Increasing positive end-expiratory pressure shifted the predominantly negative end-expiratory transpulmonary pressure at positive end-expiratory pressure 1 cm H2O (mean -3.5 ± 0.4 cm H2O) into the positive range at positive end-expiratory pressure 10 cm H2O (mean 0.58 ± 1.2 cm H2O). Despite its insensitivity to changes in functional residual capacity, measuring transpulmonary pressure may be helpful in explaining how different levels of positive end-expiratory pressure influence recruitment and collapse during tidal ventilation in the presence of increased intra-abdominal pressure and in

  1. [From evidence-based medicine to value-based medicine].

    PubMed

    Zhang, Shao-dan; Liang, Yuan-bo; Li, Si-zhen

    2006-11-01

    Evidence base medicine (EBM) is based on objective evidence, which provides best available knowledge for physicians to scientifically make medical and therapeutic decisions for the care of all individual patients in order to improve the effectiveness of treatment and to prolong the life of patients. EBM has made a significant progress in clinical practice. But medical therapies cannot always bring a better life quality and clinically, patients' preference should be always taken into account. Value-based medicine medicine (VBM) is the practice of medicine that emphasizes the value received from an intervention. It takes evidence-based data to a higher level by combining the parameters of patient-perceived value with the cost of an intervention. The fundamental instrument of VBM is cost-utility analysis. VBM will provide a better practice model to evaluate the therapeutic package and cost effectiveness for individual and general health care.

  2. 40 CFR 600.206-86 - Calculation and use of fuel economy values for gasoline-fueled, diesel, and electric vehicle...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 30 2011-07-01 2011-07-01 false Calculation and use of fuel economy values for gasoline-fueled, diesel, and electric vehicle configurations. 600.206-86 Section 600.206-86... economy values for gasoline-fueled, diesel, and electric vehicle configurations. (a) Fuel economy values...

  3. A proposed selection index for feedlot profitability based on estimated breeding values.

    PubMed

    van der Westhuizen, R R; van der Westhuizen, J

    2009-04-22

    It is generally accepted that feed intake and growth (gain) are the most important economic components when calculating profitability in a growth test or feedlot. We developed a single post-weaning growth (feedlot) index based on the economic values of different components. Variance components, heritabilities and genetic correlations for and between initial weight (IW), final weight (FW), feed intake (FI), and shoulder height (SHD) were estimated by multitrait restricted maximum likelihood procedures. The estimated breeding values (EBVs) and the economic values for IW, FW and FI were used in a selection index to estimate a post-weaning or feedlot profitability value. Heritabilities for IW, FW, FI, and SHD were 0.41, 0.40, 0.33, and 0.51, respectively. The highest genetic correlations were 0.78 (between IW and FW) and 0.70 (between FI and FW). EBVs were used in a selection index to calculate a single economical value for each animal. This economic value is an indication of the gross profitability value or the gross test value (GTV) of the animal in a post-weaning growth test. GTVs varied between -R192.17 and R231.38 with an average of R9.31 and a standard deviation of R39.96. The Pearson correlations between EBVs (for production and efficiency traits) and GTV ranged from -0.51 to 0.68. The lowest correlation (closest to zero) was 0.26 between the Kleiber ratio and GTV. Correlations of 0.68 and -0.51 were estimated between average daily gain and GTV and feed conversion ratio and GTV, respectively. These results showed that it is possible to select for GTV. The selection index can benefit feedlotting in selecting offspring of bulls with high GTVs to maximize profitability.

  4. A value-based medicine cost-utility analysis of idiopathic epiretinal membrane surgery.

    PubMed

    Gupta, Omesh P; Brown, Gary C; Brown, Melissa M

    2008-05-01

    To perform a reference case, cost-utility analysis of epiretinal membrane (ERM) surgery using current literature on outcomes and complications. Computer-based, value-based medicine analysis. Decision analyses were performed under two scenarios: ERM surgery in better-seeing eye and ERM surgery in worse-seeing eye. The models applied long-term published data primarily from the Blue Mountains Eye Study and the Beaver Dam Eye Study. Visual acuity and major complications were derived from 25-gauge pars plana vitrectomy studies. Patient-based, time trade-off utility values, Markov modeling, sensitivity analysis, and net present value adjustments were used in the design and calculation of results. Main outcome measures included the number of discounted quality-adjusted-life-years (QALYs) gained and dollars spent per QALY gained. ERM surgery in the better-seeing eye compared with observation resulted in a mean gain of 0.755 discounted QALYs (3% annual rate) per patient treated. This model resulted in $4,680 per QALY for this procedure. When sensitivity analysis was performed, utility values varied from $6,245 to $3,746/QALY gained, medical costs varied from $3,510 to $5,850/QALY gained, and ERM recurrence rate increased to $5,524/QALY. ERM surgery in the worse-seeing eye compared with observation resulted in a mean gain of 0.27 discounted QALYs per patient treated. The $/QALY was $16,146 with a range of $20,183 to $12,110 based on sensitivity analyses. Utility values ranged from $21,520 to $12,916/QALY and ERM recurrence rate increased to $16,846/QALY based on sensitivity analysis. ERM surgery is a very cost-effective procedure when compared with other interventions across medical subspecialties.

  5. Impact of dietary fiber energy on the calculation of food total energy value in the Brazilian Food Composition Database.

    PubMed

    Menezes, Elizabete Wenzel de; Grande, Fernanda; Giuntini, Eliana Bistriche; Lopes, Tássia do Vale Cardoso; Dan, Milana Cara Tanasov; Prado, Samira Bernardino Ramos do; Franco, Bernadette Dora Gombossy de Melo; Charrondière, U Ruth; Lajolo, Franco Maria

    2016-02-15

    Dietary fiber (DF) contributes to the energy value of foods and including it in the calculation of total food energy has been recommended for food composition databases. The present study aimed to investigate the impact of including energy provided by the DF fermentation in the calculation of food energy. Total energy values of 1753 foods from the Brazilian Food Composition Database were calculated with or without the inclusion of DF energy. The energy values were compared, through the use of percentage difference (D%), in individual foods and in daily menus. Appreciable energy D% (⩾10) was observed in 321 foods, mainly in the group of vegetables, legumes and fruits. However, in the Brazilian typical menus containing foods from all groups, only D%<3 was observed. In mixed diets, the DF energy may cause slight variations in total energy; on the other hand, there is appreciable energy D% for certain foods, when individually considered. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. StakeMeter: value-based stakeholder identification and quantification framework for value-based software systems.

    PubMed

    Babar, Muhammad Imran; Ghazali, Masitah; Jawawi, Dayang N A; Bin Zaheer, Kashif

    2015-01-01

    Value-based requirements engineering plays a vital role in the development of value-based software (VBS). Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ) approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called 'StakeMeter'. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error.

  7. StakeMeter: Value-Based Stakeholder Identification and Quantification Framework for Value-Based Software Systems

    PubMed Central

    Babar, Muhammad Imran; Ghazali, Masitah; Jawawi, Dayang N. A.; Zaheer, Kashif Bin

    2015-01-01

    Value-based requirements engineering plays a vital role in the development of value-based software (VBS). Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ) approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called ‘StakeMeter’. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error. PMID:25799490

  8. Assessment of adult body composition using bioelectrical impedance: comparison of researcher calculated to machine outputted values

    PubMed Central

    Franco-Villoria, Maria; Wright, Charlotte M; McColl, John H; Sherriff, Andrea; Pearce, Mark S

    2016-01-01

    Objectives To explore the usefulness of Bioelectrical Impedance Analysis (BIA) for general use by identifying best-evidenced formulae to calculate lean and fat mass, comparing these to historical gold standard data and comparing these results with machine-generated output. In addition, we explored how to best to adjust lean and fat estimates for height and how these overlapped with body mass index (BMI). Design Cross-sectional observational study within population representative cohort study. Setting Urban community, North East England Participants Sample of 506 mothers of children aged 7–8 years, mean age 36.3 years. Methods Participants were measured at a home visit using a portable height measure and leg-to-leg BIA machine (Tanita TBF-300MA). Measures Height, weight, bioelectrical impedance (BIA). Outcome measures Lean and fat mass calculated using best-evidenced published formulae as well as machine-calculated lean and fat mass data. Results Estimates of lean mass were similar to historical results using gold standard methods. When compared with the machine-generated values, there were wide limits of agreement for fat mass and a large relative bias for lean that varied with size. Lean and fat residuals adjusted for height differed little from indices of lean (or fat)/height2. Of 112 women with BMI >30 kg/m2, 100 (91%) also had high fat, but of the 16 with low BMI (<19 kg/m2) only 5 (31%) also had low fat. Conclusions Lean and fat mass calculated from BIA using published formulae produces plausible values and demonstrate good concordance between high BMI and high fat, but these differ substantially from the machine-generated values. Bioelectrical impedance can supply a robust and useful field measure of body composition, so long as the machine-generated output is not used. PMID:26743700

  9. ESR concept paper on value-based radiology.

    PubMed

    2017-10-01

    The European Society of Radiology (ESR) established a Working Group on Value-Based Imaging (VBI WG) in August 2016 in response to developments in European healthcare systems in general, and the trend within radiology to move from volume- to value-based practice in particular. The value-based healthcare (VBH) concept defines "value" as health outcomes achieved for patients relative to the costs of achieving them. Within this framework, value measurements start at the beginning of therapy; the whole diagnostic process is disregarded, and is considered only if it is the cause of errors or complications. Making the case for a new, multidisciplinary organisation of healthcare delivery centred on the patient, this paper establishes the diagnosis of disease as a first outcome in the interrelated activities of the healthcare chain. Metrics are proposed for measuring the quality of radiologists' diagnoses and the various ways in which radiologists provide value to patients, other medical specialists and healthcare systems at large. The ESR strongly believes value-based radiology (VBR) is a necessary complement to existing VBH concepts. The Society is determined to establish a holistic VBR programme to help European radiologists deal with changes in the evolution from volume- to value-based evaluation of radiological activities. Main Messages • Value-based healthcare defines value as patient's outcome over costs. • The VBH framework disregards the diagnosis as an outcome. • VBH considers diagnosis only if wrong or a cause of complications. • A correct diagnosis is the first outcome that matters to patients. • Metrics to measure radiologists' impacts on patient outcomes are key. • The value provided by radiology is multifaceted, going beyond exam volumes.

  10. 31 CFR 351.13 - What do I need to know about the savings bond rate to understand redemption value calculations in...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false What do I need to know about the savings bond rate to understand redemption value calculations in this subpart? 351.13 Section 351.13 Money... What do I need to know about the savings bond rate to understand redemption value calculations in this...

  11. Value-based resource management: a model for best value nursing care.

    PubMed

    Caspers, Barbara A; Pickard, Beth

    2013-01-01

    With the health care environment shifting to a value-based payment system, Catholic Health Initiatives nursing leadership spearheaded an initiative with 14 hospitals to establish best nursing care at a lower cost. The implementation of technology-enabled business processes at point of care led to a new model for best value nursing care: Value-Based Resource Management. The new model integrates clinical patient data from the electronic medical record and embeds the new information in care team workflows for actionable real-time decision support and predictive forecasting. The participating hospitals reported increased patient satisfaction and cost savings in the reduction of overtime and improvement in length of stay management. New data generated by the initiative on nursing hours and cost by patient and by population (Medicare severity diagnosis-related groups), and patient health status outcomes across the acute care continuum expanded business intelligence for a value-based population health system.

  12. 31 CFR 359.39 - How are redemption values calculated for definitive Series I savings bonds?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 31 Money and Finance: Treasury 2 2014-07-01 2014-07-01 false How are redemption values calculated for definitive Series I savings bonds? 359.39 Section 359.39 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FISCAL SERVICE, DEPARTMENT OF THE TREASURY BUREAU OF THE FISCAL...

  13. 31 CFR 359.39 - How are redemption values calculated for definitive Series I savings bonds?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 31 Money and Finance:Treasury 2 2012-07-01 2012-07-01 false How are redemption values calculated for definitive Series I savings bonds? 359.39 Section 359.39 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FISCAL SERVICE, DEPARTMENT OF THE TREASURY BUREAU OF THE PUBLIC...

  14. 31 CFR 359.39 - How are redemption values calculated for definitive Series I savings bonds?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 31 Money and Finance:Treasury 2 2013-07-01 2013-07-01 false How are redemption values calculated for definitive Series I savings bonds? 359.39 Section 359.39 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FISCAL SERVICE, DEPARTMENT OF THE TREASURY BUREAU OF THE PUBLIC...

  15. 31 CFR 359.39 - How are redemption values calculated for definitive Series I savings bonds?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 31 Money and Finance:Treasury 2 2011-07-01 2011-07-01 false How are redemption values calculated for definitive Series I savings bonds? 359.39 Section 359.39 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FISCAL SERVICE, DEPARTMENT OF THE TREASURY BUREAU OF THE PUBLIC...

  16. 31 CFR 359.39 - How are redemption values calculated for definitive Series I savings bonds?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false How are redemption values calculated for definitive Series I savings bonds? 359.39 Section 359.39 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FISCAL SERVICE, DEPARTMENT OF THE TREASURY BUREAU OF THE PUBLIC...

  17. Value redefined for inflammatory bowel disease patients: a choice-based conjoint analysis of patients' preferences.

    PubMed

    van Deen, Welmoed K; Nguyen, Dominic; Duran, Natalie E; Kane, Ellen; van Oijen, Martijn G H; Hommes, Daniel W

    2017-02-01

    Value-based healthcare is an upcoming field. The core idea is to evaluate care based on achieved outcomes divided by the costs. Unfortunately, the optimal way to evaluate outcomes is ill-defined. In this study, we aim to develop a single, preference based, outcome metric, which can be used to quantify overall health value in inflammatory bowel disease (IBD). IBD patients filled out a choice-based conjoint (CBC) questionnaire in which patients chose preferable outcome scenarios with different levels of disease control (DC), quality of life (QoL), and productivity (Pr). A CBC analysis was performed to estimate the relative value of DC, QoL, and Pr. A patient-centered composite score was developed which was weighted based on the stated preferences. We included 210 IBD patients. Large differences in stated preferences were observed. Increases from low to intermediate outcome levels were valued more than increases from intermediate to high outcome levels. Overall, QoL was more important to patients than DC or Pr. Individual outcome scores were calculated based on the stated preferences. This score was significantly different from a score not weighted based on patient preferences in patients with active disease. We showed the feasibility of creating a single outcome metric in IBD which incorporates patients' values using a CBC. Because this metric changes significantly when weighted according to patients' values, we propose that success in healthcare should be measured accordingly.

  18. An analytical method based on multipole moment expansion to calculate the flux distribution in Gammacell-220

    NASA Astrophysics Data System (ADS)

    Rezaeian, P.; Ataenia, V.; Shafiei, S.

    2017-12-01

    In this paper, the flux of photons inside the irradiation cell of the Gammacell-220 is calculated using an analytical method based on multipole moment expansion. The flux of the photons inside the irradiation cell is introduced as the function of monopole, dipoles and quadruples in the Cartesian coordinate system. For the source distribution of the Gammacell-220, the values of the multipole moments are specified by direct integrating. To confirm the validation of the presented methods, the flux distribution inside the irradiation cell was determined utilizing MCNP simulations as well as experimental measurements. To measure the flux inside the irradiation cell, Amber dosimeters were employed. The calculated values of the flux were in agreement with the values obtained by simulations and measurements, especially in the central zones of the irradiation cell. In order to show that the present method is a good approximation to determine the flux in the irradiation cell, the values of the multipole moments were obtained by fitting the simulation and experimental data using Levenberg-Marquardt algorithm. The present method leads to reasonable results for the all source distribution even without any symmetry which makes it a powerful tool for the source load planning.

  19. Methods of developing core collections based on the predicted genotypic value of rice ( Oryza sativa L.).

    PubMed

    Li, C T; Shi, C H; Wu, J G; Xu, H M; Zhang, H Z; Ren, Y L

    2004-04-01

    The selection of an appropriate sampling strategy and a clustering method is important in the construction of core collections based on predicted genotypic values in order to retain the greatest degree of genetic diversity of the initial collection. In this study, methods of developing rice core collections were evaluated based on the predicted genotypic values for 992 rice varieties with 13 quantitative traits. The genotypic values of the traits were predicted by the adjusted unbiased prediction (AUP) method. Based on the predicted genotypic values, Mahalanobis distances were calculated and employed to measure the genetic similarities among the rice varieties. Six hierarchical clustering methods, including the single linkage, median linkage, centroid, unweighted pair-group average, weighted pair-group average and flexible-beta methods, were combined with random, preferred and deviation sampling to develop 18 core collections of rice germplasm. The results show that the deviation sampling strategy in combination with the unweighted pair-group average method of hierarchical clustering retains the greatest degree of genetic diversities of the initial collection. The core collections sampled using predicted genotypic values had more genetic diversity than those based on phenotypic values.

  20. Calculating the dermal flux of chemicals with OELs based on their molecular structure: An attempt to assign the skin notation.

    PubMed

    Kupczewska-Dobecka, Małgorzata; Jakubowski, Marek; Czerczak, Sławomir

    2010-09-01

    Our objectives included calculating the permeability coefficient and dermal penetration rates (flux value) for 112 chemicals with occupational exposure limits (OELs) according to the LFER (linear free-energy relationship) model developed using published methods. We also attempted to assign skin notations based on each chemical's molecular structure. There are many studies available where formulae for coefficients of permeability from saturated aqueous solutions (K(p)) have been related to physicochemical characteristics of chemicals. The LFER model is based on the solvation equation, which contains five main descriptors predicted from chemical structure: solute excess molar refractivity, dipolarity/polarisability, summation hydrogen bond acidity and basicity, and the McGowan characteristic volume. Descriptor values, available for about 5000 compounds in the Pharma Algorithms Database were used to calculate permeability coefficients. Dermal penetration rate was estimated as a ratio of permeability coefficient and concentration of chemical in saturated aqueous solution. Finally, estimated dermal penetration rates were used to assign the skin notation to chemicals. Defined critical fluxes defined from the literature were recommended as reference values for skin notation. The application of Abraham descriptors predicted from chemical structure and LFER analysis in calculation of permeability coefficients and flux values for chemicals with OELs was successful. Comparison of calculated K(p) values with data obtained earlier from other models showed that LFER predictions were comparable to those obtained by some previously published models, but the differences were much more significant for others. It seems reasonable to conclude that skin should not be characterised as a simple lipophilic barrier alone. Both lipophilic and polar pathways of permeation exist across the stratum corneum. It is feasible to predict skin notation on the basis of the LFER and other published

  1. The hounsfield unit value calculated with the aid of non-contrast computed tomography and its effect on the outcome of percutaneous nephrolithotomy.

    PubMed

    Gok, Alper; Polat, Haci; Cift, Ali; Yucel, Mehmet Ozgur; Gok, Bahri; Sirik, Mehmet; Benlioglu, Can; Kalyenci, Bedreddin

    2015-06-01

    To evaluate the effect of the Hounsfield unit (HU) value, calculated with the aid of non-contrast computed tomography, on the outcome of percutaneous nephrolithotomy (PCNL). Data for 83 patients evaluated in our clinic between November 2011 and February 2014 that had similar stone sizes, localizations, and radio opacities were retrospectively reviewed. The patients were grouped according to their HU value, in a low HU group (HU ≤ 1000) or a high HU group (HU > 1000). The two groups were compared based on their PCNL success rates, complications, duration of surgery, duration of fluoroscopy, and decrease in the hematocrit. There were no significant differences in terms of mean age, female-male ratio, or mean body mass index between the two groups (p > 0.05). The stone size and stone surface area did not differ significantly between the groups (p = 0.820 and p = 0.394, respectively). The unsuccessful PCNL rate and the prevalence of complications did not differ significantly between the two groups (p > 0.05). The duration of surgery, duration of fluoroscopy, and decrease in the hematocrit were significantly greater in the high HU group compared to the low HU group (p < 0.001). Calculating the HU value using this imaging method may predict cases with longer surgery durations, longer fluoroscopy durations, and greater decreases in hematocrite levels, but this value is not related to the success rate of PCNL.

  2. Singular value decomposition based feature extraction technique for physiological signal analysis.

    PubMed

    Chang, Cheng-Ding; Wang, Chien-Chih; Jiang, Bernard C

    2012-06-01

    Multiscale entropy (MSE) is one of the popular techniques to calculate and describe the complexity of the physiological signal. Many studies use this approach to detect changes in the physiological conditions in the human body. However, MSE results are easily affected by noise and trends, leading to incorrect estimation of MSE values. In this paper, singular value decomposition (SVD) is adopted to replace MSE to extract the features of physiological signals, and adopt the support vector machine (SVM) to classify the different physiological states. A test data set based on the PhysioNet website was used, and the classification results showed that using SVD to extract features of the physiological signal could attain a classification accuracy rate of 89.157%, which is higher than that using the MSE value (71.084%). The results show the proposed analysis procedure is effective and appropriate for distinguishing different physiological states. This promising result could be used as a reference for doctors in diagnosis of congestive heart failure (CHF) disease.

  3. Lift calculations based on accepted wake models for animal flight are inconsistent and sensitive to vortex dynamics.

    PubMed

    Gutierrez, Eric; Quinn, Daniel B; Chin, Diana D; Lentink, David

    2016-12-06

    There are three common methods for calculating the lift generated by a flying animal based on the measured airflow in the wake. However, these methods might not be accurate according to computational and robot-based studies of flapping wings. Here we test this hypothesis for the first time for a slowly flying Pacific parrotlet in still air using stereo particle image velocimetry recorded at 1000 Hz. The bird was trained to fly between two perches through a laser sheet wearing laser safety goggles. We found that the wingtip vortices generated during mid-downstroke advected down and broke up quickly, contradicting the frozen turbulence hypothesis typically assumed in animal flight experiments. The quasi-steady lift at mid-downstroke was estimated based on the velocity field by applying the widely used Kutta-Joukowski theorem, vortex ring model, and actuator disk model. The calculated lift was found to be sensitive to the applied model and its different parameters, including vortex span and distance between the bird and laser sheet-rendering these three accepted ways of calculating weight support inconsistent. The three models predict different aerodynamic force values mid-downstroke compared to independent direct measurements with an aerodynamic force platform that we had available for the same species flying over a similar distance. Whereas the lift predictions of the Kutta-Joukowski theorem and the vortex ring model stayed relatively constant despite vortex breakdown, their values were too low. In contrast, the actuator disk model predicted lift reasonably accurately before vortex breakdown, but predicted almost no lift during and after vortex breakdown. Some of these limitations might be better understood, and partially reconciled, if future animal flight studies report lift calculations based on all three quasi-steady lift models instead. This would also enable much needed meta studies of animal flight to derive bioinspired design principles for quasi-steady lift

  4. Research on the Value Evaluation of Used Pure Electric Car Based on the Replacement Cost Method

    NASA Astrophysics Data System (ADS)

    Tan, zhengping; Cai, yun; Wang, yidong; Mao, pan

    2018-03-01

    In this paper, the value evaluation of the used pure electric car is carried out by the replacement cost method, which fills the blank of the value evaluation of the electric vehicle. The basic principle of using the replacement cost method, combined with the actual cost of pure electric cars, puts forward the calculation method of second-hand electric car into a new rate based on the use of AHP method to construct the weight matrix comprehensive adjustment coefficient of related factors, the improved method of value evaluation system for second-hand car

  5. 40 CFR 600.209-08 - Calculation of vehicle-specific 5-cycle fuel economy values for a model type.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... vehicle configuration 5-cycle fuel economy values as determined in § 600.207-08 for low-altitude tests. (1... economy data from tests conducted on these vehicle configuration(s) at high altitude to calculate the fuel... city and highway fuel economy values from the tests performed using gasoline or diesel test fuel. (ii...

  6. [Value-based medicine in ophthalmology].

    PubMed

    Hirneiss, C; Neubauer, A S; Tribus, C; Kampik, A

    2006-06-01

    Value-based medicine (VBM) unifies costs and patient-perceived value (improvement in quality of life, length of life, or both) of an intervention. Value-based ophthalmology is of increasing importance for decisions in eye care. The methods of VBM are explained and definitions for a specific terminology in this field are given. The cost-utility analysis as part of health care economic analyses is explained. VBM exceeds evidence-based medicine by incorporating parameters of cost and benefits from an ophthalmological intervention. The benefit of the intervention is defined as an increase or maintenance of visual quality of life and can be determined by utility analysis. The time trade-off method is valid and reliable for utility analysis. The resources expended for the value gained in VBM are measured with cost-utility analysis in terms of cost per quality-adjusted life years gained (euros/QALY). Numerous cost-utility analyses of different ophthalmological interventions have been published. The fundamental instrument of VBM is cost-utility analysis. The results in cost per QALY allow estimation of cost effectiveness of an ophthalmological intervention. Using the time trade-off method for utility analysis allows the comparison of ophthalmological cost-utility analyses with those of other medical interventions. VBM is important for individual medical decision making and for general health care.

  7. Value-Based Emergency Management.

    PubMed

    Corrigan, Zachary; Winslow, Walter; Miramonti, Charlie; Stephens, Tim

    2016-02-01

    This article touches on the complex and decentralized network that is the US health care system and how important it is to include emergency management in this network. By aligning the overarching incentives of opposing health care organizations, emergency management can become resilient to up-and-coming changes in reimbursement, staffing, and network ownership. Coalitions must grasp the opportunity created by changes in value-based purchasing and impending Centers for Medicare and Medicaid Services emergency management rules to engage payers, physicians, and executives. Hope and faith in doing good is no longer enough for preparedness and health care coalitions; understanding how physicians are employed and health care is delivered and paid for is now necessary. Incentivizing preparedness through value-based compensation systems will become the new standard for emergency management.

  8. A calculation for radial expectation values of helium like actinide ions (Z=89-93)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ürer, G., E-mail: gurer@sakarya.edu.tr; Arslan, M., E-mail: murat.arslan4@ogr.sakarya.edu.tr; Balkaya, E., E-mail: eda.balkaya@ogr.sakarya.edu.tr

    2016-03-25

    Radial expectation values, , for helium like actinides (Z{sub Ac}=89, Z{sub Th}=90, Z{sub Pa}=91, Z{sub U}=92, and Z{sub Np}=93) are reported using the Multiconfiguration Hartree-Fock (MCHF) within the framework Breit-Pauli corrections. Atomic data as energy levels, wavelengths, weighted oscillator strengths, and transition probabilities for allowed and forbidden transitions need these calculations. The obtained results are compared available works.

  9. Evidence-Based and Value-Based Decision Making About Healthcare Design: An Economic Evaluation of the Safety and Quality Outcomes.

    PubMed

    Zadeh, Rana; Sadatsafavi, Hessam; Xue, Ryan

    2015-01-01

    This study describes a vision and framework that can facilitate the implementation of evidence-based design (EBD), scientific knowledge base into the process of the design, construction, and operation of healthcare facilities and clarify the related safety and quality outcomes for the stakeholders. The proposed framework pairs EBD with value-driven decision making and aims to improve communication among stakeholders by providing a common analytical language. Recent EBD research indicates that the design and operation of healthcare facilities contribute to an organization's operational success by improving safety, quality, and efficiency. However, because little information is available about the financial returns of evidence-based investments, such investments are readily eliminated during the capital-investment decision-making process. To model the proposed framework, we used engineering economy tools to evaluate the return on investments in six successful cases, identified by a literature review, in which facility design and operation interventions resulted in reductions in hospital-acquired infections, patient falls, staff injuries, and patient anxiety. In the evidence-based cases, calculated net present values, internal rates of return, and payback periods indicated that the long-term benefits of interventions substantially outweighed the intervention costs. This article explained a framework to develop a research-based and value-based communication language on specific interventions along the planning, design and construction, operation, and evaluation stages. Evidence-based and value-based design frameworks can be applied to communicate the life-cycle costs and savings of EBD interventions to stakeholders, thereby contributing to more informed decision makings and the optimization of healthcare infrastructures. © The Author(s) 2015.

  10. Countervailing incentives in value-based payment.

    PubMed

    Arnold, Daniel R

    2017-09-01

    Payment reform has been at the forefront of the movement toward higher-value care in the U.S. health care system. A common belief is that volume-based incentives embedded in fee-for-service need to be replaced with value-based payments. While this belief is well-intended, value-based payment also contains perverse incentives. In particular, behavioral economists have identified several features of individual decision making that reverse some of the typical recommendations for inducing desirable behavior through financial incentives. This paper discusses the countervailing incentives associated with four behavioral economic concepts: loss aversion, relative social ranking, inertia or status quo bias, and extrinsic vs. intrinsic motivation. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. A Calculation Method of Electric Distance and Subarea Division Application Based on Transmission Impedance

    NASA Astrophysics Data System (ADS)

    Fang, G. J.; Bao, H.

    2017-12-01

    The widely used method of calculating electric distances is sensitivity method. The sensitivity matrix is the result of linearization and based on the hypothesis that the active power and reactive power are decoupled, so it is inaccurate. In addition, it calculates the ratio of two partial derivatives as the relationship of two dependent variables, so there is no physical meaning. This paper presents a new method for calculating electrical distance, namely transmission impedance method. It forms power supply paths based on power flow tracing, then establishes generalized branches to calculate transmission impedances. In this paper, the target of power flow tracing is S instead of Q. Q itself has no direction and the grid delivers complex power so that S contains more electrical information than Q. By describing the power transmission relationship of the branch and drawing block diagrams in both forward and reverse directions, it can be found that the numerators of feedback parts of two block diagrams are all the transmission impedances. To ensure the distance is scalar, the absolute value of transmission impedance is defined as electrical distance. Dividing network according to the electric distances and comparing with the results of sensitivity method, it proves that the transmission impedance method can adapt to the dynamic change of system better and reach a reasonable subarea division scheme.

  12. What is the value of Values Based Recruitment for nurse education programmes?

    PubMed

    Groothuizen, Johanna E; Callwood, Alison; Gallagher, Ann

    2018-05-01

    A discussion of issues associated with Values Based Recruitment (VBR) for nurse education programmes. Values Based Recruitment is a mandatory element in selection processes of students for Higher Education healthcare courses in England, including all programmes across nursing. Students are selected on the basis that their individual values align with those presented in the Constitution of the National Health Service. However, there are issues associated with the use of values as selection criteria that have been insufficiently addressed. These are discussed. Discussion paper. This article is based on documents published on the website of the executive body responsible for the implementation of a policy regarding VBR in Higher Education Institutions up until June 2017 and our evaluation of the conceptualisation of VBR, underpinned by contemporary theory and literature. Values Based Recruitment influences who is accepted onto a nurse education programme, but there has been limited critical evaluation regarding the effectiveness of employing values as selection criteria. Values are subject to interpretation and evidence regarding whether or how VBR will improve practice and care is lacking. The issues discussed in this article show that Higher Education Institutions offering nursing courses, whether in England or in other countries, should be critical and reflective regarding the implementation of VBR methods. We call for a debate regarding the meaning and implications of VBR and further research regarding its validity and effectiveness. © 2017 John Wiley & Sons Ltd.

  13. PVWatts ® Calculator: India (Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None, None

    The PVWatts ® Calculator for India was released by the National Renewable Energy Laboratory in 2013. The online tool estimates electricity production and the monetary value of that production of grid-connected roof- or ground-mounted crystalline silicon photovoltaics systems based on a few simple inputs. This factsheet provides a broad overview of the PVWatts ® Calculator for India.

  14. Custom auroral electrojet indices calculated by using MANGO value-added services

    NASA Astrophysics Data System (ADS)

    Bargatze, L. F.; Moore, W. B.; King, T. A.

    2009-12-01

    A set of computational routines called MANGO, Magnetogram Analysis for the Network of Geophysical Observatories, is utilized to calculate customized versions of the auroral electrojet indices, AE, AL, and AU. MANGO is part of an effort to enhance data services available to users of the Heliophysics VxOs, specifically for the Virtual Magnetospheric Observatory (VMO). The MANGO value-added service package is composed of a set of IDL routines that decompose ground magnetic field observations to isolate secular, diurnal, and disturbance variations of magnetic field disturbance, station-by-station. Each MANGO subroutine has been written in modular fashion to allow "plug and play"-style flexibility and each has been designed to account for failure modes and noisy data so that the programs will run to completion producing as much derived data as possible. The capabilities of the MANGO service package will be demonstrated through their application to the study of auroral electrojet current flow during magnetic substorms. Traditionally, the AE indices are calculated by using data from about twelve ground stations located at northern auroral zone latitudes spread longitudinally around the world. Magnetogram data are corrected for secular variation prior to calculating the standard version of the indices but the data are not corrected for diurnal variations. A custom version of the AE indices will be created by using the MANGO routines including a step to subtract diurnal curves from the magnetic field data at each station. The custom AE indices provide more accurate measures of auroral electrojet activity due to isolation of the sunstorm electrojet magnetic field signiture. The improvements in the accuracy of the custom AE indices over the tradition indices are largest during the northern hemisphere summer when the range of diurnal variation reaches its maximum.

  15. Assessment of adult body composition using bioelectrical impedance: comparison of researcher calculated to machine outputted values.

    PubMed

    Franco-Villoria, Maria; Wright, Charlotte M; McColl, John H; Sherriff, Andrea; Pearce, Mark S

    2016-01-07

    To explore the usefulness of Bioelectrical Impedance Analysis (BIA) for general use by identifying best-evidenced formulae to calculate lean and fat mass, comparing these to historical gold standard data and comparing these results with machine-generated output. In addition, we explored how to best to adjust lean and fat estimates for height and how these overlapped with body mass index (BMI). Cross-sectional observational study within population representative cohort study. Urban community, North East England Sample of 506 mothers of children aged 7-8 years, mean age 36.3 years. Participants were measured at a home visit using a portable height measure and leg-to-leg BIA machine (Tanita TBF-300MA). Height, weight, bioelectrical impedance (BIA). Lean and fat mass calculated using best-evidenced published formulae as well as machine-calculated lean and fat mass data. Estimates of lean mass were similar to historical results using gold standard methods. When compared with the machine-generated values, there were wide limits of agreement for fat mass and a large relative bias for lean that varied with size. Lean and fat residuals adjusted for height differed little from indices of lean (or fat)/height(2). Of 112 women with BMI >30 kg/m(2), 100 (91%) also had high fat, but of the 16 with low BMI (<19 kg/m(2)) only 5 (31%) also had low fat. Lean and fat mass calculated from BIA using published formulae produces plausible values and demonstrate good concordance between high BMI and high fat, but these differ substantially from the machine-generated values. Bioelectrical impedance can supply a robust and useful field measure of body composition, so long as the machine-generated output is not used. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  16. 40 CFR 600.206-93 - Calculation and use of fuel economy values for gasoline-fueled, diesel-fueled, electric, alcohol...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... tests performed using gasoline or diesel test fuel. (ii) Calculate the city, highway, and combined fuel economy values from the tests performed using alcohol or natural gas test fuel. (b) If only one equivalent... 40 Protection of Environment 30 2011-07-01 2011-07-01 false Calculation and use of fuel economy...

  17. Uncertainty modelling and analysis of volume calculations based on a regular grid digital elevation model (DEM)

    NASA Astrophysics Data System (ADS)

    Li, Chang; Wang, Qing; Shi, Wenzhong; Zhao, Sisi

    2018-05-01

    The accuracy of earthwork calculations that compute terrain volume is critical to digital terrain analysis (DTA). The uncertainties in volume calculations (VCs) based on a DEM are primarily related to three factors: 1) model error (ME), which is caused by an adopted algorithm for a VC model, 2) discrete error (DE), which is usually caused by DEM resolution and terrain complexity, and 3) propagation error (PE), which is caused by the variables' error. Based on these factors, the uncertainty modelling and analysis of VCs based on a regular grid DEM are investigated in this paper. Especially, how to quantify the uncertainty of VCs is proposed by a confidence interval based on truncation error (TE). In the experiments, the trapezoidal double rule (TDR) and Simpson's double rule (SDR) were used to calculate volume, where the TE is the major ME, and six simulated regular grid DEMs with different terrain complexity and resolution (i.e. DE) were generated by a Gauss synthetic surface to easily obtain the theoretical true value and eliminate the interference of data errors. For PE, Monte-Carlo simulation techniques and spatial autocorrelation were used to represent DEM uncertainty. This study can enrich uncertainty modelling and analysis-related theories of geographic information science.

  18. Poster - 08: Preliminary Investigation into Collapsed-Cone based Dose Calculations for COMS Eye Plaques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morrison, Hali; Menon, Geetha; Sloboda, Ron

    Purpose: To investigate the accuracy of model-based dose calculations using a collapsed-cone algorithm for COMS eye plaques loaded with I-125 seeds. Methods: The Nucletron SelectSeed 130.002 I-125 seed and the 12 mm COMS eye plaque were incorporated into a research version of the Oncentra® Brachy v4.5 treatment planning system which uses the Advanced Collapsed-cone Engine (ACE) algorithm. Comparisons of TG-43 and high-accuracy ACE doses were performed for a single seed in a 30×30×30 cm{sup 3} water box, as well as with one seed in the central slot of the 12 mm COMS eye plaque. The doses along the plaque centralmore » axis (CAX) were used to calculate the carrier correction factor, T(r), and were compared to tabulated and MCNP6 simulated doses for both the SelectSeed and IsoAid IAI-125A seeds. Results: The ACE calculated dose for the single seed in water was on average within 0.62 ± 2.2% of the TG-43 dose, with the largest differences occurring near the end-welds. The ratio of ACE to TG-43 calculated doses along the CAX (T(r)) of the 12 mm COMS plaque for the SelectSeed was on average within 3.0% of previously tabulated data, and within 2.9% of the MCNP6 simulated values. The IsoAid and SelectSeed T(r) values agreed within 0.3%. Conclusions: Initial comparisons show good agreement between ACE and MC doses for a single seed in a 12 mm COMS eye plaque; more complicated scenarios are being investigated to determine the accuracy of this calculation method.« less

  19. A new task scheduling algorithm based on value and time for cloud platform

    NASA Astrophysics Data System (ADS)

    Kuang, Ling; Zhang, Lichen

    2017-08-01

    Tasks scheduling, a key part of increasing resource utilization and enhancing system performance, is a never outdated problem especially in cloud platforms. Based on the value density algorithm of the real-time task scheduling system and the character of the distributed system, the paper present a new task scheduling algorithm by further studying the cloud technology and the real-time system: Least Level Value Density First (LLVDF). The algorithm not only introduces some attributes of time and value for tasks, it also can describe weighting relationships between these properties mathematically. As this feature of the algorithm, it can gain some advantages to distinguish between different tasks more dynamically and more reasonably. When the scheme was used in the priority calculation of the dynamic task scheduling on cloud platform, relying on its advantage, it can schedule and distinguish tasks with large amounts and many kinds more efficiently. The paper designs some experiments, some distributed server simulation models based on M/M/C model of queuing theory and negative arrivals, to compare the algorithm against traditional algorithm to observe and show its characters and advantages.

  20. A free software for the calculation of T2* values for iron overload assessment.

    PubMed

    Fernandes, Juliano Lara; Fioravante, Luciana Andrea Barozi; Verissimo, Monica P; Loggetto, Sandra R

    2017-06-01

    Background Iron overload assessment with magnetic resonance imaging (MRI) using T2* has become a key diagnostic method in the management of many diseases. Quantitative analysis of the MRI images with a cost-effective tool has been a limitation to increased use of the method. Purpose To provide a free software solution for this purpose comparing the results with a commercial solution. Material and Methods The free tool was developed as a standalone program to be directly downloaded and ran in a common personal computer platform without the need of a dedicated workstation. Liver and cardiac T2* values were calculated using both tools and the values obtained compared between them in a group of 56 patients with suspected iron overload using Bland-Altman plots and concordance correlation coefficients (CCC). Results In the heart, the mean T2* differences between the two methods was 0.46 ms (95% confidence interval [CI], -0.037 -0.965) and in the liver 0.49 ms (95% CI, 0.257-0.722). The CCC for both the heart and the liver were significantly high (0.98 [95% CI, 0.966-0.988] with a Pearson ρ of 0.9811 and 0.991 [95% CI, 0.986-0.994] with a Pearson ρ of 0.996, respectively. No significant differences were observed when analyzing only patients with abnormal concentrations of iron in both organs compared to the whole cohort. Conclusion The proposed free software tool is accurate for calculation of T2* values of the liver and heart and might be a solution for centers that cannot use paid commercial solutions.

  1. 7 CFR 1437.301 - Value loss.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 10 2010-01-01 2010-01-01 false Value loss. 1437.301 Section 1437.301 Agriculture... Coverage Using Value § 1437.301 Value loss. (a) Special provisions are required to assess losses and.... Assistance for these commodities is calculated based on the loss of value at the time of disaster. The agency...

  2. On the validity of microscopic calculations of double-quantum-dot spin qubits based on Fock-Darwin states

    NASA Astrophysics Data System (ADS)

    Chan, GuoXuan; Wang, Xin

    2018-04-01

    We consider two typical approximations that are used in the microscopic calculations of double-quantum dot spin qubits, namely, the Heitler-London (HL) and the Hund-Mulliken (HM) approximations, which use linear combinations of Fock-Darwin states to approximate the two-electron states under the double-well confinement potential. We compared these results to a case in which the solution to a one-dimensional Schr¨odinger equation was exactly known and found that typical microscopic calculations based on Fock-Darwin states substantially underestimate the value of the exchange interaction, which is the key parameter that controls the quantum dot spin qubits. This underestimation originates from the lack of tunneling of Fock-Darwin states, which is accurate only in the case with a single potential well. Our results suggest that the accuracies of the current two-dimensional molecular- orbit-theoretical calculations based on Fock-Darwin states should be revisited since underestimation could only deteriorate in dimensions that are higher than one.

  3. The Lα (λ = 121.6 nm) solar plage contrasts calculations.

    NASA Astrophysics Data System (ADS)

    Bruevich, E. A.

    1991-06-01

    The results of calculations of Lα plage contrasts based on experimental data are presented. A three-component model ideology of Lα solar flux using "Prognoz-10" and SME daily smoothed values of Lα solar flux are applied. The values of contrast are discussed and compared with experimental values based on "Skylab" data.

  4. Value-Based Leadership Approach: A Way for Principals to Revive the Value of Values in Schools

    ERIC Educational Resources Information Center

    van Niekerk, Molly; Botha, Johan

    2017-01-01

    The qualitative research discussed in this article is based on the assumption that school principals as leaders need to establish, develop and maintain a core of shared values in their schools. Our focus is on principals' current perceptions of values in their schools. This is important because values underpin their decisions and actions and thus…

  5. Image phase shift invariance based cloud motion displacement vector calculation method for ultra-short-term solar PV power forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Fei; Zhen, Zhao; Liu, Chun

    Irradiance received on the earth's surface is the main factor that affects the output power of solar PV plants, and is chiefly determined by the cloud distribution seen in a ground-based sky image at the corresponding moment in time. It is the foundation for those linear extrapolation-based ultra-short-term solar PV power forecasting approaches to obtain the cloud distribution in future sky images from the accurate calculation of cloud motion displacement vectors (CMDVs) by using historical sky images. Theoretically, the CMDV can be obtained from the coordinate of the peak pulse calculated from a Fourier phase correlation theory (FPCT) method throughmore » the frequency domain information of sky images. The peak pulse is significant and unique only when the cloud deformation between two consecutive sky images is slight enough, which is likely possible for a very short time interval (such as 1?min or shorter) with common changes in the speed of cloud. Sometimes, there will be more than one pulse with similar values when the deformation of the clouds between two consecutive sky images is comparatively obvious under fast changing cloud speeds. This would probably lead to significant errors if the CMDVs were still only obtained from the single coordinate of the peak value pulse. However, the deformation estimation of clouds between two images and its influence on FPCT-based CMDV calculations are terrifically complex and difficult because the motion of clouds is complicated to describe and model. Therefore, to improve the accuracy and reliability under these circumstances in a simple manner, an image-phase-shift-invariance (IPSI) based CMDV calculation method using FPCT is proposed for minute time scale solar power forecasting. First, multiple different CMDVs are calculated from the corresponding consecutive images pairs obtained through different synchronous rotation angles compared to the original images by using the FPCT method. Second, the final CMDV is generated

  6. Image phase shift invariance based cloud motion displacement vector calculation method for ultra-short-term solar PV power forecasting

    DOE PAGES

    Wang, Fei; Zhen, Zhao; Liu, Chun; ...

    2017-12-18

    Irradiance received on the earth's surface is the main factor that affects the output power of solar PV plants, and is chiefly determined by the cloud distribution seen in a ground-based sky image at the corresponding moment in time. It is the foundation for those linear extrapolation-based ultra-short-term solar PV power forecasting approaches to obtain the cloud distribution in future sky images from the accurate calculation of cloud motion displacement vectors (CMDVs) by using historical sky images. Theoretically, the CMDV can be obtained from the coordinate of the peak pulse calculated from a Fourier phase correlation theory (FPCT) method throughmore » the frequency domain information of sky images. The peak pulse is significant and unique only when the cloud deformation between two consecutive sky images is slight enough, which is likely possible for a very short time interval (such as 1?min or shorter) with common changes in the speed of cloud. Sometimes, there will be more than one pulse with similar values when the deformation of the clouds between two consecutive sky images is comparatively obvious under fast changing cloud speeds. This would probably lead to significant errors if the CMDVs were still only obtained from the single coordinate of the peak value pulse. However, the deformation estimation of clouds between two images and its influence on FPCT-based CMDV calculations are terrifically complex and difficult because the motion of clouds is complicated to describe and model. Therefore, to improve the accuracy and reliability under these circumstances in a simple manner, an image-phase-shift-invariance (IPSI) based CMDV calculation method using FPCT is proposed for minute time scale solar power forecasting. First, multiple different CMDVs are calculated from the corresponding consecutive images pairs obtained through different synchronous rotation angles compared to the original images by using the FPCT method. Second, the final CMDV is generated

  7. Is Earth-based scaling a valid procedure for calculating heat flows for Mars?

    NASA Astrophysics Data System (ADS)

    Ruiz, Javier; Williams, Jean-Pierre; Dohm, James M.; Fernández, Carlos; López, Valle

    2013-09-01

    Heat flow is a very important parameter for constraining the thermal evolution of a planetary body. Several procedures for calculating heat flows for Mars from geophysical or geological proxies have been used, which are valid for the time when the structures used as indicators were formed. The more common procedures are based on estimates of lithospheric strength (the effective elastic thickness of the lithosphere or the depth to the brittle-ductile transition). On the other hand, several works by Kargel and co-workers have estimated martian heat flows from scaling the present-day terrestrial heat flow to Mars, but the so-obtained values are much higher than those deduced from lithospheric strength. In order to explain the discrepancy, a recent paper by Rodriguez et al. (Rodriguez, J.A.P., Kargel, J.S., Tanaka, K.L., Crown, D.A., Berman, D.C., Fairén, A.G., Baker, V.R., Furfaro, R., Candelaria, P., Sasaki, S. [2011]. Icarus 213, 150-194) criticized the heat flow calculations for ancient Mars presented by Ruiz et al. (Ruiz, J., Williams, J.-P., Dohm, J.M., Fernández, C., López, V. [2009]. Icarus 207, 631-637) and other studies calculating ancient martian heat flows from lithospheric strength estimates, and casted doubts on the validity of the results obtained by these works. Here however we demonstrate that the discrepancy is due to computational and conceptual errors made by Kargel and co-workers, and we conclude that the scaling from terrestrial heat flow values is not a valid procedure for estimating reliable heat flows for Mars.

  8. MRI-Based Computed Tomography Metal Artifact Correction Method for Improving Proton Range Calculation Accuracy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Peter C.; Schreibmann, Eduard; Roper, Justin

    2015-03-15

    Purpose: Computed tomography (CT) artifacts can severely degrade dose calculation accuracy in proton therapy. Prompted by the recently increased popularity of magnetic resonance imaging (MRI) in the radiation therapy clinic, we developed an MRI-based CT artifact correction method for improving the accuracy of proton range calculations. Methods and Materials: The proposed method replaces corrupted CT data by mapping CT Hounsfield units (HU number) from a nearby artifact-free slice, using a coregistered MRI. MRI and CT volumetric images were registered with use of 3-dimensional (3D) deformable image registration (DIR). The registration was fine-tuned on a slice-by-slice basis by using 2D DIR.more » Based on the intensity of paired MRI pixel values and HU from an artifact-free slice, we performed a comprehensive analysis to predict the correct HU for the corrupted region. For a proof-of-concept validation, metal artifacts were simulated on a reference data set. Proton range was calculated using reference, artifactual, and corrected images to quantify the reduction in proton range error. The correction method was applied to 4 unique clinical cases. Results: The correction method resulted in substantial artifact reduction, both quantitatively and qualitatively. On respective simulated brain and head and neck CT images, the mean error was reduced from 495 and 370 HU to 108 and 92 HU after correction. Correspondingly, the absolute mean proton range errors of 2.4 cm and 1.7 cm were reduced to less than 2 mm in both cases. Conclusions: Our MRI-based CT artifact correction method can improve CT image quality and proton range calculation accuracy for patients with severe CT artifacts.« less

  9. View Estimation Based on Value System

    NASA Astrophysics Data System (ADS)

    Takahashi, Yasutake; Shimada, Kouki; Asada, Minoru

    Estimation of a caregiver's view is one of the most important capabilities for a child to understand the behavior demonstrated by the caregiver, that is, to infer the intention of behavior and/or to learn the observed behavior efficiently. We hypothesize that the child develops this ability in the same way as behavior learning motivated by an intrinsic reward, that is, he/she updates the model of the estimated view of his/her own during the behavior imitated from the observation of the behavior demonstrated by the caregiver based on minimizing the estimation error of the reward during the behavior. From this view, this paper shows a method for acquiring such a capability based on a value system from which values can be obtained by reinforcement learning. The parameters of the view estimation are updated based on the temporal difference error (hereafter TD error: estimation error of the state value), analogous to the way such that the parameters of the state value of the behavior are updated based on the TD error. Experiments with simple humanoid robots show the validity of the method, and the developmental process parallel to young children's estimation of its own view during the imitation of the observed behavior of the caregiver is discussed.

  10. 31 CFR 359.55 - How are redemption values calculated for book-entry Series I savings bonds?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... prorated to the book-entry par investment amount for the corresponding issue and redemption dates... to $25.04; calculated value of $25.045 rounds to $25.05. [Book-entry par investment ÷ 100] × [CRV... for book-entry Series I savings bonds? 359.55 Section 359.55 Money and Finance: Treasury Regulations...

  11. Standard Operating Procedure for Using the NAFTA Guidance to Calculate Representative Half-life Values and Characterizing Pesticide Degradation

    EPA Pesticide Factsheets

    Results of the degradation kinetics project and describes a general approach for calculating and selecting representative half-life values from soil and aquatic transformation studies for risk assessment and exposure modeling purposes.

  12. Correlation Between Analytical and Thermodynamicaly Calculated Values of Density For Chloride-sodium Brines

    NASA Astrophysics Data System (ADS)

    Dudukalov, A.

    Leakage from pipe-lines, nonhermetic wells and other industrial equipment of highly mineralized chloride-sodium brines, incidentally produced during oil field exploitation is one of the main source of fresh groundwater contamination on the Arlan oil field. Thermodynamic calculation, aimed to define more exactly brines chemical composi- tion and density was carried out by FREZCHEM2 program (Mironenko M.V. et al. 1997). Five brines types with mineralization of 137.9, 181.2, 217.4, 243.7, 267.8 g/l and density of 1.176, 1.09, 1.135, 1.153, 1.167 g/cm3 correspondingly were used. It is necessary to note that preliminarily chemical compositions of two last brines were corrected according to their mineralization. During calculations it was determined the following density values of brines: 1.082, 1.114, 1.131, 1.146, 1.158 g/cm3 conse- quently. Obtained results demonstrate the significant discrepancy in experimental and model estimates. Significant excess of anions over cations in experimental data indicates a major prob- lem with the analytical measurements. During calculations it was analyzed the possi- bility of changes in brines density depending on editing to cations or deducting from anions requisite amount of agent for keeping charge balance equal to zero. Received results demonstrate that in this case brines density can change on 0.004-0.011 g/cm3.

  13. Correcting power and p-value calculations for bias in diffusion tensor imaging.

    PubMed

    Lauzon, Carolyn B; Landman, Bennett A

    2013-07-01

    Diffusion tensor imaging (DTI) provides quantitative parametric maps sensitive to tissue microarchitecture (e.g., fractional anisotropy, FA). These maps are estimated through computational processes and subject to random distortions including variance and bias. Traditional statistical procedures commonly used for study planning (including power analyses and p-value/alpha-rate thresholds) specifically model variability, but neglect potential impacts of bias. Herein, we quantitatively investigate the impacts of bias in DTI on hypothesis test properties (power and alpha-rate) using a two-sided hypothesis testing framework. We present theoretical evaluation of bias on hypothesis test properties, evaluate the bias estimation technique SIMEX for DTI hypothesis testing using simulated data, and evaluate the impacts of bias on spatially varying power and alpha rates in an empirical study of 21 subjects. Bias is shown to inflame alpha rates, distort the power curve, and cause significant power loss even in empirical settings where the expected difference in bias between groups is zero. These adverse effects can be attenuated by properly accounting for bias in the calculation of power and p-values. Copyright © 2013 Elsevier Inc. All rights reserved.

  14. Simulation-based power calculations for planning a two-stage individual participant data meta-analysis.

    PubMed

    Ensor, Joie; Burke, Danielle L; Snell, Kym I E; Hemming, Karla; Riley, Richard D

    2018-05-18

    Researchers and funders should consider the statistical power of planned Individual Participant Data (IPD) meta-analysis projects, as they are often time-consuming and costly. We propose simulation-based power calculations utilising a two-stage framework, and illustrate the approach for a planned IPD meta-analysis of randomised trials with continuous outcomes where the aim is to identify treatment-covariate interactions. The simulation approach has four steps: (i) specify an underlying (data generating) statistical model for trials in the IPD meta-analysis; (ii) use readily available information (e.g. from publications) and prior knowledge (e.g. number of studies promising IPD) to specify model parameter values (e.g. control group mean, intervention effect, treatment-covariate interaction); (iii) simulate an IPD meta-analysis dataset of a particular size from the model, and apply a two-stage IPD meta-analysis to obtain the summary estimate of interest (e.g. interaction effect) and its associated p-value; (iv) repeat the previous step (e.g. thousands of times), then estimate the power to detect a genuine effect by the proportion of summary estimates with a significant p-value. In a planned IPD meta-analysis of lifestyle interventions to reduce weight gain in pregnancy, 14 trials (1183 patients) promised their IPD to examine a treatment-BMI interaction (i.e. whether baseline BMI modifies intervention effect on weight gain). Using our simulation-based approach, a two-stage IPD meta-analysis has < 60% power to detect a reduction of 1 kg weight gain for a 10-unit increase in BMI. Additional IPD from ten other published trials (containing 1761 patients) would improve power to over 80%, but only if a fixed-effect meta-analysis was appropriate. Pre-specified adjustment for prognostic factors would increase power further. Incorrect dichotomisation of BMI would reduce power by over 20%, similar to immediately throwing away IPD from ten trials. Simulation-based power

  15. 31 CFR 351.32 - How are redemption values calculated for Series EE bonds with issue dates of May 1, 1997, through...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 31 Money and Finance:Treasury 2 2012-07-01 2012-07-01 false How are redemption values calculated... Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FISCAL SERVICE... formula: FV = PV × {[1+(i ÷ 2)] (m/6)} where FV (future value) = redemption value on redemption date...

  16. Grey-Markov prediction model based on background value optimization and central-point triangular whitenization weight function

    NASA Astrophysics Data System (ADS)

    Ye, Jing; Dang, Yaoguo; Li, Bingjun

    2018-01-01

    Grey-Markov forecasting model is a combination of grey prediction model and Markov chain which show obvious optimization effects for data sequences with characteristics of non-stationary and volatility. However, the state division process in traditional Grey-Markov forecasting model is mostly based on subjective real numbers that immediately affects the accuracy of forecasting values. To seek the solution, this paper introduces the central-point triangular whitenization weight function in state division to calculate possibilities of research values in each state which reflect preference degrees in different states in an objective way. On the other hand, background value optimization is applied in the traditional grey model to generate better fitting data. By this means, the improved Grey-Markov forecasting model is built. Finally, taking the grain production in Henan Province as an example, it verifies this model's validity by comparing with GM(1,1) based on background value optimization and the traditional Grey-Markov forecasting model.

  17. An economic-research-based approach to calculate community health-staffing requirements in Xicheng District, Beijing.

    PubMed

    Yin, Delu; Yin, Tao; Yang, Huiming; Xin, Qianqian; Wang, Lihong; Li, Ninyan; Ding, Xiaoyan; Chen, Bowen

    2016-12-07

    A shortage of community health professionals has been a crucial issue hindering the development of CHS. Various methods have been established to calculate health workforce requirements. This study aimed to use an economic-research-based approach to calculate the number of community health professionals required to provide community health services in the Xicheng District of Beijing and then assess current staffing levels against this ideal. Using questionnaires, we collected relevant data from 14 community health centers in the Xicheng District, including resident population, number of different health services provided, and service volumes. Through 36 interviews with family doctors, nurses, and public health workers, and six focus groups, we were able to calculate the person-time (equivalent value) required for each community health service. Field observations were conducted to verify the duration. In the 14 community health centers in Xicheng District, 1752 health workers were found in our four categories, serving a population of 1.278 million. Total demand for the community health service outstripped supply for doctors, nurses, and public health workers, but not other professionals. The method suggested that to properly serve the study population an additional 64 family doctors, 40 nurses, and 753 public health workers would be required. Our calculations indicate that significant numbers of new health professionals are required to deliver community health services. We established time standards in minutes (equivalent value) for each community health service activity, which could be applied elsewhere in China by government planners and civil society advocates.

  18. Steganography based on pixel intensity value decomposition

    NASA Astrophysics Data System (ADS)

    Abdulla, Alan Anwar; Sellahewa, Harin; Jassim, Sabah A.

    2014-05-01

    This paper focuses on steganography based on pixel intensity value decomposition. A number of existing schemes such as binary, Fibonacci, Prime, Natural, Lucas, and Catalan-Fibonacci (CF) are evaluated in terms of payload capacity and stego quality. A new technique based on a specific representation is proposed to decompose pixel intensity values into 16 (virtual) bit-planes suitable for embedding purposes. The proposed decomposition has a desirable property whereby the sum of all bit-planes does not exceed the maximum pixel intensity value, i.e. 255. Experimental results demonstrate that the proposed technique offers an effective compromise between payload capacity and stego quality of existing embedding techniques based on pixel intensity value decomposition. Its capacity is equal to that of binary and Lucas, while it offers a higher capacity than Fibonacci, Prime, Natural, and CF when the secret bits are embedded in 1st Least Significant Bit (LSB). When the secret bits are embedded in higher bit-planes, i.e., 2nd LSB to 8th Most Significant Bit (MSB), the proposed scheme has more capacity than Natural numbers based embedding. However, from the 6th bit-plane onwards, the proposed scheme offers better stego quality. In general, the proposed decomposition scheme has less effect in terms of quality on pixel value when compared to most existing pixel intensity value decomposition techniques when embedding messages in higher bit-planes.

  19. Space resection model calculation based on Random Sample Consensus algorithm

    NASA Astrophysics Data System (ADS)

    Liu, Xinzhu; Kang, Zhizhong

    2016-03-01

    Resection has been one of the most important content in photogrammetry. It aims at the position and attitude information of camera at the shooting point. However in some cases, the observed values for calculating are with gross errors. This paper presents a robust algorithm that using RANSAC method with DLT model can effectually avoiding the difficulties to determine initial values when using co-linear equation. The results also show that our strategies can exclude crude handicap and lead to an accurate and efficient way to gain elements of exterior orientation.

  20. Value-Based Argumentation for Justifying Compliance

    NASA Astrophysics Data System (ADS)

    Burgemeestre, Brigitte; Hulstijn, Joris; Tan, Yao-Hua

    Compliance is often achieved 'by design' through a coherent system of controls consisting of information systems and procedures . This system-based control requires a new approach to auditing in which companies must demonstrate to the regulator that they are 'in control'. They must determine the relevance of a regulation for their business, justify which set of control measures they have taken to comply with it, and demonstrate that the control measures are operationally effective. In this paper we show how value-based argumentation theory can be applied to the compliance domain. Corporate values motivate the selection of control measures (actions) which aim to fulfill control objectives, i.e. adopted norms (goals). In particular, we show how to formalize the dialogue in which companies justify their compliance decisions to regulators using value-based argumentation. The approach is illustrated by a case study of the safety and security measures adopted in the context of EU customs regulation.

  1. SU-F-BRA-10: Fricke Dosimetry: Determination of the G-Value for Ir-192 Energy Based On the NRC Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salata, C; David, M; Rosado, P

    Purpose: Use the methodology developed by the National Research Council Canada (NRC), for Fricke Dosimetry, to determine the G-value used at Ir-192 energies. Methods: In this study the Radiology Science Laboratory of Rio de Janeiro State University (LCR),based the G-value determination on the NRC method, using polyethylene bags. Briefly, this method consists of interpolating the G-values calculated for Co-60 and 250 kV x-rays for the average energy of Ir-192 (380 keV). As the Co-60 G-value is well described at literature, and associated with low uncertainties, it wasn’t measured in this present study. The G-values for 150 kV (Effective energy ofmore » 68 keV), 250 kV (Effective energy of 132 keV)and 300 kV(Effective energy of 159 keV)were calculated using the air kerma given by a calibrated ion chamber, and making it equivalent to the absorbed to the Fricke solution, using a Monte Carlo calculated factor for this conversion. Instead of interpolations, as described by the NRC, we displayed the G-values points in a graph, and used the line equation to determine the G- value for Ir-192 (380 keV). Results: The measured G-values were 1.436 ± 0.002 µmol/J for 150 kV, 1.472 ± 0.002 µmol/J for 250 kV, 1.497 ± 0.003 µmol/J for 300 kV. The used G-value for Co-60 (1.25 MeV) was 1,613 µmol/J. The R-square of the fitted regression line among those G-value points was 0.991. Using the line equation, the calculate G-value for 380 KeV was 1.542 µmol/J. Conclusion: The Result found for Ir-192 G-value is 3,1% different (lower) from the NRC value. But it agrees with previous literature results, using different methodologies to calculate this parameter. We will continue this experiment measuring the G-value for Co-60 in order to compare with the NRC method and better understand the reasons for the found differences.« less

  2. Calculating meal glycemic index by using measured and published food values compared with directly measured meal glycemic index.

    PubMed

    Dodd, Hayley; Williams, Sheila; Brown, Rachel; Venn, Bernard

    2011-10-01

    Glycemic index (GI) testing is normally based on individual foods, whereas GIs for meals or diets are based on a formula using a weighted sum of the constituents. The accuracy with which the formula can predict a meal or diet GI is questionable. Our objective was to compare the GI of meals, obtained by using the formula and by using both measured food GI and published values, with directly measured meal GIs. The GIs of 7 foods were tested in 30 healthy people. The foods were combined into 3 meals, each of which provided 50 g available carbohydrate, including a staple (potato, rice, or spaghetti), vegetables, sauce, and pan-fried chicken. The mean (95% CI) meal GIs determined from individual food GI values and by direct measurement were as follows: potato meal [predicted, 63 (56, 70); measured, 53 (46, 62)], rice meal [predicted, 51 (45, 56); measured, 38 (33, 45)], and spaghetti meal [predicted, 54 (49, 60); measured, 38 (33, 44)]. The predicted meal GIs were all higher than the measured GIs (P < 0.001). The extent of the overestimation depended on the particular food, ie, 12, 15, and 19 GI units (or 22%, 40%, and 50%) for the potato, rice, and spaghetti meals, respectively. The formula overestimated the GI of the meals by between 22% and 50%. The use of published food values also overestimated the measured meal GIs. Investigators using the formula to calculate a meal or diet GI should be aware of limitations in the method. This trial is registered with the Australian and New Zealand Clinical Trials Registry as ACTRN12611000210976.

  3. Values based practice: a framework for thinking with.

    PubMed

    Mohanna, Kay

    2017-07-01

    Values are those principles that govern behaviours, and values-based practice has been described as a theory and skills base for effective healthcare decision-making where different (and hence potentially conflicting) values are in play. The emphasis is on good process rather than pre-set right outcomes, aiming to achieve balanced decision-making. In this article we will consider the utility of this model by looking at leadership development, a current area of much interest and investment in healthcare. Copeland points out that 'values based leadership behaviors are styles with a moral, authentic and ethical dimension', important qualities in healthcare decision-making.

  4. A simplified calculation procedure for mass isotopomer distribution analysis (MIDA) based on multiple linear regression.

    PubMed

    Fernández-Fernández, Mario; Rodríguez-González, Pablo; García Alonso, J Ignacio

    2016-10-01

    We have developed a novel, rapid and easy calculation procedure for Mass Isotopomer Distribution Analysis based on multiple linear regression which allows the simultaneous calculation of the precursor pool enrichment and the fraction of newly synthesized labelled proteins (fractional synthesis) using linear algebra. To test this approach, we used the peptide RGGGLK as a model tryptic peptide containing three subunits of glycine. We selected glycine labelled in two 13 C atoms ( 13 C 2 -glycine) as labelled amino acid to demonstrate that spectral overlap is not a problem in the proposed methodology. The developed methodology was tested first in vitro by changing the precursor pool enrichment from 10 to 40% of 13 C 2 -glycine. Secondly, a simulated in vivo synthesis of proteins was designed by combining the natural abundance RGGGLK peptide and 10 or 20% 13 C 2 -glycine at 1 : 1, 1 : 3 and 3 : 1 ratios. Precursor pool enrichments and fractional synthesis values were calculated with satisfactory precision and accuracy using a simple spreadsheet. This novel approach can provide a relatively rapid and easy means to measure protein turnover based on stable isotope tracers. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  5. Microstructure-Sensitive Extreme Value Probabilities for High Cycle Fatigue of Ni-Base Superalloy IN100 (Preprint)

    DTIC Science & Technology

    2009-03-01

    transition fatigue regimes; however, microplasticity (i.e., heterogeneous plasticity at the scale of microstructure) is relevant to understanding fatigue...and Socie [57] considered the affect of microplastic 14 Microstructure-Sensitive Extreme Value Probabilities for High Cycle Fatigue of Ni-Base...considers the local stress state as affected by intergranular interactions and microplasticity . For the calculations given below, the volumes over which

  6. [Nonnative guidelines for allocating human resources in child and adolescent psychiatry using average values under convergence conditions instead of price determination - analysis of the data of university hospitals in Germany concerning the costs of calculating day and minute values according to Psych-PV and PEPP-System].

    PubMed

    Barufka, Steffi; Heller, Michael; Prayon, Valeria; Fegert, Jörg M

    2015-11-01

    Despite substantial opposition in the practical field, based on an amendment to the Hospital Financing Act (KHG). the so-called PEPP-System was introduced in child and adolescent psychiatry as a new calculation model. The 2-year moratorium, combined with the rescheduling of the repeal of the psychiatry personnel regulation (Psych-PV) and a convergence phase, provided the German Federal Ministry of Health with additional time to enter a structured dialogue with professional associations. Especially the perspective concerning the regulatory framework is presently unclear. In light of this debate, this article provides calculations to illustrate the transformation of the previous personnel regulation into the PEPP-System by means of the data of §21 KHEntgG stemming from the 22 university hospitals of child and adolescent psychiatry and psychotherapy in Germany. In 2013 there was a total of 7,712 cases and 263,694 calculation days. In order to identify a necessary basic reimbursement value th1\\t would guarantee a constant quality of patient care, the authors utilize outcomes, cost structures, calculation days, and minute values for individual professional groups according to both systems (Psych-PV and PEPP) based on data from 2013 and the InEK' s analysis of the calculation datasets. The authors propose a normative agreement on the basic reimbursement value between 270 and 285 EUR. This takes into account the concentration phenomenon and the expansion of services that has occurred since the introduction of the Psych-PV system. Such a normative agreement on structural quality could provide a verifiable framework for the allocation of human resources corresponding to the previous regulations of Psych-PV.

  7. Comparison of primary zone combustor liner wall temperatures with calculated predictions

    NASA Technical Reports Server (NTRS)

    Norgren, C. T.

    1973-01-01

    Calculated liner temperatures based on a steady-state radiative and convective heat balance at the liner wall were compared with experimental values. Calculated liner temperatures were approximately 8 percent higher than experimental values. A radiometer was used to experimentally determine values of flame temperature and flame emissivity. Film cooling effectiveness was calculated from an empirical turbulent mixing expression assuming a turbulent mixing level of 2 percent. Liner wall temperatures were measured in a rectangular combustor segment 6 by 12 in. and tested at pressures up to 26.7 atm and inlet temperatures up to 922 K.

  8. Software-Based Visual Loan Calculator For Banking Industry

    NASA Astrophysics Data System (ADS)

    Isizoh, A. N.; Anazia, A. E.; Okide, S. O. 3; Onyeyili, T. I.; Okwaraoka, C. A. P.

    2012-03-01

    industry is very necessary in modern day banking system using many design techniques for security reasons. This paper thus presents the software-based design and implementation of a Visual Loan calculator for banking industry using Visual Basic .Net (VB.Net). The fundamental approach to this is to develop a Graphical User Interface (GUI) using VB.Net operating tools, and then developing a working program which calculates the interest of any loan obtained. The VB.Net programming was done, implemented and the software proved satisfactory.

  9. The effects of calculator-based laboratories on standardized test scores

    NASA Astrophysics Data System (ADS)

    Stevens, Charlotte Bethany Rains

    Nationwide, the goal of providing a productive science and math education to our youth in today's educational institutions is centering itself around the technology being utilized in these classrooms. In this age of digital technology, educational software and calculator-based laboratories (CBL) have become significant devices in the teaching of science and math for many states across the United States. Among the technology, the Texas Instruments graphing calculator and Vernier Labpro interface, are among some of the calculator-based laboratories becoming increasingly popular among middle and high school science and math teachers in many school districts across this country. In Tennessee, however, it is reported that this type of technology is not regularly utilized at the student level in most high school science classrooms, especially in the area of Physical Science (Vernier, 2006). This research explored the effect of calculator based laboratory instruction on standardized test scores. The purpose of this study was to determine the effect of traditional teaching methods versus graphing calculator teaching methods on the state mandated End-of-Course (EOC) Physical Science exam based on ability, gender, and ethnicity. The sample included 187 total tenth and eleventh grade physical science students, 101 of which belonged to a control group and 87 of which belonged to the experimental group. Physical Science End-of-Course scores obtained from the Tennessee Department of Education during the spring of 2005 and the spring of 2006 were used to examine the hypotheses. The findings of this research study suggested the type of teaching method, traditional or calculator based, did not have an effect on standardized test scores. However, the students' ability level, as demonstrated on the End-of-Course test, had a significant effect on End-of-Course test scores. This study focused on a limited population of high school physical science students in the middle Tennessee

  10. Ultrasoft pseudopotentials and Hubbard U values for rare-earth elements (Re=La-Lu) guided by HSE06 calculations

    NASA Astrophysics Data System (ADS)

    Topsakal, Mehmet; Umemoto, Koichiro; Wentzcovitch, Renata

    2014-03-01

    The lanthanide series of the periodic table comprises fifteen members ranging from La to Lu - the rare-earth (Re) elements. They exhibit unique (and mostly unexplored) chemical properties depending on the fillings of 4f-orbitals. Due to strong electronic correlation, 4f valence electrons are incorrectly described by standard DFT functionals. In order to cope with these inefficiencies, the DFT+U method is often employed where Hubbard-type U is introduced into the standard DFT. Another approach is to use hybrid functionals. Both improve the treatment of strongly correlated electrons. However, DFT+U suffers from ambiguity of U while hybrid functionals suffer from extremely demanding computational costs. Here we provide Vanderbilt type ultrasoft pseudopotentials for Re elements with suggested U values allowing efficient plane-wave calculations. Hubbard U values are determined according to HSE06 calculations on Re-nitrides (ReN). Generated pseudopotentials were further tested on some Re-cobaltite (Re-CoO3) perovskites. Alternative pseudopotentials with f-electrons kept frozen in the core of pseudopotential are also provided and possible outcomes are addressed. We believe that these new pseudopotentials with suggested U values will allow further studies on rare-earth materials.

  11. SU-F-J-109: Generate Synthetic CT From Cone Beam CT for CBCT-Based Dose Calculation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, H; Barbee, D; Wang, W

    Purpose: The use of CBCT for dose calculation is limited by its HU inaccuracy from increased scatter. This study presents a method to generate synthetic CT images from CBCT data by a probabilistic classification that may be robust to CBCT noise. The feasibility of using the synthetic CT for dose calculation is evaluated in IMRT for unilateral H&N cancer. Methods: In the training phase, a fuzzy c-means classification was performed on HU vectors (CBCT, CT) of planning CT and registered day-1 CBCT image pair. Using the resulting centroid CBCT and CT values for five classified “tissue” types, a synthetic CTmore » for a daily CBCT was created by classifying each CBCT voxel to obtain its probability belonging to each tissue class, then assigning a CT HU with a probability-weighted summation of the classes’ CT centroids. Two synthetic CTs from a CBCT were generated: s-CT using the centroids from classification of individual patient CBCT/CT data; s2-CT using the same centroids for all patients to investigate the applicability of group-based centroids. IMRT dose calculations for five patients were performed on the synthetic CTs and compared with CT-planning doses by dose-volume statistics. Results: DVH curves of PTVs and critical organs calculated on s-CT and s2-CT agree with those from planning-CT within 3%, while doses calculated with heterogeneity off or on raw CBCT show DVH differences up to 15%. The differences in PTV D95% and spinal cord max are 0.6±0.6% and 0.6±0.3% for s-CT, and 1.6±1.7% and 1.9±1.7% for s2-CT. Gamma analysis (2%/2mm) shows 97.5±1.6% and 97.6±1.6% pass rates for using s-CTs and s2-CTs compared with CT-based doses, respectively. Conclusion: CBCT-synthesized CTs using individual or group-based centroids resulted in dose calculations that are comparable to CT-planning dose for unilateral H&N cancer. The method may provide a tool for accurate dose calculation based on daily CBCT.« less

  12. The effects of variations in parameters and algorithm choices on calculated radiomics feature values: initial investigations and comparisons to feature variability across CT image acquisition conditions

    NASA Astrophysics Data System (ADS)

    Emaminejad, Nastaran; Wahi-Anwar, Muhammad; Hoffman, John; Kim, Grace H.; Brown, Matthew S.; McNitt-Gray, Michael

    2018-02-01

    Translation of radiomics into clinical practice requires confidence in its interpretations. This may be obtained via understanding and overcoming the limitations in current radiomic approaches. Currently there is a lack of standardization in radiomic feature extraction. In this study we examined a few factors that are potential sources of inconsistency in characterizing lung nodules, such as 1)different choices of parameters and algorithms in feature calculation, 2)two CT image dose levels, 3)different CT reconstruction algorithms (WFBP, denoised WFBP, and Iterative). We investigated the effect of variation of these factors on entropy textural feature of lung nodules. CT images of 19 lung nodules identified from our lung cancer screening program were identified by a CAD tool and contours provided. The radiomics features were extracted by calculating 36 GLCM based and 4 histogram based entropy features in addition to 2 intensity based features. A robustness index was calculated across different image acquisition parameters to illustrate the reproducibility of features. Most GLCM based and all histogram based entropy features were robust across two CT image dose levels. Denoising of images slightly improved robustness of some entropy features at WFBP. Iterative reconstruction resulted in improvement of robustness in a fewer times and caused more variation in entropy feature values and their robustness. Within different choices of parameters and algorithms texture features showed a wide range of variation, as much as 75% for individual nodules. Results indicate the need for harmonization of feature calculations and identification of optimum parameters and algorithms in a radiomics study.

  13. Study of fatigue crack propagation in Ti-1Al-1Mn based on the calculation of cold work evolution

    NASA Astrophysics Data System (ADS)

    Plekhov, O. A.; Kostina, A. A.

    2017-05-01

    The work proposes a numerical method for lifetime assessment for metallic materials based on consideration of energy balance at crack tip. This method is based on the evaluation of the stored energy value per loading cycle. To calculate the stored and dissipated parts of deformation energy an elasto-plastic phenomenological model of energy balance in metals under the deformation and failure processes was proposed. The key point of the model is strain-type internal variable describing the stored energy process. This parameter is introduced based of the statistical description of defect evolution in metals as a second-order tensor and has a meaning of an additional strain due to the initiation and growth of the defects. The fatigue crack rate was calculated in a framework of a stationary crack approach (several loading cycles for every crack length was considered to estimate the energy balance at crack tip). The application of the proposed algorithm is illustrated by the calculation of the lifetime of the Ti-1Al-1Mn compact tension specimen under cyclic loading.

  14. Creative Uses for Calculator-based Laboratory (CBL) Technology in Chemistry.

    ERIC Educational Resources Information Center

    Sales, Cynthia L.; Ragan, Nicole M.; Murphy, Maureen Kendrick

    1999-01-01

    Reviews three projects that use a graphing calculator linked to a calculator-based laboratory device as a portable data-collection system for students in chemistry classes. Projects include Isolation, Purification and Quantification of Buckminsterfullerene from Woodstove Ashes; Determination of the Activation Energy Associated with the…

  15. Current State of Value-Based Purchasing Programs

    PubMed Central

    Chee, Tingyin T.; Ryan, Andrew M.; Wasfy, Jason H.; Borden, William B.

    2016-01-01

    The United States healthcare system is rapidly moving toward rewarding value. Recent legislation, such as the Affordable Care Act and the Medicare Access and CHIP Reauthorization Act (MACRA), solidified the role of value-based payment in Medicare. Many private insurers are following Medicare’s lead. Much of the policy attention has been on programs such as accountable care organizations and bundled payments; yet, value-based purchasing (VBP) or pay-for-performance, defined as providers being paid fee-for-service with payment adjustments up or down based on value metrics, remains a core element of value payment in MACRA and will likely remain so for the foreseeable future. This review article summarizes the current state of VBP programs and provides analysis of the strengths, weaknesses, and opportunities for the future. Multiple inpatient and outpatient VBP programs have been implemented and evaluated, with the impact of those programs being marginal. Opportunities to enhance the performance of VBP programs include improving the quality measurement science, strengthening both the size and design of incentives, reducing health disparities, establishing broad outcome measurement, choosing appropriate comparison targets, and determining the optimal role of VBP relative to alternative payment models. VBP programs will play a significant role in healthcare delivery for years to come, and they serve as an opportunity for providers to build the infrastructure needed for value-oriented care. PMID:27245648

  16. Value-based attentional capture influences context-dependent decision-making

    PubMed Central

    Cha, Kexin; Rangsipat, Napat; Serences, John T.

    2015-01-01

    Normative theories posit that value-based decision-making is context independent. However, decisions between two high-value options can be suboptimally biased by the introduction of a third low-value option. This context-dependent modulation is consistent with the divisive normalization of the value of each stimulus by the total value of all stimuli. In addition, an independent line of research demonstrates that pairing a stimulus with a high-value outcome can lead to attentional capture that can mediate the efficiency of visual information processing. Here we tested the hypothesis that value-based attentional capture interacts with value-based normalization to influence the optimality of decision-making. We used a binary-choice paradigm in which observers selected between two targets and the color of each target indicated the magnitude of their reward potential. Observers also had to simultaneously ignore a task-irrelevant distractor rendered in a color that was previously associated with a specific reward magnitude. When the color of the task-irrelevant distractor was previously associated with a high reward, observers responded more slowly and less optimally. Moreover, as the learned value of the distractor increased, electrophysiological data revealed an attenuation of the lateralized N1 and N2Pc responses evoked by the relevant choice stimuli and an attenuation of the late positive deflection (LPD). Collectively, these behavioral and electrophysiological data suggest that value-based attentional capture and value-based normalization jointly mediate the influence of context on free-choice decision-making. PMID:25995350

  17. Shrinkage regression-based methods for microarray missing value imputation.

    PubMed

    Wang, Hsiuying; Chiu, Chia-Chun; Wu, Yi-Ching; Wu, Wei-Sheng

    2013-01-01

    Missing values commonly occur in the microarray data, which usually contain more than 5% missing values with up to 90% of genes affected. Inaccurate missing value estimation results in reducing the power of downstream microarray data analyses. Many types of methods have been developed to estimate missing values. Among them, the regression-based methods are very popular and have been shown to perform better than the other types of methods in many testing microarray datasets. To further improve the performances of the regression-based methods, we propose shrinkage regression-based methods. Our methods take the advantage of the correlation structure in the microarray data and select similar genes for the target gene by Pearson correlation coefficients. Besides, our methods incorporate the least squares principle, utilize a shrinkage estimation approach to adjust the coefficients of the regression model, and then use the new coefficients to estimate missing values. Simulation results show that the proposed methods provide more accurate missing value estimation in six testing microarray datasets than the existing regression-based methods do. Imputation of missing values is a very important aspect of microarray data analyses because most of the downstream analyses require a complete dataset. Therefore, exploring accurate and efficient methods for estimating missing values has become an essential issue. Since our proposed shrinkage regression-based methods can provide accurate missing value estimation, they are competitive alternatives to the existing regression-based methods.

  18. Value-based recruitment in midwifery: do the values align with what women say is important to them?

    PubMed

    Callwood, Alison; Cooke, Debbie; Allan, Helen

    2016-10-01

    The aim of this study was to discuss theoretical conceptualization and definition of values and value-based recruitment in the context of women's views about what they would like from their midwife. Value-based recruitment received headline status in the UK government's response to pervasive deficiencies in compassionate care identified in the health service. Core values which aim to inform service user's experience are defined in the National Health Service Constitution but clarity about whether these encompass all that women say is important to them is needed. Discussion paper. A literature search included published papers written in English relating to values, VBR and women's views of a 'good' midwife with no date limiters. Definitions of values and value-based recruitment are examined. Congruence is explored between what women say is important to them and key government and professional regulatory documentation. The importance of a 'sustainable emotional' dimension in the midwife-mother relationship is suggested. Inconsistencies are identified between women's views, government, professional documentation and what women say they want. An omission of any reference to emotions or emotionality in value-based recruitment policy, professional recruitment and selection guidance documentation is identified. A review of key professional documentation, in relation to selection for 'values', is proposed. We argue for clarity and revision so that values embedded in value-based recruitment are consistent with health service users' views. An enhancement of the 'values' in the value-based recruitment framework is recommended to include the emotionality that women state is a fundamental part of their relationship with their midwife. © 2016 John Wiley & Sons Ltd.

  19. Pricing for Higher Education Institutions: A Value-Based Approach

    ERIC Educational Resources Information Center

    Amir, Amizawati Mohd; Auzair, Sofiah Md; Maelah, Ruhanita; Ahmad, Azlina

    2016-01-01

    Purpose: The purpose of this paper is to propose the concept of higher education institutions (HEIs) offering educational services based on value for money. The value is determined based on customers' (i.e. students) expectations of the service and the costs in comparison to the competitors. Understanding the value and creating customer value are…

  20. 'What the patient wants': an investigation of the methods of ascertaining patient values in evidence-based medicine and values-based practice.

    PubMed

    Wieten, Sarah

    2018-02-01

    Evidence-Based Medicine (EBM), Values-Based Practice (VBP) and Person-Centered Healthcare (PCH) are all concerned with the values in play in the clinical encounter. However, these recent movements are not in agreement about how to discover these relevant values. In some parts of EBM textbooks, the prescribed method for discovering values is through social science research on the average values in a particular population. VBP by contrast always investigates the individually held values of the different stakeholders in the particular clinical encounter, although the account has some other difficulties. I argue that although average values for populations might be very useful in informing questions of resource distribution and policy making, their use cannot replace the individual solicitation of patient (and other stakeholder) values in the clinical encounter. Because of the inconsistency of the EBM stance on values, the incompatibility of some versions of the EBM treatment of values with PCH, and EBM's attempt to transplant research methods from science into the realm of values, I must recommend the use of the VBP account of values discovery. © 2015 John Wiley & Sons, Ltd.

  1. Development of a quantum mechanics-based free-energy perturbation method: use in the calculation of relative solvation free energies.

    PubMed

    Reddy, M Rami; Singh, U C; Erion, Mark D

    2004-05-26

    Free-energy perturbation (FEP) is considered the most accurate computational method for calculating relative solvation and binding free-energy differences. Despite some success in applying FEP methods to both drug design and lead optimization, FEP calculations are rarely used in the pharmaceutical industry. One factor limiting the use of FEP is its low throughput, which is attributed in part to the dependence of conventional methods on the user's ability to develop accurate molecular mechanics (MM) force field parameters for individual drug candidates and the time required to complete the process. In an attempt to find an FEP method that could eventually be automated, we developed a method that uses quantum mechanics (QM) for treating the solute, MM for treating the solute surroundings, and the FEP method for computing free-energy differences. The thread technique was used in all transformations and proved to be essential for the successful completion of the calculations. Relative solvation free energies for 10 structurally diverse molecular pairs were calculated, and the results were in close agreement with both the calculated results generated by conventional FEP methods and the experimentally derived values. While considerably more CPU demanding than conventional FEP methods, this method (QM/MM-based FEP) alleviates the need for development of molecule-specific MM force field parameters and therefore may enable future automation of FEP-based calculations. Moreover, calculation accuracy should be improved over conventional methods, especially for calculations reliant on MM parameters derived in the absence of experimental data.

  2. Calculation of the acid-base equilibrium constants at the alumina/electrolyte interface from the ph dependence of the adsorption of singly charged ions (Na+, Cl-)

    NASA Astrophysics Data System (ADS)

    Gololobova, E. G.; Gorichev, I. G.; Lainer, Yu. A.; Skvortsova, I. V.

    2011-05-01

    A procedure was proposed for the calculation of the acid-base equilibrium constants at an alumina/electrolyte interface from experimental data on the adsorption of singly charged ions (Na+, Cl-) at various pH values. The calculated constants (p K {1/0}= 4.1, p K {2/0}= 11.9, p K {3/0}= 8.3, and p K {4/0}= 7.7) are shown to agree with the values obtained from an experimental pH dependence of the electrokinetic potential and the results of potentiometric titration of Al2O3 suspensions.

  3. Value-based attentional capture influences context-dependent decision-making.

    PubMed

    Itthipuripat, Sirawaj; Cha, Kexin; Rangsipat, Napat; Serences, John T

    2015-07-01

    Normative theories posit that value-based decision-making is context independent. However, decisions between two high-value options can be suboptimally biased by the introduction of a third low-value option. This context-dependent modulation is consistent with the divisive normalization of the value of each stimulus by the total value of all stimuli. In addition, an independent line of research demonstrates that pairing a stimulus with a high-value outcome can lead to attentional capture that can mediate the efficiency of visual information processing. Here we tested the hypothesis that value-based attentional capture interacts with value-based normalization to influence the optimality of decision-making. We used a binary-choice paradigm in which observers selected between two targets and the color of each target indicated the magnitude of their reward potential. Observers also had to simultaneously ignore a task-irrelevant distractor rendered in a color that was previously associated with a specific reward magnitude. When the color of the task-irrelevant distractor was previously associated with a high reward, observers responded more slowly and less optimally. Moreover, as the learned value of the distractor increased, electrophysiological data revealed an attenuation of the lateralized N1 and N2Pc responses evoked by the relevant choice stimuli and an attenuation of the late positive deflection (LPD). Collectively, these behavioral and electrophysiological data suggest that value-based attentional capture and value-based normalization jointly mediate the influence of context on free-choice decision-making. Copyright © 2015 the American Physiological Society.

  4. Value-Based Medicine and Integration of Tumor Biology.

    PubMed

    Brooks, Gabriel A; Bosserman, Linda D; Mambetsariev, Isa; Salgia, Ravi

    2017-01-01

    Clinical oncology is in the midst of a genomic revolution, as molecular insights redefine our understanding of cancer biology. Greater awareness of the distinct aberrations that drive carcinogenesis is also contributing to a growing armamentarium of genomically targeted therapies. Although much work remains to better understand how to combine and sequence these therapies, improved outcomes for patients are becoming manifest. As we welcome this genomic revolution in cancer care, oncologists also must grapple with a number of practical problems. Costs of cancer care continue to grow, with targeted therapies responsible for an increasing proportion of spending. Rising costs are bringing the concept of value into sharper focus and challenging the oncology community with implementation of value-based cancer care. This article explores the ways that the genomic revolution is transforming cancer care, describes various frameworks for considering the value of genomically targeted therapies, and outlines key challenges for delivering on the promise of personalized cancer care. It highlights practical solutions for the implementation of value-based care, including investment in biomarker development and clinical trials to improve the efficacy of targeted therapy, the use of evidence-based clinical pathways, team-based care, computerized clinical decision support, and value-based payment approaches.

  5. Value-Based Care in the Worldwide Battle Against Cancer.

    PubMed

    Johansen, Niloufer J; Saunders, Christobel M

    2017-02-17

    Globally, an increasing and aging population is contributing to the prevalence of cancer. To be effective, cancer care needs to involve the coordination of multidisciplinary specialties, and also needs to be affordable, accessible, and capable of producing optimal patient outcomes. Porter and Teisberg (2006) have postulated that shifting current healthcare strategies from volume-based to patient-centric care redirects economic competition to providing treatments which promote the best patient outcomes while driving down costs. Therefore, the value in value-based healthcare (VBH) is defined as patient outcome per currency spent on providing care. Based on the experiences of healthcare organizations currently transitioning to the value-based system, this review details actionable guidelines to transition current cancer care practices to the value-based system in four main steps: by defining universal clinical and patient-reported measures, creating cancer-specific units that provide the full care cycle, establishing a data capture model to routinely determine the value of the care delivered, and continually improving treatment strategies through research. As healthcare providers in more developed countries move to value-based care, those located in less developed countries should also be assisted in their transition to relieve the cancer burden globally.

  6. Current State of Value-Based Purchasing Programs.

    PubMed

    Chee, Tingyin T; Ryan, Andrew M; Wasfy, Jason H; Borden, William B

    2016-05-31

    The US healthcare system is rapidly moving toward rewarding value. Recent legislation, such as the Affordable Care Act and the Medicare Access and CHIP Reauthorization Act, solidified the role of value-based payment in Medicare. Many private insurers are following Medicare's lead. Much of the policy attention has been on programs such as accountable care organizations and bundled payments; yet, value-based purchasing (VBP) or pay-for-performance, defined as providers being paid fee-for-service with payment adjustments up or down based on value metrics, remains a core element of value payment in Medicare Access and CHIP Reauthorization Act and will likely remain so for the foreseeable future. This review article summarizes the current state of VBP programs and provides analysis of the strengths, weaknesses, and opportunities for the future. Multiple inpatient and outpatient VBP programs have been implemented and evaluated; the impact of those programs has been marginal. Opportunities to enhance the performance of VBP programs include improving the quality measurement science, strengthening both the size and design of incentives, reducing health disparities, establishing broad outcome measurement, choosing appropriate comparison targets, and determining the optimal role of VBP relative to alternative payment models. VBP programs will play a significant role in healthcare delivery for years to come, and they serve as an opportunity for providers to build the infrastructure needed for value-oriented care. © 2016 American Heart Association, Inc.

  7. Formulae Based on Biomathematics to Estimate the Standard Value of Fetal Growth of Japanese.

    PubMed

    Miyagi, Yasunari; Tada, Katsuhiko; Takayoshi, Riko; Oguni, Nobutsugu; Sato, Yasushi; Shibata, Maki; Kiyokawa, Machiko; Hashimoto, Tadashi; Takada, Tomoyoshi; Oda, Takashi; Miyake, Takahito

    2018-04-01

    We devised biomathematics-based formulae to estimate the standard values of fetal growth of Japanese after 22 weeks' gestation. The growth rates of bi-parietal diameter (BPD), abdominal circumference (AC), femur length (FL), and estimated fetal body weight (EFBW) at the time of gestation were assumed to be proportional to the product of the value at the time and the rest value of an unknown maximum value, respectively. The EFBW was also assumed to follow a multiple logistic function of BPD, AC and FL to fit the standard values of Japanese fetuses published by the Japan Society of Ultrasonics in Medicine. The Mann-Whitney test was used for statistical analysis. The values as a function of gestational day, t, were as follows: BPD(t)=99.6/(1+exp (2.725-0.01837*t)) (mm); AC(t)=39.7/(1+exp (2.454-0.01379*t)) (cm); FL(t)=79.6/(1+exp (2.851-0.01710*t)) (mm); EFBW(t)=8045.1/(1+exp (6.028-0.06582*BPD(t)-0.1469*AC(t)+ 0.07377*FL(t))) (g). EFBW as a function of BPD, AC and FL was as follows: EFBW=8045.1/(1+exp (4.747+ 0.02584*BPD+0.1010*AC-0.1416*FL)) (g). When the BPD, AC and FL were at -2 standard deviation (SD), -1SD, mean and + 2SD, the EFBW values calculated by the formula were statistically closer to the standard values than conventional formulas with p-values of 4.871×10-7, 4.228×10-7, 9.777×10-7 and 0.028, respectively. The formulae based on biomathematics might be useful to estimate the fetal growth standard values.

  8. QSPR models for various physical properties of carbohydrates based on molecular mechanics and quantum chemical calculations.

    PubMed

    Dyekjaer, Jane Dannow; Jónsdóttir, Svava Osk

    2004-01-22

    Quantitative Structure-Property Relationships (QSPR) have been developed for a series of monosaccharides, including the physical properties of partial molar heat capacity, heat of solution, melting point, heat of fusion, glass-transition temperature, and solid state density. The models were based on molecular descriptors obtained from molecular mechanics and quantum chemical calculations, combined with other types of descriptors. Saccharides exhibit a large degree of conformational flexibility, therefore a methodology for selecting the energetically most favorable conformers has been developed, and was used for the development of the QSPR models. In most cases good correlations were obtained for monosaccharides. For five of the properties predictions were made for disaccharides, and the predicted values for the partial molar heat capacities were in excellent agreement with experimental values.

  9. An approach to value-based simulator selection: The creation and evaluation of the simulator value index tool.

    PubMed

    Rooney, Deborah M; Hananel, David M; Covington, Benjamin J; Dionise, Patrick L; Nykamp, Michael T; Pederson, Melvin; Sahloul, Jamal M; Vasquez, Rachael; Seagull, F Jacob; Pinsky, Harold M; Sweier, Domenica G; Cooke, James M

    2018-04-01

    Currently there is no reliable, standardized mechanism to support health care professionals during the evaluation of and procurement processes for simulators. A tool founded on best practices could facilitate simulator purchase processes. In a 3-phase process, we identified top factors considered during the simulator purchase process through expert consensus (n = 127), created the Simulator Value Index (SVI) tool, evaluated targeted validity evidence, and evaluated the practical value of this SVI. A web-based survey was sent to simulation professionals. Participants (n = 79) used the SVI and provided feedback. We evaluated the practical value of 4 tool variations by calculating their sensitivity to predict a preferred simulator. Seventeen top factors were identified and ranked. The top 2 were technical stability/reliability of the simulator and customer service, with no practical differences in rank across institution or stakeholder role. Full SVI variations predicted successfully the preferred simulator with good (87%) sensitivity, whereas the sensitivity of variations in cost and customer service and cost and technical stability decreased (≤54%). The majority (73%) of participants agreed that the SVI was helpful at guiding simulator purchase decisions, and 88% agreed the SVI tool would help facilitate discussion with peers and leadership. Our findings indicate the SVI supports the process of simulator purchase using a standardized framework. Sensitivity of the tool improved when factors extend beyond traditionally targeted factors. We propose the tool will facilitate discussion amongst simulation professionals dealing with simulation, provide essential information for finance and procurement professionals, and improve the long-term value of simulation solutions. Limitations and application of the tool are discussed. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Value-at-risk estimation with wavelet-based extreme value theory: Evidence from emerging markets

    NASA Astrophysics Data System (ADS)

    Cifter, Atilla

    2011-06-01

    This paper introduces wavelet-based extreme value theory (EVT) for univariate value-at-risk estimation. Wavelets and EVT are combined for volatility forecasting to estimate a hybrid model. In the first stage, wavelets are used as a threshold in generalized Pareto distribution, and in the second stage, EVT is applied with a wavelet-based threshold. This new model is applied to two major emerging stock markets: the Istanbul Stock Exchange (ISE) and the Budapest Stock Exchange (BUX). The relative performance of wavelet-based EVT is benchmarked against the Riskmetrics-EWMA, ARMA-GARCH, generalized Pareto distribution, and conditional generalized Pareto distribution models. The empirical results show that the wavelet-based extreme value theory increases predictive performance of financial forecasting according to number of violations and tail-loss tests. The superior forecasting performance of the wavelet-based EVT model is also consistent with Basel II requirements, and this new model can be used by financial institutions as well.

  11. 30 CFR 1206.105 - What records must I keep to support my calculations of value under this subpart?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 30 Mineral Resources 3 2014-07-01 2014-07-01 false What records must I keep to support my calculations of value under this subpart? 1206.105 Section 1206.105 Mineral Resources OFFICE OF NATURAL RESOURCES REVENUE, DEPARTMENT OF THE INTERIOR NATURAL RESOURCES REVENUE PRODUCT VALUATION Federal Oil § 1206...

  12. 30 CFR 1206.105 - What records must I keep to support my calculations of value under this subpart?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 30 Mineral Resources 3 2012-07-01 2012-07-01 false What records must I keep to support my calculations of value under this subpart? 1206.105 Section 1206.105 Mineral Resources OFFICE OF NATURAL RESOURCES REVENUE, DEPARTMENT OF THE INTERIOR NATURAL RESOURCES REVENUE PRODUCT VALUATION Federal Oil § 1206...

  13. 30 CFR 1206.105 - What records must I keep to support my calculations of value under this subpart?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 30 Mineral Resources 3 2013-07-01 2013-07-01 false What records must I keep to support my calculations of value under this subpart? 1206.105 Section 1206.105 Mineral Resources OFFICE OF NATURAL RESOURCES REVENUE, DEPARTMENT OF THE INTERIOR NATURAL RESOURCES REVENUE PRODUCT VALUATION Federal Oil § 1206...

  14. Using force-based adaptive resolution simulations to calculate solvation free energies of amino acid sidechain analogues

    NASA Astrophysics Data System (ADS)

    Fiorentini, Raffaele; Kremer, Kurt; Potestio, Raffaello; Fogarty, Aoife C.

    2017-06-01

    The calculation of free energy differences is a crucial step in the characterization and understanding of the physical properties of biological molecules. In the development of efficient methods to compute these quantities, a promising strategy is that of employing a dual-resolution representation of the solvent, specifically using an accurate model in the proximity of a molecule of interest and a simplified description elsewhere. One such concurrent multi-resolution simulation method is the Adaptive Resolution Scheme (AdResS), in which particles smoothly change their resolution on-the-fly as they move between different subregions. Before using this approach in the context of free energy calculations, however, it is necessary to make sure that the dual-resolution treatment of the solvent does not cause undesired effects on the computed quantities. Here, we show how AdResS can be used to calculate solvation free energies of small polar solutes using Thermodynamic Integration (TI). We discuss how the potential-energy-based TI approach combines with the force-based AdResS methodology, in which no global Hamiltonian is defined. The AdResS free energy values agree with those calculated from fully atomistic simulations to within a fraction of kBT. This is true even for small atomistic regions whose size is on the order of the correlation length, or when the properties of the coarse-grained region are extremely different from those of the atomistic region. These accurate free energy calculations are possible because AdResS allows the sampling of solvation shell configurations which are equivalent to those of fully atomistic simulations. The results of the present work thus demonstrate the viability of the use of adaptive resolution simulation methods to perform free energy calculations and pave the way for large-scale applications where a substantial computational gain can be attained.

  15. Value-Based Medicine and Pharmacoeconomics.

    PubMed

    Brown, Gary C; Brown, Melissa M

    2016-01-01

    Pharmacoeconomics is assuming increasing importance in the pharmaceutical field since it is entering the public policy arena in many countries. Among the variants of pharmacoeconomic analysis are cost-minimization, cost-benefit, cost-effectiveness and cost-utility analyses. The latter is the most versatile and sophisticated in that it integrates the patient benefit (patient value) conferred by a drug in terms of improvement in length and/or quality of life. It also incorporates the costs expended for that benefit, as well as the dollars returned to patients and society from the use of a drug (financial value). Unfortunately, one cost-utility analysis in the literature is generally not comparable to another because of the lack of standardized formats and standardized input variables (costs, cost perspective, quality-of-life measurement instruments, quality-of-life respondents, discounting and so forth). Thus, millions of variants can be used. Value-based medicine® (VBM) cost-utility analysis standardizes these variants so that one VBM analysis is comparable to another. This system provides a highly rational methodology that allows providers and patients to quantify and compare the patient value and financial value gains associated with the use of pharmaceutical agents for example. © 2016 S. Karger AG, Basel.

  16. How can activity-based costing methodology be performed as a powerful tool to calculate costs and secure appropriate patient care?

    PubMed

    Lin, Blossom Yen-Ju; Chao, Te-Hsin; Yao, Yuh; Tu, Shu-Min; Wu, Chun-Ching; Chern, Jin-Yuan; Chao, Shiu-Hsiung; Shaw, Keh-Yuong

    2007-04-01

    Previous studies have shown the advantages of using activity-based costing (ABC) methodology in the health care industry. The potential values of ABC methodology in health care are derived from the more accurate cost calculation compared to the traditional step-down costing, and the potentials to evaluate quality or effectiveness of health care based on health care activities. This project used ABC methodology to profile the cost structure of inpatients with surgical procedures at the Department of Colorectal Surgery in a public teaching hospital, and to identify the missing or inappropriate clinical procedures. We found that ABC methodology was able to accurately calculate costs and to identify several missing pre- and post-surgical nursing education activities in the course of treatment.

  17. The transition to value-based care.

    PubMed

    Ray, Jordan C; Kusumoto, Fred

    2016-10-01

    Delivery of medical care is evolving rapidly worldwide. Over the past several years in the USA, there has been a rapid shift in reimbursement from a simple fee-for-service model to more complex models that attempt to link payment to quality and value. Change in any large system can be difficult, but with medicine, the transition to a value-based system has been particularly hard to implement because both quality and cost are difficult to quantify. Professional societies and other medical groups are developing different programs in an attempt to define high value care. However, applying a national standard of value for any treatment is challenging, since value varies from person to person, and the individual benefit must remain the central tenet for delivering best patient-centered medical care. Regardless of the specific operational features of the rapidly changing healthcare environment, physicians must first and foremost always remain patient advocates.

  18. 40 CFR 600.209-08 - Calculation of vehicle-specific 5-cycle fuel economy values for a model type.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... intended for sale at high altitude, the Administrator may use fuel economy data from tests conducted on... from the tests performed using gasoline or diesel test fuel. (ii) If 5-cycle testing was performed on the alcohol or natural gas test fuel, calculate the city and highway fuel economy values from the...

  19. 40 CFR 600.209-08 - Calculation of vehicle-specific 5-cycle fuel economy values for a model type.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... intended for sale at high altitude, the Administrator may use fuel economy data from tests conducted on... from the tests performed using gasoline or diesel test fuel. (ii) If 5-cycle testing was performed on the alcohol or natural gas test fuel, calculate the city and highway fuel economy values from the...

  20. 40 CFR 600.210-12 - Calculation of fuel economy and CO2 emission values for labeling.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... not qualify for the second method as described in § 600.115 (other than electric vehicles). The second... values for electric vehicles. Determine FTP-based city and HFET-based highway fuel economy label values for electric vehicles as described in § 600.116. Convert W-hour/mile results to miles per kW-hr and...

  1. 40 CFR 600.210-12 - Calculation of fuel economy and CO2 emission values for labeling.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... not qualify for the second method as described in § 600.115 (other than electric vehicles). The second... values for electric vehicles. Determine FTP-based city and HFET-based highway fuel economy label values for electric vehicles as described in § 600.116. Convert W-hour/mile results to miles per kW-hr and...

  2. 40 CFR 600.210-12 - Calculation of fuel economy and CO2 emission values for labeling.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... not qualify for the second method as described in § 600.115 (other than electric vehicles). The second... values for electric vehicles. Determine FTP-based city and HFET-based highway fuel economy label values for electric vehicles as described in § 600.116. Convert W-hour/mile results to miles per kW-hr and...

  3. SU-E-T-769: T-Test Based Prior Error Estimate and Stopping Criterion for Monte Carlo Dose Calculation in Proton Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hong, X; Gao, H; Schuemann, J

    2015-06-15

    Purpose: The Monte Carlo (MC) method is a gold standard for dose calculation in radiotherapy. However, it is not a priori clear how many particles need to be simulated to achieve a given dose accuracy. Prior error estimate and stopping criterion are not well established for MC. This work aims to fill this gap. Methods: Due to the statistical nature of MC, our approach is based on one-sample t-test. We design the prior error estimate method based on the t-test, and then use this t-test based error estimate for developing a simulation stopping criterion. The three major components are asmore » follows.First, the source particles are randomized in energy, space and angle, so that the dose deposition from a particle to the voxel is independent and identically distributed (i.i.d.).Second, a sample under consideration in the t-test is the mean value of dose deposition to the voxel by sufficiently large number of source particles. Then according to central limit theorem, the sample as the mean value of i.i.d. variables is normally distributed with the expectation equal to the true deposited dose.Third, the t-test is performed with the null hypothesis that the difference between sample expectation (the same as true deposited dose) and on-the-fly calculated mean sample dose from MC is larger than a given error threshold, in addition to which users have the freedom to specify confidence probability and region of interest in the t-test based stopping criterion. Results: The method is validated for proton dose calculation. The difference between the MC Result based on the t-test prior error estimate and the statistical Result by repeating numerous MC simulations is within 1%. Conclusion: The t-test based prior error estimate and stopping criterion are developed for MC and validated for proton dose calculation. Xiang Hong and Hao Gao were partially supported by the NSFC (#11405105), the 973 Program (#2015CB856000) and the Shanghai Pujiang Talent Program (#14PJ1404500)« less

  4. Optimal policy for value-based decision-making.

    PubMed

    Tajima, Satohiro; Drugowitsch, Jan; Pouget, Alexandre

    2016-08-18

    For decades now, normative theories of perceptual decisions, and their implementation as drift diffusion models, have driven and significantly improved our understanding of human and animal behaviour and the underlying neural processes. While similar processes seem to govern value-based decisions, we still lack the theoretical understanding of why this ought to be the case. Here, we show that, similar to perceptual decisions, drift diffusion models implement the optimal strategy for value-based decisions. Such optimal decisions require the models' decision boundaries to collapse over time, and to depend on the a priori knowledge about reward contingencies. Diffusion models only implement the optimal strategy under specific task assumptions, and cease to be optimal once we start relaxing these assumptions, by, for example, using non-linear utility functions. Our findings thus provide the much-needed theory for value-based decisions, explain the apparent similarity to perceptual decisions, and predict conditions under which this similarity should break down.

  5. Optimal policy for value-based decision-making

    PubMed Central

    Tajima, Satohiro; Drugowitsch, Jan; Pouget, Alexandre

    2016-01-01

    For decades now, normative theories of perceptual decisions, and their implementation as drift diffusion models, have driven and significantly improved our understanding of human and animal behaviour and the underlying neural processes. While similar processes seem to govern value-based decisions, we still lack the theoretical understanding of why this ought to be the case. Here, we show that, similar to perceptual decisions, drift diffusion models implement the optimal strategy for value-based decisions. Such optimal decisions require the models' decision boundaries to collapse over time, and to depend on the a priori knowledge about reward contingencies. Diffusion models only implement the optimal strategy under specific task assumptions, and cease to be optimal once we start relaxing these assumptions, by, for example, using non-linear utility functions. Our findings thus provide the much-needed theory for value-based decisions, explain the apparent similarity to perceptual decisions, and predict conditions under which this similarity should break down. PMID:27535638

  6. Variation Among Internet Based Calculators in Predicting Spontaneous Resolution of Vesicoureteral Reflux

    PubMed Central

    Routh, Jonathan C.; Gong, Edward M.; Cannon, Glenn M.; Yu, Richard N.; Gargollo, Patricio C.; Nelson, Caleb P.

    2010-01-01

    Purpose An increasing number of parents and practitioners use the Internet for health related purposes, and an increasing number of models are available on the Internet for predicting spontaneous resolution rates for children with vesi-coureteral reflux. We sought to determine whether currently available Internet based calculators for vesicoureteral reflux resolution produce systematically different results. Materials and Methods Following a systematic Internet search we identified 3 Internet based calculators of spontaneous resolution rates for children with vesicoureteral reflux, of which 2 were academic affiliated and 1 was industry affiliated. We generated a random cohort of 100 hypothetical patients with a wide range of clinical characteristics and entered the data on each patient into each calculator. We then compared the results from the calculators in terms of mean predicted resolution probability and number of cases deemed likely to resolve at various cutoff probabilities. Results Mean predicted resolution probabilities were 41% and 36% (range 31% to 41%) for the 2 academic affiliated calculators and 33% for the industry affiliated calculator (p = 0.02). For some patients the calculators produced markedly different probabilities of spontaneous resolution, in some instances ranging from 24% to 89% for the same patient. At thresholds greater than 5%, 10% and 25% probability of spontaneous resolution the calculators differed significantly regarding whether cases would resolve (all p < 0.0001). Conclusions Predicted probabilities of spontaneous resolution of vesicoureteral reflux differ significantly among Internet based calculators. For certain patients, particularly those with a lower probability of spontaneous resolution, these differences can significantly influence clinical decision making. PMID:20172550

  7. Calculation of Derivative Thermodynamic Hydration and Aqueous Partial Molar Properties of Ions Based on Atomistic Simulations.

    PubMed

    Dahlgren, Björn; Reif, Maria M; Hünenberger, Philippe H; Hansen, Niels

    2012-10-09

    The raw ionic solvation free energies calculated on the basis of atomistic (explicit-solvent) simulations are extremely sensitive to the boundary conditions and treatment of electrostatic interactions used during these simulations. However, as shown recently [Kastenholz, M. A.; Hünenberger, P. H. J. Chem. Phys.2006, 124, 224501 and Reif, M. M.; Hünenberger, P. H. J. Chem. Phys.2011, 134, 144104], the application of an appropriate correction scheme allows for a conversion of the methodology-dependent raw data into methodology-independent results. In this work, methodology-independent derivative thermodynamic hydration and aqueous partial molar properties are calculated for the Na(+) and Cl(-) ions at P° = 1 bar and T(-) = 298.15 K, based on the SPC water model and on ion-solvent Lennard-Jones interaction coefficients previously reoptimized against experimental hydration free energies. The hydration parameters considered are the hydration free energy and enthalpy. The aqueous partial molar parameters considered are the partial molar entropy, volume, heat capacity, volume-compressibility, and volume-expansivity. Two alternative calculation methods are employed to access these properties. Method I relies on the difference in average volume and energy between two aqueous systems involving the same number of water molecules, either in the absence or in the presence of the ion, along with variations of these differences corresponding to finite pressure or/and temperature changes. Method II relies on the calculation of the hydration free energy of the ion, along with variations of this free energy corresponding to finite pressure or/and temperature changes. Both methods are used considering two distinct variants in the application of the correction scheme. In variant A, the raw values from the simulations are corrected after the application of finite difference in pressure or/and temperature, based on correction terms specifically designed for derivative parameters at

  8. Identifying Intraplate Mechanism by B-Value Calculations in the South of Java Island

    NASA Astrophysics Data System (ADS)

    Bagus Suananda Y., Ida; Aufa, Irfan; Harlianti, Ulvienin

    2018-03-01

    Java is the most populous island in Indonesia with 50 million people live there. This island geologically formed at the Eurasia plate margin by the subduction of the Australian oceanic crust. At the south part of Java, beside the occurrence of 2-plate convergence earthquake (interplate), there are also the activities of the intraplate earthquake. Research for distinguish this 2 different earthquake type is necessary for estimating the behavior of the earthquake that may occur. The aim of this research is to map the b-value in the south of Java using earthquake data from 1963 until 2008. The research area are divided into clusters based on the epicenter mapping results with magnitude more than 4 and three different depth (0-30 km, 30-60 km, 60-100 km). This location clustering indicate group of earthquakes occurred by the same structure or mechanism. On some cluster in the south of Java, b-value obtained are between 0.8 and 1.25. This range of b-value indicates the region was intraplate earthquake zone, with 0.72-1.2 b-value range is the indication of intraplate earthquake zone. The final validation is to determine the mechanism of a segment done by correlating the epicenter and b-value plot with the available structural geology data. Based on this research, we discover that the earthquakes occur in Java not only the interplate earthquake, the intraplate earthquake also occurred here. By identifying the mechanism of a segment in the south of Java, earthquake characterization that may occur can be done for developing the accurate earthquake disaster mitigation system.

  9. A table of semiempirical gf values. Part 1: Wavelengths: 5.2682 nm to 272.3380 nm. [to calculate line-blanketed model atmospheres for solar and stellar spectra

    NASA Technical Reports Server (NTRS)

    Kurucz, R. L.; Peytremann, E.

    1975-01-01

    The gf values for 265,587 atomic lines selected from the line data used to calculate line-blanketed model atmospheres are tabulated. These data are especially useful for line identification and spectral synthesis in solar and stellar spectra. The gf values are calculated semiempirically by using scaled Thomas-Fermi-Dirac radial wavefunctions and eigenvectors found through least-squares fits to observed energy levels. Included in the calculation are the first five or six stages of ionization for sequences up through nickel. Published gf values are included for elements heavier than nickel. The tabulation is restricted to lines with wavelengths less than 10 micrometers.

  10. 40 CFR 600.209-08 - Calculation of vehicle-specific 5-cycle fuel economy values for a model type.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Calculation of vehicle-specific 5-cycle fuel economy values for a model type. 600.209-08 Section 600.209-08 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) ENERGY POLICY FUEL ECONOMY AND CARBON-RELATED EXHAUST EMISSIONS OF MOTOR VEHICLES Fuel Economy Regulations fo...

  11. Expanding CyberShake Physics-Based Seismic Hazard Calculations to Central California

    NASA Astrophysics Data System (ADS)

    Silva, F.; Callaghan, S.; Maechling, P. J.; Goulet, C. A.; Milner, K. R.; Graves, R. W.; Olsen, K. B.; Jordan, T. H.

    2016-12-01

    As part of its program of earthquake system science, the Southern California Earthquake Center (SCEC) has developed a simulation platform, CyberShake, to perform physics-based probabilistic seismic hazard analysis (PSHA) using 3D deterministic wave propagation simulations. CyberShake performs PSHA by first simulating a tensor-valued wavefield of Strain Green Tensors. CyberShake then takes an earthquake rupture forecast and extends it by varying the hypocenter location and slip distribution, resulting in about 500,000 rupture variations. Seismic reciprocity is used to calculate synthetic seismograms for each rupture variation at each computation site. These seismograms are processed to obtain intensity measures, such as spectral acceleration, which are then combined with probabilities from the earthquake rupture forecast to produce a hazard curve. Hazard curves are calculated at seismic frequencies up to 1 Hz for hundreds of sites in a region and the results interpolated to obtain a hazard map. In developing and verifying CyberShake, we have focused our modeling in the greater Los Angeles region. We are now expanding the hazard calculations into Central California. Using workflow tools running jobs across two large-scale open-science supercomputers, NCSA Blue Waters and OLCF Titan, we calculated 1-Hz PSHA results for over 400 locations in Central California. For each location, we produced hazard curves using both a 3D central California velocity model created via tomographic inversion, and a regionally averaged 1D model. These new results provide low-frequency exceedance probabilities for the rapidly expanding metropolitan areas of Santa Barbara, Bakersfield, and San Luis Obispo, and lend new insights into the effects of directivity-basin coupling associated with basins juxtaposed to major faults such as the San Andreas. Particularly interesting are the basin effects associated with the deep sediments of the southern San Joaquin Valley. We will compare hazard

  12. Ethics education for health professionals: a values based approach.

    PubMed

    Godbold, Rosemary; Lees, Amanda

    2013-11-01

    It is now widely accepted that ethics is an essential part of educating health professionals. Despite a clear mandate to educators, there are differing approaches, in particular, how and where ethics is positioned in training programmes, underpinning philosophies and optimal modes of assessment. This paper explores varying practices and argues for a values based approach to ethics education. It then explores the possibility of using a web-based technology, the Values Exchange, to facilitate a values based approach. It uses the findings of a small scale study to signal the potential of the Values Exchange for engaging, meaningful and applied ethics education. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. The importance of values in evidence-based medicine.

    PubMed

    Kelly, Michael P; Heath, Iona; Howick, Jeremy; Greenhalgh, Trisha

    2015-10-12

    Evidence-based medicine (EBM) has always required integration of patient values with 'best' clinical evidence. It is widely recognized that scientific practices and discoveries, including those of EBM, are value-laden. But to date, the science of EBM has focused primarily on methods for reducing bias in the evidence, while the role of values in the different aspects of the EBM process has been almost completely ignored. In this paper, we address this gap by demonstrating how a consideration of values can enhance every aspect of EBM, including: prioritizing which tests and treatments to investigate, selecting research designs and methods, assessing effectiveness and efficiency, supporting patient choice and taking account of the limited time and resources available to busy clinicians. Since values are integral to the practice of EBM, it follows that the highest standards of EBM require values to be made explicit, systematically explored, and integrated into decision making. Through 'values based' approaches, EBM's connection to the humanitarian principles upon which it was founded will be strengthened.

  14. Redefining Health: Implication for Value-Based Healthcare Reform.

    PubMed

    Putera, Ikhwanuliman

    2017-03-02

    Health definition consists of three domains namely, physical, mental, and social health that should be prioritized in delivering healthcare. The emergence of chronic diseases in aging populations has been a barrier to the realization of a healthier society. The value-based healthcare concept seems in line with the true health objective: increasing value. Value is created from health outcomes which matter to patients relative to the cost of achieving those outcomes. The health outcomes should include all domains of health in a full cycle of care. To implement value-based healthcare, transformations need to be done by both health providers and patients: establishing true health outcomes, strengthening primary care, building integrated health systems, implementing appropriate health payment schemes that promote value and reduce moral hazards, enabling health information technology, and creating a policy that fits well with a community.

  15. Simulation and analysis of main steam control system based on heat transfer calculation

    NASA Astrophysics Data System (ADS)

    Huang, Zhenqun; Li, Ruyan; Feng, Zhongbao; Wang, Songhan; Li, Wenbo; Cheng, Jiwei; Jin, Yingai

    2018-05-01

    In this paper, after thermal power plant 300MW boiler was studied, mat lab was used to write calculation program about heat transfer process between the main steam and boiler flue gas and amount of water was calculated to ensure the main steam temperature keeping in target temperature. Then heat transfer calculation program was introduced into Simulink simulation platform based on control system multiple models switching and heat transfer calculation. The results show that multiple models switching control system based on heat transfer calculation not only overcome the large inertia of main stream temperature, a large hysteresis characteristic of main stream temperature, but also adapted to the boiler load changing.

  16. Evidence-based medicine: the value of vision screening.

    PubMed

    Beauchamp, George R; Ellepola, Chalani; Beauchamp, Cynthia L

    2010-01-01

    To review the literature for evidence-based medicine (EBM), to assess the evidence for effectiveness of vision screening, and to propose moving toward value-based medicine (VBM) as a preferred basis for comparative effectiveness research. Literature based evidence is applied to five core questions concerning vision screening: (1) Is vision valuable (an inherent good)?; (2) Is screening effective (finding amblyopia)?; (3) What are the costs of screening?; (4) Is treatment effective?; and (5) Is amblyopia detection beneficial? Based on EBM literature and clinical experience, the answers to the five questions are: (1) yes; (2) based on literature, not definitively so; (3) relatively inexpensive, although some claim benefits for more expensive options such as mandatory exams; (4) yes, for compliant care, although treatment processes may have negative aspects such as "bullying"; and (5) economic productive values are likely very high, with returns of investment on the order of 10:1, while human value returns need further elucidation. Additional evidence is required to ascertain the degree to which vision screening is effective. The processes of screening are multiple, sequential, and complicated. The disease is complex, and good visual outcomes require compliance. The value of outcomes is appropriately analyzed in clinical, human, and economic terms.

  17. Value-based medicine and interventions for macular degeneration.

    PubMed

    Brown, Melissa M; Brown, Gary C; Brown, Heidi

    2007-05-01

    The aim of this article is to review the patient value conferred by interventions for neovascular macular degeneration. Value-based medicine is the practice of medicine based upon the patient value (improvement in quality of life and length of life) conferred by an intervention. For ophthalmologic interventions, in which length-of-life is generally unaffected, the value gain is equivalent to the improvement in quality of life. Photodynamic therapy delivers a value gain (improvement in quality of life) of 8.1% for the average person with classic subfoveal choroidal neovascularization, while laser photocoagulation for the same entity confers a 4.4% improvement in quality of life. Preliminary data suggest the value gain for the treatment of occult/minimally classic choroidal neovascularization with ranibizumab is greater than 15%. The average value gain for statins for the treatment of hyperlipidemia is 3.9%, while that for the use of biphosphonates for the treatment of osteoporosis is 1.1% and that for drugs to treat benign prostatic hyperplasia is 1-2%. Interventions, especially ranibizumab therapy, for neovascular macular degeneration appear to deliver an extraordinary degree of value compared with many other interventions across healthcare.

  18. Choosing a Values-Based Leader: An Experiential Exercise

    ERIC Educational Resources Information Center

    Reilly, Anne H.; Ehlinger, Sara

    2007-01-01

    Scandals throughout corporate America have encouraged companies to seek leaders who can sustain profitability and embody positive values within the organization. This group exercise highlights some of the key challenges involved in choosing a values-based leader. Participants assess three hypothetical companies' values during a period of change…

  19. Finding the 'sweet spot' in value-based contracts.

    PubMed

    Eggbeer, Bill; Sears, Kevin; Homer, Ken

    2015-08-01

    Health systems pursing value-based contracts should address six important considerations: The definition of value. Contracting goals. Cost of implementation. Risk exposure. Contract structure and design. Essential contractual protections.

  20. Dosimetry for nonuniform activity distributions: a method for the calculation of 3D absorbed-dose distribution without the use of voxel S-values, point kernels, or Monte Carlo simulations.

    PubMed

    Traino, A C; Marcatili, S; Avigo, C; Sollini, M; Erba, P A; Mariani, G

    2013-04-01

    Nonuniform activity within the target lesions and the critical organs constitutes an important limitation for dosimetric estimates in patients treated with tumor-seeking radiopharmaceuticals. The tumor control probability and the normal tissue complication probability are affected by the distribution of the radionuclide in the treated organ/tissue. In this paper, a straightforward method for calculating the absorbed dose at the voxel level is described. This new method takes into account a nonuniform activity distribution in the target/organ. The new method is based on the macroscopic S-values (i.e., the S-values calculated for the various organs, as defined in the MIRD approach), on the definition of the number of voxels, and on the raw-count 3D array, corrected for attenuation, scatter, and collimator resolution, in the lesion/organ considered. Starting from these parameters, the only mathematical operation required is to multiply the 3D array by a scalar value, thus avoiding all the complex operations involving the 3D arrays. A comparison with the MIRD approach, fully described in the MIRD Pamphlet No. 17, using S-values at the voxel level, showed a good agreement between the two methods for (131)I and for (90)Y. Voxel dosimetry is becoming more and more important when performing therapy with tumor-seeking radiopharmaceuticals. The method presented here does not require calculating the S-values at the voxel level, and thus bypasses the mathematical problems linked to the convolution of 3D arrays and to the voxel size. In the paper, the results obtained with this new simplified method as well as the possibility of using it for other radionuclides commonly employed in therapy are discussed. The possibility of using the correct density value of the tissue/organs involved is also discussed.

  1. [Value-based cancer care. From traditional evidence-based decision making to balanced decision making within frameworks of shared values].

    PubMed

    Palazzo, Salvatore; Filice, Aldo; Mastroianni, Candida; Biamonte, Rosalbino; Conforti, Serafino; Liguori, Virginia; Turano, Salvatore; De Simone, Rosanna; Rovito, Antonio; Manfredi, Caterina; Minardi, Stefano; Vilardo, Emmanuelle; Loizzo, Monica; Oriolo, Carmela

    2016-04-01

    Clinical decision making in oncology is based so far on the evidence of efficacy from high-quality clinical research. Data collection and analysis from experimental studies provide valuable insight into response rates and progression-free or overall survival. Data processing generates valuable information for medical professionals involved in cancer patient care, enabling them to make objective and unbiased choices. The increased attention of many scientific associations toward a more rational resource consumption in clinical decision making is mirrored in the Choosing Wisely campaign against the overuse or misuse of exams and procedures of little or no benefit for the patient. This cultural movement has been actively promoting care solutions based on the concept of "value". As a result, the value-based decision-making process for cancer care should not be dissociated from economic sustainability and from ethics of the affordability, also given the growing average cost of the most recent cancer drugs. In support of this orientation, the National Comprehensive Cancer Network (NCCN) has developed innovative and "complex" guidelines based on values, defined as "evidence blocks", with the aim of assisting the medical community in making overall sustainable choices.

  2. The use of elements of the Stewart model (Strong Ion Approach) for the diagnostics of respiratory acidosis on the basis of the calculation of a value of a modified anion gap (AGm) in brachycephalic dogs.

    PubMed

    Sławuta, P; Glińska-Suchocka, K; Cekiera, A

    2015-01-01

    Apart from the HH equation, the acid-base balance of an organism is also described by the Stewart model, which assumes that the proper insight into the ABB of the organism is given by an analysis of: pCO2, the difference of concentrations of strong cations and anions in the blood serum - SID, and the total concentration of nonvolatile weak acids - Acid total. The notion of an anion gap (AG), or the apparent lack of ions, is closely related to the acid-base balance described according to the HH equation. Its value mainly consists of negatively charged proteins, phosphates, and sulphates in blood. In the human medicine, a modified anion gap is used, which, including the concentration of the protein buffer of blood, is, in fact, the combination of the apparent lack of ions derived from the classic model and the Stewart model. In brachycephalic dogs, respiratory acidosis often occurs, which is caused by an overgrowth of the soft palate, making it impossible for a free air flow and causing an increase in pCO2--carbonic acid anhydride The aim of the present paper was an attempt to answer the question whether, in the case of systemic respiratory acidosis, changes in the concentration of buffering ions can also be seen. The study was carried out on 60 adult dogs of boxer breed in which, on the basis of the results of endoscopic examination, a strong overgrowth of the soft palate requiring a surgical correction was found. For each dog, the value of the anion gap before and after the palate correction procedure was calculated according to the following equation: AG = ([Na+ mmol/l] + [K+ mmol/l])--([Cl- mmol/l]+ [HCO3- mmol/l]) as well as the value of the modified AG--according to the following equation: AGm = calculated AG + 2.5 x (albumins(r)--albumins(d)). The values of AG calculated for the dogs before and after the procedure fell within the limits of the reference values and did not differ significantly whereas the values of AGm calculated for the dogs before and after

  3. A practical approach for calculating the settlement and storage capacity of landfills based on the space and time discretization of the landfilling process.

    PubMed

    Gao, Wu; Xu, Wenjie; Bian, Xuecheng; Chen, Yunmin

    2017-11-01

    The settlement of any position of the municipal solid waste (MSW) body during the landfilling process and after its closure has effects on the integrity of the internal structure and storage capacity of the landfill. This paper proposes a practical approach for calculating the settlement and storage capacity of landfills based on the space and time discretization of the landfilling process. The MSW body in the landfill was divided into independent column units, and the filling process of each column unit was determined by a simplified complete landfilling process. The settlement of a position in the landfill was calculated with the compression of each MSW layer in every column unit. Then, the simultaneous settlement of all the column units was integrated to obtain the settlement of the landfill and storage capacity of all the column units; this allowed to obtain the storage capacity of the landfill based on the layer-wise summation method. When the compression of each MSW layer was calculated, the effects of the fluctuation of the main leachate level and variation in the unit weight of the MSW on the overburdened effective stress were taken into consideration by introducing the main leachate level's proportion and the unit weight and buried depth curve. This approach is especially significant for MSW with a high kitchen waste content and landfills in developing countries. The stress-biodegradation compression model was used to calculate the compression of each MSW layer. A software program, Settlement and Storage Capacity Calculation System for Landfills, was developed by integrating the space and time discretization of the landfilling process and the settlement and storage capacity algorithms. The landfilling process of the phase IV of Shanghai Laogang Landfill was simulated using this software. The maximum geometric volume of the landfill error between the calculated and measured values is only 2.02%, and the accumulated filling weight error between the

  4. N values estimation based on photon flux simulation with Geant4 toolkit.

    PubMed

    Sun, Z J; Danjaji, M; Kim, Y

    2018-06-01

    N values are routinely introduced in photon activation analysis (PAA) as the ratio of special activities of product nuclides to compare the relative intensities of different reaction channels. They determine the individual activities of each radioisotope and the total activity of the sample, which are the primary concerns of radiation safety. Traditionally, N values are calculated from the gamma spectroscopy in real measurements by normalizing the activities of individual nuclides to the reference reaction [ 58 Ni(γ, n) 57 Ni] of the nickel monitor simultaneously irradiated in photon activation. Is it possible to use photon flux simulated by Monte Carlo software to calculate N values even before the actual irradiation starts? This study has applied Geant4 toolkit, a popular platform of simulating the passage of particles through matter, to generate photon flux in the samples. Assisted with photonuclear cross section from IAEA database, it is feasible to predict N values in different experimental setups for simulated target material. We have validated of this method and its consistency with Geant4. Results also show that N values are highly correlated with the beam parameters of incoming electrons and the setup of the electron-photon converter. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. Independent Monte-Carlo dose calculation for MLC based CyberKnife radiotherapy

    NASA Astrophysics Data System (ADS)

    Mackeprang, P.-H.; Vuong, D.; Volken, W.; Henzen, D.; Schmidhalter, D.; Malthaner, M.; Mueller, S.; Frei, D.; Stampanoni, M. F. M.; Dal Pra, A.; Aebersold, D. M.; Fix, M. K.; Manser, P.

    2018-01-01

    This work aims to develop, implement and validate a Monte Carlo (MC)-based independent dose calculation (IDC) framework to perform patient-specific quality assurance (QA) for multi-leaf collimator (MLC)-based CyberKnife® (Accuray Inc., Sunnyvale, CA) treatment plans. The IDC framework uses an XML-format treatment plan as exported from the treatment planning system (TPS) and DICOM format patient CT data, an MC beam model using phase spaces, CyberKnife MLC beam modifier transport using the EGS++ class library, a beam sampling and coordinate transformation engine and dose scoring using DOSXYZnrc. The framework is validated against dose profiles and depth dose curves of single beams with varying field sizes in a water tank in units of cGy/Monitor Unit and against a 2D dose distribution of a full prostate treatment plan measured with Gafchromic EBT3 (Ashland Advanced Materials, Bridgewater, NJ) film in a homogeneous water-equivalent slab phantom. The film measurement is compared to IDC results by gamma analysis using 2% (global)/2 mm criteria. Further, the dose distribution of the clinical treatment plan in the patient CT is compared to TPS calculation by gamma analysis using the same criteria. Dose profiles from IDC calculation in a homogeneous water phantom agree within 2.3% of the global max dose or 1 mm distance to agreement to measurements for all except the smallest field size. Comparing the film measurement to calculated dose, 99.9% of all voxels pass gamma analysis, comparing dose calculated by the IDC framework to TPS calculated dose for the clinical prostate plan shows 99.0% passing rate. IDC calculated dose is found to be up to 5.6% lower than dose calculated by the TPS in this case near metal fiducial markers. An MC-based modular IDC framework was successfully developed, implemented and validated against measurements and is now available to perform patient-specific QA by IDC.

  6. Calculation of Lung Cancer Volume of Target Based on Thorax Computed Tomography Images using Active Contour Segmentation Method for Treatment Planning System

    NASA Astrophysics Data System (ADS)

    Patra Yosandha, Fiet; Adi, Kusworo; Edi Widodo, Catur

    2017-06-01

    In this research, calculation process of the lung cancer volume of target based on computed tomography (CT) thorax images was done. Volume of the target calculation was done in purpose to treatment planning system in radiotherapy. The calculation of the target volume consists of gross tumor volume (GTV), clinical target volume (CTV), planning target volume (PTV) and organs at risk (OAR). The calculation of the target volume was done by adding the target area on each slices and then multiply the result with the slice thickness. Calculations of area using of digital image processing techniques with active contour segmentation method. This segmentation for contouring to obtain the target volume. The calculation of volume produced on each of the targets is 577.2 cm3 for GTV, 769.9 cm3 for CTV, 877.8 cm3 for PTV, 618.7 cm3 for OAR 1, 1,162 cm3 for OAR 2 right, and 1,597 cm3 for OAR 2 left. These values indicate that the image processing techniques developed can be implemented to calculate the lung cancer target volume based on CT thorax images. This research expected to help doctors and medical physicists in determining and contouring the target volume quickly and precisely.

  7. Dose calculation accuracy of different image value to density tables for cone-beam CT planning in head & neck and pelvic localizations.

    PubMed

    Barateau, Anaïs; Garlopeau, Christopher; Cugny, Audrey; De Figueiredo, Bénédicte Henriques; Dupin, Charles; Caron, Jérôme; Antoine, Mikaël

    2015-03-01

    We aimed to identify the most accurate combination of phantom and protocol for image value to density table (IVDT) on volume-modulated arc therapy (VMAT) dose calculation based on kV-Cone-beam CT imaging, for head and neck (H&N) and pelvic localizations. Three phantoms (Catphan(®)600, CIRS(®)062M (inner phantom for head and outer phantom for body), and TomoTherapy(®) "Cheese" phantom) were used to create IVDT curves of CBCT systems with two different CBCT protocols (Standard-dose Head and Standard Pelvis). Hounsfield Unit (HU) time stability and repeatability for a single On-Board-Imager (OBI) and compatibility of two distinct devices were assessed with Catphan(®)600. Images from the anthropomorphic phantom CIRS ATOM(®) for both CT and CBCT modalities were used for VMAT dose calculation from different IVDT curves. Dosimetric indices from CT and CBCT imaging were compared. IVDT curves from CBCT images were highly different depending on phantom used (up to 1000 HU for high densities) and protocol applied (up to 200 HU for high densities). HU time stability was verified over seven weeks. A maximum difference of 3% on the dose calculation indices studied was found between CT and CBCT VMAT dose calculation across the two localizations using appropriate IVDT curves. One IVDT curve per localization can be established with a bi-monthly verification of IVDT-CBCT. The IVDT-CBCTCIRS-Head phantom with the Standard-dose Head protocol was the most accurate combination for dose calculation on H&N CBCT images. For pelvic localizations, the IVDT-CBCTCheese established with the Standard Pelvis protocol provided the best accuracy. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  8. Development of Quantum Chemical Method to Calculate Half Maximal Inhibitory Concentration (IC50 ).

    PubMed

    Bag, Arijit; Ghorai, Pradip Kr

    2016-05-01

    Till date theoretical calculation of the half maximal inhibitory concentration (IC50 ) of a compound is based on different Quantitative Structure Activity Relationship (QSAR) models which are empirical methods. By using the Cheng-Prusoff equation it may be possible to compute IC50 , but this will be computationally very expensive as it requires explicit calculation of binding free energy of an inhibitor with respective protein or enzyme. In this article, for the first time we report an ab initio method to compute IC50 of a compound based only on the inhibitor itself where the effect of the protein is reflected through a proportionality constant. By using basic enzyme inhibition kinetics and thermodynamic relations, we derive an expression of IC50 in terms of hydrophobicity, electric dipole moment (μ) and reactivity descriptor (ω) of an inhibitor. We implement this theory to compute IC50 of 15 HIV-1 capsid inhibitors and compared them with experimental results and available other QASR based empirical results. Calculated values using our method are in very good agreement with the experimental values compared to the values calculated using other methods. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. EuroFIR Guideline on calculation of nutrient content of foods for food business operators.

    PubMed

    Machackova, Marie; Giertlova, Anna; Porubska, Janka; Roe, Mark; Ramos, Carlos; Finglas, Paul

    2018-01-01

    This paper presents a Guideline for calculating nutrient content of foods by calculation methods for food business operators and presents data on compliance between calculated values and analytically determined values. In the EU, calculation methods are legally valid to determine the nutrient values of foods for nutrition labelling (Regulation (EU) No 1169/2011). However, neither a specific calculation method nor rules for use of retention factors are defined. EuroFIR AISBL (European Food Information Resource) has introduced a Recipe Calculation Guideline based on the EuroFIR harmonized procedure for recipe calculation. The aim is to provide food businesses with a step-by-step tool for calculating nutrient content of foods for the purpose of nutrition declaration. The development of this Guideline and use in the Czech Republic is described and future application to other Member States is discussed. Limitations of calculation methods and the importance of high quality food composition data are discussed. Copyright © 2017. Published by Elsevier Ltd.

  10. 40 CFR 600.209-12 - Calculation of vehicle-specific 5-cycle fuel economy and CO2 emission values for a model type.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... city and highway fuel economy and CO2 emission values from the tests performed using gasoline or diesel test fuel. (ii) If 5-cycle testing was performed on the alcohol or natural gas test fuel, calculate the city and highway fuel economy and CO2 emission values from the tests performed using alcohol or natural...

  11. 40 CFR 600.209-12 - Calculation of vehicle-specific 5-cycle fuel economy and CO2 emission values for a model type.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... city and highway fuel economy and CO2 emission values from the tests performed using gasoline or diesel test fuel. (ii) If 5-cycle testing was performed on the alcohol or natural gas test fuel, calculate the city and highway fuel economy and CO2 emission values from the tests performed using alcohol or natural...

  12. 40 CFR 600.209-12 - Calculation of vehicle-specific 5-cycle fuel economy and CO2 emission values for a model type.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... city and highway fuel economy and CO2 emission values from the tests performed using gasoline or diesel test fuel. (ii) If 5-cycle testing was performed on the alcohol or natural gas test fuel, calculate the city and highway fuel economy and CO2 emission values from the tests performed using alcohol or natural...

  13. SU-E-T-226: Correction of a Standard Model-Based Dose Calculator Using Measurement Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, M; Jiang, S; Lu, W

    Purpose: To propose a hybrid method that combines advantages of the model-based and measurement-based method for independent dose calculation. Modeled-based dose calculation, such as collapsed-cone-convolution/superposition (CCCS) or the Monte-Carlo method, models dose deposition in the patient body accurately; however, due to lack of detail knowledge about the linear accelerator (LINAC) head, commissioning for an arbitrary machine is tedious and challenging in case of hardware changes. On the contrary, the measurement-based method characterizes the beam property accurately but lacks the capability of dose disposition modeling in heterogeneous media. Methods: We used a standard CCCS calculator, which is commissioned by published data,more » as the standard model calculator. For a given machine, water phantom measurements were acquired. A set of dose distributions were also calculated using the CCCS for the same setup. The difference between the measurements and the CCCS results were tabulated and used as the commissioning data for a measurement based calculator. Here we used a direct-ray-tracing calculator (ΔDRT). The proposed independent dose calculation consists of the following steps: 1. calculate D-model using CCCS. 2. calculate D-ΔDRT using ΔDRT. 3. combine Results: D=D-model+D-ΔDRT. Results: The hybrid dose calculation was tested on digital phantoms and patient CT data for standard fields and IMRT plan. The results were compared to dose calculated by the treatment planning system (TPS). The agreement of the hybrid and the TPS was within 3%, 3 mm for over 98% of the volume for phantom studies and lung patients. Conclusion: The proposed hybrid method uses the same commissioning data as those for the measurement-based method and can be easily extended to any non-standard LINAC. The results met the accuracy, independence, and simple commissioning criteria for an independent dose calculator.« less

  14. Patient Experience-based Value Sets: Are They Stable?

    PubMed

    Pickard, A Simon; Hung, Yu-Ting; Lin, Fang-Ju; Lee, Todd A

    2017-11-01

    Although societal preference weights are desirable to inform resource-allocation decision-making, patient experienced health state-based value sets can be useful for clinical decision-making, but context may matter. To estimate EQ-5D value sets using visual analog scale (VAS) ratings for patients undergoing knee replacement surgery and compare the estimates before and after surgery. We used the Patient Reported Outcome Measures data collected by the UK National Health Service on patients undergoing knee replacement from 2009 to 2012. Generalized least squares regression models were used to derive value sets based on the EQ-5D-3 level using a development sample before and after surgery, and model performance was examined using a validation sample. A total of 90,450 preoperative and postoperative valuations were included. For preoperative valuations, the largest decrement in VAS values was associated with the dimension of anxiety/depression, followed by self-care, mobility, usual activities, and pain/discomfort. However, pain/discomfort had a greater impact on VAS value decrement in postoperative valuations. Compared with preoperative health problems, postsurgical health problems were associated with larger value decrements, with significant differences in several levels and dimensions, including level 2 of mobility, level 2/3 of usual activities, level 3 of pain/discomfort, and level 3 of anxiety/depression. Similar results were observed across subgroups stratified by age and sex. Findings suggest patient experience-based value sets are not stable (ie, context such as timing matters). However, the knowledge that lower values are assigned to health states postsurgery compared with presurgery may be useful for the patient-doctor decision-making process.

  15. 40 CFR 600.206-93 - Calculation and use of fuel economy values for gasoline-fueled, diesel-fueled, electric, alcohol...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Calculation and use of fuel economy values for gasoline-fueled, diesel-fueled, electric, alcohol-fueled, natural gas-fueled, alcohol dual fuel, and natural gas dual fuel vehicle configurations. 600.206-93 Section 600.206-93 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY ...

  16. 40 CFR 600.206-86 - Calculation and use of fuel economy values for gasoline-fueled, diesel, and electric vehicle...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Calculation and use of fuel economy values for gasoline-fueled, diesel, and electric vehicle configurations. 600.206-86 Section 600.206-86 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) ENERGY POLICY FUEL ECONOMY AND CARBON-RELATED EXHAUST EMISSIONS OF MOTOR...

  17. Machine learning assisted first-principles calculation of multicomponent solid solutions: estimation of interface energy in Ni-based superalloys

    NASA Astrophysics Data System (ADS)

    Chandran, Mahesh; Lee, S. C.; Shim, Jae-Hyeok

    2018-02-01

    A disordered configuration of atoms in a multicomponent solid solution presents a computational challenge for first-principles calculations using density functional theory (DFT). The challenge is in identifying the few probable (low energy) configurations from a large configurational space before DFT calculation can be performed. The search for these probable configurations is possible if the configurational energy E({\\boldsymbol{σ }}) can be calculated accurately and rapidly (with a negligibly small computational cost). In this paper, we demonstrate such a possibility by constructing a machine learning (ML) model for E({\\boldsymbol{σ }}) trained with DFT-calculated energies. The feature vector for the ML model is formed by concatenating histograms of pair and triplet (only equilateral triangle) correlation functions, {g}(2)(r) and {g}(3)(r,r,r), respectively. These functions are a quantitative ‘fingerprint’ of the spatial arrangement of atoms, familiar in the field of amorphous materials and liquids. The ML model is used to generate an accurate distribution P(E({\\boldsymbol{σ }})) by rapidly spanning a large number of configurations. The P(E) contains full configurational information of the solid solution and can be selectively sampled to choose a few configurations for targeted DFT calculations. This new framework is employed to estimate (100) interface energy ({σ }{{IE}}) between γ and γ \\prime at 700 °C in Alloy 617, a Ni-based superalloy, with composition reduced to five components. The estimated {σ }{{IE}} ≈ 25.95 mJ m-2 is in good agreement with the value inferred by the precipitation model fit to experimental data. The proposed new ML-based ab initio framework can be applied to calculate the parameters and properties of alloys with any number of components, thus widening the reach of first-principles calculation to realistic compositions of industrially relevant materials and alloys.

  18. Determinination of plasma osmolality and agreement between measured and calculated values in healthy adult Hispaniolan Amazon parrots (Amazona ventralis).

    PubMed

    Acierno, Mark J; Mitchell, Mark A; Freeman, Diana M; Schuster, Patricia J; Guzman, David Sanchez-Migallon; Tully, Thomas N

    2009-09-01

    To determine plasma osmolality in healthy adult Hispaniolan Amazon parrots (Amazona ventralis) and validate osmolality equations in these parrots. 20 healthy adult Hispaniolan Amazon parrots. A blood sample (0.5 mL) was collected from the right jugular vein of each parrot and placed into a lithium heparin microtainer tube. Samples were centrifuged, and plasma was harvested and frozen at -30 degrees C. Samples were thawed, and plasma osmolality was measured in duplicate with a freezing-point depression osmometer. The mean value was calculated for the 2 osmolality measurements. Plasma osmolality values were normally distributed, with a mean +/- SD of 326.0 +/- 6.878 mOsm/kg. The equations (2 x [Na(+) + K(+)]) + (glucose/18), which resulted in bias of 2.3333 mOsm/kg and limits of agreement of -7.0940 to 11.7606 mOsm/kg, and (2 x [Na(+) + K(+)]) + (uric acid concentration/16.8) + (glucose concentration/18), which resulted in bias of 5.8117 mOsm/kg and limits of agreement of -14.6640 to 3.0406 mOsm/kg, yielded calculated values that were in good agreement with the measured osmolality. IV administration of large amounts of hypotonic fluids can have catastrophic consequences. Osmolality of the plasma from parrots in this study was significantly higher than that of commercially available prepackaged fluids. Therefore, such fluids should be used with caution in Hispaniolan Amazon parrots as well as other psittacines. Additional studies are needed to determine whether the estimation of osmolality has the same clinical value in psittacines as it does in other animals.

  19. IR Imager Exposure Calculator

    Science.gov Websites

    CTIO Infrared Imager Exposure Time Calculator Note: ISPI throughput values updated 12 March 2005 S/N ratio 10 Exposure Time 1 (seconds) Calculate S/N for specified Total Integration Time Calculate Total Integration Time to reach Desired S/N Submit Exposure Calculation Request [CTIO Home] [CTIO IR

  20. [Value-based medicine for glaucoma].

    PubMed

    Hirneiss, C; Kampik, A; Neubauer, A S

    2010-03-01

    The application of value-based medicine (VBM) tenets in the area of glaucoma research requires valid and reliable data concerning the quality of life with glaucoma. A multitude of instruments for measuring quality of life of patients with glaucoma have been employed in the past. Any instrument used would need to capture peripheral vision loss and its influence on patient-reported quality of life as this is one of the hallmarks of this disease. Cost-utility analyses can then be based on the reported quality of life and the cost of glaucoma therapy. Several cost-utility analyses have been applied in the field of glaucoma screening as well as treating ocular hypertension and based on this a recommendation regarding population subgroups which can be treated cost efficiently can be made.

  1. Density functional theory calculations of III-N based semiconductors with mBJLDA

    NASA Astrophysics Data System (ADS)

    Gürel, Hikmet Hakan; Akıncı, Özden; Ünlü, Hilmi

    2017-02-01

    In this work, we present first principles calculations based on a full potential linear augmented plane-wave method (FP-LAPW) to calculate structural and electronic properties of III-V based nitrides such as GaN, AlN, InN in a zinc-blende cubic structure. First principles calculation using the local density approximation (LDA) and generalized gradient approximation (GGA) underestimate the band gap. We proposed a new potential called modified Becke-Johnson local density approximation (MBJLDA) that combines modified Becke-Johnson exchange potential and the LDA correlation potential to get better band gap results compared to experiment. We compared various exchange-correlation potentials (LSDA, GGA, HSE, and MBJLDA) to determine band gaps and structural properties of semiconductors. We show that using MBJLDA density potential gives a better agreement with experimental data for band gaps III-V nitrides based semiconductors.

  2. Building a values-based culture in nurse education.

    PubMed

    Tetley, Josie; Dobson, Fiona; Jack, Kirsten; Pearson, Beryl; Walker, Elaine

    2016-01-01

    Nurse education has found itself challenged to select and educate nurses who on completion of? of their programme? have: excellent technical skills, an ability to critically analyse care and work compassionately in ways that support the values of care that are important to service users. Recent reports of care suggest that nursing still needs to develop the values base of its student selection and education processes. Against this backdrop, this paper presents two examples from pre registration nurse education that illustrate how a values based approach is used as part of the selection process in one university and used to inform the development of a reflective poetry initiative in another university. Having presented the two examples the authors debate some of the wider benefits and challenges linked to these ways of working. For example, the importance of connecting nurses' personal beliefs, attitudes and assumptions to service user values in recruitment are discussed. The use of poetry as a way of thinking about practice that moves beyond traditional models of reflection in nursing are also considered. However, the authors recognise that if developments in nurse education are to have a real impact on nursing practice and patient care, there is the need for values based initiatives to be more directly connected to the delivery of healthcare. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Employers' use of value-based purchasing strategies.

    PubMed

    Rosenthal, Meredith B; Landon, Bruce E; Normand, Sharon-Lise T; Frank, Richard G; Ahmad, Thaniyyah S; Epstein, Arnold M

    2007-11-21

    Value-based purchasing by employers has often been portrayed as the lynchpin to quality improvement in a market-based health care system. Although a small group of the largest national employers has been actively engaged in promoting quality measurement, reporting, and pay for performance, it is unknown whether these ideas have significantly permeated employer-sponsored health benefit purchasing. To provide systematic descriptions and analyses of value-based purchasing and related efforts to improve quality of care by health care purchasers. We conducted telephone interviews with executives at 609 of the largest employers across 41 US markets between July 2005 and March 2006. The 41 randomly selected markets have at least 100,000 persons enrolled in health maintenance organizations, include approximately 91% of individuals enrolled in health maintenance organizations nationally, and represent roughly 78% of the US metropolitan population. Using the Dun & Bradstreet database of US employers, we identified the 26 largest firms in each market. Firms ranged in size from 60 to 250,000 employees. The degree to which value-based purchasing and related strategies are reported being used by employers. Percentages were weighted by number of employees. Of 1041 companies contacted, 609 employer representatives completed the survey (response rate, 64%). A large percentage of surveyed executives reported that they examine health plan quality data (269 respondents; 65% [95% confidence interval {CI}, 57%-74%]; P<.001), but few reported using it for performance rewards (49 respondents; 17% [95% CI, 7%-27%]; P=.008) or to influence employees (71 respondents; 23% [95% CI, 13%-33%]). Physician quality information is even less commonly examined (71 respondents; 16% [95% CI, 9%-23%]) or used by employers to reward performance (8 respondents; 2% [95% CI, 0%-3%]) or influence employee choice of providers (34 respondents; 8% [95% CI, 3%-12%]). Surveyed employers as a whole do not appear to

  4. Errors in the Calculation of 27Al Nuclear Magnetic Resonance Chemical Shifts

    PubMed Central

    Wang, Xianlong; Wang, Chengfei; Zhao, Hui

    2012-01-01

    Computational chemistry is an important tool for signal assignment of 27Al nuclear magnetic resonance spectra in order to elucidate the species of aluminum(III) in aqueous solutions. The accuracy of the popular theoretical models for computing the 27Al chemical shifts was evaluated by comparing the calculated and experimental chemical shifts in more than one hundred aluminum(III) complexes. In order to differentiate the error due to the chemical shielding tensor calculation from that due to the inadequacy of the molecular geometry prediction, single-crystal X-ray diffraction determined structures were used to build the isolated molecule models for calculating the chemical shifts. The results were compared with those obtained using the calculated geometries at the B3LYP/6-31G(d) level. The isotropic chemical shielding constants computed at different levels have strong linear correlations even though the absolute values differ in tens of ppm. The root-mean-square difference between the experimental chemical shifts and the calculated values is approximately 5 ppm for the calculations based on the X-ray structures, but more than 10 ppm for the calculations based on the computed geometries. The result indicates that the popular theoretical models are adequate in calculating the chemical shifts while an accurate molecular geometry is more critical. PMID:23203134

  5. Many-body calculations with deuteron based single-particle bases and their associated natural orbits

    NASA Astrophysics Data System (ADS)

    Puddu, G.

    2018-06-01

    We use the recently introduced single-particle states obtained from localized deuteron wave-functions as a basis for nuclear many-body calculations. We show that energies can be substantially lowered if the natural orbits (NOs) obtained from this basis are used. We use this modified basis for {}10{{B}}, {}16{{O}} and {}24{{Mg}} employing the bare NNLOopt nucleon–nucleon interaction. The lowering of the energies increases with the mass. Although in principle NOs require a full scale preliminary many-body calculation, we found that an approximate preliminary many-body calculation, with a marginal increase in the computational cost, is sufficient. The use of natural orbits based on an harmonic oscillator basis leads to a much smaller lowering of the energies for a comparable computational cost.

  6. Time to stabilization in single leg drop jump landings: an examination of calculation methods and assessment of differences in sample rate, filter settings and trial length on outcome values.

    PubMed

    Fransz, Duncan P; Huurnink, Arnold; de Boode, Vosse A; Kingma, Idsart; van Dieën, Jaap H

    2015-01-01

    Time to stabilization (TTS) is the time it takes for an individual to return to a baseline or stable state following a jump or hop landing. A large variety exists in methods to calculate the TTS. These methods can be described based on four aspects: (1) the input signal used (vertical, anteroposterior, or mediolateral ground reaction force) (2) signal processing (smoothed by sequential averaging, a moving root-mean-square window, or fitting an unbounded third order polynomial), (3) the stable state (threshold), and (4) the definition of when the (processed) signal is considered stable. Furthermore, differences exist with regard to the sample rate, filter settings and trial length. Twenty-five healthy volunteers performed ten 'single leg drop jump landing' trials. For each trial, TTS was calculated according to 18 previously reported methods. Additionally, the effects of sample rate (1000, 500, 200 and 100 samples/s), filter settings (no filter, 40, 15 and 10 Hz), and trial length (20, 14, 10, 7, 5 and 3s) were assessed. The TTS values varied considerably across the calculation methods. The maximum effect of alterations in the processing settings, averaged over calculation methods, were 2.8% (SD 3.3%) for sample rate, 8.8% (SD 7.7%) for filter settings, and 100.5% (SD 100.9%) for trial length. Differences in TTS calculation methods are affected differently by sample rate, filter settings and trial length. The effects of differences in sample rate and filter settings are generally small, while trial length has a large effect on TTS values. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. 21 CFR 868.1880 - Pulmonary-function data calculator.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Pulmonary-function data calculator. 868.1880 Section 868.1880 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES...-function values based on actual physical data obtained during pulmonary-function testing. (b...

  8. 21 CFR 868.1880 - Pulmonary-function data calculator.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Pulmonary-function data calculator. 868.1880 Section 868.1880 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES...-function values based on actual physical data obtained during pulmonary-function testing. (b...

  9. Group additivity calculations of the thermodynamic properties of unfolded proteins in aqueous solution: a critical comparison of peptide-based and HKF models.

    PubMed

    Hakin, A W; Hedwig, G R

    2001-02-15

    A recent paper in this journal [Amend and Helgeson, Biophys. Chem. 84 (2000) 105] presented a new group additivity model to calculate various thermodynamic properties of unfolded proteins in aqueous solution. The parameters given for the revised Helgeson-Kirkham-Flowers (HKF) equations of state for all the constituent groups of unfolded proteins can be used, in principle, to calculate the partial molar heat capacity, C(o)p.2, and volume, V2(0), at infinite dilution of any polypeptide. Calculations of the values of C(o)p.2 and V2(0) for several polypeptides have been carried out to test the predictive utility of the HKF group additivity model. The results obtained are in very poor agreement with experimental data, and also with results calculated using a peptide-based group additivity model. A critical assessment of these two additivity models is presented.

  10. Model-based coefficient method for calculation of N leaching from agricultural fields applied to small catchments and the effects of leaching reducing measures

    NASA Astrophysics Data System (ADS)

    Kyllmar, K.; Mårtensson, K.; Johnsson, H.

    2005-03-01

    A method to calculate N leaching from arable fields using model-calculated N leaching coefficients (NLCs) was developed. Using the process-based modelling system SOILNDB, leaching of N was simulated for four leaching regions in southern Sweden with 20-year climate series and a large number of randomised crop sequences based on regional agricultural statistics. To obtain N leaching coefficients, mean values of annual N leaching were calculated for each combination of main crop, following crop and fertilisation regime for each leaching region and soil type. The field-NLC method developed could be useful for following up water quality goals in e.g. small monitoring catchments, since it allows normal leaching from actual crop rotations and fertilisation to be determined regardless of the weather. The method was tested using field data from nine small intensively monitored agricultural catchments. The agreement between calculated field N leaching and measured N transport in catchment stream outlets, 19-47 and 8-38 kg ha -1 yr -1, respectively, was satisfactory in most catchments when contributions from land uses other than arable land and uncertainties in groundwater flows were considered. The possibility of calculating effects of crop combinations (crop and following crop) is of considerable value since changes in crop rotation constitute a large potential for reducing N leaching. When the effect of a number of potential measures to reduce N leaching (i.e. applying manure in spring instead of autumn; postponing ploughing-in of ley and green fallow in autumn; undersowing a catch crop in cereals and oilseeds; and increasing the area of catch crops by substituting winter cereals and winter oilseeds with corresponding spring crops) was calculated for the arable fields in the catchments using field-NLCs, N leaching was reduced by between 34 and 54% for the separate catchments when the best possible effect on the entire potential area was assumed.

  11. Validation of a program for supercritical power plant calculations

    NASA Astrophysics Data System (ADS)

    Kotowicz, Janusz; Łukowicz, Henryk; Bartela, Łukasz; Michalski, Sebastian

    2011-12-01

    This article describes the validation of a supercritical steam cycle. The cycle model was created with the commercial program GateCycle and validated using in-house code of the Institute of Power Engineering and Turbomachinery. The Institute's in-house code has been used extensively for industrial power plants calculations with good results. In the first step of the validation process, assumptions were made about the live steam temperature and pressure, net power, characteristic quantities for high- and low-pressure regenerative heat exchangers and pressure losses in heat exchangers. These assumptions were then used to develop a steam cycle model in Gate-Cycle and a model based on the code developed in-house at the Institute of Power Engineering and Turbomachinery. Properties, such as thermodynamic parameters at characteristic points of the steam cycle, net power values and efficiencies, heat provided to the steam cycle and heat taken from the steam cycle, were compared. The last step of the analysis was calculation of relative errors of compared values. The method used for relative error calculations is presented in the paper. The assigned relative errors are very slight, generally not exceeding 0.1%. Based on our analysis, it can be concluded that using the GateCycle software for calculations of supercritical power plants is possible.

  12. Multi-step Monte Carlo calculations applied to nuclear reactor instrumentation - source definition and renormalization to physical values

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Radulovic, Vladimir; Barbot, Loic; Fourmentel, Damien

    , which were recently irradiated in the Jozef Stefan Institute TRIGA Mark II reactor in Ljubljana, Slovenia, and provides recommendations on how they can be overcome. The paper concludes with a discussion on the renormalization of the results from the second step calculations, to obtain accurate physical values. (authors)« less

  13. Principals' Leadership Behaviour: Values-Based, Contingent or Both?

    ERIC Educational Resources Information Center

    Warwas, Julia

    2015-01-01

    Purpose: Concepts of values-based leadership posit that school principals' professional practice must be informed by values to ensure coherently purposeful activities. Contingency models stress the contextual dependency of professional practice and the need to match activities to local opportunities and constraints. The purpose of this paper is to…

  14. Research on BIM-based building information value chain reengineering

    NASA Astrophysics Data System (ADS)

    Hui, Zhao; Weishuang, Xie

    2017-04-01

    The achievement of value and value-added factor to the building engineering information is accomplished through a chain-flow, that is, building the information value chain. Based on the deconstruction of the information chain on the construction information in the traditional information mode, this paper clarifies the value characteristics and requirements of each stage of the construction project. In order to achieve building information value-added, the paper deconstructs the traditional building information value chain, reengineer the information value chain model on the basis of the theory and techniques of BIM, to build value-added management model and analyse the value of the model.

  15. Effect-based trigger values for in vitro bioassays: Reading across from existing water quality guideline values.

    PubMed

    Escher, Beate I; Neale, Peta A; Leusch, Frederic D L

    2015-09-15

    Cell-based bioassays are becoming increasingly popular in water quality assessment. The new generations of reporter-gene assays are very sensitive and effects are often detected in very clean water types such as drinking water and recycled water. For monitoring applications it is therefore imperative to derive trigger values that differentiate between acceptable and unacceptable effect levels. In this proof-of-concept paper, we propose a statistical method to read directly across from chemical guideline values to trigger values without the need to perform in vitro to in vivo extrapolations. The derivation is based on matching effect concentrations with existing chemical guideline values and filtering out appropriate chemicals that are responsive in the given bioassays at concentrations in the range of the guideline values. To account for the mixture effects of many chemicals acting together in a complex water sample, we propose bioanalytical equivalents that integrate the effects of groups of chemicals with the same mode of action that act in a concentration-additive manner. Statistical distribution methods are proposed to derive a specific effect-based trigger bioanalytical equivalent concentration (EBT-BEQ) for each bioassay of environmental interest that targets receptor-mediated toxicity. Even bioassays that are indicative of the same mode of action have slightly different numeric trigger values due to differences in their inherent sensitivity. The algorithm was applied to 18 cell-based bioassays and 11 provisional effect-based trigger bioanalytical equivalents were derived as an illustrative example using the 349 chemical guideline values protective for human health of the Australian Guidelines for Water Recycling. We illustrate the applicability using the example of a diverse set of water samples including recycled water. Most recycled water samples were compliant with the proposed triggers while wastewater effluent would not have been compliant with a few

  16. Evidence-Based and Values-Based Practices for People with Severe Disabilities

    ERIC Educational Resources Information Center

    Singer, George H. S.; Agran, Martin; Spooner, Fred

    2017-01-01

    This article discusses the relationship between evidence-based practices (EBPs) and values in research and practice pertaining to people with severe disabilities. The importance of basing educational and habilitation practices on substantial scientific evidence for practical, moral, and legal reasons is acknowledged given the prevalence of…

  17. Calculation of thermal expansion coefficient of glasses based on topological constraint theory

    NASA Astrophysics Data System (ADS)

    Zeng, Huidan; Ye, Feng; Li, Xiang; Wang, Ling; Yang, Bin; Chen, Jianding; Zhang, Xianghua; Sun, Luyi

    2016-10-01

    In this work, the thermal expansion behavior and the structure configuration evolution of glasses were studied. Degree of freedom based on the topological constraint theory is correlated with configuration evolution; considering the chemical composition and the configuration change, the analytical equation for calculating the thermal expansion coefficient of glasses from degree of freedom was derived. The thermal expansion of typical silicate and chalcogenide glasses was examined by calculating their thermal expansion coefficients (TEC) using the approach stated above. The results showed that this approach was energetically favorable for glass materials and revealed the corresponding underlying essence from viewpoint of configuration entropy. This work establishes a configuration-based methodology to calculate the thermal expansion coefficient of glasses that, lack periodic order.

  18. Parallel Representation of Value-Based and Finite State-Based Strategies in the Ventral and Dorsal Striatum

    PubMed Central

    Ito, Makoto; Doya, Kenji

    2015-01-01

    Previous theoretical studies of animal and human behavioral learning have focused on the dichotomy of the value-based strategy using action value functions to predict rewards and the model-based strategy using internal models to predict environmental states. However, animals and humans often take simple procedural behaviors, such as the “win-stay, lose-switch” strategy without explicit prediction of rewards or states. Here we consider another strategy, the finite state-based strategy, in which a subject selects an action depending on its discrete internal state and updates the state depending on the action chosen and the reward outcome. By analyzing choice behavior of rats in a free-choice task, we found that the finite state-based strategy fitted their behavioral choices more accurately than value-based and model-based strategies did. When fitted models were run autonomously with the same task, only the finite state-based strategy could reproduce the key feature of choice sequences. Analyses of neural activity recorded from the dorsolateral striatum (DLS), the dorsomedial striatum (DMS), and the ventral striatum (VS) identified significant fractions of neurons in all three subareas for which activities were correlated with individual states of the finite state-based strategy. The signal of internal states at the time of choice was found in DMS, and for clusters of states was found in VS. In addition, action values and state values of the value-based strategy were encoded in DMS and VS, respectively. These results suggest that both the value-based strategy and the finite state-based strategy are implemented in the striatum. PMID:26529522

  19. Utility and Value of Satellite-Based Frost Forecasting for Kenya's Tea Farming Sector

    NASA Astrophysics Data System (ADS)

    Morrison, I.

    2016-12-01

    Frost damage regularly inflicts millions of dollars of crop losses in the tea-growing highlands of western Kenya, a problem that the USAID/NASA Regional Visualization and Monitoring System (SERVIR) program is working to mitigate through a frost monitoring and forecasting product that uses satellite-based temperature and soil moisture data to generate up to three days of advanced warning before frost events. This paper presents the findings of a value of information (VOI) study assessing the value of this product based on Kenyan tea farmers' experiences with frost and frost-damage mitigation. Value was calculated based on historic trends of frost frequency, severity, and extent; likelihood of warning receipt and response; and subsequent frost-related crop-loss aversion. Quantification of these factors was derived through inferential analysis of survey data from 400 tea-farming households across the tea-growing regions of Kericho and Nandi, supplemented with key informant interviews with decision-makers at large estate tea plantations, historical frost incident and crop-loss data from estate tea plantations and agricultural insurance companies, and publicly available demographic and economic data. At this time, the product provides a forecasting window of up to three days, and no other frost-prediction methods are used by the large or small-scale farmers of Kenya's tea sector. This represents a significant opportunity for preemptive loss-reduction via Earth observation data. However, the tea-growing community has only two realistic options for frost-damage mitigation: preemptive harvest of available tea leaves to minimize losses, or skiving (light pruning) to facilitate fast recovery from frost damage. Both options are labor-intensive and require a minimum of three days of warning to be viable. As a result, the frost forecasting system has a very narrow margin of usefulness, making its value highly dependent on rapid access to the warning messages and flexible access

  20. Calculation Package: Derivation of Facility-Specific Derived Air Concentration (DAC) Values in Support of Spallation Neutron Source Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McLaughlin, David A

    Derived air concentration (DAC) values for 175 radionuclides* produced at the Oak Ridge National Laboratory (ORNL) Spallation Neutron Source (SNS), but not listed in Appendix A of 10 CFR 835 (01/01/2009 version), are presented. The proposed DAC values, ranging between 1 E-07 {micro}Ci/mL and 2 E-03 {micro}Ci/mL, were calculated in accordance with the recommendations of the International Commission on Radiological Protection (ICRP), and are intended to support an exemption request seeking regulatory relief from the 10 CFR 835, Appendix A, requirement to apply restrictive DACs of 2E-13 {micro}Ci/mL and 4E-11 {micro}Ci/mL and for non-listed alpha and non-alpha-emitting radionuclides, respectively.

  1. The value of improved (ERS) information based on domestic distribution effects of U.S. agriculture crops

    NASA Technical Reports Server (NTRS)

    Bradford, D. F.; Kelejian, H. H.; Brusch, R.; Gross, J.; Fishman, H.; Feenberg, D.

    1974-01-01

    The value of improving information for forecasting future crop harvests was investigated. Emphasis was placed upon establishing practical evaluation procedures firmly based in economic theory. The analysis was applied to the case of U.S. domestic wheat consumption. Estimates for a cost of storage function and a demand function for wheat were calculated. A model of market determinations of wheat inventories was developed for inventory adjustment. The carry-over horizon is computed by the solution of a nonlinear programming problem, and related variables such as spot and future price at each stage are determined. The model is adaptable to other markets. Results are shown to depend critically on the accuracy of current and proposed measurement techniques. The quantitative results are presented parametrically, in terms of various possible values of current and future accuracies.

  2. Sampling of Stochastic Input Parameters for Rockfall Calculations and for Structural Response Calculations Under Vibratory Ground Motion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    M. Gross

    2004-09-01

    The purpose of this scientific analysis is to define the sampled values of stochastic (random) input parameters for (1) rockfall calculations in the lithophysal and nonlithophysal zones under vibratory ground motions, and (2) structural response calculations for the drip shield and waste package under vibratory ground motions. This analysis supplies: (1) Sampled values of ground motion time history and synthetic fracture pattern for analysis of rockfall in emplacement drifts in nonlithophysal rock (Section 6.3 of ''Drift Degradation Analysis'', BSC 2004 [DIRS 166107]); (2) Sampled values of ground motion time history and rock mechanical properties category for analysis of rockfall inmore » emplacement drifts in lithophysal rock (Section 6.4 of ''Drift Degradation Analysis'', BSC 2004 [DIRS 166107]); (3) Sampled values of ground motion time history and metal to metal and metal to rock friction coefficient for analysis of waste package and drip shield damage to vibratory motion in ''Structural Calculations of Waste Package Exposed to Vibratory Ground Motion'' (BSC 2004 [DIRS 167083]) and in ''Structural Calculations of Drip Shield Exposed to Vibratory Ground Motion'' (BSC 2003 [DIRS 163425]). The sampled values are indices representing the number of ground motion time histories, number of fracture patterns and rock mass properties categories. These indices are translated into actual values within the respective analysis and model reports or calculations. This report identifies the uncertain parameters and documents the sampled values for these parameters. The sampled values are determined by GoldSim V6.04.007 [DIRS 151202] calculations using appropriate distribution types and parameter ranges. No software development or model development was required for these calculations. The calculation of the sampled values allows parameter uncertainty to be incorporated into the rockfall and structural response calculations that support development of the seismic scenario for

  3. Value-Based Delivery of Education: MOOCs as Messengers

    ERIC Educational Resources Information Center

    Gilfoil, David M.; Focht, Jeffrey W.

    2015-01-01

    Value-based delivery of healthcare has been discussed in the literature for almost a decade. The concept focuses on the patient and defines value as the improvement of patient outcomes divided by healthcare costs. Further refinements, called the Triple Aim model, focus on improving patient outcomes, reducing treatment costs, and improving patient…

  4. Estimation of Δ R/ R values by benchmark study of the Mössbauer Isomer shifts for Ru, Os complexes using relativistic DFT calculations

    NASA Astrophysics Data System (ADS)

    Kaneko, Masashi; Yasuhara, Hiroki; Miyashita, Sunao; Nakashima, Satoru

    2017-11-01

    The present study applies all-electron relativistic DFT calculation with Douglas-Kroll-Hess (DKH) Hamiltonian to each ten sets of Ru and Os compounds. We perform the benchmark investigation of three density functionals (BP86, B3LYP and B2PLYP) using segmented all-electron relativistically contracted (SARC) basis set with the experimental Mössbauer isomer shifts for 99Ru and 189Os nuclides. Geometry optimizations at BP86 theory of level locate the structure in a local minimum. We calculate the contact density to the wavefunction obtained by a single point calculation. All functionals show the good linear correlation with experimental isomer shifts for both 99Ru and 189Os. Especially, B3LYP functional gives a stronger correlation compared to BP86 and B2PLYP functionals. The comparison of contact density between SARC and well-tempered basis set (WTBS) indicated that the numerical convergence of contact density cannot be obtained, but the reproducibility is less sensitive to the choice of basis set. We also estimate the values of Δ R/ R, which is an important nuclear constant, for 99Ru and 189Os nuclides by using the benchmark results. The sign of the calculated Δ R/ R values is consistent with the predicted data for 99Ru and 189Os. We obtain computationally the Δ R/ R values of 99Ru and 189Os (36.2 keV) as 2.35×10-4 and -0.20×10-4, respectively, at B3LYP level for SARC basis set.

  5. An organ-based approach to dose calculation in the assessment of dose-dependent biological effects of ionising radiation in Arabidopsis thaliana.

    PubMed

    Biermans, Geert; Horemans, Nele; Vanhoudt, Nathalie; Vandenhove, Hildegarde; Saenen, Eline; Van Hees, May; Wannijn, Jean; Vives i Batlle, Jordi; Cuypers, Ann

    2014-07-01

    There is a need for a better understanding of biological effects of radiation exposure in non-human biota. Correct description of these effects requires a more detailed model of dosimetry than that available in current risk assessment tools, particularly for plants. In this paper, we propose a simple model for dose calculations in roots and shoots of Arabidopsis thaliana seedlings exposed to radionuclides in a hydroponic exposure setup. This model is used to compare absorbed doses for three radionuclides, (241)Am (α-radiation), (90)Sr (β-radiation) and (133)Ba (γ radiation). Using established dosimetric calculation methods, dose conversion coefficient values were determined for each organ separately based on uptake data from the different plant organs. These calculations were then compared to the DCC values obtained with the ERICA tool under equivalent geometry assumptions. When comparing with our new method, the ERICA tool appears to overestimate internal doses and underestimate external doses in the roots for all three radionuclides, though each to a different extent. These observations might help to refine dose-response relationships. The DCC values for (90)Sr in roots are shown to deviate the most. A dose-effect curve for (90)Sr β-radiation has been established on biomass and photosynthesis endpoints, but no significant dose-dependent effects are observed. This indicates the need for use of endpoints at the molecular and physiological scale. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Mandatory bundled payment getting into formation for value-based care.

    PubMed

    Fink, John

    2015-10-01

    Succeeding under Medicare's enterprise Comprehensive Care for Joint Replacement Model will require collaboration among caregivers and financial arrangements to align incentives Priorities for most organization's transition to becoming a value-based hospitals will be care redesign, supply-purchasing strategy, and post-acute care provider partnering. Pursuing value for your joint replacement program will chart a path for other service lines and lead your organization's transition to becoming a value-based enterprise.

  7. Medication calculation: the potential role of digital game-based learning in nurse education.

    PubMed

    Foss, Brynjar; Mordt Ba, Petter; Oftedal, Bjørg F; Løkken, Atle

    2013-12-01

    Medication dose calculation is one of several medication-related activities that are conducted by nurses daily. However, medication calculation skills appear to be an area of global concern, possibly because of low numeracy skills, test anxiety, low self-confidence, and low self-efficacy among student nurses. Various didactic strategies have been developed for student nurses who still lack basic mathematical competence. However, we suggest that the critical nature of these skills demands the investigation of alternative and/or supplementary didactic approaches to improve medication calculation skills and to reduce failure rates. Digital game-based learning is a possible solution because of the following reasons. First, mathematical drills may improve medication calculation skills. Second, games are known to be useful during nursing education. Finally, mathematical drill games appear to improve the attitudes of students toward mathematics. The aim of this article was to discuss common challenges of medication calculation skills in nurse education, and we highlight the potential role of digital game-based learning in this area.

  8. Sensor Based Engine Life Calculation: A Probabilistic Perspective

    NASA Technical Reports Server (NTRS)

    Guo, Ten-Huei; Chen, Philip

    2003-01-01

    It is generally known that an engine component will accumulate damage (life usage) during its lifetime of use in a harsh operating environment. The commonly used cycle count for engine component usage monitoring has an inherent range of uncertainty which can be overly costly or potentially less safe from an operational standpoint. With the advance of computer technology, engine operation modeling, and the understanding of damage accumulation physics, it is possible (and desirable) to use the available sensor information to make a more accurate assessment of engine component usage. This paper describes a probabilistic approach to quantify the effects of engine operating parameter uncertainties on the thermomechanical fatigue (TMF) life of a selected engine part. A closed-loop engine simulation with a TMF life model is used to calculate the life consumption of different mission cycles. A Monte Carlo simulation approach is used to generate the statistical life usage profile for different operating assumptions. The probabilities of failure of different operating conditions are compared to illustrate the importance of the engine component life calculation using sensor information. The results of this study clearly show that a sensor-based life cycle calculation can greatly reduce the risk of component failure as well as extend on-wing component life by avoiding unnecessary maintenance actions.

  9. The MiAge Calculator: a DNA methylation-based mitotic age calculator of human tissue types.

    PubMed

    Youn, Ahrim; Wang, Shuang

    2018-01-01

    Cell division is important in human aging and cancer. The estimation of the number of cell divisions (mitotic age) of a given tissue type in individuals is of great interest as it allows not only the study of biological aging (using a new molecular aging target) but also the stratification of prospective cancer risk. Here, we introduce the MiAge Calculator, a mitotic age calculator based on a novel statistical framework, the MiAge model. MiAge is designed to quantitatively estimate mitotic age (total number of lifetime cell divisions) of a tissue using the stochastic replication errors accumulated in the epigenetic inheritance process during cell divisions. With the MiAge model, the MiAge Calculator was built using the training data of DNA methylation measures of 4,020 tumor and adjacent normal tissue samples from eight TCGA cancer types and was tested using the testing data of DNA methylation measures of 2,221 tumor and adjacent normal tissue samples of five other TCGA cancer types. We showed that within each of the thirteen cancer types studied, the estimated mitotic age is universally accelerated in tumor tissues compared to adjacent normal tissues. Across the thirteen cancer types, we showed that worse cancer survivals are associated with more accelerated mitotic age in tumor tissues. Importantly, we demonstrated the utility of mitotic age by showing that the integration of mitotic age and clinical information leads to improved survival prediction in six out of the thirteen cancer types studied. The MiAge Calculator is available at http://www.columbia.edu/∼sw2206/softwares.htm .

  10. Phenol-quinone tautomerism in (arylazo)naphthols and the analogous Schiff bases: benchmark calculations.

    PubMed

    Ali, S Tahir; Antonov, Liudmil; Fabian, Walter M F

    2014-01-30

    Tautomerization energies of a series of isomeric [(4-R-phenyl)azo]naphthols and the analogous Schiff bases (R = N(CH3)2, OCH3, H, CN, NO2) are calculated by LPNO-CEPA/1-CBS using the def2-TZVPP and def2-QZVPP basis sets for extrapolation. The performance of various density functionals (B3LYP, M06-2X, PW6B95, B2PLYP, mPW2PLYP, PWPB95) as well as MP2 and SCS-MP2 is evaluated against these results. M06-2X and SCS-MP2 yield results close to the LPNO-CEPA/1-CBS values. Solvent effects (CCl4, CHCl3, CH3CN, and CH3OH) are treated by a variety of bulk solvation models (SM8, IEFPCM, COSMO, PBF, and SMD) as well as explicit solvation (Monte Carlo free energy perturbation using the OPLSAA force field).

  11. Hybrid dose calculation: a dose calculation algorithm for microbeam radiation therapy

    NASA Astrophysics Data System (ADS)

    Donzelli, Mattia; Bräuer-Krisch, Elke; Oelfke, Uwe; Wilkens, Jan J.; Bartzsch, Stefan

    2018-02-01

    Microbeam radiation therapy (MRT) is still a preclinical approach in radiation oncology that uses planar micrometre wide beamlets with extremely high peak doses, separated by a few hundred micrometre wide low dose regions. Abundant preclinical evidence demonstrates that MRT spares normal tissue more effectively than conventional radiation therapy, at equivalent tumour control. In order to launch first clinical trials, accurate and efficient dose calculation methods are an inevitable prerequisite. In this work a hybrid dose calculation approach is presented that is based on a combination of Monte Carlo and kernel based dose calculation. In various examples the performance of the algorithm is compared to purely Monte Carlo and purely kernel based dose calculations. The accuracy of the developed algorithm is comparable to conventional pure Monte Carlo calculations. In particular for inhomogeneous materials the hybrid dose calculation algorithm out-performs purely convolution based dose calculation approaches. It is demonstrated that the hybrid algorithm can efficiently calculate even complicated pencil beam and cross firing beam geometries. The required calculation times are substantially lower than for pure Monte Carlo calculations.

  12. Numerical calculation of thermo-mechanical problems at large strains based on complex step derivative approximation of tangent stiffness matrices

    NASA Astrophysics Data System (ADS)

    Balzani, Daniel; Gandhi, Ashutosh; Tanaka, Masato; Schröder, Jörg

    2015-05-01

    In this paper a robust approximation scheme for the numerical calculation of tangent stiffness matrices is presented in the context of nonlinear thermo-mechanical finite element problems and its performance is analyzed. The scheme extends the approach proposed in Kim et al. (Comput Methods Appl Mech Eng 200:403-413, 2011) and Tanaka et al. (Comput Methods Appl Mech Eng 269:454-470, 2014 and bases on applying the complex-step-derivative approximation to the linearizations of the weak forms of the balance of linear momentum and the balance of energy. By incorporating consistent perturbations along the imaginary axis to the displacement as well as thermal degrees of freedom, we demonstrate that numerical tangent stiffness matrices can be obtained with accuracy up to computer precision leading to quadratically converging schemes. The main advantage of this approach is that contrary to the classical forward difference scheme no round-off errors due to floating-point arithmetics exist within the calculation of the tangent stiffness. This enables arbitrarily small perturbation values and therefore leads to robust schemes even when choosing small values. An efficient algorithmic treatment is presented which enables a straightforward implementation of the method in any standard finite-element program. By means of thermo-elastic and thermo-elastoplastic boundary value problems at finite strains the performance of the proposed approach is analyzed.

  13. A computer-based matrix for rapid calculation of pulmonary hemodynamic parameters in congenital heart disease

    PubMed Central

    Lopes, Antonio Augusto; dos Anjos Miranda, Rogério; Gonçalves, Rilvani Cavalcante; Thomaz, Ana Maria

    2009-01-01

    BACKGROUND: In patients with congenital heart disease undergoing cardiac catheterization for hemodynamic purposes, parameter estimation by the indirect Fick method using a single predicted value of oxygen consumption has been a matter of criticism. OBJECTIVE: We developed a computer-based routine for rapid estimation of replicate hemodynamic parameters using multiple predicted values of oxygen consumption. MATERIALS AND METHODS: Using Microsoft® Excel facilities, we constructed a matrix containing 5 models (equations) for prediction of oxygen consumption, and all additional formulas needed to obtain replicate estimates of hemodynamic parameters. RESULTS: By entering data from 65 patients with ventricular septal defects, aged 1 month to 8 years, it was possible to obtain multiple predictions for oxygen consumption, with clear between-age groups (P <.001) and between-methods (P <.001) differences. Using these predictions in the individual patient, it was possible to obtain the upper and lower limits of a likely range for any given parameter, which made estimation more realistic. CONCLUSION: The organized matrix allows for rapid obtainment of replicate parameter estimates, without error due to exhaustive calculations. PMID:19641642

  14. Moving toward integrated value-based planning: The issues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chamberlin, J.H.; Braithwait, C.L.

    1988-07-01

    Integrated Value-Based Planning (IVP) is a promising new planning approach that uses value, as well as cost, as the common denominator for evaluating supply and demand resource options. Planning based on value yields an ''apples to apples'' comparison of utility and customer options. The IVP approach can form the cornerstone of a successful market-driven utility planning strategy. This conference will raise questions, discuss issues, and further the exchange of information regarding the tools, concepts, and techniques needed to put IVP into the utility planner's toolbox. This proceedings is more than a compendium of papers. It is designed to let bothmore » participants and non-participants exchange information. To this end, listings and cross-listings of papers, speakers and participant interest areas, along with the ever-invaluable phone number have been included.« less

  15. Nonmarket economic user values of the Florida Keys/Key West

    Treesearch

    Vernon R. Leeworthy; J. Michael Bowker

    1997-01-01

    This report provides estimates of the nonmarket economic user values for recreating visitors to the Florida Keys/Key West that participated in natural resource-based activities. Results from estimated travel cost models are presented, including visitor’s responses to prices and estimated per person-trip user values. Annual user values are also calculated and presented...

  16. On the accuracy and reproducibility of a novel probabilistic atlas-based generation for calculation of head attenuation maps on integrated PET/MR scanners.

    PubMed

    Chen, Kevin T; Izquierdo-Garcia, David; Poynton, Clare B; Chonde, Daniel B; Catana, Ciprian

    2017-03-01

    To propose an MR-based method for generating continuous-valued head attenuation maps and to assess its accuracy and reproducibility. Demonstrating that novel MR-based photon attenuation correction methods are both accurate and reproducible is essential prior to using them routinely in research and clinical studies on integrated PET/MR scanners. Continuous-valued linear attenuation coefficient maps ("μ-maps") were generated by combining atlases that provided the prior probability of voxel positions belonging to a certain tissue class (air, soft tissue, or bone) and an MR intensity-based likelihood classifier to produce posterior probability maps of tissue classes. These probabilities were used as weights to generate the μ-maps. The accuracy of this probabilistic atlas-based continuous-valued μ-map ("PAC-map") generation method was assessed by calculating the voxel-wise absolute relative change (RC) between the MR-based and scaled CT-based attenuation-corrected PET images. To assess reproducibility, we performed pair-wise comparisons of the RC values obtained from the PET images reconstructed using the μ-maps generated from the data acquired at three time points. The proposed method produced continuous-valued μ-maps that qualitatively reflected the variable anatomy in patients with brain tumor and agreed well with the scaled CT-based μ-maps. The absolute RC comparing the resulting PET volumes was 1.76 ± 2.33 %, quantitatively demonstrating that the method is accurate. Additionally, we also showed that the method is highly reproducible, the mean RC value for the PET images reconstructed using the μ-maps obtained at the three visits being 0.65 ± 0.95 %. Accurate and highly reproducible continuous-valued head μ-maps can be generated from MR data using a probabilistic atlas-based approach.

  17. Bayesian model aggregation for ensemble-based estimates of protein pKa values

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gosink, Luke J.; Hogan, Emilie A.; Pulsipher, Trenton C.

    2014-03-01

    This paper investigates an ensemble-based technique called Bayesian Model Averaging (BMA) to improve the performance of protein amino acid pmore » $$K_a$$ predictions. Structure-based p$$K_a$$ calculations play an important role in the mechanistic interpretation of protein structure and are also used to determine a wide range of protein properties. A diverse set of methods currently exist for p$$K_a$$ prediction, ranging from empirical statistical models to {\\it ab initio} quantum mechanical approaches. However, each of these methods are based on a set of assumptions that have inherent bias and sensitivities that can effect a model's accuracy and generalizability for p$$K_a$$ prediction in complicated biomolecular systems. We use BMA to combine eleven diverse prediction methods that each estimate pKa values of amino acids in staphylococcal nuclease. These methods are based on work conducted for the pKa Cooperative and the pKa measurements are based on experimental work conducted by the Garc{\\'i}a-Moreno lab. Our study demonstrates that the aggregated estimate obtained from BMA outperforms all individual prediction methods in our cross-validation study with improvements from 40-70\\% over other method classes. This work illustrates a new possible mechanism for improving the accuracy of p$$K_a$$ prediction and lays the foundation for future work on aggregate models that balance computational cost with prediction accuracy.« less

  18. A value-based taxonomy of improvement approaches in healthcare.

    PubMed

    Colldén, Christian; Gremyr, Ida; Hellström, Andreas; Sporraeus, Daniella

    2017-06-19

    Purpose The concept of value is becoming increasingly fashionable in healthcare and various improvement approaches (IAs) have been introduced with the aim of increasing value. The purpose of this paper is to construct a taxonomy that supports the management of parallel IAs in healthcare. Design/methodology/approach Based on previous research, this paper proposes a taxonomy that includes the dimensions of view on value and organizational focus; three contemporary IAs - lean, value-based healthcare, and patient-centered care - are related to the taxonomy. An illustrative qualitative case study in the context of psychiatric (psychosis) care is then presented that contains data from 23 interviews and focuses on the value concept, IAs, and the proposed taxonomy. Findings Respondents recognized the dimensions of the proposed taxonomy and indicated its usefulness as support for choosing and combining different IAs into a coherent management model, and for facilitating dialog about IAs. The findings also suggested that the view of value as "health outcomes" is widespread, but healthcare professionals are less likely than managers to also view value as a process. Originality/value The conceptual contribution of this paper is to delineate some important characteristics of IAs in relation to the emerging "value era". It also highlights the coexistence of different IAs in healthcare management practice. A taxonomy is proposed that can help managers choose, adapt, and combine IAs in local management models.

  19. Validation of light water reactor calculation methods and JEF-1-based data libraries by TRX and BAPL critical experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paratte, J.M.; Pelloni, S.; Grimm, P.

    1991-04-01

    This paper analyzes the capability of various code systems and JEF-1-based nuclear data libraries to compute light water reactor lattices by comparing calculations with results from thermal reactor benchmark experiments TRX and BAPL and with previously published values. With the JEF-1 evaluation, eigenvalues are generally well predicted within 8 mk (1 mk = 0.001) or less by all code systems, and all methods give reasonable results for the measured reaction rate ratios within, or not too far from, the experimental uncertainty.

  20. Ontology-Based Exchange and Immediate Application of Business Calculation Definitions for Online Analytical Processing

    NASA Astrophysics Data System (ADS)

    Kehlenbeck, Matthias; Breitner, Michael H.

    Business users define calculated facts based on the dimensions and facts contained in a data warehouse. These business calculation definitions contain necessary knowledge regarding quantitative relations for deep analyses and for the production of meaningful reports. The business calculation definitions are implementation and widely organization independent. But no automated procedures facilitating their exchange across organization and implementation boundaries exist. Separately each organization currently has to map its own business calculations to analysis and reporting tools. This paper presents an innovative approach based on standard Semantic Web technologies. This approach facilitates the exchange of business calculation definitions and allows for their automatic linking to specific data warehouses through semantic reasoning. A novel standard proxy server which enables the immediate application of exchanged definitions is introduced. Benefits of the approach are shown in a comprehensive case study.

  1. Confidence assignment for mass spectrometry based peptide identifications via the extreme value distribution.

    PubMed

    Alves, Gelio; Yu, Yi-Kuo

    2016-09-01

    There is a growing trend for biomedical researchers to extract evidence and draw conclusions from mass spectrometry based proteomics experiments, the cornerstone of which is peptide identification. Inaccurate assignments of peptide identification confidence thus may have far-reaching and adverse consequences. Although some peptide identification methods report accurate statistics, they have been limited to certain types of scoring function. The extreme value statistics based method, while more general in the scoring functions it allows, demands accurate parameter estimates and requires, at least in its original design, excessive computational resources. Improving the parameter estimate accuracy and reducing the computational cost for this method has two advantages: it provides another feasible route to accurate significance assessment, and it could provide reliable statistics for scoring functions yet to be developed. We have formulated and implemented an efficient algorithm for calculating the extreme value statistics for peptide identification applicable to various scoring functions, bypassing the need for searching large random databases. The source code, implemented in C ++ on a linux system, is available for download at ftp://ftp.ncbi.nlm.nih.gov/pub/qmbp/qmbp_ms/RAId/RAId_Linux_64Bit yyu@ncbi.nlm.nih.gov Supplementary data are available at Bioinformatics online. Published by Oxford University Press 2016. This work is written by US Government employees and is in the public domain in the US.

  2. 42 CFR 403.253 - Calculation of benefits.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... calculated on a net level reserve basis, using appropriate values to account for lapse, mortality, morbidity, and interest, that on the valuation date represents— (A) The present value of expected incurred benefits over the loss ratio calculation period; less— (B) The present value of expected net premiums over...

  3. 42 CFR 403.253 - Calculation of benefits.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... calculated on a net level reserve basis, using appropriate values to account for lapse, mortality, morbidity, and interest, that on the valuation date represents— (A) The present value of expected incurred benefits over the loss ratio calculation period; less— (B) The present value of expected net premiums over...

  4. 42 CFR 403.253 - Calculation of benefits.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... calculated on a net level reserve basis, using appropriate values to account for lapse, mortality, morbidity, and interest, that on the valuation date represents— (A) The present value of expected incurred benefits over the loss ratio calculation period; less— (B) The present value of expected net premiums over...

  5. 42 CFR 403.253 - Calculation of benefits.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... calculated on a net level reserve basis, using appropriate values to account for lapse, mortality, morbidity, and interest, that on the valuation date represents— (A) The present value of expected incurred benefits over the loss ratio calculation period; less— (B) The present value of expected net premiums over...

  6. Calculation of exchange coupling constants in triply-bridged dinuclear Cu(II) compounds based on spin-flip constricted variational density functional theory.

    PubMed

    Seidu, Issaka; Zhekova, Hristina R; Seth, Michael; Ziegler, Tom

    2012-03-08

    The performance of the second-order spin-flip constricted variational density functional theory (SF-CV(2)-DFT) for the calculation of the exchange coupling constant (J) is assessed by application to a series of triply bridged Cu(II) dinuclear complexes. A comparison of the J values based on SF-CV(2)-DFT with those obtained by the broken symmetry (BS) DFT method and experiment is provided. It is demonstrated that our methodology constitutes a viable alternative to the BS-DFT method. The strong dependence of the calculated exchange coupling constants on the applied functionals is demonstrated. Both SF-CV(2)-DFT and BS-DFT affords the best agreement with experiment for hybrid functionals.

  7. 31 CFR 351.32 - How are redemption values calculated for Series EE bonds with issue dates of May 1, 1997, through...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false How are redemption values calculated... Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FISCAL SERVICE, DEPARTMENT OF THE TREASURY BUREAU OF THE PUBLIC DEBT OFFERING OF UNITED STATES SAVINGS BONDS, SERIES EE...

  8. Value-based purchasing of medical devices.

    PubMed

    Obremskey, William T; Dail, Teresa; Jahangir, A Alex

    2012-04-01

    Health care in the United States is known for its continued innovation and production of new devices and techniques. While the intention of these devices is to improve the delivery and outcome of patient care, they do not always achieve this goal. As new technologies enter the market, hospitals and physicians must determine which of these new devices to incorporate into practice, and it is important these devices bring value to patient care. We provide a model of a physician-engaged process to decrease cost and increase review of physician preference items. We describe the challenges, implementation, and outcomes of cost reduction and product stabilization of a value-based process for purchasing medical devices at a major academic medical center. We implemented a physician-driven committee that standardized and utilized evidence-based, clinically sound, and financially responsible methods for introducing or consolidating new supplies, devices, and technology for patient care. This committee worked with institutional finance and administrative leaders to accomplish its goals. Utilizing this physician-driven committee, we provided access to new products, standardized some products, decreased costs of physician preference items 11% to 26% across service lines, and achieved savings of greater than $8 million per year. The implementation of a facility-based technology assessment committee that critically evaluates new technology can decrease hospital costs on implants and standardize some product lines.

  9. ARS-Media for Excel: A Spreadsheet Tool for Calculating Media Recipes Based on Ion-Specific Constraints

    PubMed Central

    Niedz, Randall P.

    2016-01-01

    ARS-Media for Excel is an ion solution calculator that uses “Microsoft Excel” to generate recipes of salts for complex ion mixtures specified by the user. Generating salt combinations (recipes) that result in pre-specified target ion values is a linear programming problem. Excel’s Solver add-on solves the linear programming equation to generate a recipe. Calculating a mixture of salts to generate exact solutions of complex ionic mixtures is required for at least 2 types of problems– 1) formulating relevant ecological/biological ionic solutions such as those from a specific lake, soil, cell, tissue, or organ and, 2) designing ion confounding-free experiments to determine ion-specific effects where ions are treated as statistical factors. Using ARS-Media for Excel to solve these two problems is illustrated by 1) exactly reconstructing a soil solution representative of a loamy agricultural soil and, 2) constructing an ion-based experiment to determine the effects of substituting Na+ for K+ on the growth of a Valencia sweet orange nonembryogenic cell line. PMID:27812202

  10. ARS-Media for Excel: A Spreadsheet Tool for Calculating Media Recipes Based on Ion-Specific Constraints.

    PubMed

    Niedz, Randall P

    2016-01-01

    ARS-Media for Excel is an ion solution calculator that uses "Microsoft Excel" to generate recipes of salts for complex ion mixtures specified by the user. Generating salt combinations (recipes) that result in pre-specified target ion values is a linear programming problem. Excel's Solver add-on solves the linear programming equation to generate a recipe. Calculating a mixture of salts to generate exact solutions of complex ionic mixtures is required for at least 2 types of problems- 1) formulating relevant ecological/biological ionic solutions such as those from a specific lake, soil, cell, tissue, or organ and, 2) designing ion confounding-free experiments to determine ion-specific effects where ions are treated as statistical factors. Using ARS-Media for Excel to solve these two problems is illustrated by 1) exactly reconstructing a soil solution representative of a loamy agricultural soil and, 2) constructing an ion-based experiment to determine the effects of substituting Na+ for K+ on the growth of a Valencia sweet orange nonembryogenic cell line.

  11. The value of innovation under value-based pricing.

    PubMed

    Moreno, Santiago G; Ray, Joshua A

    2016-01-01

    The role of cost-effectiveness analysis (CEA) in incentivizing innovation is controversial. Critics of CEA argue that its use for pricing purposes disregards the 'value of innovation' reflected in new drug development, whereas supporters of CEA highlight that the value of innovation is already accounted for. Our objective in this article is to outline the limitations of the conventional CEA approach, while proposing an alternative method of evaluation that captures the value of innovation more accurately. The adoption of a new drug benefits present and future patients (with cost implications) for as long as the drug is part of clinical practice. Incidence patients and off-patent prices are identified as two key missing features preventing the conventional CEA approach from capturing 1) benefit to future patients and 2) future savings from off-patent prices. The proposed CEA approach incorporates these two features to derive the total lifetime value of an innovative drug (i.e., the value of innovation). The conventional CEA approach tends to underestimate the value of innovative drugs by disregarding the benefit to future patients and savings from off-patent prices. As a result, innovative drugs are underpriced, only allowing manufacturers to capture approximately 15% of the total value of innovation during the patent protection period. In addition to including the incidence population and off-patent price, the alternative approach proposes pricing new drugs by first negotiating the share of value of innovation to be appropriated by the manufacturer (>15%?) and payer (<85%?), in order to then identify the drug price that satisfies this condition. We argue for a modification to the conventional CEA approach that integrates the total lifetime value of innovative drugs into CEA, by taking into account off-patent pricing and future patients. The proposed approach derives a price that allows manufacturers to capture an agreed share of this value, thereby incentivizing

  12. A values-based approach to medical leadership.

    PubMed

    Moen, Charlotte; Prescott, Patricia

    2016-11-02

    Integrity, trust and authenticity are essential characteristics of an effective leader, demonstrated through a values-based approach to leadership. This article explores whether Covey's (1989) principle-centred leadership model is a useful approach to developing doctors' leadership qualities and skills.

  13. ARS-Media: A spreadsheet tool for calculating media recipes based on ion-specific constraints

    USDA-ARS?s Scientific Manuscript database

    ARS-Media is an ion solution calculator that uses Microsoft Excel to generate recipes of salts for complex ion mixtures specified by the user. Generating salt combinations (recipes) that result in pre-specified target ion values is a linear programming problem. Thus, the recipes are generated using ...

  14. Age-related Associative Memory Deficits in Value-based Remembering: The Contribution of Agenda-based Regulation and Strategy Use

    PubMed Central

    Ariel, Robert; Price, Jodi; Hertzog, Christopher

    2015-01-01

    Value-based remembering in free recall tasks may be spared from the typical age-related cognitive decline observed for episodic memory. However, it is unclear whether value-based remembering for associative information is also spared from age-related cognitive decline. The current experiments evaluated the contribution of agenda-based based regulation and strategy use during study to age differences and similarities in value-based remembering of associative information. Participants studied word pairs (Experiments 1-2) or single words (Experiment 2) slated with different point values by moving a mouse controlled cursor to different spatial locations to reveal either items for study or the point value associated with remembering each item. Some participants also provided strategy reports for each item. Younger and older adults allocated greater time to studying high than low valued information, reported using normatively effective encoding strategies to learn high-valued pairs, and avoided study of low-valued pairs. As a consequence, both age groups selectively remembered more high than low-valued items. Despite nearly identical regulatory behavior, an associative memory deficit for older adults was present for high valued pairs. Age differences in value-based remembering did not occur when the materials were word lists. Fluid intelligence also moderated the effectiveness of older adults’ strategy use for high valued pairs (Experiment 2). These results suggest that age differences in associative value-based remembering may be due to some older adults’ gleaning less benefit from using normatively effective encoding strategies rather than age differences in metacognitive self-regulation per se. PMID:26523692

  15. Calculating pKa values for substituted phenols and hydration energies for other compounds with the first-order Fuzzy-Border continuum solvation model

    PubMed Central

    Sharma, Ity; Kaminski, George A.

    2012-01-01

    We have computed pKa values for eleven substituted phenol compounds using the continuum Fuzzy-Border (FB) solvation model. Hydration energies for 40 other compounds, including alkanes, alkenes, alkynes, ketones, amines, alcohols, ethers, aromatics, amides, heterocycles, thiols, sulfides and acids have been calculated. The overall average unsigned error in the calculated acidity constant values was equal to 0.41 pH units and the average error in the solvation energies was 0.076 kcal/mol. We have also reproduced pKa values of propanoic and butanoic acids within ca. 0.1 pH units from the experimental values by fitting the solvation parameters for carboxylate ion carbon and oxygen atoms. The FB model combines two distinguishing features. First, it limits the amount of noise which is common in numerical treatment of continuum solvation models by using fixed-position grid points. Second, it employs either second- or first-order approximation for the solvent polarization, depending on a particular implementation. These approximations are similar to those used for solute and explicit solvent fast polarization treatment which we developed previously. This article describes results of employing the first-order technique. This approximation places the presented methodology between the Generalized Born and Poisson-Boltzmann continuum solvation models with respect to their accuracy of reproducing the many-body effects in modeling a continuum solvent. PMID:22815192

  16. OrthoANI: An improved algorithm and software for calculating average nucleotide identity.

    PubMed

    Lee, Imchang; Ouk Kim, Yeong; Park, Sang-Cheol; Chun, Jongsik

    2016-02-01

    Species demarcation in Bacteria and Archaea is mainly based on overall genome relatedness, which serves a framework for modern microbiology. Current practice for obtaining these measures between two strains is shifting from experimentally determined similarity obtained by DNA-DNA hybridization (DDH) to genome-sequence-based similarity. Average nucleotide identity (ANI) is a simple algorithm that mimics DDH. Like DDH, ANI values between two genome sequences may be different from each other when reciprocal calculations are compared. We compared 63 690 pairs of genome sequences and found that the differences in reciprocal ANI values are significantly high, exceeding 1 % in some cases. To resolve this problem of not being symmetrical, a new algorithm, named OrthoANI, was developed to accommodate the concept of orthology for which both genome sequences were fragmented and only orthologous fragment pairs taken into consideration for calculating nucleotide identities. OrthoANI is highly correlated with ANI (using BLASTn) and the former showed approximately 0.1 % higher values than the latter. In conclusion, OrthoANI provides a more robust and faster means of calculating average nucleotide identity for taxonomic purposes. The standalone software tools are freely available at http://www.ezbiocloud.net/sw/oat.

  17. Communication: importance sampling including path correlation in semiclassical initial value representation calculations for time correlation functions.

    PubMed

    Pan, Feng; Tao, Guohua

    2013-03-07

    Full semiclassical (SC) initial value representation (IVR) for time correlation functions involves a double phase space average over a set of two phase points, each of which evolves along a classical path. Conventionally, the two initial phase points are sampled independently for all degrees of freedom (DOF) in the Monte Carlo procedure. Here, we present an efficient importance sampling scheme by including the path correlation between the two initial phase points for the bath DOF, which greatly improves the performance of the SC-IVR calculations for large molecular systems. Satisfactory convergence in the study of quantum coherence in vibrational relaxation has been achieved for a benchmark system-bath model with up to 21 DOF.

  18. Development of an evidence-based approach to external quality assurance for breast cancer hormone receptor immunohistochemistry: comparison of reference values.

    PubMed

    Makretsov, Nikita; Gilks, C Blake; Alaghehbandan, Reza; Garratt, John; Quenneville, Louise; Mercer, Joel; Palavdzic, Dragana; Torlakovic, Emina E

    2011-07-01

    External quality assurance and proficiency testing programs for breast cancer predictive biomarkers are based largely on traditional ad hoc design; at present there is no universal consensus on definition of a standard reference value for samples used in external quality assurance programs. To explore reference values for estrogen receptor and progesterone receptor immunohistochemistry in order to develop an evidence-based analytic platform for external quality assurance. There were 31 participating laboratories, 4 of which were previously designated as "expert" laboratories. Each participant tested a tissue microarray slide with 44 breast carcinomas for estrogen receptor and progesterone receptor and submitted it to the Canadian Immunohistochemistry Quality Control Program for analysis. Nuclear staining in 1% or more of the tumor cells was a positive score. Five methods for determining reference values were compared. All reference values showed 100% agreement for estrogen receptor and progesterone receptor scores, when indeterminate results were excluded. Individual laboratory performance (agreement rates, test sensitivity, test specificity, positive predictive value, negative predictive value, and κ value) was very similar for all reference values. Identification of suboptimal performance by all methods was identical for 30 of 31 laboratories. Estrogen receptor assessment of 1 laboratory was discordant: agreement was less than 90% for 3 of 5 reference values and greater than 90% with the use of 2 other reference values. Various reference values provide equivalent laboratory rating. In addition to descriptive feedback, our approach allows calculation of technical test sensitivity and specificity, positive and negative predictive values, agreement rates, and κ values to guide corrective actions.

  19. 31 CFR 351.28 - How are redemption values calculated for bonds with issue dates from May 1, 1995, through April 1...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false How are redemption values calculated for bonds with issue dates from May 1, 1995, through April 1, 1997? 351.28 Section 351.28 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FISCAL SERVICE, DEPARTMENT OF THE...

  20. Metric Scale Calculation for Visual Mapping Algorithms

    NASA Astrophysics Data System (ADS)

    Hanel, A.; Mitschke, A.; Boerner, R.; Van Opdenbosch, D.; Hoegner, L.; Brodie, D.; Stilla, U.

    2018-05-01

    Visual SLAM algorithms allow localizing the camera by mapping its environment by a point cloud based on visual cues. To obtain the camera locations in a metric coordinate system, the metric scale of the point cloud has to be known. This contribution describes a method to calculate the metric scale for a point cloud of an indoor environment, like a parking garage, by fusing multiple individual scale values. The individual scale values are calculated from structures and objects with a-priori known metric extension, which can be identified in the unscaled point cloud. Extensions of building structures, like the driving lane or the room height, are derived from density peaks in the point distribution. The extension of objects, like traffic signs with a known metric size, are derived using projections of their detections in images onto the point cloud. The method is tested with synthetic image sequences of a drive with a front-looking mono camera through a virtual 3D model of a parking garage. It has been shown, that each individual scale value improves either the robustness of the fused scale value or reduces its error. The error of the fused scale is comparable to other recent works.

  1. Investigation of the acid-base and electromigration properties of 5-azacytosine derivatives using capillary electrophoresis and density functional theory calculations.

    PubMed

    Geffertová, Denisa; Ali, Syed Tahir; Šolínová, Veronika; Krečmerová, Marcela; Holý, Antonín; Havlas, Zdeněk; Kašička, Václav

    2017-01-06

    Capillary electrophoresis (CE) and quantum mechanical density functional theory (DFT) were applied to the investigation of the acid-base and electromigration properties of important compounds: newly synthesized derivatives of 5-azacytosine - analogs of efficient antiviral drug cidofovir. These compounds exhibit a strong antiviral activity and they are considered as potential new antiviral agents. For their characterization and application, it is necessary to know their acid-base properties, particularly the acidity constants (pK a ) of their ionogenic groups (the basic N 3 atom of the triazine ring and the acidic phosphonic acid group in the alkyl chain). First, the mixed acidity constants (pK a mix ) of these ionogenic groups and the ionic mobilities of these compounds were determined by nonlinear regression analysis of the pH dependence of their effective electrophoretic mobilities. Effective mobilities were measured by CE in a series of background electrolytes in a wide pH range (2.0-10.5), at constant ionic strength (25mM) and constant temperature (25°C). Subsequently, the pK a mix values were recalculated to thermodynamic pK a values using the Debye-Hückel theory. The thermodynamic pK a value of the NH + moiety at the N 3 atom of the triazine ring was found to be in the range 2.82-3.30, whereas the pK a of the hydrogenphosphonate group reached values from 7.19 to 7.47, depending on the structure of the analyzed compounds. These experimentally determined pK a values were in good agreement with those calculated by quantum mechanical DFT. In addition, DFT calculations revealed that from the four nitrogen atoms in the 5-azacytosine moiety, the N 3 atom of the triazine ring is preferentially protonated. Effective charges of analyzed compounds ranged from zero or close-to-zero values at pH 2 to -2 elementary charges at pH≥9. Ionic mobilities were in the range (-16.7 to -19.1)×10 -9 m 2 V -1 s -1 for univalent anions and in the interval (-26.9 to -30.3)×10 -9 m

  2. Implementation of Online Promethee Method for Poor Family Change Rate Calculation

    NASA Astrophysics Data System (ADS)

    Aji, Dhady Lukito; Suryono; Widodo, Catur Edi

    2018-02-01

    This research has been done online calculation of the rate of poor family change rate by using Preference Ranking Method of Organization Of Enrichment Evaluation (PROMETHEE) .This system is very useful to monitor poverty in a region as well as for administrative services related to poverty rate. The system consists of computer clients and servers connected via the internet network. Poor family residence data obtained from the government. In addition, survey data are inputted through the client computer in each administrative village and also 23 criteria of input in accordance with the established government. The PROMETHEE method is used to evaluate the value of poverty and its weight is used to determine poverty status. PROMETHEE output can also be used to rank the poverty of the registered population of the server based on the netflow value. The poverty rate is calculated based on the current poverty rate compared to the previous poverty rate. The rate results can be viewed online and real time on the server through numbers and graphs. From the test results can be seen that the system can classify poverty status, calculate the poverty rate change rate and can determine the value and poverty ranking of each population.

  3. Comparison of lysimeter based and calculated ASCE reference evapotranspiration in a subhumid climate

    NASA Astrophysics Data System (ADS)

    Nolz, Reinhard; Cepuder, Peter; Eitzinger, Josef

    2016-04-01

    The standardized form of the well-known FAO Penman-Monteith equation, published by the Environmental and Water Resources Institute of the American Society of Civil Engineers (ASCE-EWRI), is recommended as a standard procedure for calculating reference evapotranspiration (ET ref) and subsequently plant water requirements. Applied and validated under different climatic conditions it generally achieved good results compared to other methods. However, several studies documented deviations between measured and calculated reference evapotranspiration depending on environmental and weather conditions. Therefore, it seems generally advisable to evaluate the model under local environmental conditions. In this study, reference evapotranspiration was determined at a subhumid site in northeastern Austria from 2005 to 2010 using a large weighing lysimeter (ET lys). The measured data were compared with ET ref calculations. Daily values differed slightly during a year, at which ET ref was generally overestimated at small values, whereas it was rather underestimated when ET was large, which is supported also by other studies. In our case, advection of sensible heat proved to have an impact, but it could not explain the differences exclusively. Obviously, there were also other influences, such as seasonal varying surface resistance or albedo. Generally, the ASCE-EWRI equation for daily time steps performed best at average weather conditions. The outcomes should help to correctly interpret ET ref data in the region and in similar environments and improve knowledge on the dynamics of influencing factors causing deviations.

  4. The value of innovation under value-based pricing

    PubMed Central

    Moreno, Santiago G.; Ray, Joshua A.

    2016-01-01

    Objective The role of cost-effectiveness analysis (CEA) in incentivizing innovation is controversial. Critics of CEA argue that its use for pricing purposes disregards the ‘value of innovation’ reflected in new drug development, whereas supporters of CEA highlight that the value of innovation is already accounted for. Our objective in this article is to outline the limitations of the conventional CEA approach, while proposing an alternative method of evaluation that captures the value of innovation more accurately. Method The adoption of a new drug benefits present and future patients (with cost implications) for as long as the drug is part of clinical practice. Incidence patients and off-patent prices are identified as two key missing features preventing the conventional CEA approach from capturing 1) benefit to future patients and 2) future savings from off-patent prices. The proposed CEA approach incorporates these two features to derive the total lifetime value of an innovative drug (i.e., the value of innovation). Results The conventional CEA approach tends to underestimate the value of innovative drugs by disregarding the benefit to future patients and savings from off-patent prices. As a result, innovative drugs are underpriced, only allowing manufacturers to capture approximately 15% of the total value of innovation during the patent protection period. In addition to including the incidence population and off-patent price, the alternative approach proposes pricing new drugs by first negotiating the share of value of innovation to be appropriated by the manufacturer (>15%?) and payer (<85%?), in order to then identify the drug price that satisfies this condition. Conclusion We argue for a modification to the conventional CEA approach that integrates the total lifetime value of innovative drugs into CEA, by taking into account off-patent pricing and future patients. The proposed approach derives a price that allows manufacturers to capture an agreed share

  5. Development of an efficient procedure for calculating the aerodynamic effects of planform variation

    NASA Technical Reports Server (NTRS)

    Mercer, J. E.; Geller, E. W.

    1981-01-01

    Numerical procedures to compute gradients in aerodynamic loading due to planform shape changes using panel method codes were studied. Two procedures were investigated: one computed the aerodynamic perturbation directly; the other computed the aerodynamic loading on the perturbed planform and on the base planform and then differenced these values to obtain the perturbation in loading. It is indicated that computing the perturbed values directly can not be done satisfactorily without proper aerodynamic representation of the pressure singularity at the leading edge of a thin wing. For the alternative procedure, a technique was developed which saves most of the time-consuming computations from a panel method calculation for the base planform. Using this procedure the perturbed loading can be calculated in about one-tenth the time of that for the base solution.

  6. Zinc finger protein binding to DNA: an energy perspective using molecular dynamics simulation and free energy calculations on mutants of both zinc finger domains and their specific DNA bases.

    PubMed

    Hamed, Mazen Y; Arya, Gaurav

    2016-05-01

    Energy calculations based on MM-GBSA were employed to study various zinc finger protein (ZF) motifs binding to DNA. Mutants of both the DNA bound to their specific amino acids were studied. Calculated energies gave evidence for a relationship between binding energy and affinity of ZF motifs to their sites on DNA. ΔG values were -15.82(12), -3.66(12), and -12.14(11.6) kcal/mol for finger one, finger two, and finger three, respectively. The mutations in the DNA bases reduced the value of the negative energies of binding (maximum value for ΔΔG = 42Kcal/mol for F1 when GCG mutated to GGG, and ΔΔG = 22 kcal/mol for F2, the loss in total energy of binding originated in the loss in electrostatic energies upon mutation (r = .98). The mutations in key amino acids in the ZF motif in positions-1, 2, 3, and 6 showed reduced binding energies to DNA with correlation coefficients between total free energy and electrostatic was .99 and with Van der Waal was .93. Results agree with experimentally found selectivity which showed that Arginine in position-1 is specific to G, while Aspartic acid (D) in position 2 plays a complicated role in binding. There is a correlation between the MD calculated free energies of binding and those obtained experimentally for prepared ZF motifs bound to triplet bases in other reports (), our results may help in the design of ZF motifs based on the established recognition codes based on energies and contributing energies to the total energy.

  7. SU-E-T-538: Evaluation of IMRT Dose Calculation Based on Pencil-Beam and AAA Algorithms.

    PubMed

    Yuan, Y; Duan, J; Popple, R; Brezovich, I

    2012-06-01

    To evaluate the accuracy of dose calculation for intensity modulated radiation therapy (IMRT) based on Pencil Beam (PB) and Analytical Anisotropic Algorithm (AAA) computation algorithms. IMRT plans of twelve patients with different treatment sites, including head/neck, lung and pelvis, were investigated. For each patient, dose calculation with PB and AAA algorithms using dose grid sizes of 0.5 mm, 0.25 mm, and 0.125 mm, were compared with composite-beam ion chamber and film measurements in patient specific QA. Discrepancies between the calculation and the measurement were evaluated by percentage error for ion chamber dose and γ〉l failure rate in gamma analysis (3%/3mm) for film dosimetry. For 9 patients, ion chamber dose calculated with AAA-algorithms is closer to ion chamber measurement than that calculated with PB algorithm with grid size of 2.5 mm, though all calculated ion chamber doses are within 3% of the measurements. For head/neck patients and other patients with large treatment volumes, γ〉l failure rate is significantly reduced (within 5%) with AAA-based treatment planning compared to generally more than 10% with PB-based treatment planning (grid size=2.5 mm). For lung and brain cancer patients with medium and small treatment volumes, γ〉l failure rates are typically within 5% for both AAA and PB-based treatment planning (grid size=2.5 mm). For both PB and AAA-based treatment planning, improvements of dose calculation accuracy with finer dose grids were observed in film dosimetry of 11 patients and in ion chamber measurements for 3 patients. AAA-based treatment planning provides more accurate dose calculation for head/neck patients and other patients with large treatment volumes. Compared with film dosimetry, a γ〉l failure rate within 5% can be achieved for AAA-based treatment planning. © 2012 American Association of Physicists in Medicine.

  8. The experimental and calculated characteristics of 22 tapered wings

    NASA Technical Reports Server (NTRS)

    Anderson, Raymond F

    1938-01-01

    The experimental and calculated aerodynamic characteristics of 22 tapered wings are compared, using tests made in the variable-density wind tunnel. The wings had aspect ratios from 6 to 12 and taper ratios from 1:6:1 and 5:1. The compared characteristics are the pitching moment, the aerodynamic-center position, the lift-curve slope, the maximum lift coefficient, and the curves of drag. The method of obtaining the calculated values is based on the use of wing theory and experimentally determined airfoil section data. In general, the experimental and calculated characteristics are in sufficiently good agreement that the method may be applied to many problems of airplane design.

  9. The Role of Value-Based Care in Patients with Cirrhosis.

    PubMed

    Volk, Michael L

    2017-02-01

    Value-based care means delivering high-quality care while keeping costs at a reasonable level. Many physicians have long viewed quality care and the responsible utilization of resources to be an integral part of their professional responsibilities. As the health care system changes, however, physicians are increasingly being asked to objectively demonstrate value. In this review article, the author describes the reimbursement and regulatory shift toward value-based care, and provides specific strategies for meeting this care. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  10. Adding glycaemic index and glycaemic load functionality to DietPLUS, a Malaysian food composition database and diet intake calculator.

    PubMed

    Shyam, Sangeetha; Wai, Tony Ng Kock; Arshad, Fatimah

    2012-01-01

    This paper outlines the methodology to add glycaemic index (GI) and glycaemic load (GL) functionality to food DietPLUS, a Microsoft Excel-based Malaysian food composition database and diet intake calculator. Locally determined GI values and published international GI databases were used as the source of GI values. Previously published methodology for GI value assignment was modified to add GI and GL calculators to the database. Two popular local low GI foods were added to the DietPLUS database, bringing up the total number of foods in the database to 838 foods. Overall, in relation to the 539 major carbohydrate foods in the Malaysian Food Composition Database, 243 (45%) food items had local Malaysian values or were directly matched to International GI database and another 180 (33%) of the foods were linked to closely-related foods in the GI databases used. The mean ± SD dietary GI and GL of the dietary intake of 63 women with previous gestational diabetes mellitus, calculated using DietPLUS version3 were, 62 ± 6 and 142 ± 45, respectively. These values were comparable to those reported from other local studies. DietPLUS version3, a simple Microsoft Excel-based programme aids calculation of diet GI and GL for Malaysian diets based on food records.

  11. Green Net Value Added as a Sustainability Metric Based on ...

    EPA Pesticide Factsheets

    Sustainability measurement in economics involves evaluation of environmental and economic impact in an integrated manner. In this study, system level economic data are combined with environmental impact from a life cycle assessment (LCA) of a common product. We are exploring a costing approach that captures traditional costs but also incorporates externality costs to provide a convenient, easily interpretable metric. Green Net Value Added (GNVA) is a type of full cost accounting that incorporates total revenue, the cost of materials and services, depreciation, and environmental externalities. Two, but not all, of the potential environmental impacts calculated by the standard LCIA method (TRACI) could be converted to externality cost values. We compute externality costs disaggregated by upstream sectors, full cost, and GNVA to evaluate the relative sustainability of Bounty® paper towels manufactured at two production facilities. We found that the longer running, more established line had a higher GNVA than the newer line. The dominant factors contributing to externality costs are calculated to come from the stationary sources in the supply chain: electricity generation (27-35%), refineries (20-21%), pulp and paper making (15-23%). Health related externalities from Particulate Matter (PM2.5) and Carbon Dioxide equivalent (CO2e) emissions appear largely driven by electricity usage and emissions by the facilities, followed by pulp processing and transport. Supply

  12. Values-based practice in mental health and psychiatry.

    PubMed

    Woodbridge-Dodd, Kim

    2012-11-01

    Values-based practice (VBP) challenges traditions of clinical practice and moral decision-making and the literature published over the last 18 months demonstrates a growing momentum for its use. The VBP model has become part of the narrative of health and social care practice. It features in a range of publications and has been subject to philosophical analysis in relation to its theoretical rigour, and applied to clinical practice through education and training, implementation in the field and policy development. From a philosophical perspective the model faces several challenges; from a practice perspective it is welcomed as a necessary partner to Evidence-Based Practice. Both perspectives suggest VBP requires significant adjustment to professional ideas of good practice, expectations of the clinician service user relationship and notions of what constitutes good care. VBP is both a solution and a problem for clinicians. Whether VBP is seen as providing a much needed clinical skill for working with the complexity of mental health and psychiatry which is steeped in values or a solution to a dominant sociopolitical neoliberal ideology demand for choice and personalization of care, VBP will require clinicians to make personal changes to their values base that reach to the depths of their professional identity.

  13. Dose Calculation For Accidental Release Of Radioactive Cloud Passing Over Jeddah

    NASA Astrophysics Data System (ADS)

    Alharbi, N. D.; Mayhoub, A. B.

    2011-12-01

    For the evaluation of doses after the reactor accident, in particular for the inhalation dose, a thorough knowledge of the concentration of the various radionuclide in air during the passage of the plume is required. In this paper we present an application of the Gaussian Plume Model (GPM) to calculate the atmospheric dispersion and airborne radionuclide concentration resulting from radioactive cloud over the city of Jeddah (KSA). The radioactive cloud is assumed to be emitted from a reactor of 10 MW power in postulated accidental release. Committed effective doses (CEDs) to the public at different distance from the source to the receptor are calculated. The calculations were based on meteorological condition and data of the Jeddah site. These data are: pasquill atmospheric stability is the class B and the wind speed is 2.4m/s at 10m height in the N direction. The residence time of some radionuclides considered in this study were calculated. The results indicate that, the values of doses first increase with distance, reach a maximum value and then gradually decrease. The total dose received by human is estimated by using the estimated values of residence time of each radioactive pollutant at different distances.

  14. Family Physician Readiness for Value-Based Payments: Does Ownership Status Matter?

    PubMed

    Robertson-Cooper, Heidy; Neaderhiser, Bradley; Happe, Laura E; Beveridge, Roy A

    2017-10-01

    Value-based payments are rapidly replacing fee-for-service arrangements, necessitating advancements in physician practice capabilities and functions. The objective of this study was to examine potential differences among family physicians who are owners versus employed with respect to their readiness for value-based payment models. The authors surveyed more than 550 family physicians from the American Academy of Family Physician's membership; nearly 75% had made changes to participate in value-based payments. However, owners were significantly more likely to report that their practices had made no changes in value-based payment capabilities than employed physicians (owners 35.2% vs. employed 18.1%, P < 0.05). This study identified 3 key areas in which physician owners' value-based practice capabilities were not as advanced as the employed physician group: (1) quality improvement strategies, (2) human capital investment, and (3) identification of high-risk patients. Specifically, the employed physician group reported more quality improvement strategies, including quality measures, Plan-Do-Study-Act, root cause analysis, and Lean Six Sigma (P < 0.05 for all). More employed physicians reported that their practices had full-time care management staff (19.8% owners vs. 30.8% employed, P < 0.05), while owners were more likely to report that they had no resources/capacity to hire care managers or care coordinators (31.4% owners vs. 19.4% employed, P < 0.05). Owners were significantly more likely to respond that they do not have the resources/capacity to identify high-risk patients (23.1% owners vs. 19.3% employed, P < 0.05). As public and private payers transition to value-based payments, consideration of different population health management needs according to ownership status has the potential to support the adoption of value-based care delivery for family physicians.

  15. Considering Spine Surgery: A Web-Based Calculator for Communicating Estimates of Personalized Treatment Outcomes.

    PubMed

    Moulton, Haley; Tosteson, Tor D; Zhao, Wenyan; Pearson, Loretta; Mycek, Kristina; Scherer, Emily; Weinstein, James N; Pearson, Adam; Abdu, William; Schwarz, Susan; Kelly, Michael; McGuire, Kevin; Milam, Alden; Lurie, Jon D

    2018-06-05

    Prospective evaluation of an informational web-based calculator for communicating estimates of personalized treatment outcomes. To evaluate the usability, effectiveness in communicating benefits and risks, and impact on decision quality of a calculator tool for patients with intervertebral disc herniations, spinal stenosis, and degenerative spondylolisthesis who are deciding between surgical and non-surgical treatments. The decision to have back surgery is preference-sensitive and warrants shared decision-making. However, more patient-specific, individualized tools for presenting clinical evidence on treatment outcomes are needed. Using Spine Patient Outcomes Research Trial (SPORT) data, prediction models were designed and integrated into a web-based calculator tool: http://spinesurgerycalc.dartmouth.edu/calc/. Consumer Reports subscribers with back-related pain were invited to use the calculator via email, and patient participants were recruited to use the calculator in a prospective manner following an initial appointment at participating spine centers. Participants completed questionnaires before and after using the calculator. We randomly assigned previously validated questions that tested knowledge about the treatment options to be asked either before or after viewing the calculator. 1,256 Consumer Reports subscribers and 68 patient participants completed the calculator and questionnaires. Knowledge scores were higher in the post-calculator group compared to the pre-calculator group, indicating that calculator usage successfully informed users. Decisional conflict was lower when measured following calculator use, suggesting the calculator was beneficial in the decision-making process. Participants generally found the tool helpful and easy to use. While the calculator is not a comprehensive decision aid, it does focus on communicating individualized risks and benefits for treatment options. Moreover, it appears to be helpful in achieving the goals of more

  16. AUI&GIV: Recommendation with Asymmetric User Influence and Global Importance Value

    PubMed Central

    Zhao, Zhi-Lin; Wang, Chang-Dong; Lai, Jian-Huang

    2016-01-01

    The user-based collaborative filtering (CF) algorithm is one of the most popular approaches for making recommendation. Despite its success, the traditional user-based CF algorithm suffers one serious problem that it only measures the influence between two users based on their symmetric similarities calculated by their consumption histories. It means that, for a pair of users, the influences on each other are the same, which however may not be true. Intuitively, an expert may have an impact on a novice user but a novice user may not affect an expert at all. Besides, each user may possess a global importance factor that affects his/her influence to the remaining users. To this end, in this paper, we propose an asymmetric user influence model to measure the directed influence between two users and adopt the PageRank algorithm to calculate the global importance value of each user. And then the directed influence values and the global importance values are integrated to deduce the final influence values between two users. Finally, we use the final influence values to improve the performance of the traditional user-based CF algorithm. Extensive experiments have been conducted, the results of which have confirmed that both the asymmetric user influence model and global importance value play key roles in improving recommendation accuracy, and hence the proposed method significantly outperforms the existing recommendation algorithms, in particular the user-based CF algorithm on the datasets of high rating density. PMID:26828803

  17. AUI&GIV: Recommendation with Asymmetric User Influence and Global Importance Value.

    PubMed

    Zhao, Zhi-Lin; Wang, Chang-Dong; Lai, Jian-Huang

    2016-01-01

    The user-based collaborative filtering (CF) algorithm is one of the most popular approaches for making recommendation. Despite its success, the traditional user-based CF algorithm suffers one serious problem that it only measures the influence between two users based on their symmetric similarities calculated by their consumption histories. It means that, for a pair of users, the influences on each other are the same, which however may not be true. Intuitively, an expert may have an impact on a novice user but a novice user may not affect an expert at all. Besides, each user may possess a global importance factor that affects his/her influence to the remaining users. To this end, in this paper, we propose an asymmetric user influence model to measure the directed influence between two users and adopt the PageRank algorithm to calculate the global importance value of each user. And then the directed influence values and the global importance values are integrated to deduce the final influence values between two users. Finally, we use the final influence values to improve the performance of the traditional user-based CF algorithm. Extensive experiments have been conducted, the results of which have confirmed that both the asymmetric user influence model and global importance value play key roles in improving recommendation accuracy, and hence the proposed method significantly outperforms the existing recommendation algorithms, in particular the user-based CF algorithm on the datasets of high rating density.

  18. IOL calculation using paraxial matrix optics.

    PubMed

    Haigis, Wolfgang

    2009-07-01

    Matrix methods have a long tradition in paraxial physiological optics. They are especially suited to describe and handle optical systems in a simple and intuitive manner. While these methods are more and more applied to calculate the refractive power(s) of toric intraocular lenses (IOL), they are hardly used in routine IOL power calculations for cataract and refractive surgery, where analytical formulae are commonly utilized. Since these algorithms are also based on paraxial optics, matrix optics can offer rewarding approaches to standard IOL calculation tasks, as will be shown here. Some basic concepts of matrix optics are introduced and the system matrix for the eye is defined, and its application in typical IOL calculation problems is illustrated. Explicit expressions are derived to determine: predicted refraction for a given IOL power; necessary IOL power for a given target refraction; refractive power for a phakic IOL (PIOL); predicted refraction for a thick lens system. Numerical examples with typical clinical values are given for each of these expressions. It is shown that matrix optics can be applied in a straightforward and intuitive way to most problems of modern routine IOL calculation, in thick or thin lens approximation, for aphakic or phakic eyes.

  19. Episodic memories predict adaptive value-based decision-making

    PubMed Central

    Murty, Vishnu; FeldmanHall, Oriel; Hunter, Lindsay E.; Phelps, Elizabeth A; Davachi, Lila

    2016-01-01

    Prior research illustrates that memory can guide value-based decision-making. For example, previous work has implicated both working memory and procedural memory (i.e., reinforcement learning) in guiding choice. However, other types of memories, such as episodic memory, may also influence decision-making. Here we test the role for episodic memory—specifically item versus associative memory—in supporting value-based choice. Participants completed a task where they first learned the value associated with trial unique lotteries. After a short delay, they completed a decision-making task where they could choose to re-engage with previously encountered lotteries, or new never before seen lotteries. Finally, participants completed a surprise memory test for the lotteries and their associated values. Results indicate that participants chose to re-engage more often with lotteries that resulted in high versus low rewards. Critically, participants not only formed detailed, associative memories for the reward values coupled with individual lotteries, but also exhibited adaptive decision-making only when they had intact associative memory. We further found that the relationship between adaptive choice and associative memory generalized to more complex, ecologically valid choice behavior, such as social decision-making. However, individuals more strongly encode experiences of social violations—such as being treated unfairly, suggesting a bias for how individuals form associative memories within social contexts. Together, these findings provide an important integration of episodic memory and decision-making literatures to better understand key mechanisms supporting adaptive behavior. PMID:26999046

  20. Dose equivalent rate constants and barrier transmission data for nuclear medicine facility dose calculations and shielding design.

    PubMed

    Kusano, Maggie; Caldwell, Curtis B

    2014-07-01

    A primary goal of nuclear medicine facility design is to keep public and worker radiation doses As Low As Reasonably Achievable (ALARA). To estimate dose and shielding requirements, one needs to know both the dose equivalent rate constants for soft tissue and barrier transmission factors (TFs) for all radionuclides of interest. Dose equivalent rate constants are most commonly calculated using published air kerma or exposure rate constants, while transmission factors are most commonly calculated using published tenth-value layers (TVLs). Values can be calculated more accurately using the radionuclide's photon emission spectrum and the physical properties of lead, concrete, and/or tissue at these energies. These calculations may be non-trivial due to the polyenergetic nature of the radionuclides used in nuclear medicine. In this paper, the effects of dose equivalent rate constant and transmission factor on nuclear medicine dose and shielding calculations are investigated, and new values based on up-to-date nuclear data and thresholds specific to nuclear medicine are proposed. To facilitate practical use, transmission curves were fitted to the three-parameter Archer equation. Finally, the results of this work were applied to the design of a sample nuclear medicine facility and compared to doses calculated using common methods to investigate the effects of these values on dose estimates and shielding decisions. Dose equivalent rate constants generally agreed well with those derived from the literature with the exception of those from NCRP 124. Depending on the situation, Archer fit TFs could be significantly more accurate than TVL-based TFs. These results were reflected in the sample shielding problem, with unshielded dose estimates agreeing well, with the exception of those based on NCRP 124, and Archer fit TFs providing a more accurate alternative to TVL TFs and a simpler alternative to full spectral-based calculations. The data provided by this paper should assist

  1. Reference Values for Human Posture Measurements Based on Computerized Photogrammetry: A Systematic Review.

    PubMed

    Macedo Ribeiro, Ana Freire; Bergmann, Anke; Lemos, Thiago; Pacheco, Antônio Guilherme; Mello Russo, Maitê; Santos de Oliveira, Laura Alice; de Carvalho Rodrigues, Erika

    The main objective of this study was to review the literature to identify reference values for angles and distances of body segments related to upright posture in healthy adult women with the Postural Assessment Software (PAS/SAPO). Electronic databases (BVS, PubMed, SciELO and Scopus) were assessed using the following descriptors: evaluation, posture, photogrammetry, physical therapy, postural alignment, postural assessment, and physiotherapy. Studies that performed postural evaluation in healthy adult women with PAS/SAPO and were published in English, Portuguese and Spanish, between the years 2005 and 2014 were included. Four studies met the inclusion criteria. Data from the included studies were grouped to establish the statistical descriptors (mean, variance, and standard deviation) of the body angles and distances. A total of 29 variables were assessed (10 in the anterior views, 16 in the lateral right and left views, and 3 in the posterior views), and its respective mean and standard deviation were calculated. Reference values for the anterior and posterior views showed no symmetry between the right and left sides of the body in the frontal plane. There were also small differences in the calculated reference values for the lateral view. The proposed reference values for quantitative evaluation of the upright posture in healthy adult women estimated in the present study using PAS/SAPO could guide future studies and help clinical practice. Copyright © 2017. Published by Elsevier Inc.

  2. Accurate pKa calculation of the conjugate acids of alkanolamines, alkaloids and nucleotide bases by quantum chemical methods.

    PubMed

    Gangarapu, Satesh; Marcelis, Antonius T M; Zuilhof, Han

    2013-04-02

    The pKa of the conjugate acids of alkanolamines, neurotransmitters, alkaloid drugs and nucleotide bases are calculated with density functional methods (B3LYP, M08-HX and M11-L) and ab initio methods (SCS-MP2, G3). Implicit solvent effects are included with a conductor-like polarizable continuum model (CPCM) and universal solvation models (SMD, SM8). G3, SCS-MP2 and M11-L methods coupled with SMD and SM8 solvation models perform well for alkanolamines with mean unsigned errors below 0.20 pKa units, in all cases. Extending this method to the pKa calculation of 35 nitrogen-containing compounds spanning 12 pKa units showed an excellent correlation between experimental and computational pKa values of these 35 amines with the computationally low-cost SM8/M11-L density functional approach. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Photolysis Rate Coefficient Calculations in Support of SOLVE II

    NASA Technical Reports Server (NTRS)

    Swartz, William H.

    2005-01-01

    A quantitative understanding of photolysis rate coefficients (or "j-values") is essential to determining the photochemical reaction rates that define ozone loss and other crucial processes in the atmosphere. j-Values can be calculated with radiative transfer models, derived from actinic flux observations, or inferred from trace gas measurements. The primary objective of the present effort was the accurate calculation of j-values in the Arctic twilight along NASA DC-8 flight tracks during the second SAGE III Ozone Loss and Validation Experiment (SOLVE II), based in Kiruna, Sweden (68 degrees N, 20 degrees E) during January-February 2003. The JHU/APL radiative transfer model was utilized to produce a large suite of j-values for photolysis processes (over 70 reactions) relevant to the upper troposphere and lower stratosphere. The calculations take into account the actual changes in ozone abundance and apparent albedo of clouds and the Earth surface along the aircraft flight tracks as observed by in situ and remote sensing platforms (e.g., EP-TOMS). A secondary objective was to analyze solar irradiance data from NCAR s Direct beam Irradiance Atmospheric Spectrometer (DIAS) on-board the NASA DC-8 and to start the development of a flexible, multi-species spectral fitting technique for the independent retrieval of O3,O2.02, and aerosol optical properties.

  4. AMCP Partnership Forum: Advancing Value-Based Contracting.

    PubMed

    2017-11-01

    During the past decade, payment models for the delivery of health care have undergone a dramatic shift from focusing on volume to focusing on value. This shift began with the Affordable Care Act and was reinforced by the Medicare Access and CHIP Reauthorization Act of 2015 (MACRA), which increased the emphasis on payment for delivery of quality care. Today, value-based care is a primary strategy for improving patient care while managing costs. This shift in payment models is expanding beyond the delivery of health care services to encompass models of compensation between payers and biopharmaceutical manufacturers. Value-based contracts (VBCs) have emerged as a mechanism that payers may use to better align their contracting structures with broader changes in the health care system. While pharmaceuticals represent a small share of total health care spending, it is one of the fastest-growing segments of the health care marketplace, and the increasing costs of pharmaceuticals necessitate more flexibility to contract in new ways based on the value of these products. Although not all products or services are appropriate for these types of contracts, VBCs could be a part of the solution to address increasing drug prices and overall drug spending. VBCs encompass a variety of different contracting strategies for biopharmaceutical products that do not base payment rates on volume. These contracts instead may include payment on the achievement of specific goals in a predetermined patient population and offer innovative solutions for quantifying and rewarding positive outcomes or otherwise reducing payer risk associated with pharmaceutical costs. To engage national stakeholders in a discussion of current practices, barriers, and potential benefits of VBCs, the Academy of Managed Care Pharmacy (AMCP) convened a Partnership Forum on Advancing Value-Based Contracting in Arlington, Virginia, on June 20-21, 2017. The goals of the VBC forum were as follows: (a) agree to a definition

  5. 40 CFR 600.209-95 - Calculation of fuel economy values for labeling.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...) Multiply the city model type fuel economy calculated from the tests performed using gasoline or diesel test... (B) Multiply the city model type fuel economy calculated from the tests performed using alcohol or natural gas test fuel as determined in § 600.207 (b)(5)(ii) by 0.90, rounding the product to the nearest...

  6. Value-based cost sharing in the United States and elsewhere can increase patients' use of high-value goods and services.

    PubMed

    Thomson, Sarah; Schang, Laura; Chernew, Michael E

    2013-04-01

    This article reviews efforts in the United States and several other member countries of the Organization for Economic Cooperation and Development to encourage patients, through cost sharing, to use goods such as medications, services, and providers that offer better value than other options--an approach known as value-based cost sharing. Among the countries we reviewed, we found that value-based approaches were most commonly applied to drug cost sharing. A few countries, including the United States, employed financial incentives, such as lower copayments, to encourage use of preferred providers or preventive services. Evidence suggests that these efforts can increase patients' use of high-value services--although they may also be associated with high administrative costs and could exacerbate health inequalities among various groups. With careful design, implementation, and evaluation, value-based cost sharing can be an important tool for aligning patient and provider incentives to pursue high-value care.

  7. Determination of water pH using absorption-based optical sensors: evaluation of different calculation methods

    NASA Astrophysics Data System (ADS)

    Wang, Hongliang; Liu, Baohua; Ding, Zhongjun; Wang, Xiangxin

    2017-02-01

    Absorption-based optical sensors have been developed for the determination of water pH. In this paper, based on the preparation of a transparent sol-gel thin film with a phenol red (PR) indicator, several calculation methods, including simple linear regression analysis, quadratic regression analysis and dual-wavelength absorbance ratio analysis, were used to calculate water pH. Results of MSSRR show that dual-wavelength absorbance ratio analysis can improve the calculation accuracy of water pH in long-term measurement.

  8. Desired emotions across cultures: A value-based account.

    PubMed

    Tamir, Maya; Schwartz, Shalom H; Cieciuch, Jan; Riediger, Michaela; Torres, Claudio; Scollon, Christie; Dzokoto, Vivian; Zhou, Xiaolu; Vishkin, Allon

    2016-07-01

    Values reflect how people want to experience the world; emotions reflect how people actually experience the world. Therefore, we propose that across cultures people desire emotions that are consistent with their values. Whereas prior research focused on the desirability of specific affective states or 1 or 2 target emotions, we offer a broader account of desired emotions. After reporting initial evidence for the potential causal effects of values on desired emotions in a preliminary study (N = 200), we tested the predictions of our proposed model in 8 samples (N = 2,328) from distinct world cultural regions. Across cultural samples, we found that people who endorsed values of self-transcendence (e.g., benevolence) wanted to feel more empathy and compassion, people who endorsed values of self-enhancement (e.g., power) wanted to feel more anger and pride, people who endorsed values of openness to change (e.g., self-direction) wanted to feel more interest and excitement, and people who endorsed values of conservation (e.g., tradition) wanted to feel more calmness and less fear. These patterns were independent of differences in emotional experience. We discuss the implications of our value-based account of desired emotions for understanding emotion regulation, culture, and other individual differences. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  9. First field-based observations of δ2H and δ18O values of precipitation and other water bodies in the Mongolian Gobi desert

    NASA Astrophysics Data System (ADS)

    Burnik Šturm, Martina; Ganbaatar, Oyunsaikhan; Voigt, Christian C.; Kaczensky, Petra

    2017-04-01

    Hydrogen (δ2H) and oxygen (δ18O) isotope values of water are widely used to track the global hydrological cycle and the global δ2H and δ18O patterns of precipitation are increasingly used in studies on animal migration, forensics, food authentication and traceability studies. However, δ2H and δ18O values of precipitation spanning one or more years are available for only a few 100 locations worldwide and for many remote areas such as Mongolia data are still scarce. We obtained the first field-based δ2H and δ18O isotope data of event-based precipitation, rivers and other water bodies in the extreme environment of the Dzungarian Gobi desert in SW Mongolia, covering a period of 16 months (1). Our study area is located over 450 km north-east from the nearest IAEA GNIP station (Fukang station, China) from which it is separated by a mountain range at the international border between China and Mongolia. Isotope values of the collected event-based precipitation showed and extreme range and a high seasonal variability with higher and more variable values in summer and lower in winter. The high variability could not be explained by different origin of air masses alone (i.e. NW polar winds over Russia or westerlies over Central Asia; analyzed using back-trajectory HYSPLIT model), but is likely a result of a combination of different processes affecting the isotope values of precipitation in this area. The calculated field-based local meteoric water line (LMWL, δ2H=(7.42±0.16)δ18O-(23.87±3.27)) showed isotopic characteristics of precipitation in an arid region. We observed a slight discrepancy between the filed based and modelled (Online Isotope in Precipitation Calculator, OIPC) LMWL which highlighted the difficulty of modelling the δ2H and δ18O values for areas with extreme climatic conditions and thus emphasized the importance of collecting long-term field-based data. The collected isotopic data of precipitation and other water bodies provide a basis for future

  10. Acceleration of intensity-modulated radiotherapy dose calculation by importance sampling of the calculation matrices.

    PubMed

    Thieke, Christian; Nill, Simeon; Oelfke, Uwe; Bortfeld, Thomas

    2002-05-01

    In inverse planning for intensity-modulated radiotherapy, the dose calculation is a crucial element limiting both the maximum achievable plan quality and the speed of the optimization process. One way to integrate accurate dose calculation algorithms into inverse planning is to precalculate the dose contribution of each beam element to each voxel for unit fluence. These precalculated values are stored in a big dose calculation matrix. Then the dose calculation during the iterative optimization process consists merely of matrix look-up and multiplication with the actual fluence values. However, because the dose calculation matrix can become very large, this ansatz requires a lot of computer memory and is still very time consuming, making it not practical for clinical routine without further modifications. In this work we present a new method to significantly reduce the number of entries in the dose calculation matrix. The method utilizes the fact that a photon pencil beam has a rapid radial dose falloff, and has very small dose values for the most part. In this low-dose part of the pencil beam, the dose contribution to a voxel is only integrated into the dose calculation matrix with a certain probability. Normalization with the reciprocal of this probability preserves the total energy, even though many matrix elements are omitted. Three probability distributions were tested to find the most accurate one for a given memory size. The sampling method is compared with the use of a fully filled matrix and with the well-known method of just cutting off the pencil beam at a certain lateral distance. A clinical example of a head and neck case is presented. It turns out that a sampled dose calculation matrix with only 1/3 of the entries of the fully filled matrix does not sacrifice the quality of the resulting plans, whereby the cutoff method results in a suboptimal treatment plan.

  11. 40 CFR 600.113-78 - Fuel economy calculations.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... economy calculations. The calculations of vehicle fuel economy values require the weighted grams/mile values for HC, CO, and CO2 for the city fuel economy test and the grams/mile values for HC, CO, and CO2... weighted grams/mile values for the city fuel economy test for HC, CO, and CO2 as specified in § 86.144 of...

  12. An accurate density functional theory based estimation of pK(a) values of polar residues combined with experimental data: from amino acids to minimal proteins.

    PubMed

    Matsui, Toru; Baba, Takeshi; Kamiya, Katsumasa; Shigeta, Yasuteru

    2012-03-28

    We report a scheme for estimating the acid dissociation constant (pK(a)) based on quantum-chemical calculations combined with a polarizable continuum model, where a parameter is determined for small reference molecules. We calculated the pK(a) values of variously sized molecules ranging from an amino acid to a protein consisting of 300 atoms. This scheme enabled us to derive a semiquantitative pK(a) value of specific chemical groups and discuss the influence of the surroundings on the pK(a) values. As applications, we have derived the pK(a) value of the side chain of an amino acid and almost reproduced the experimental value. By using our computing schemes, we showed the influence of hydrogen bonds on the pK(a) values in the case of tripeptides, which decreases the pK(a) value by 3.0 units for serine in comparison with those of the corresponding monopeptides. Finally, with some assumptions, we derived the pK(a) values of tyrosines and serines in chignolin and a tryptophan cage. We obtained quite different pK(a) values of adjacent serines in the tryptophan cage; the pK(a) value of the OH group of Ser13 exposed to bulk water is 14.69, whereas that of Ser14 not exposed to bulk water is 20.80 because of the internal hydrogen bonds.

  13. Value-based insurance design: embracing value over cost alone.

    PubMed

    Fendrick, A Mark; Chernew, Michael E; Levi, Gary W

    2009-12-01

    The US healthcare system is in crisis, with documented gaps in quality, safety, access, and affordability. Many believe the solution to unsustainable cost increases is increased patient cost-sharing. From an overall cost perspective, reduced consumption of certain essential services may yield short-term savings but lead to worse health and markedly higher costs down the road--in complications, hospitalizations, and increased utilization. Value-based insurance design (VBID) can help plug the inherent shortfalls in "across-the-board" patient cost-sharing. Instead of focusing on cost or quality alone, VBID focuses on value, aligning the financial and nonfinancial incentives of the various stakeholders and complementing other current initiatives to improve quality and subdue costs, such as high-deductible consumer-directed health plans, pay-for-performance programs, and disease management. Mounting evidence, both peer-reviewed and empirical, indicates not only that VBID can be implemented, but also leads to desired changes in behavior. For all its documented successes and recognized promise, VBID is in its infancy and is not a panacea for the current healthcare crisis. However, the available research and documented experiences indicate that as an overall approach, and in its fully evolved and widely adopted form, VBID will promote a healthier population and therefore support cost-containment efforts by producing better health at any price point.

  14. Enhancing the Value of Population-Based Risk Scores for Institutional-Level Use.

    PubMed

    Raza, Sajjad; Sabik, Joseph F; Rajeswaran, Jeevanantham; Idrees, Jay J; Trezzi, Matteo; Riaz, Haris; Javadikasgari, Hoda; Nowicki, Edward R; Svensson, Lars G; Blackstone, Eugene H

    2016-07-01

    We hypothesized that factors associated with an institution's residual risk unaccounted for by population-based models may be identifiable and used to enhance the value of population-based risk scores for quality improvement. From January 2000 to January 2010, 4,971 patients underwent aortic valve replacement (AVR), either isolated (n = 2,660) or with concomitant coronary artery bypass grafting (AVR+CABG; n = 2,311). Operative mortality and major morbidity and mortality predicted by The Society of Thoracic Surgeons (STS) risk models were compared with observed values. After adjusting for patients' STS score, additional and refined risk factors were sought to explain residual risk. Differences between STS model coefficients (risk-factor strength) and those specific to our institution were calculated. Observed operative mortality was less than predicted for AVR (1.6% [42 of 2,660] vs 2.8%, p < 0.0001) and AVR+CABG (2.6% [59 of 2,311] vs 4.9%, p < 0.0001). Observed major morbidity and mortality was also lower than predicted for isolated AVR (14.6% [389 of 2,660] vs 17.5%, p < 0.0001) and AVR+CABG (20.0% [462 of 2,311] vs 25.8%, p < 0.0001). Shorter height, higher bilirubin, and lower albumin were identified as additional institution-specific risk factors, and body surface area, creatinine, glomerular filtration rate, blood urea nitrogen, and heart failure across all levels of functional class were identified as refined risk-factor variables associated with residual risk. In many instances, risk-factor strength differed substantially from that of STS models. Scores derived from population-based models can be enhanced for institutional level use by adjusting for institution-specific additional and refined risk factors. Identifying these and measuring differences in institution-specific versus population-based risk-factor strength can identify areas to target for quality improvement initiatives. Copyright © 2016 The Society of Thoracic Surgeons. Published by Elsevier

  15. Ab initio calculations of the lattice dynamics of silver halides

    NASA Astrophysics Data System (ADS)

    Gordienko, A. B.; Kravchenko, N. G.; Sedelnikov, A. N.

    2010-12-01

    Based on ab initio pseudopotential calculations, the results of investigations of the lattice dynamics of silver halides AgHal (Hal = Cl, Br, I) are presented. Equilibrium lattice parameters, phonon spectra, frequency densities and effective atomic-charge values are obtained for all types of crystals under study.

  16. Calculated quantum yield of photosynthesis of phytoplankton in the Marine Light-Mixed Layers (59 deg N, 21 deg W)

    NASA Technical Reports Server (NTRS)

    Carder, K. L.; Lee, Z. P.; Marra, John; Steward, R. G.; Perry, M. J.

    1995-01-01

    The quantum yield of photosynthesis (mol C/mol photons) was calculated at six depths for the waters of the Marine Light-Mixed Layer (MLML) cruise of May 1991. As there were photosynthetically available radiation (PAR) but no spectral irradiance measurements for the primary production incubations, three ways are presented here for the calculation of the absorbed photons (AP) by phytoplankton for the purpose of calculating phi. The first is based on a simple, nonspectral model; the second is based on a nonlinear regression using measured PAR values with depth; and the third is derived through remote sensing measurements. We show that the results of phi calculated using the nonlinear regreesion method and those using remote sensing are in good agreement with each other, and are consistent with the reported values of other studies. In deep waters, however, the simple nonspectral model may cause quantum yield values much higher than theoretically possible.

  17. Calculating evidence-based renal replacement therapy - Introducing an excel-based calculator to improve prescribing and delivery in renal replacement therapy - A before and after study.

    PubMed

    Cottle, Daniel; Mousdale, Stephen; Waqar-Uddin, Haroon; Tully, Redmond; Taylor, Benjamin

    2016-02-01

    Transferring the theoretical aspect of continuous renal replacement therapy to the bedside and delivering a given "dose" can be difficult. In research, the "dose" of renal replacement therapy is given as effluent flow rate in ml kg -1  h -1 . Unfortunately, most machines require other information when they are initiating therapy, including blood flow rate, pre-blood pump flow rate, dialysate flow rate, etc. This can lead to confusion, resulting in patients receiving inappropriate doses of renal replacement therapy. Our aim was to design an excel calculator which would personalise patient's treatment, deliver an effective, evidence-based dose of renal replacement therapy without large variations in practice and prolong filter life. Our calculator prescribes a haemodialfiltration dose of 25 ml kg -1  h -1 whilst limiting the filtration fraction to 15%. We compared the episodes of renal replacement therapy received by a historical group of patients, by retrieving their data stored on the haemofiltration machines, to a group where the calculator was used. In the second group, the data were gathered prospectively. The median delivered dose reduced from 41.0 ml kg -1  h -1 to 26.8 ml kg -1  h -1 with reduced variability that was significantly closer to the aim of 25 ml kg -1 .h -1 ( p  < 0.0001). The median treatment time increased from 8.5 h to 22.2 h ( p  = 0.00001). Our calculator significantly reduces variation in prescriptions of continuous veno-venous haemodiafiltration and provides an evidence-based dose. It is easy to use and provides personal care for patients whilst optimizing continuous veno-venous haemodiafiltration delivery and treatment times.

  18. Dominance-based ranking functions for interval-valued intuitionistic fuzzy sets.

    PubMed

    Chen, Liang-Hsuan; Tu, Chien-Cheng

    2014-08-01

    The ranking of interval-valued intuitionistic fuzzy sets (IvIFSs) is difficult since they include the interval values of membership and nonmembership. This paper proposes ranking functions for IvIFSs based on the dominance concept. The proposed ranking functions consider the degree to which an IvIFS dominates and is not dominated by other IvIFSs. Based on the bivariate framework and the dominance concept, the functions incorporate not only the boundary values of membership and nonmembership, but also the relative relations among IvIFSs in comparisons. The dominance-based ranking functions include bipolar evaluations with a parameter that allows the decision-maker to reflect his actual attitude in allocating the various kinds of dominance. The relationship for two IvIFSs that satisfy the dual couple is defined based on four proposed ranking functions. Importantly, the proposed ranking functions can achieve a full ranking for all IvIFSs. Two examples are used to demonstrate the applicability and distinctiveness of the proposed ranking functions.

  19. [Modeling in value-based medicine].

    PubMed

    Neubauer, A S; Hirneiss, C; Kampik, A

    2010-03-01

    Modeling plays an important role in value-based medicine (VBM). It allows decision support by predicting potential clinical and economic consequences, frequently combining different sources of evidence. Based on relevant publications and examples focusing on ophthalmology the key economic modeling methods are explained and definitions are given. The most frequently applied model types are decision trees, Markov models, and discrete event simulation (DES) models. Model validation includes besides verifying internal validity comparison with other models (external validity) and ideally validation of its predictive properties. The existing uncertainty with any modeling should be clearly stated. This is true for economic modeling in VBM as well as when using disease risk models to support clinical decisions. In economic modeling uni- and multivariate sensitivity analyses are usually applied; the key concepts here are tornado plots and cost-effectiveness acceptability curves. Given the existing uncertainty, modeling helps to make better informed decisions than without this additional information.

  20. Vibrational spectroscopic studies of Isoleucine by quantum chemical calculations.

    PubMed

    Moorthi, P P; Gunasekaran, S; Ramkumaar, G R

    2014-04-24

    In this work, we reported a combined experimental and theoretical study on molecular structure, vibrational spectra and NBO analysis of Isoleucine (2-Amino-3-methylpentanoic acid). The optimized molecular structure, vibrational frequencies, corresponding vibrational assignments, thermodynamics properties, NBO analyses, NMR chemical shifts and ultraviolet-visible spectral interpretation of Isoleucine have been studied by performing MP2 and DFT/cc-pVDZ level of theory. The FTIR, FT-Raman spectra were recorded in the region 4000-400 cm(-1) and 3500-50 cm(-1) respectively. The UV-visible absorption spectra of the compound were recorded in the range of 200-800 nm. Computational calculations at MP2 and B3LYP level with basis set of cc-pVDZ is employed in complete assignments of Isoleucine molecule on the basis of the potential energy distribution (PED) of the vibrational modes, calculated using VEDA-4 program. The calculated wavenumbers are compared with the experimental values. The difference between the observed and calculated wavenumber values of most of the fundamentals is very small. (13)C and (1)H nuclear magnetic resonance chemical shifts of the molecule were calculated using the gauge independent atomic orbital (GIAO) method and compared with experimental results. The formation of hydrogen bond was investigated in terms of the charge density by the NBO calculations. Based on the UV spectra and TD-DFT calculations, the electronic structure and the assignments of the absorption bands were carried out. Besides, molecular electrostatic potential (MEP) were investigated using theoretical calculations. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Allan Variance Calculation for Nonuniformly Spaced Input Data

    DTIC Science & Technology

    2015-01-01

    τ (tau). First, the set of gyro values is partitioned into bins of duration τ. For example, if the sampling duration τ is 2 sec and there are 4,000...Variance Calculation For each value of τ, the conventional AV calculation partitions the gyro data sets into bins with approximately τ / Δt...value of Δt. Therefore, a new way must be found to partition the gyro data sets into bins. The basic concept behind the modified AV calculation is

  2. Comparison of measurement- and proxy-based Vs30 values in California

    USGS Publications Warehouse

    Yong, Alan K.

    2016-01-01

    This study was prompted by the recent availability of a significant amount of openly accessible measured VS30 values and the desire to investigate the trend of using proxy-based models to predict VS30 in the absence of measurements. Comparisons between measured and model-based values were performed. The measured data included 503 VS30 values collected from various projects for 482 seismographic station sites in California. Six proxy-based models—employing geologic mapping, topographic slope, and terrain classification—were also considered. Included was a new terrain class model based on the Yong et al. (2012) approach but recalibrated with updated measured VS30 values. Using the measured VS30 data as the metric for performance, the predictive capabilities of the six models were determined to be statistically indistinguishable. This study also found three models that tend to underpredict VS30 at lower velocities (NEHRP Site Classes D–E) and overpredict at higher velocities (Site Classes B–C).

  3. Calculating the return on investment of mobile healthcare.

    PubMed

    Oriol, Nancy E; Cote, Paul J; Vavasis, Anthony P; Bennet, Jennifer; Delorenzo, Darien; Blanc, Philip; Kohane, Isaac

    2009-06-02

    Mobile health clinics provide an alternative portal into the healthcare system for the medically disenfranchised, that is, people who are underinsured, uninsured or who are otherwise outside of mainstream healthcare due to issues of trust, language, immigration status or simply location. Mobile health clinics as providers of last resort are an essential component of the healthcare safety net providing prevention, screening, and appropriate triage into mainstream services. Despite the face value of providing services to underserved populations, a focused analysis of the relative value of the mobile health clinic model has not been elucidated. The question that the return on investment algorithm has been designed to answer is: can the value of the services provided by mobile health programs be quantified in terms of quality adjusted life years saved and estimated emergency department expenditures avoided? Using a sample mobile health clinic and published research that quantifies health outcomes, we developed and tested an algorithm to calculate the return on investment of a typical broad-service mobile health clinic: the relative value of mobile health clinic services = annual projected emergency department costs avoided + value of potential life years saved from the services provided. Return on investment ratio = the relative value of the mobile health clinic services/annual cost to run the mobile health clinic. Based on service data provided by The Family Van for 2008 we calculated the annual cost savings from preventing emergency room visits, $3,125,668 plus the relative value of providing 7 of the top 25 priority prevention services during the same period, US$17,780,000 for a total annual value of $20,339,968. Given that the annual cost to run the program was $567,700, the calculated return on investment of The Family Van was 36:1. By using published data that quantify the value of prevention practices and the value of preventing unnecessary use of emergency

  4. Public Report on Health: Development of a Nutritive Value Calculator for Indian Foods and Analysis of Food Logs and Nutrient Intake in six States.

    PubMed

    Sathyamala, C; Kurian, Nj; DE, Anuradha; Saxena, Kb; Priya, Ritu; Baru, Rama; Srivastava, Ravi; Mittal, Onkar; Noronha, Claire; Samson, Meera; Khalsa, Sneh; Puliyel, Ashish; Puliyel, Jacob

    2014-05-01

    The Public Report on Health (PRoH) was initiated in 2005 to understand public health issues for people from diverse backgrounds living in different region specific contexts. States were selected purposively to capture a diversity of situations from better-performing states and not-so-well performing states. Based on these considerations, six states - the better-performing states of Tamil Nadu (TN), Maharashtra (MH) and Himachal Pradesh (HP) and the not-so-well performing states of Madhya Pradesh (MP), Uttar Pradesh (UP) and Orissa (OR) - were selected. This is a report of a study using food diaries to assess food intakes in sample households from six states of India. Food diaries were maintained and all the raw food items that went into making the food in the household was measured using a measuring cup that converted volumes into dry weights for each item. The proportion consumed by individual adults was recorded. A nutrient calculator that computed the total nutrient in the food items consumed, using the 'Nutritive Value of Indian Foods by Gopalan et al., was developed to analyze the data and this is now been made available as freeware (http://bit.ly/ncalculator). The total nutrients consumed by the adults, men and women was calculated. Identifying details having been removed, the raw data is available, open access on the internet http://bit.ly/foodlogxls.The energy consumption in our study was 2379 kcal per capita per day. According to the Summary Report World Agriculture the per capita food consumption in 1997-99 was 2803 which is higher than that in the best state in India. The consumption for developing countries a decade ago was 2681 and in Sub-Saharan Africa it was 2195. Our data is compatible in 2005 with the South Asia consumption of 2403 Kcal per capita per day in 1997-99. For comparison, in industrialized countries it was 3380. In Tamil Nadu it was a mere 1817 kcal. The nutrient consumption in this study suggests that food security in the villages

  5. Alternate approach for calculating hardness based on residual indentation depth: Comparison with experiments

    NASA Astrophysics Data System (ADS)

    Ananthakrishna, G.; K, Srikanth

    2018-03-01

    It is well known that plastic deformation is a highly nonlinear dissipative irreversible phenomenon of considerable complexity. As a consequence, little progress has been made in modeling some well-known size-dependent properties of plastic deformation, for instance, calculating hardness as a function of indentation depth independently. Here, we devise a method of calculating hardness by calculating the residual indentation depth and then calculate the hardness as the ratio of the load to the residual imprint area. Recognizing the fact that dislocations are the basic defects controlling the plastic component of the indentation depth, we set up a system of coupled nonlinear time evolution equations for the mobile, forest, and geometrically necessary dislocation densities. Within our approach, we consider the geometrically necessary dislocations to be immobile since they contribute to additional hardness. The model includes dislocation multiplication, storage, and recovery mechanisms. The growth of the geometrically necessary dislocation density is controlled by the number of loops that can be activated under the contact area and the mean strain gradient. The equations are then coupled to the load rate equation. Our approach has the ability to adopt experimental parameters such as the indentation rates, the geometrical parameters defining the Berkovich indenter, including the nominal tip radius. The residual indentation depth is obtained by integrating the Orowan expression for the plastic strain rate, which is then used to calculate the hardness. Consistent with the experimental observations, the increasing hardness with decreasing indentation depth in our model arises from limited dislocation sources at small indentation depths and therefore avoids divergence in the limit of small depths reported in the Nix-Gao model. We demonstrate that for a range of parameter values that physically represent different materials, the model predicts the three characteristic

  6. AtomicChargeCalculator: interactive web-based calculation of atomic charges in large biomolecular complexes and drug-like molecules.

    PubMed

    Ionescu, Crina-Maria; Sehnal, David; Falginella, Francesco L; Pant, Purbaj; Pravda, Lukáš; Bouchal, Tomáš; Svobodová Vařeková, Radka; Geidl, Stanislav; Koča, Jaroslav

    2015-01-01

    Partial atomic charges are a well-established concept, useful in understanding and modeling the chemical behavior of molecules, from simple compounds, to large biomolecular complexes with many reactive sites. This paper introduces AtomicChargeCalculator (ACC), a web-based application for the calculation and analysis of atomic charges which respond to changes in molecular conformation and chemical environment. ACC relies on an empirical method to rapidly compute atomic charges with accuracy comparable to quantum mechanical approaches. Due to its efficient implementation, ACC can handle any type of molecular system, regardless of size and chemical complexity, from drug-like molecules to biomacromolecular complexes with hundreds of thousands of atoms. ACC writes out atomic charges into common molecular structure files, and offers interactive facilities for statistical analysis and comparison of the results, in both tabular and graphical form. Due to high customizability and speed, easy streamlining and the unified platform for calculation and analysis, ACC caters to all fields of life sciences, from drug design to nanocarriers. ACC is freely available via the Internet at http://ncbr.muni.cz/ACC.

  7. Recurrence quantity analysis based on singular value decomposition

    NASA Astrophysics Data System (ADS)

    Bian, Songhan; Shang, Pengjian

    2017-05-01

    Recurrence plot (RP) has turned into a powerful tool in many different sciences in the last three decades. To quantify the complexity and structure of RP, recurrence quantification analysis (RQA) has been developed based on the measures of recurrence density, diagonal lines, vertical lines and horizontal lines. This paper will study the RP based on singular value decomposition which is a new perspective of RP study. Principal singular value proportion (PSVP) will be proposed as one new RQA measure and bigger PSVP means higher complexity for one system. In contrast, smaller PSVP reflects a regular and stable system. Considering the advantage of this method in detecting the complexity and periodicity of systems, several simulation and real data experiments are chosen to examine the performance of this new RQA.

  8. Methodology of full-core Monte Carlo calculations with leakage parameter evaluations for benchmark critical experiment analysis

    NASA Astrophysics Data System (ADS)

    Sboev, A. G.; Ilyashenko, A. S.; Vetrova, O. A.

    1997-02-01

    The method of bucking evaluation, realized in the MOnte Carlo code MCS, is described. This method was applied for calculational analysis of well known light water experiments TRX-1 and TRX-2. The analysis of this comparison shows, that there is no coincidence between Monte Carlo calculations, obtained by different ways: the MCS calculations with given experimental bucklings; the MCS calculations with given bucklings evaluated on base of full core MCS direct simulations; the full core MCNP and MCS direct simulations; the MCNP and MCS calculations, where the results of cell calculations are corrected by the coefficients taking into the account the leakage from the core. Also the buckling values evaluated by full core MCS calculations have differed from experimental ones, especially in the case of TRX-1, when this difference has corresponded to 0.5 percent increase of Keff value.

  9. A comparison of the prognostic value of preoperative inflammation-based scores and TNM stage in patients with gastric cancer.

    PubMed

    Pan, Qun-Xiong; Su, Zi-Jian; Zhang, Jian-Hua; Wang, Chong-Ren; Ke, Shao-Ying

    2015-01-01

    People's Republic of China is one of the countries with the highest incidence of gastric cancer, accounting for 45% of all new gastric cancer cases in the world. Therefore, strong prognostic markers are critical for the diagnosis and survival of Chinese patients suffering from gastric cancer. Recent studies have begun to unravel the mechanisms linking the host inflammatory response to tumor growth, invasion and metastasis in gastric cancers. Based on this relationship between inflammation and cancer progression, several inflammation-based scores have been demonstrated to have prognostic value in many types of malignant solid tumors. To compare the prognostic value of inflammation-based prognostic scores and tumor node metastasis (TNM) stage in patients undergoing gastric cancer resection. The inflammation-based prognostic scores were calculated for 207 patients with gastric cancer who underwent surgery. Glasgow prognostic score (GPS), neutrophil lymphocyte ratio (NLR), platelet lymphocyte ratio (PLR), prognostic nutritional index (PNI), and prognostic index (PI) were analyzed. Linear trend chi-square test, likelihood ratio chi-square test, and receiver operating characteristic were performed to compare the prognostic value of the selected scores and TNM stage. In univariate analysis, preoperative serum C-reactive protein (P<0.001), serum albumin (P<0.001), GPS (P<0.001), PLR (P=0.002), NLR (P<0.001), PI (P<0.001), PNI (P<0.001), and TNM stage (P<0.001) were significantly associated with both overall survival and disease-free survival of patients with gastric cancer. In multivariate analysis, GPS (P=0.024), NLR (P=0.012), PI (P=0.001), TNM stage (P<0.001), and degree of differentiation (P=0.002) were independent predictors of gastric cancer survival. GPS and TNM stage had a comparable prognostic value and higher linear trend chi-square value, likelihood ratio chi-square value, and larger area under the receiver operating characteristic curve as compared to other

  10. Monte Carlo-based evaluation of S-values in mouse models for positron-emitting radionuclides

    NASA Astrophysics Data System (ADS)

    Xie, Tianwu; Zaidi, Habib

    2013-01-01

    In addition to being a powerful clinical tool, Positron emission tomography (PET) is also used in small laboratory animal research to visualize and track certain molecular processes associated with diseases such as cancer, heart disease and neurological disorders in living small animal models of disease. However, dosimetric characteristics in small animal PET imaging are usually overlooked, though the radiation dose may not be negligible. In this work, we constructed 17 mouse models of different body mass and size based on the realistic four-dimensional MOBY mouse model. Particle (photons, electrons and positrons) transport using the Monte Carlo method was performed to calculate the absorbed fractions and S-values for eight positron-emitting radionuclides (C-11, N-13, O-15, F-18, Cu-64, Ga-68, Y-86 and I-124). Among these radionuclides, O-15 emits positrons with high energy and frequency and produces the highest self-absorbed S-values in each organ, while Y-86 emits γ-rays with high energy and frequency which results in the highest cross-absorbed S-values for non-neighbouring organs. Differences between S-values for self-irradiated organs were between 2% and 3%/g difference in body weight for most organs. For organs irradiating other organs outside the splanchnocoele (i.e. brain, testis and bladder), differences between S-values were lower than 1%/g. These appealing results can be used to assess variations in small animal dosimetry as a function of total-body mass. The generated database of S-values for various radionuclides can be used in the assessment of radiation dose to mice from different radiotracers in small animal PET experiments, thus offering quantitative figures for comparative dosimetry research in small animal models.

  11. The Value of the Energy Data Base.

    ERIC Educational Resources Information Center

    King, Donald W.; And Others

    A study was conducted to assess the value of the Energy Data Base (EDB), which is produced by the Technical Information Center (TIC) of the Department of Energy (DOE) in order to provide a means of identifying primary energy information sources, particularly journal articles and technical reports. The volume of energy information distributed to…

  12. Flexibility and Project Value: Interactions and Multiple Real Options

    NASA Astrophysics Data System (ADS)

    Čulík, Miroslav

    2010-06-01

    This paper is focused on a project valuation with embedded portfolio of real options including their interactions. Valuation is based on the criterion of Net Present Value on the simulation basis. Portfolio includes selected types of European-type real options: option to expand, contract, abandon and temporarily shut down and restart a project. Due to the fact, that in reality most of the managerial flexibility takes the form of portfolio of real options, selected types of options are valued not only individually, but also in combination. The paper is structured as follows: first, diffusion models for forecasting of output prices and variable costs are derived. Second, project value is estimated on the assumption, that no real options are present. Next, project value is calculated with the presence of selected European-type options; these options and their impact on project value are valued first in isolation and consequently in different combinations. Moreover, intrinsic value evolution of given real options with respect to the time of exercising is analysed. In the end, results are presented graphically; selected statistics and risk measures (Value at Risk, Expected Shortfall) of the NPV's distributions are calculated and commented.

  13. G4DARI: Geant4/GATE based Monte Carlo simulation interface for dosimetry calculation in radiotherapy.

    PubMed

    Slimani, Faiçal A A; Hamdi, Mahdjoub; Bentourkia, M'hamed

    2018-05-01

    Monte Carlo (MC) simulation is widely recognized as an important technique to study the physics of particle interactions in nuclear medicine and radiation therapy. There are different codes dedicated to dosimetry applications and widely used today in research or in clinical application, such as MCNP, EGSnrc and Geant4. However, such codes made the physics easier but the programming remains a tedious task even for physicists familiar with computer programming. In this paper we report the development of a new interface GEANT4 Dose And Radiation Interactions (G4DARI) based on GEANT4 for absorbed dose calculation and for particle tracking in humans, small animals and complex phantoms. The calculation of the absorbed dose is performed based on 3D CT human or animal images in DICOM format, from images of phantoms or from solid volumes which can be made from any pure or composite material to be specified by its molecular formula. G4DARI offers menus to the user and tabs to be filled with values or chemical formulas. The interface is described and as application, we show results obtained in a lung tumor in a digital mouse irradiated with seven energy beams, and in a patient with glioblastoma irradiated with five photon beams. In conclusion, G4DARI can be easily used by any researcher without the need to be familiar with computer programming, and it will be freely available as an application package. Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. 40 CFR 600.113-93 - Fuel economy calculations.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... calculations of the weighted fuel economy values require input of the weighted grams/mile values for total... this section. A sample appears in appendix II to this part. (a) Calculate the weighted grams/mile... the grams/mile values for the highway fuel economy test for HC, CO and CO2, and where applicable CH3...

  15. Recognition of Values-Based Constructs in a Summer Physical Activity Program.

    ERIC Educational Resources Information Center

    Watson, Doris L.; Newton, Maria; Kim, Mi-Sook

    2003-01-01

    Examined the extent to which participants in a summer sports camp embraced values-based constructs, noting the relationship between perceptions of values-based constructs and affect and attitude. Data on ethnically diverse 10-13-year-olds indicated that care for others/goal setting, self-responsibility, and self-control/respect positively related…

  16. Applying ISO 11929:2010 Standard to detection limit calculation in least-squares based multi-nuclide gamma-ray spectrum evaluation

    NASA Astrophysics Data System (ADS)

    Kanisch, G.

    2017-05-01

    The concepts of ISO 11929 (2010) are applied to evaluation of radionuclide activities from more complex multi-nuclide gamma-ray spectra. From net peak areas estimated by peak fitting, activities and their standard uncertainties are calculated by weighted linear least-squares method with an additional step, where uncertainties of the design matrix elements are taken into account. A numerical treatment of the standard's uncertainty function, based on ISO 11929 Annex C.5, leads to a procedure for deriving decision threshold and detection limit values. The methods shown allow resolving interferences between radionuclide activities also in case of calculating detection limits where they can improve the latter by including more than one gamma line per radionuclide. The co"mmon single nuclide weighted mean is extended to an interference-corrected (generalized) weighted mean, which, combined with the least-squares method, allows faster detection limit calculations. In addition, a new grouped uncertainty budget was inferred, which for each radionuclide gives uncertainty budgets from seven main variables, such as net count rates, peak efficiencies, gamma emission intensities and others; grouping refers to summation over lists of peaks per radionuclide.

  17. Social value and individual choice: The value of a choice-based decision-making process in a collectively funded health system.

    PubMed

    Espinoza, Manuel Antonio; Manca, Andrea; Claxton, Karl; Sculpher, Mark

    2018-02-01

    Evidence about cost-effectiveness is increasingly being used to inform decisions about the funding of new technologies that are usually implemented as guidelines from centralized decision-making bodies. However, there is also an increasing recognition for the role of patients in determining their preferred treatment option. This paper presents a method to estimate the value of implementing a choice-based decision process using the cost-effectiveness analysis toolbox. This value is estimated for 3 alternative scenarios. First, it compares centralized decisions, based on population average cost-effectiveness, against a decision process based on patient choice. Second, it compares centralized decision based on patients' subgroups versus an individual choice-based decision process. Third, it compares a centralized process based on average cost-effectiveness against a choice-based process where patients choose according to a different measure of outcome to that used by the centralized decision maker. The methods are applied to a case study for the management of acute coronary syndrome. It is concluded that implementing a choice-based process of treatment allocation may be an option in collectively funded health systems. However, its value will depend on the specific health problem and the social values considered relevant to the health system. Copyright © 2017 John Wiley & Sons, Ltd.

  18. [Value(s)].

    PubMed

    Vanbelle, G

    2006-01-01

    After a short explanation of the word value, the (cultural) value of teeth, the economic evaluation of dentistry and the payment criteria are being presented. The specific situation of health care as a service which deviates in quite a few aspects from the standard demand-supply model is being pointed out. Attention is drawn to key characteristics of the liberal professions such as the obligation to perform to the best of one's ability, not to a specific result. Function classification appears to offer possibilities for cataloguing the wide variation in practice settings. The well-known wage calculation of Professor De Lembre is being reviewed. Subsequently the analysis of cost variation and induced demand by extra services and profile shaping is being elaborated. Cost-benefit-analysis is the concluding item.

  19. Quantifying Physician Teaching Productivity Using Clinical Relative Value Units

    PubMed Central

    Yeh, Michael M; Cahill, Daniel F

    1999-01-01

    OBJECTIVE To design and test a customizable system for calculating physician teaching productivity based on clinical relative value units (RVUs). SETTING/PARTICIPANTS A 550-bed community teaching hospital with 11 part-time faculty general internists. DESIGN Academic year 1997–98 educational activities were analyzed with an RVU-based system using teaching value multipliers (TVMs). The TVM is the ratio of the value of a unit of time spent teaching to the equivalent time spent in clinical practice. We assigned TVMs to teaching tasks based on their educational value and complexity. The RVUs of a teaching activity would be equal to its TVM multiplied by its duration and by the regional median clinical RVU production rate. MEASUREMENTS The faculty members' total annual RVUs for teaching were calculated and compared with the RVUs they would have earned had they spent the same proportion of time in clinical practice. MAIN RESULTS For the same proportion of time, the faculty physicians would have generated 29,806 RVUs through teaching or 27,137 RVUs through clinical practice (Absolute difference = 2,669 RVUs; Relative excess = 9.8%). CONCLUSIONS We describe an easily customizable method of quantifying physician teaching productivity in terms of clinical RVUs. This system allows equitable recognition of physician efforts in both the educational and clinical arenas. PMID:10571707

  20. Introducing Value-Based Purchasing into TRICARE Reform

    PubMed Central

    Hosek, Susan D.; Sorbero, Melony E.; Martsolf, Grant; Kandrack, Ryan

    2017-01-01

    Abstract TRICARE, the health benefits program created for beneficiaries of the U.S. Department of Defense, covers health care provided in military treatment facilities and by civilian providers. Congress is now considering how to update TRICARE, which was first developed in the 1980s drawing on managed care concepts from civilian health plans. This article places TRICARE's current managed care strategy in historical context and describes recent innovations by private insurers and Medicare intended to enhance the value---cost and quality---of the care they purchase for their members. With this movement toward value-based purchasing as background, the authors evaluate two existing proposals for reform and describe an alternative approach that blends the existing proposals. PMID:28845347

  1. Experimental Demonstration of Higher Precision Weak-Value-Based Metrology Using Power Recycling

    NASA Astrophysics Data System (ADS)

    Wang, Yi-Tao; Tang, Jian-Shun; Hu, Gang; Wang, Jian; Yu, Shang; Zhou, Zong-Quan; Cheng, Ze-Di; Xu, Jin-Shi; Fang, Sen-Zhi; Wu, Qing-Lin; Li, Chuan-Feng; Guo, Guang-Can

    2016-12-01

    The weak-value-based metrology is very promising and has attracted a lot of attention in recent years because of its remarkable ability in signal amplification. However, it is suggested that the upper limit of the precision of this metrology cannot exceed that of classical metrology because of the low sample size caused by the probe loss during postselection. Nevertheless, a recent proposal shows that this probe loss can be reduced by the power-recycling technique, and thus enhance the precision of weak-value-based metrology. Here we experimentally realize the power-recycled interferometric weak-value-based beam-deflection measurement and obtain the amplitude of the detected signal and white noise by discrete Fourier transform. Our results show that the detected signal can be strengthened by power recycling, and the power-recycled weak-value-based signal-to-noise ratio can surpass the upper limit of the classical scheme, corresponding to the shot-noise limit. This work sheds light on higher precision metrology and explores the real advantage of the weak-value-based metrology over classical metrology.

  2. Numericware i: Identical by State Matrix Calculator

    PubMed Central

    Kim, Bongsong; Beavis, William D

    2017-01-01

    We introduce software, Numericware i, to compute identical by state (IBS) matrix based on genotypic data. Calculating an IBS matrix with a large dataset requires large computer memory and takes lengthy processing time. Numericware i addresses these challenges with 2 algorithmic methods: multithreading and forward chopping. The multithreading allows computational routines to concurrently run on multiple central processing unit (CPU) processors. The forward chopping addresses memory limitation by dividing a dataset into appropriately sized subsets. Numericware i allows calculation of the IBS matrix for a large genotypic dataset using a laptop or a desktop computer. For comparison with different software, we calculated genetic relationship matrices using Numericware i, SPAGeDi, and TASSEL with the same genotypic dataset. Numericware i calculates IBS coefficients between 0 and 2, whereas SPAGeDi and TASSEL produce different ranges of values including negative values. The Pearson correlation coefficient between the matrices from Numericware i and TASSEL was high at .9972, whereas SPAGeDi showed low correlation with Numericware i (.0505) and TASSEL (.0587). With a high-dimensional dataset of 500 entities by 10 000 000 SNPs, Numericware i spent 382 minutes using 19 CPU threads and 64 GB memory by dividing the dataset into 3 pieces, whereas SPAGeDi and TASSEL failed with the same dataset. Numericware i is freely available for Windows and Linux under CC-BY 4.0 license at https://figshare.com/s/f100f33a8857131eb2db. PMID:28469375

  3. Comparison of Polar Cap (PC) index calculations.

    NASA Astrophysics Data System (ADS)

    Stauning, P.

    2012-04-01

    The Polar Cap (PC) index introduced by Troshichev and Andrezen (1985) is derived from polar magnetic variations and is mainly a measure of the intensity of the transpolar ionospheric currents. These currents relate to the polar cap antisunward ionospheric plasma convection driven by the dawn-dusk electric field, which in turn is generated by the interaction of the solar wind with the Earth's magnetosphere. Coefficients to calculate PCN and PCS index values from polar magnetic variations recorded at Thule and Vostok, respectively, have been derived by several different procedures in the past. The first published set of coefficients for Thule was derived by Vennerstrøm, 1991 and is still in use for calculations of PCN index values by DTU Space. Errors in the program used to calculate index values were corrected in 1999 and again in 2001. In 2005 DMI adopted a unified procedure proposed by Troshichev for calculations of the PCN index. Thus there exists 4 different series of PCN index values. Similarly, at AARI three different sets of coefficients have been used to calculate PCS indices in the past. The presentation discusses the principal differences between the various PC index procedures and provides comparisons between index values derived from the same magnetic data sets using the different procedures. Examples from published papers are examined to illustrate the differences.

  4. allantools: Allan deviation calculation

    NASA Astrophysics Data System (ADS)

    Wallin, Anders E. E.; Price, Danny C.; Carson, Cantwell G.; Meynadier, Frédéric

    2018-04-01

    allantools calculates Allan deviation and related time & frequency statistics. The library is written in Python and has a GPL v3+ license. It takes input data that is either evenly spaced observations of either fractional frequency, or phase in seconds. Deviations are calculated for given tau values in seconds. Several noise generators for creating synthetic datasets are also included.

  5. Activity-based costing: a practical model for cost calculation in radiotherapy.

    PubMed

    Lievens, Yolande; van den Bogaert, Walter; Kesteloot, Katrien

    2003-10-01

    The activity-based costing method was used to compute radiotherapy costs. This report describes the model developed, the calculated costs, and possible applications for the Leuven radiotherapy department. Activity-based costing is an advanced cost calculation technique that allocates resource costs to products based on activity consumption. In the Leuven model, a complex allocation principle with a large diversity of cost drivers was avoided by introducing an extra allocation step between activity groups and activities. A straightforward principle of time consumption, weighed by some factors of treatment complexity, was used. The model was developed in an iterative way, progressively defining the constituting components (costs, activities, products, and cost drivers). Radiotherapy costs are predominantly determined by personnel and equipment cost. Treatment-related activities consume the greatest proportion of the resource costs, with treatment delivery the most important component. This translates into products that have a prolonged total or daily treatment time being the most costly. The model was also used to illustrate the impact of changes in resource costs and in practice patterns. The presented activity-based costing model is a practical tool to evaluate the actual cost structure of a radiotherapy department and to evaluate possible resource or practice changes.

  6. CODATA recommended values of the fundamental constants

    NASA Astrophysics Data System (ADS)

    Mohr, Peter J.; Taylor, Barry N.

    2000-11-01

    A review is given of the latest Committee on Data for Science and Technology (CODATA) adjustment of the values of the fundamental constants. The new set of constants, referred to as the 1998 values, replaces the values recommended for international use by CODATA in 1986. The values of the constants, and particularly the Rydberg constant, are of relevance to the calculation of precise atomic spectra. The standard uncertainty (estimated standard deviation) of the new recommended value of the Rydberg constant, which is based on precision frequency metrology and a detailed analysis of the theory, is approximately 1/160 times the uncertainty of the 1986 value. The new set of recommended values as well as a searchable bibliographic database that gives citations to the relevant literature is available on the World Wide Web at physics.nist.gov/constants and physics.nist.gov/constantsbib, respectively. .

  7. TH-C-BRD-06: A Novel MRI Based CT Artifact Correction Method for Improving Proton Range Calculation in the Presence of Severe CT Artifacts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, P; Schreibmann, E; Fox, T

    2014-06-15

    Purpose: Severe CT artifacts can impair our ability to accurately calculate proton range thereby resulting in a clinically unacceptable treatment plan. In this work, we investigated a novel CT artifact correction method based on a coregistered MRI and investigated its ability to estimate CT HU and proton range in the presence of severe CT artifacts. Methods: The proposed method corrects corrupted CT data using a coregistered MRI to guide the mapping of CT values from a nearby artifact-free region. First patient MRI and CT images were registered using 3D deformable image registration software based on B-spline and mutual information. Themore » CT slice with severe artifacts was selected as well as a nearby slice free of artifacts (e.g. 1cm away from the artifact). The two sets of paired MRI and CT images at different slice locations were further registered by applying 2D deformable image registration. Based on the artifact free paired MRI and CT images, a comprehensive geospatial analysis was performed to predict the correct CT HU of the CT image with severe artifact. For a proof of concept, a known artifact was introduced that changed the ground truth CT HU value up to 30% and up to 5cm error in proton range. The ability of the proposed method to recover the ground truth was quantified using a selected head and neck case. Results: A significant improvement in image quality was observed visually. Our proof of concept study showed that 90% of area that had 30% errors in CT HU was corrected to 3% of its ground truth value. Furthermore, the maximum proton range error up to 5cm was reduced to 4mm error. Conclusion: MRI based CT artifact correction method can improve CT image quality and proton range calculation for patients with severe CT artifacts.« less

  8. Technical Note: On the calculation of stopping-power ratio for stoichiometric calibration in proton therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ödén, Jakob; Zimmerman, Jens; Nowik, Patrik

    2015-09-15

    Purpose: The quantitative effects of assumptions made in the calculation of stopping-power ratios (SPRs) are investigated, for stoichiometric CT calibration in proton therapy. The assumptions investigated include the use of the Bethe formula without correction terms, Bragg additivity, the choice of I-value for water, and the data source for elemental I-values. Methods: The predictions of the Bethe formula for SPR (no correction terms) were validated against more sophisticated calculations using the SRIM software package for 72 human tissues. A stoichiometric calibration was then performed at our hospital. SPR was calculated for the human tissues using either the assumption of simplemore » Bragg additivity or the Seltzer-Berger rule (as used in ICRU Reports 37 and 49). In each case, the calculation was performed twice: First, by assuming the I-value of water was an experimentally based value of 78 eV (value proposed in Errata and Addenda for ICRU Report 73) and second, by recalculating the I-value theoretically. The discrepancy between predictions using ICRU elemental I-values and the commonly used tables of Janni was also investigated. Results: Errors due to neglecting the correction terms to the Bethe formula were calculated at less than 0.1% for biological tissues. Discrepancies greater than 1%, however, were estimated due to departures from simple Bragg additivity when a fixed I-value for water was imposed. When the I-value for water was calculated in a consistent manner to that for tissue, this disagreement was substantially reduced. The difference between SPR predictions when using Janni’s or ICRU tables for I-values was up to 1.6%. Experimental data used for materials of relevance to proton therapy suggest that the ICRU-derived values provide somewhat more accurate results (root-mean-square-error: 0.8% versus 1.6%). Conclusions: The conclusions from this study are that (1) the Bethe formula can be safely used for SPR calculations without correction terms

  9. Professional Values Competency Evaluation for Students Enrolled in a Concept-Based Curriculum.

    PubMed

    Elliott, Annette M

    2017-01-01

    Although many nursing programs have transitioned toward the use of concept-based curricula, the evaluation of student learning associated with the curricular approach has been limited. An evaluation of student learning related to professional values for programs offering concept-based curricula was not evident in the literature. The purpose was to determine how a course competency related to professional values was addressed by nursing students studying in a concept-based nursing curriculum. The qualitative methodology of framework analysis was used to evaluate written assignments (N = 75). The core concept appreciation for professional values and the core concept disillusionment with unprofessional behaviors were identified in students' written reflections. The core concept of appreciation for professional values contributes to an evidence base of contemporary professional values identified in nursing. The core concept of disillusionment with unprofessional behaviors can inform curricular planning and research on how to advocate for professional behaviors. [J Nurs Educ. 2017;56(1):12-21.]. Copyright 2017, SLACK Incorporated.

  10. Target virus log10 reduction values determined for two reclaimed wastewater irrigation scenarios in Japan based on tolerable annual disease burden.

    PubMed

    Ito, Toshihiro; Kitajima, Masaaki; Kato, Tsuyoshi; Ishii, Satoshi; Segawa, Takahiro; Okabe, Satoshi; Sano, Daisuke

    2017-11-15

    Multiple-barriers are widely employed for managing microbial risks in water reuse, in which different types of wastewater treatment units (biological treatment, disinfection, etc.) and health protection measures (use of personal protective gear, vegetable washing, etc.) are combined to achieve a performance target value of log 10 reduction (LR) of viruses. The LR virus target value needs to be calculated based on the data obtained from monitoring the viruses of concern and the water reuse scheme in the context of the countries/regions where water reuse is implemented. In this study, we calculated the virus LR target values under two exposure scenarios for reclaimed wastewater irrigation in Japan, using the concentrations of indigenous viruses in untreated wastewater and a defined tolerable annual disease burden (10 -4 or 10 -6 disability-adjusted life years per person per year (DALY pppy )). Three genogroups of norovirus (norovirus genogroup I (NoV GI), geogroup II (NoV GII), and genogroup IV (NoV GIV)) in untreated wastewater were quantified as model viruses using reverse transcription-microfluidic quantitative PCR, and only NoV GII was present in quantifiable concentration. The probabilistic distribution of NoV GII concentration in untreated wastewater was then estimated from its concentration dataset, and used to calculate the LR target values of NoV GII for wastewater treatment. When an accidental ingestion of reclaimed wastewater by Japanese farmers was assumed, the NoV GII LR target values corresponding to the tolerable annual disease burden of 10 -6 DALY pppy were 3.2, 4.4, and 5.7 at 95, 99, and 99.9%tile, respectively. These percentile values, defined as "reliability," represent the cumulative probability of NoV GII concentration distribution in untreated wastewater below the corresponding tolerable annual disease burden after wastewater reclamation. An approximate 1-log 10 difference of LR target values was observed between 10 -4 and 10 -6 DALY pppy

  11. Fast CPU-based Monte Carlo simulation for radiotherapy dose calculation.

    PubMed

    Ziegenhein, Peter; Pirner, Sven; Ph Kamerling, Cornelis; Oelfke, Uwe

    2015-08-07

    Monte-Carlo (MC) simulations are considered to be the most accurate method for calculating dose distributions in radiotherapy. Its clinical application, however, still is limited by the long runtimes conventional implementations of MC algorithms require to deliver sufficiently accurate results on high resolution imaging data. In order to overcome this obstacle we developed the software-package PhiMC, which is capable of computing precise dose distributions in a sub-minute time-frame by leveraging the potential of modern many- and multi-core CPU-based computers. PhiMC is based on the well verified dose planning method (DPM). We could demonstrate that PhiMC delivers dose distributions which are in excellent agreement to DPM. The multi-core implementation of PhiMC scales well between different computer architectures and achieves a speed-up of up to 37[Formula: see text] compared to the original DPM code executed on a modern system. Furthermore, we could show that our CPU-based implementation on a modern workstation is between 1.25[Formula: see text] and 1.95[Formula: see text] faster than a well-known GPU implementation of the same simulation method on a NVIDIA Tesla C2050. Since CPUs work on several hundreds of GB RAM the typical GPU memory limitation does not apply for our implementation and high resolution clinical plans can be calculated.

  12. Research implications of science-informed, value-based decision making.

    PubMed

    Dowie, Jack

    2004-01-01

    In 'Hard' science, scientists correctly operate as the 'guardians of certainty', using hypothesis testing formulations and value judgements about error rates and time discounting that make classical inferential methods appropriate. But these methods can neither generate most of the inputs needed by decision makers in their time frame, nor generate them in a form that allows them to be integrated into the decision in an analytically coherent and transparent way. The need for transparent accountability in public decision making under uncertainty and value conflict means the analytical coherence provided by the stochastic Bayesian decision analytic approach, drawing on the outputs of Bayesian science, is needed. If scientific researchers are to play the role they should be playing in informing value-based decision making, they need to see themselves also as 'guardians of uncertainty', ensuring that the best possible current posterior distributions on relevant parameters are made available for decision making, irrespective of the state of the certainty-seeking research. The paper distinguishes the actors employing different technologies in terms of the focus of the technology (knowledge, values, choice); the 'home base' mode of their activity on the cognitive continuum of varying analysis-to-intuition ratios; and the underlying value judgements of the activity (especially error loss functions and time discount rates). Those who propose any principle of decision making other than the banal 'Best Principle', including the 'Precautionary Principle', are properly interpreted as advocates seeking to have their own value judgements and preferences regarding mode location apply. The task for accountable decision makers, and their supporting technologists, is to determine the best course of action under the universal conditions of uncertainty and value difference/conflict.

  13. Quantification of confounding factors in MRI-based dose calculations as applied to prostate IMRT

    NASA Astrophysics Data System (ADS)

    Maspero, Matteo; Seevinck, Peter R.; Schubert, Gerald; Hoesl, Michaela A. U.; van Asselen, Bram; Viergever, Max A.; Lagendijk, Jan J. W.; Meijer, Gert J.; van den Berg, Cornelis A. T.

    2017-02-01

    Magnetic resonance (MR)-only radiotherapy treatment planning requires pseudo-CT (pCT) images to enable MR-based dose calculations. To verify the accuracy of MR-based dose calculations, institutions interested in introducing MR-only planning will have to compare pCT-based and computer tomography (CT)-based dose calculations. However, interpreting such comparison studies may be challenging, since potential differences arise from a range of confounding factors which are not necessarily specific to MR-only planning. Therefore, the aim of this study is to identify and quantify the contribution of factors confounding dosimetric accuracy estimation in comparison studies between CT and pCT. The following factors were distinguished: set-up and positioning differences between imaging sessions, MR-related geometric inaccuracy, pCT generation, use of specific calibration curves to convert pCT into electron density information, and registration errors. The study comprised fourteen prostate cancer patients who underwent CT/MRI-based treatment planning. To enable pCT generation, a commercial solution (MRCAT, Philips Healthcare, Vantaa, Finland) was adopted. IMRT plans were calculated on CT (gold standard) and pCTs. Dose difference maps in a high dose region (CTV) and in the body volume were evaluated, and the contribution to dose errors of possible confounding factors was individually quantified. We found that the largest confounding factor leading to dose difference was the use of different calibration curves to convert pCT and CT into electron density (0.7%). The second largest factor was the pCT generation which resulted in pCT stratified into a fixed number of tissue classes (0.16%). Inter-scan differences due to patient repositioning, MR-related geometric inaccuracy, and registration errors did not significantly contribute to dose differences (0.01%). The proposed approach successfully identified and quantified the factors confounding accurate MRI-based dose calculation in

  14. Reward-based training of recurrent neural networks for cognitive and value-based tasks

    PubMed Central

    Song, H Francis; Yang, Guangyu R; Wang, Xiao-Jing

    2017-01-01

    Trained neural network models, which exhibit features of neural activity recorded from behaving animals, may provide insights into the circuit mechanisms of cognitive functions through systematic analysis of network activity and connectivity. However, in contrast to the graded error signals commonly used to train networks through supervised learning, animals learn from reward feedback on definite actions through reinforcement learning. Reward maximization is particularly relevant when optimal behavior depends on an animal’s internal judgment of confidence or subjective preferences. Here, we implement reward-based training of recurrent neural networks in which a value network guides learning by using the activity of the decision network to predict future reward. We show that such models capture behavioral and electrophysiological findings from well-known experimental paradigms. Our work provides a unified framework for investigating diverse cognitive and value-based computations, and predicts a role for value representation that is essential for learning, but not executing, a task. DOI: http://dx.doi.org/10.7554/eLife.21492.001 PMID:28084991

  15. Moving healthcare quality forward with nursing-sensitive value-based purchasing.

    PubMed

    Kavanagh, Kevin T; Cimiotti, Jeannie P; Abusalem, Said; Coty, Mary-Beth

    2012-12-01

    To underscore the need for health system reform and emphasize nursing measures as a key component in our healthcare reimbursement system. Nursing-sensitive value-based purchasing (NSVBP) has been proposed as an initiative that would help to promote optimal staffing and practice environment through financial rewards and transparency of structure, process, and patient outcome measures. This article reviews the medical, governmental, institutional, and lay literature regarding the necessity for, method of implementation of, and potential impact of NSVBP. Research has shown that adverse events and mortality are highly dependent on nurse staffing levels and skill mix. The National Database of Nursing Quality Indicators (NDNQI), along with other well-developed indicators, can be used as nursing-sensitive measurements for value-based purchasing initiatives. Nursing-sensitive measures are an important component of value-based purchasing. Value-based purchasing is in its infancy. Devising an effective system that recognizes and incorporates nursing measures will facilitate the success of this initiative. NSVBP needs to be designed and incentivized to decrease adverse events, hospital stays, and readmission rates, thereby decreasing societal healthcare costs. NSVBP has the potential for improving the quality of nursing care by financially motivating hospitals to have an optimal nurse practice environment capable of producing optimal patient outcomes by aligning cost effectiveness for hospitals to that of the patient and society. © 2012 Sigma Theta Tau International.

  16. Moving Healthcare Quality Forward With Nursing-Sensitive Value-Based Purchasing

    PubMed Central

    Kavanagh, Kevin T; Cimiotti, Jeannie P; Abusalem, Said; Coty, Mary-Beth

    2012-01-01

    Purpose: To underscore the need for health system reform and emphasize nursing measures as a key component in our healthcare reimbursement system. Design and Methods: Nursing-sensitive value-based purchasing (NSVBP) has been proposed as an initiative that would help to promote optimal staffing and practice environment through financial rewards and transparency of structure, process, and patient outcome measures. This article reviews the medical, governmental, institutional, and lay literature regarding the necessity for, method of implementation of, and potential impact of NSVBP. Findings: Research has shown that adverse events and mortality are highly dependent on nurse staffing levels and skill mix. The National Database of Nursing Quality Indicators (NDNQI), along with other well-developed indicators, can be used as nursing-sensitive measurements for value-based purchasing initiatives. Nursing-sensitive measures are an important component of value-based purchasing. Conclusions: Value-based purchasing is in its infancy. Devising an effective system that recognizes and incorporates nursing measures will facilitate the success of this initiative. NSVBP needs to be designed and incentivized to decrease adverse events, hospital stays, and readmission rates, thereby decreasing societal healthcare costs. Clinical Relevance: NSVBP has the potential for improving the quality of nursing care by financially motivating hospitals to have an optimal nurse practice environment capable of producing optimal patient outcomes by aligning cost effectiveness for hospitals to that of the patient and society. PMID:23066956

  17. QED Based Calculation of the Fine Structure Constant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lestone, John Paul

    2016-10-13

    Quantum electrodynamics is complex and its associated mathematics can appear overwhelming for those not trained in this field. Here, semi-classical approaches are used to obtain a more intuitive feel for what causes electrostatics, and the anomalous magnetic moment of the electron. These intuitive arguments lead to a possible answer to the question of the nature of charge. Virtual photons, with a reduced wavelength of λ, are assumed to interact with isolated electrons with a cross section of πλ 2. This interaction is assumed to generate time-reversed virtual photons that are capable of seeking out and interacting with other electrons. Thismore » exchange of virtual photons between particles is assumed to generate and define the strength of electromagnetism. With the inclusion of near-field effects the model presented here gives a fine structure constant of ~1/137 and an anomalous magnetic moment of the electron of ~0.00116. These calculations support the possibility that near-field corrections are the key to understanding the numerical value of the dimensionless fine structure constant.« less

  18. The Questionable Economic Case for Value-Based Drug Pricing in Market Health Systems.

    PubMed

    Pauly, Mark V

    2017-02-01

    This article investigates the economic theory and interpretation of the concept of "value-based pricing" for new breakthrough drugs with no close substitutes in a context (such as the United States) in which a drug firm with market power sells its product to various buyers. The interpretation is different from that in a country that evaluates medicines for a single public health insurance plan or a set of heavily regulated plans. It is shown that there will not ordinarily be a single value-based price but rather a schedule of prices with different volumes of buyers at each price. Hence, it is incorrect to term a particular price the value-based price, or to argue that the profit-maximizing monopoly price is too high relative to some hypothesized value-based price. When effectiveness of treatment or value of health is heterogeneous, the profit-maximizing price can be higher than that associated with assumed values of quality-adjusted life-years. If the firm sets a price higher than the value-based price for a set of potential buyers, the optimal strategy of the buyers is to decline to purchase that drug. The profit-maximizing price will come closer to a unique value-based price if demand is less heterogeneous. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  19. Precision phase estimation based on weak-value amplification

    NASA Astrophysics Data System (ADS)

    Qiu, Xiaodong; Xie, Linguo; Liu, Xiong; Luo, Lan; Li, Zhaoxue; Zhang, Zhiyou; Du, Jinglei

    2017-02-01

    In this letter, we propose a precision method for phase estimation based on the weak-value amplification (WVA) technique using a monochromatic light source. The anomalous WVA significantly suppresses the technical noise with respect to the intensity difference signal induced by the phase delay when the post-selection procedure comes into play. The phase measured precision of this method is proportional to the weak-value of a polarization operator in the experimental range. Our results compete well with the wide spectrum light phase weak measurements and outperform the standard homodyne phase detection technique.

  20. Improvements of the Ray-Tracing Based Method Calculating Hypocentral Loci for Earthquake Location

    NASA Astrophysics Data System (ADS)

    Zhao, A. H.

    2014-12-01

    Hypocentral loci are very useful to reliable and visual earthquake location. However, they can hardly be analytically expressed when the velocity model is complex. One of methods numerically calculating them is based on a minimum traveltime tree algorithm for tracing rays: a focal locus is represented in terms of ray paths in its residual field from the minimum point (namely initial point) to low residual points (referred as reference points of the focal locus). The method has no restrictions on the complexity of the velocity model but still lacks the ability of correctly dealing with multi-segment loci. Additionally, it is rather laborious to set calculation parameters for obtaining loci with satisfying completeness and fineness. In this study, we improve the ray-tracing based numerical method to overcome its advantages. (1) Reference points of a hypocentral locus are selected from nodes of the model cells that it goes through, by means of a so-called peeling method. (2) The calculation domain of a hypocentral locus is defined as such a low residual area that its connected regions each include one segment of the locus and hence all the focal locus segments are respectively calculated with the minimum traveltime tree algorithm for tracing rays by repeatedly assigning the minimum residual reference point among those that have not been traced as an initial point. (3) Short ray paths without branching are removed to make the calculated locus finer. Numerical tests show that the improved method becomes capable of efficiently calculating complete and fine hypocentral loci of earthquakes in a complex model.

  1. 40 CFR 600.113-88 - Fuel economy calculations.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... grams/mile values for HC, CO and CO2 for both the city fuel economy test and the highway fuel economy...) Calculate the weighted grams/mile values for the city fuel economy test for HC, CO, and CO2 as specified in... paragraph (c) of this section. (2) Calculate the grams/mile values for the highway fuel economy test for HC...

  2. Polarizability calculations on water, hydrogen, oxygen, and carbon dioxide

    NASA Technical Reports Server (NTRS)

    Nir, S.; Adams, S.; Rein, R.

    1973-01-01

    A semiclassical model of damped oscillators is used as a basis for the calculation of the dispersion of the refractive index, polarizability, and dielectric permeability in water, hydrogen, and oxygen in liquid and gaseous states, and in gaseous carbon dioxide. The absorption coefficient and the imaginary part of the refractive index are also calculated at corresponding wavelengths. A good agreement is obtained between the observed and calculated values of refractive indices, and between those of absorption coefficients in the region of absorption bands. The calculated values of oscillator strengths and damping factors are also discussed. The value of the polarizability of liquid water was about 2.8 times that of previous calculations.

  3. A hypersonic aeroheating calculation method based on inviscid outer edge of boundary layer parameters

    NASA Astrophysics Data System (ADS)

    Meng, ZhuXuan; Fan, Hu; Peng, Ke; Zhang, WeiHua; Yang, HuiXin

    2016-12-01

    This article presents a rapid and accurate aeroheating calculation method for hypersonic vehicles. The main innovation is combining accurate of numerical method with efficient of engineering method, which makes aeroheating simulation more precise and faster. Based on the Prandtl boundary layer theory, the entire flow field is divided into inviscid and viscid flow at the outer edge of the boundary layer. The parameters at the outer edge of the boundary layer are numerically calculated from assuming inviscid flow. The thermodynamic parameters of constant-volume specific heat, constant-pressure specific heat and the specific heat ratio are calculated, the streamlines on the vehicle surface are derived and the heat flux is then obtained. The results of the double cone show that at the 0° and 10° angle of attack, the method of aeroheating calculation based on inviscid outer edge of boundary layer parameters reproduces the experimental data better than the engineering method. Also the proposed simulation results of the flight vehicle reproduce the viscid numerical results well. Hence, this method provides a promising way to overcome the high cost of numerical calculation and improves the precision.

  4. Application of the resource-based relative value scale system to pediatrics.

    PubMed

    Gerstle, Robert S; Molteni, Richard A; Andreae, Margie C; Bradley, Joel F; Brewer, Eileen D; Calabrese, Jamie; Krug, Steven E; Liechty, Edward A; Linzer, Jeffrey F; Pillsbury, Julia M; Tuli, Sanjeev Y

    2014-06-01

    The majority of public and private payers in the United States currently use the Medicare Resource-Based Relative Value Scale as the basis for physician payment. Many large group and academic practices have adopted this objective system of physician work to benchmark physician productivity, including using it, wholly or in part, to determine compensation. The Resource-Based Relative Value Scale survey instrument, used to value physician services, was designed primarily for procedural services, leading to current concerns that American Medical Association/Specialty Society Relative Value Scale Update Committee (RUC) surveys may undervalue nonprocedural evaluation and management services. The American Academy of Pediatrics is represented on the RUC, the committee charged with maintaining accurate physician work values across specialties and age groups. The Academy, working closely with other primary care and subspecialty societies, actively pursues a balanced RUC membership and a survey instrument that will ensure appropriate work relative value unit assignments, thereby allowing pediatricians to receive appropriate payment for their services relative to other services.

  5. Value-Based Standards Guide Sexism Inferences for Self and Others.

    PubMed

    Mitamura, Chelsea; Erickson, Lynnsey; Devine, Patricia G

    2017-09-01

    People often disagree about what constitutes sexism, and these disagreements can be both socially and legally consequential. It is unclear, however, why or how people come to different conclusions about whether something or someone is sexist. Previous research on judgments about sexism has focused on the perceiver's gender and attitudes, but neither of these variables identifies comparative standards that people use to determine whether any given behavior (or person) is sexist. Extending Devine and colleagues' values framework (Devine, Monteith, Zuwerink, & Elliot, 1991; Plant & Devine, 1998), we argue that, when evaluating others' behavior, perceivers rely on the morally-prescriptive values that guide their own behavior toward women. In a series of 3 studies we demonstrate that (1) people's personal standards for sexism in their own and others' behavior are each related to their values regarding sexism, (2) these values predict how much behavioral evidence people need to infer sexism, and (3) people with stringent, but not lenient, value-based standards get angry and try to regulate a sexist perpetrator's behavior to reduce sexism. Furthermore, these personal values are related to all outcomes in the present work above and beyond other person characteristics previously used to predict sexism inferences. We discuss the implications of differing value-based standards for explaining and reconciling disputes over what constitutes sexist behavior.

  6. Interactive value-based curriculum: a pilot study.

    PubMed

    Bowman Peterson, Jill M; Duffy, Briar; Duran, Alisa; Gladding, Sophia P

    2018-03-06

    Current health care costs are unsustainable, with a large percentage of waste attributed to doctor practices. Medical educators are developing curricula to address value-based care (VBC) in education. There is, however, a paucity of curricula and assessments addressing levels higher than 'knows' at the base of Miller's pyramid of assessment. Our objective was to: (1) teach residents the principles of VBC using active learning strategies; and (2) develop and pilot a tool to assess residents' ability to apply principles of VBC at the higher level of 'knows how' on Miller's pyramid. Residents in medicine, medicine-paediatrics and medicine-dermatology participated in a 5-week VBC morning report curriculum using active learning techniques. Early sessions targeted knowledge and later sessions emphasised the application of VBC principles. Medical educators are developing curricula to address value-based care in education RESULTS: Thirty residents attended at least one session and completed both pre- and post-intervention tests, using a newly developed case-based assessment tool featuring a 'waste score' balanced with 'standard of care'. Residents, on average, reduced their waste score from pre-intervention to post-intervention [mean 8.8 (SD 6.3) versus mean 4.7 (SD 4.6), p = 0.001]. For those who reduced their waste score, most maintained or improved their standard of care. Our results suggest that residents may be able to decrease health care waste, with the majority maintaining or improving their management of care in a case-based assessment after participation in the curriculum. We are working to further incorporate VBC principles into more morning reports, and to develop further interventions and assessments to evaluate our residents at higher levels on Miller's pyramid of assessment. © 2018 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

  7. 19 CFR 10.536 - Value of materials.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 1 2010-04-01 2010-04-01 false Value of materials. 10.536 Section 10.536 Customs... Rules of Origin § 10.536 Value of materials. (a) Calculating the value of materials. Except as provided in § 10.541, for purposes of calculating the regional value content of a good under General Note 25(o...

  8. 19 CFR 10.536 - Value of materials.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 19 Customs Duties 1 2011-04-01 2011-04-01 false Value of materials. 10.536 Section 10.536 Customs... Rules of Origin § 10.536 Value of materials. (a) Calculating the value of materials. Except as provided in § 10.541, for purposes of calculating the regional value content of a good under General Note 25(o...

  9. 19 CFR 10.536 - Value of materials.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 19 Customs Duties 1 2013-04-01 2013-04-01 false Value of materials. 10.536 Section 10.536 Customs... Rules of Origin § 10.536 Value of materials. (a) Calculating the value of materials. Except as provided in § 10.541, for purposes of calculating the regional value content of a good under General Note 25(o...

  10. 19 CFR 10.536 - Value of materials.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 19 Customs Duties 1 2012-04-01 2012-04-01 false Value of materials. 10.536 Section 10.536 Customs... Rules of Origin § 10.536 Value of materials. (a) Calculating the value of materials. Except as provided in § 10.541, for purposes of calculating the regional value content of a good under General Note 25(o...

  11. 19 CFR 10.536 - Value of materials.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 19 Customs Duties 1 2014-04-01 2014-04-01 false Value of materials. 10.536 Section 10.536 Customs... Rules of Origin § 10.536 Value of materials. (a) Calculating the value of materials. Except as provided in § 10.541, for purposes of calculating the regional value content of a good under General Note 25(o...

  12. Applying Activity Based Costing (ABC) Method to Calculate Cost Price in Hospital and Remedy Services

    PubMed Central

    Rajabi, A; Dabiri, A

    2012-01-01

    Background Activity Based Costing (ABC) is one of the new methods began appearing as a costing methodology in the 1990’s. It calculates cost price by determining the usage of resources. In this study, ABC method was used for calculating cost price of remedial services in hospitals. Methods: To apply ABC method, Shahid Faghihi Hospital was selected. First, hospital units were divided into three main departments: administrative, diagnostic, and hospitalized. Second, activity centers were defined by the activity analysis method. Third, costs of administrative activity centers were allocated into diagnostic and operational departments based on the cost driver. Finally, with regard to the usage of cost objectives from services of activity centers, the cost price of medical services was calculated. Results: The cost price from ABC method significantly differs from tariff method. In addition, high amount of indirect costs in the hospital indicates that capacities of resources are not used properly. Conclusion: Cost price of remedial services with tariff method is not properly calculated when compared with ABC method. ABC calculates cost price by applying suitable mechanisms but tariff method is based on the fixed price. In addition, ABC represents useful information about the amount and combination of cost price services. PMID:23113171

  13. The goal of value-based medicine analyses: comparability. The case for neovascular macular degeneration.

    PubMed

    Brown, Gary C; Brown, Melissa M; Brown, Heidi C; Kindermann, Sylvia; Sharma, Sanjay

    2007-01-01

    To evaluate the comparability of articles in the peer-reviewed literature assessing the (1) patient value and (2) cost-utility (cost-effectiveness) associated with interventions for neovascular age-related macular degeneration (ARMD). A search was performed in the National Library of Medicine database of 16 million peer-reviewed articles using the key words cost-utility, cost-effectiveness, value, verteporfin, pegaptanib, laser photocoagulation, ranibizumab, and therapy. All articles that used an outcome of quality-adjusted life-years (QALYs) were studied in regard to (1) percent improvement in quality of life, (2) utility methodology, (3) utility respondents, (4) types of costs included (eg, direct healthcare, direct nonhealthcare, indirect), (5) cost bases (eg, Medicare, National Health Service in the United Kingdom), and (6) study cost perspective (eg, government, societal, third-party insurer). To qualify as a value-based medicine analysis, the patient value had to be measured using the outcome of the QALYs conferred by respective interventions. As with value-based medicine analyses, patient-based time tradeoff utility analysis had to be utilized, patient utility respondents were necessary, and direct medical costs were used. Among 21 cost-utility analyses performed on interventions for neovascular macular degeneration, 15 (71%) met value-based medicine criteria. The 6 others (29%) were not comparable owing to (1) varying utility methodology, (2) varying utility respondents, (3) differing costs utilized, (4) differing cost bases, and (5) varying study perspectives. Among value-based medicine studies, laser photocoagulation confers a 4.4% value gain (improvement in quality of life) for the treatment of classic subfoveal choroidal neovascularization. Intravitreal pegaptanib confers a 5.9% value gain (improvement in quality of life) for classic, minimally classic, and occult subfoveal choroidal neovascularization, and photodynamic therapy with verteporfin confers

  14. THE GOAL OF VALUE-BASED MEDICINE ANALYSES: COMPARABILITY. THE CASE FOR NEOVASCULAR MACULAR DEGENERATION

    PubMed Central

    Brown, Gary C.; Brown, Melissa M.; Brown, Heidi C.; Kindermann, Sylvia; Sharma, Sanjay

    2007-01-01

    Purpose To evaluate the comparability of articles in the peer-reviewed literature assessing the (1) patient value and (2) cost-utility (cost-effectiveness) associated with interventions for neovascular age-related macular degeneration (ARMD). Methods A search was performed in the National Library of Medicine database of 16 million peer-reviewed articles using the key words cost-utility, cost-effectiveness, value, verteporfin, pegaptanib, laser photocoagulation, ranibizumab, and therapy. All articles that used an outcome of quality-adjusted life-years (QALYs) were studied in regard to (1) percent improvement in quality of life, (2) utility methodology, (3) utility respondents, (4) types of costs included (eg, direct healthcare, direct nonhealthcare, indirect), (5) cost bases (eg, Medicare, National Health Service in the United Kingdom), and (6) study cost perspective (eg, government, societal, third-party insurer). To qualify as a value-based medicine analysis, the patient value had to be measured using the outcome of the QALYs conferred by respective interventions. As with value-based medicine analyses, patient-based time tradeoff utility analysis had to be utilized, patient utility respondents were necessary, and direct medical costs were used. Results Among 21 cost-utility analyses performed on interventions for neovascular macular degeneration, 15 (71%) met value-based medicine criteria. The 6 others (29%) were not comparable owing to (1) varying utility methodology, (2) varying utility respondents, (3) differing costs utilized, (4) differing cost bases, and (5) varying study perspectives. Among value-based medicine studies, laser photocoagulation confers a 4.4% value gain (improvement in quality of life) for the treatment of classic subfoveal choroidal neovascularization. Intravitreal pegaptanib confers a 5.9% value gain (improvement in quality of life) for classic, minimally classic, and occult subfoveal choroidal neovascularization, and photodynamic therapy

  15. Missing Value Imputation Approach for Mass Spectrometry-based Metabolomics Data.

    PubMed

    Wei, Runmin; Wang, Jingye; Su, Mingming; Jia, Erik; Chen, Shaoqiu; Chen, Tianlu; Ni, Yan

    2018-01-12

    Missing values exist widely in mass-spectrometry (MS) based metabolomics data. Various methods have been applied for handling missing values, but the selection can significantly affect following data analyses. Typically, there are three types of missing values, missing not at random (MNAR), missing at random (MAR), and missing completely at random (MCAR). Our study comprehensively compared eight imputation methods (zero, half minimum (HM), mean, median, random forest (RF), singular value decomposition (SVD), k-nearest neighbors (kNN), and quantile regression imputation of left-censored data (QRILC)) for different types of missing values using four metabolomics datasets. Normalized root mean squared error (NRMSE) and NRMSE-based sum of ranks (SOR) were applied to evaluate imputation accuracy. Principal component analysis (PCA)/partial least squares (PLS)-Procrustes analysis were used to evaluate the overall sample distribution. Student's t-test followed by correlation analysis was conducted to evaluate the effects on univariate statistics. Our findings demonstrated that RF performed the best for MCAR/MAR and QRILC was the favored one for left-censored MNAR. Finally, we proposed a comprehensive strategy and developed a public-accessible web-tool for the application of missing value imputation in metabolomics ( https://metabolomics.cc.hawaii.edu/software/MetImp/ ).

  16. WE-A-17A-07: Evaluation of a Grid-Based Boltzmann Solver for Nuclear Medicine Voxel-Based Dose Calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mikell, J; Kappadath, S; Wareing, T

    Purpose: Grid-based Boltzmann solvers (GBBS) have been successfully implemented in radiation oncology clinics as dose calculations for e×ternal photon beams and 192Ir sealed-source brachytherapy. We report on the evaluation of a GBBS for nuclear medicine vo×el-based absorbed doses. Methods: Vo×el-S-values were calculated for monoenergetic betas and photons (1, 0.1, 0.01 MeV), 90Y, and 131I for 3 mm vo×el sizes using Monte Carlo (DOS×YZnrc) and GBBS (Attila 8.1-beta5, Transpire). The source distribution was uniform throughout a single vo×el. The material was an infinite 1.04 g/cc soft tissue slab. To e×plore convergence properties of the GBBS 3 tetrahedral meshes, 3 energy groupmore » structures, 3 different square Chebyschev-Legendre quadrature set orders (Sn), and 4×2013;7 spherical harmonic e×pansion terms (Pn) were investigated for a total of 168 discretizations per source. The mesh, energy group, and quadrature sets are 8×, 3×, and 16×, respectively, finer than the corresponding coarse discretization. GBBS cross sections were generated with full electronphoton-coupling using the vendors e×tended CEP×S code. For accuracy, percent differences (%Δ) in source vo×el absorbed doses between MC and GBBS are reported for the coarsest and finest discretization. For convergence, ratios of the two finest discretization solutions are reported along each variable. Results: For 1 MeV, 0.1 MeV, 0.01 MeV, Y90, and I-131 beta sources the %Δ in the source vo×el for (coarsest,finest) discretization were (+2.0,−6.4), (−8.0, −7.5), (−13.8, −13.4), (+0.9,−5.5), and (− 10.1,−9.0) respectively. The corresponding %Δ for photons were (+33.7,−7.1), (−9.4, −9.8), (−17.4, −15.2), and (−1.7,−7.7), respectively. For betas, the convergence ratio of mesh, energy, Sn, and Pn ranged from 0.991–1.000. For gammas, the convergence ratio of mesh, Sn, and Pn ranged from 0.998–1.003 while the ratio for energy ranged from 0.964–1.001. Conclusions

  17. Size Reduction of Hamiltonian Matrix for Large-Scale Energy Band Calculations Using Plane Wave Bases

    NASA Astrophysics Data System (ADS)

    Morifuji, Masato

    2018-01-01

    We present a method of reducing the size of a Hamiltonian matrix used in calculations of electronic states. In the electronic states calculations using plane wave basis functions, a large number of plane waves are often required to obtain precise results. Even using state-of-the-art techniques, the Hamiltonian matrix often becomes very large. The large computational time and memory necessary for diagonalization limit the widespread use of band calculations. We show a procedure of deriving a reduced Hamiltonian constructed using a small number of low-energy bases by renormalizing high-energy bases. We demonstrate numerically that the significant speedup of eigenstates evaluation is achieved without losing accuracy.

  18. Neural Signature of Value-Based Sensorimotor Prioritization in Humans.

    PubMed

    Blangero, Annabelle; Kelly, Simon P

    2017-11-01

    In situations in which impending sensory events demand fast action choices, we must be ready to prioritize higher-value courses of action to avoid missed opportunities. When such a situation first presents itself, stimulus-action contingencies and their relative value must be encoded to establish a value-biased state of preparation for an impending sensorimotor decision. Here, we sought to identify neurophysiological signatures of such processes in the human brain (both female and male). We devised a task requiring fast action choices based on the discrimination of a simple visual cue in which the differently valued sensory alternatives were presented 750-800 ms before as peripheral "targets" that specified the stimulus-action mapping for the upcoming decision. In response to the targets, we identified a discrete, transient, spatially selective signal in the event-related potential (ERP), which scaled with relative value and strongly predicted the degree of behavioral bias in the upcoming decision both across and within subjects. This signal is not compatible with any hitherto known ERP signature of spatial selection and also bears novel distinctions with respect to characterizations of value-sensitive, spatially selective activity found in sensorimotor areas of nonhuman primates. Specifically, a series of follow-up experiments revealed that the signal was reliably invoked regardless of response laterality, response modality, sensory feature, and reward valence. It was absent, however, when the response deadline was relaxed and the strategic need for biasing removed. Therefore, more than passively representing value or salience, the signal appears to play a versatile and active role in adaptive sensorimotor prioritization. SIGNIFICANCE STATEMENT In many situations such as fast-moving sports, we must be ready to act fast in response to sensory events and, in our preparation, prioritize courses of action that lead to greater rewards. Although behavioral effects of

  19. Neural Signature of Value-Based Sensorimotor Prioritization in Humans

    PubMed Central

    Blangero, Annabelle

    2017-01-01

    In situations in which impending sensory events demand fast action choices, we must be ready to prioritize higher-value courses of action to avoid missed opportunities. When such a situation first presents itself, stimulus–action contingencies and their relative value must be encoded to establish a value-biased state of preparation for an impending sensorimotor decision. Here, we sought to identify neurophysiological signatures of such processes in the human brain (both female and male). We devised a task requiring fast action choices based on the discrimination of a simple visual cue in which the differently valued sensory alternatives were presented 750–800 ms before as peripheral “targets” that specified the stimulus–action mapping for the upcoming decision. In response to the targets, we identified a discrete, transient, spatially selective signal in the event-related potential (ERP), which scaled with relative value and strongly predicted the degree of behavioral bias in the upcoming decision both across and within subjects. This signal is not compatible with any hitherto known ERP signature of spatial selection and also bears novel distinctions with respect to characterizations of value-sensitive, spatially selective activity found in sensorimotor areas of nonhuman primates. Specifically, a series of follow-up experiments revealed that the signal was reliably invoked regardless of response laterality, response modality, sensory feature, and reward valence. It was absent, however, when the response deadline was relaxed and the strategic need for biasing removed. Therefore, more than passively representing value or salience, the signal appears to play a versatile and active role in adaptive sensorimotor prioritization. SIGNIFICANCE STATEMENT In many situations such as fast-moving sports, we must be ready to act fast in response to sensory events and, in our preparation, prioritize courses of action that lead to greater rewards. Although behavioral

  20. Value-Based Reimbursement: Impact of Curtailing Physician Autonomy in Medical Decision Making.

    PubMed

    Gupta, Dipti; Karst, Ingolf; Mendelson, Ellen B

    2016-02-01

    In this article, we define value in the context of reimbursement and explore the effect of shifting reimbursement paradigms on the decision-making autonomy of a women's imaging radiologist. The current metrics used for value-based reimbursement such as report turnaround time are surrogate measures that do not measure value directly. The true measure of a physician's value in medicine is accomplishment of better health outcomes, which, in breast imaging, are best achieved with a physician-patient relationship. Complying with evidence-based medicine, which includes data-driven best clinical practices, a physician's clinical expertise, and the patient's values, will improve our science and preserve the art of medicine.

  1. Comprehending the multiple 'values' of green infrastructure - Valuing nature-based solutions for urban water management from multiple perspectives.

    PubMed

    Wild, T C; Henneberry, J; Gill, L

    2017-10-01

    The valuation of urban water management practices and associated nature-based solutions (NBS) is highly contested, and is becoming increasingly important to cities seeking to increase their resilience to climate change whilst at the same time facing budgetary pressures. Different conceptions of 'values' exist, each being accompanied by a set of potential measures ranging from calculative practices (closely linked to established market valuation techniques) - through to holistic assessments that seek to address wider concerns of sustainability. Each has the potential to offer important insights that often go well beyond questions of balancing the costs and benefits of the schemes concerned. However, the need to address - and go beyond - economic considerations presents policy-makers, practitioners and researchers with difficult methodological, ethical and practical challenges, especially when considered without the benefit of a broader theoretical framework or in the absence of well-established tools (as might apply within more traditional infrastructural planning contexts, such as the analysis of transport interventions). Drawing on empirical studies undertaken in Sheffield over a period of 10 years, and delivered in partnership with several other European cities and regions, we compare and examine different attempts to evaluate the benefits of urban greening options and future development scenarios. Comparing these different approaches to the valuation of nature-based solutions alongside other, more conventional forms of infrastructure - and indeed integrating both 'green and grey' interventions within a broader framework of infrastructures - throws up some surprising results and conclusions, as well as providing important sign-posts for future research in this rapidly emerging field. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Stochastic-analytic approach to the calculation of multiply scattered lidar returns

    NASA Astrophysics Data System (ADS)

    Gillespie, D. T.

    1985-08-01

    The problem of calculating the nth-order backscattered power of a laser firing short pulses at time zero into an homogeneous cloud with specified scattering and absorption parameters, is discussed. In the problem, backscattered power is measured at any time less than zero by a small receiver colocated with the laser and fitted with a forward looking conical baffle. Theoretical calculations are made on the premise that the laser pulse is composed of propagating photons which are scattered and absorbed by the cloud particles in a probabilistic manner. The effect of polarization was not taken into account in the calculations. An exact formula is derived for backscattered power, based on direct physical arguments together with a rigorous analysis of random variables. It is shown that, for values of n less than or equal to 2, the obtained formula is a well-behaved (3n-4) dimensionless integral. The computational feasibility of the integral formula is demonstrated for a model cloud of isotropically scattering particles. An analytical formula is obtained for a value of n = 2, and a Monte Carlo program was used to obtain numerical results for values of n = 3, . . ., 6.

  3. A web-based normative calculator for the uniform data set (UDS) neuropsychological test battery.

    PubMed

    Shirk, Steven D; Mitchell, Meghan B; Shaughnessy, Lynn W; Sherman, Janet C; Locascio, Joseph J; Weintraub, Sandra; Atri, Alireza

    2011-11-11

    With the recent publication of new criteria for the diagnosis of preclinical Alzheimer's disease (AD), there is a need for neuropsychological tools that take premorbid functioning into account in order to detect subtle cognitive decline. Using demographic adjustments is one method for increasing the sensitivity of commonly used measures. We sought to provide a useful online z-score calculator that yields estimates of percentile ranges and adjusts individual performance based on sex, age and/or education for each of the neuropsychological tests of the National Alzheimer's Coordinating Center Uniform Data Set (NACC, UDS). In addition, we aimed to provide an easily accessible method of creating norms for other clinical researchers for their own, unique data sets. Data from 3,268 clinically cognitively-normal older UDS subjects from a cohort reported by Weintraub and colleagues (2009) were included. For all neuropsychological tests, z-scores were estimated by subtracting the raw score from the predicted mean and then dividing this difference score by the root mean squared error term (RMSE) for a given linear regression model. For each neuropsychological test, an estimated z-score was calculated for any raw score based on five different models that adjust for the demographic predictors of SEX, AGE and EDUCATION, either concurrently, individually or without covariates. The interactive online calculator allows the entry of a raw score and provides five corresponding estimated z-scores based on predictions from each corresponding linear regression model. The calculator produces percentile ranks and graphical output. An interactive, regression-based, normative score online calculator was created to serve as an additional resource for UDS clinical researchers, especially in guiding interpretation of individual performances that appear to fall in borderline realms and may be of particular utility for operationalizing subtle cognitive impairment present according to the newly

  4. Evaluation of students' knowledge about paediatric dosage calculations.

    PubMed

    Özyazıcıoğlu, Nurcan; Aydın, Ayla İrem; Sürenler, Semra; Çinar, Hava Gökdere; Yılmaz, Dilek; Arkan, Burcu; Tunç, Gülseren Çıtak

    2018-01-01

    Medication errors are common and may jeopardize the patient safety. As paediatric dosages are calculated based on the child's age and weight, risk of error in dosage calculations is increasing. In paediatric patients, overdose drug prescribed regardless of the child's weight, age and clinical picture may lead to excessive toxicity and mortalities while low doses may delay the treatment. This study was carried out to evaluate the knowledge of nursing students about paediatric dosage calculations. This research, which is of retrospective type, covers a population consisting of all the 3rd grade students at the bachelor's degree in May, 2015 (148 students). Drug dose calculation questions in exam papers including 3 open ended questions on dosage calculation problems, addressing 5 variables were distributed to the students and their responses were evaluated by the researchers. In the evaluation of the data, figures and percentage distribution were calculated and Spearman correlation analysis was applied. Exam question on the dosage calculation based on child's age, which is the most common method in paediatrics, and which ensures right dosages and drug dilution was answered correctly by 87.1% of the students while 9.5% answered it wrong and 3.4% left it blank. 69.6% of the students was successful in finding the safe dose range, and 79.1% in finding the right ratio/proportion. 65.5% of the answers with regard to Ml/dzy calculation were correct. Moreover, student's four operation skills were assessed and 68.2% of the students were determined to have found the correct answer. When the relation among the questions on medication was examined, a significant relation (correlation) was determined between them. It is seen that in dosage calculations, the students failed mostly in calculating ml/dzy (decimal). This result means that as dosage calculations are based on decimal values, calculations may be ten times erroneous when the decimal point is placed wrongly. Moreover, it

  5. Calculation of surface enthalpy of solids from an ab initio electronegativity based model: case of ice.

    PubMed

    Douillard, J M; Henry, M

    2003-07-15

    A very simple route to calculation of the surface energy of solids is proposed because this value is very difficult to determine experimentally. The first step is the calculation of the attractive part of the electrostatic energy of crystals. The partial charges used in this calculation are obtained by using electronegativity equalization and scales of electronegativity and hardness deduced from physical characteristics of the atom. The lattice energies of the infinite crystal and of semi-infinite layers are then compared. The difference is related to the energy of cohesion and then to the surface energy. Very good results are obtained with ice, if one compares with the surface energy of liquid water, which is generally considered a good approximation of the surface energy of ice.

  6. Experimental verification of a CT-based Monte Carlo dose-calculation method in heterogeneous phantoms.

    PubMed

    Wang, L; Lovelock, M; Chui, C S

    1999-12-01

    To further validate the Monte Carlo dose-calculation method [Med. Phys. 25, 867-878 (1998)] developed at the Memorial Sloan-Kettering Cancer Center, we have performed experimental verification in various inhomogeneous phantoms. The phantom geometries included simple layered slabs, a simulated bone column, a simulated missing-tissue hemisphere, and an anthropomorphic head geometry (Alderson Rando Phantom). The densities of the inhomogeneity range from 0.14 to 1.86 g/cm3, simulating both clinically relevant lunglike and bonelike materials. The data are reported as central axis depth doses, dose profiles, dose values at points of interest, such as points at the interface of two different media and in the "nasopharynx" region of the Rando head. The dosimeters used in the measurement included dosimetry film, TLD chips, and rods. The measured data were compared to that of Monte Carlo calculations for the same geometrical configurations. In the case of the Rando head phantom, a CT scan of the phantom was used to define the calculation geometry and to locate the points of interest. The agreement between the calculation and measurement is generally within 2.5%. This work validates the accuracy of the Monte Carlo method. While Monte Carlo, at present, is still too slow for routine treatment planning, it can be used as a benchmark against which other dose calculation methods can be compared.

  7. Modeling Alkyl p-Methoxy Cinnamate (APMC) as UV absorber based on electronic transition using semiempirical quantum mechanics ZINDO/s calculation

    NASA Astrophysics Data System (ADS)

    Salmahaminati; Azis, Muhlas Abdul; Purwiandono, Gani; Arsyik Kurniawan, Muhammad; Rubiyanto, Dwiarso; Darmawan, Arif

    2017-11-01

    In this research, modeling several alkyl p-methoxy cinnamate (APMC) based on electronic transition by using semiempirical mechanical quantum ZINDO/s calculation is performed. Alkyl cinnamates of C1 (methyl) up to C7 (heptyl) homolog with 1-5 example structures of each homolog are used as materials. Quantum chemistry-package software Hyperchem 8.0 is used to simulate the drawing of the structure, geometry optimization by a semiempirical Austin Model 1 algorithm and single point calculation employing a semiempirical ZINDO/s technique. ZINDO/s calculations use a defined criteria that singly excited -Configuration Interaction (CI) where a gap of HOMO-LUMO energy transition and maximum degeneracy level are 7 and 2, respectively. Moreover, analysis of the theoretical spectra is focused on the UV-B (290-320 nm) and UV-C (200-290 nm) area. The results show that modeling of the compound can be used to predict the type of UV protection activity depends on the electronic transition in the UV area. Modification of the alkyl homolog relatively does not change the value of wavelength absorption to indicate the UV protection activity. Alkyl cinnamate compounds are predicted as UV-B and UV-C sunscreen.

  8. Arginine: Its pKa value revisited

    PubMed Central

    Fitch, Carolyn A; Platzer, Gerald; Okon, Mark; Garcia-Moreno E, Bertrand; McIntosh, Lawrence P

    2015-01-01

    Using complementary approaches of potentiometry and NMR spectroscopy, we have determined that the equilibrium acid dissociation constant (pKa value) of the arginine guanidinium group is 13.8 ± 0.1. This is substantially higher than that of ∼12 often used in structure-based electrostatics calculations and cited in biochemistry textbooks. The revised intrinsic pKa value helps explains why arginine side chains in proteins are always predominantly charged, even at pH values as great as 10. The high pKa value also reinforces the observation that arginine side chains are invariably protonated under physiological conditions of near neutral pH. This occurs even when the guanidinium moiety is buried in a hydrophobic micro-environment, such as that inside a protein or a lipid membrane, thought to be incompatible with the presence of a charged group. PMID:25808204

  9. Value Added Based on Educational Positions in Dutch Secondary Education

    ERIC Educational Resources Information Center

    Timmermans, Anneke C.; Bosker, Roel J.; de Wolf, Inge F.; Doolaard, Simone; van der Werf, Margaretha P. C.

    2014-01-01

    Estimating added value as an indicator of school effectiveness in the context of educational accountability often occurs using test or examination scores of students. This study investigates the possibilities for using scores of educational positions as an alternative indicator. A number of advantages of a value added indicator based on…

  10. The Theory of Value-Based Payment Incentives and Their Application to Health Care.

    PubMed

    Conrad, Douglas A

    2015-12-01

    To present the implications of agency theory in microeconomics, augmented by behavioral economics, for different methods of value-based payment in health care; and to derive a set of future research questions and policy recommendations based on that conceptual analysis. Original literature of agency theory, and secondarily behavioral economics, combined with applied research and empirical evidence on the application of those principles to value-based payment. Conceptual analysis and targeted review of theoretical research and empirical literature relevant to value-based payment in health care. Agency theory and secondarily behavioral economics have powerful implications for design of value-based payment in health care. To achieve improved value-better patient experience, clinical quality, health outcomes, and lower costs of care-high-powered incentives should directly target improved care processes, enhanced patient experience, and create achievable benchmarks for improved outcomes. Differing forms of value-based payment (e.g., shared savings and risk, reference pricing, capitation, and bundled payment), coupled with adjunct incentives for quality and efficiency, can be tailored to different market conditions and organizational settings. Payment contracts that are "incentive compatible"-which directly encourage better care and reduced cost, mitigate gaming, and selectively induce clinically efficient providers to participate-will focus differentially on evidence-based care processes, will right-size and structure incentives to avoid crowd-out of providers' intrinsic motivation, and will align patient incentives with value. Future research should address the details of putting these and related principles into practice; further, by deploying these insights in payment design, policy makers will improve health care value for patients and purchasers. © Health Research and Educational Trust.

  11. Tailoring Agility: Promiscuous Pair Story Authoring and Value Calculation

    NASA Astrophysics Data System (ADS)

    Tendon, Steve

    This chapter describes how a multi-national software organization created a business plan involving business units from eight countries that followed an agile way, after two previously failed attempts with traditional approaches. The case is told by the consultant who initiated implementation of agility into requirements gathering, estimation and planning processes in an international setting. The agile approach was inspired by XP, but then tailored to meet the peculiar requirements. Two innovations were critical. The first innovation was promiscuous pair story authoring, where user stories were written by two people (similarly to pair programming), and the pairing changed very often (as frequently as every 15-20 minutes) to achieve promiscuity and cater for diverse point of views. The second innovation was an economic value evaluation (and not the cost) which was attributed to stories. Continuous recalculation of the financial value of the stories allowed to assess the projects financial return. In this case implementation of agility in the international context allowed the involved team members to reach consensus and unanimity of decisions, vision and purpose.

  12. GPU-based ultra-fast dose calculation using a finite size pencil beam model.

    PubMed

    Gu, Xuejun; Choi, Dongju; Men, Chunhua; Pan, Hubert; Majumdar, Amitava; Jiang, Steve B

    2009-10-21

    Online adaptive radiation therapy (ART) is an attractive concept that promises the ability to deliver an optimal treatment in response to the inter-fraction variability in patient anatomy. However, it has yet to be realized due to technical limitations. Fast dose deposit coefficient calculation is a critical component of the online planning process that is required for plan optimization of intensity-modulated radiation therapy (IMRT). Computer graphics processing units (GPUs) are well suited to provide the requisite fast performance for the data-parallel nature of dose calculation. In this work, we develop a dose calculation engine based on a finite-size pencil beam (FSPB) algorithm and a GPU parallel computing framework. The developed framework can accommodate any FSPB model. We test our implementation in the case of a water phantom and the case of a prostate cancer patient with varying beamlet and voxel sizes. All testing scenarios achieved speedup ranging from 200 to 400 times when using a NVIDIA Tesla C1060 card in comparison with a 2.27 GHz Intel Xeon CPU. The computational time for calculating dose deposition coefficients for a nine-field prostate IMRT plan with this new framework is less than 1 s. This indicates that the GPU-based FSPB algorithm is well suited for online re-planning for adaptive radiotherapy.

  13. The value of EHR-based assessment of physician competency: An investigative effort with internal medicine physicians.

    PubMed

    Venta, Kimberly; Baker, Erin; Fidopiastis, Cali; Stanney, Kay

    2017-12-01

    The purpose of this study was to investigate the potential of developing an EHR-based model of physician competency, named the Skill Deficiency Evaluation Toolkit for Eliminating Competency-loss Trends (Skill-DETECT), which presents the opportunity to use EHR-based models to inform selection of Continued Medical Education (CME) opportunities specifically targeted at maintaining proficiency. The IBM Explorys platform provided outpatient Electronic Health Records (EHRs) representing 76 physicians with over 5000 patients combined. These data were used to develop the Skill-DETECT model, a predictive hybrid model composed of a rule-based model, logistic regression model, and a thresholding model, which predicts cognitive clinical skill deficiencies in internal medicine physicians. A three-phase approach was then used to statistically validate the model performance. Subject Matter Expert (SME) panel reviews resulted in a 100% overall approval rate of the rule based model. Area under the receiver-operating characteristic curves calculated for each logistic regression curve resulted in values between 0.76 and 0.92, which indicated exceptional performance. Normality, skewness, and kurtosis were determined and confirmed that the distribution of values output from the thresholding model were unimodal and peaked, which confirmed effectiveness and generalizability. The validation has confirmed that the Skill-DETECT model has a strong ability to evaluate EHR data and support the identification of internal medicine cognitive clinical skills that are deficient or are of higher likelihood of becoming deficient and thus require remediation, which will allow both physician and medical organizations to fine tune training efforts. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Overcoming misconceptions of graph interpretation of kinematics motion using calculator based rangers

    NASA Astrophysics Data System (ADS)

    Olson, John R.

    This is a quasi-experimental study of 261 first year high school students that analyzes gains made through the use of calculator based rangers attached to calculators. The study has qualitative components but is based on quantitative tests. Biechner's TUG-K test was used for the pretest, posttest, and post-posttest. The population was divided into one group that predicted the results before using the CBRs and another that did not predict first but completed the same activities. The data for the groups was further disaggregated into learning style groups (based on Kolb's Learning Styles Inventory), type of class (advanced vs. general physics), and gender. Four instructors used the labs developed by the author for this study and created significant differences between the groups by instructor based on interviews, participant observation and one way ANOVA. No significant differences were found between learning styles based on MANOVA. No significant differences were found between predict and nonpredict groups for the one way ANOVAs or MANOVA, however, some differences do exist as measured by a survey and participant observation. Significant differences do exist between gender and type of class (advanced/general) based on one way ANOVA and MANOVA. The males outscored the females on all tests and the advanced physics scored higher than the general physics on all tests. The advanced physics scoring higher was expected but the difference between genders was not.

  15. Calculation of the exchange coupling constants of copper binuclear systems based on spin-flip constricted variational density functional theory.

    PubMed

    Zhekova, Hristina R; Seth, Michael; Ziegler, Tom

    2011-11-14

    We have recently developed a methodology for the calculation of exchange coupling constants J in weakly interacting polynuclear metal clusters. The method is based on unrestricted and restricted second order spin-flip constricted variational density functional theory (SF-CV(2)-DFT) and is here applied to eight binuclear copper systems. Comparison of the SF-CV(2)-DFT results with experiment and with results obtained from other DFT and wave function based methods has been made. Restricted SF-CV(2)-DFT with the BH&HLYP functional yields consistently J values in excellent agreement with experiment. The results acquired from this scheme are comparable in quality to those obtained by accurate multi-reference wave function methodologies such as difference dedicated configuration interaction and the complete active space with second-order perturbation theory. © 2011 American Institute of Physics

  16. Public Report on Health: Development of a Nutritive Value Calculator for Indian Foods and Analysis of Food Logs and Nutrient Intake in six States

    PubMed Central

    Sathyamala, C; Kurian, NJ; DE, Anuradha; Saxena, KB; Priya, Ritu; Baru, Rama; Srivastava, Ravi; Mittal, Onkar; Noronha, Claire; Samson, Meera; Khalsa, Sneh; Puliyel, Ashish

    2014-01-01

    The Public Report on Health (PRoH) was initiated in 2005 to understand public health issues for people from diverse backgrounds living in different region specific contexts. States were selected purposively to capture a diversity of situations from better-performing states and not-so-well performing states. Based on these considerations, six states – the better-performing states of Tamil Nadu (TN), Maharashtra (MH) and Himachal Pradesh (HP) and the not-so-well performing states of Madhya Pradesh (MP), Uttar Pradesh (UP) and Orissa (OR) – were selected. This is a report of a study using food diaries to assess food intakes in sample households from six states of India. Method: Food diaries were maintained and all the raw food items that went into making the food in the household was measured using a measuring cup that converted volumes into dry weights for each item. The proportion consumed by individual adults was recorded. A nutrient calculator that computed the total nutrient in the food items consumed, using the ‘Nutritive Value of Indian Foods by Gopalan et al., was developed to analyze the data and this is now been made available as freeware (http://bit.ly/ncalculator). The total nutrients consumed by the adults, men and women was calculated. Results: Identifying details having been removed, the raw data is available, open access on the internet http://bit.ly/foodlogxls.The energy consumption in our study was 2379 kcal per capita per day. According to the Summary Report World Agriculture the per capita food consumption in 1997-99 was 2803 which is higher than that in the best state in India. The consumption for developing countries a decade ago was 2681 and in Sub-Saharan Africa it was 2195. Our data is compatible in 2005 with the South Asia consumption of 2403 Kcal per capita per day in 1997-99. For comparison, in industrialized countries it was 3380. In Tamil Nadu it was a mere 1817 kcal. Discussion: The nutrient consumption in this study suggests that

  17. Health technology assessment, value-based decision making, and innovation.

    PubMed

    Henshall, Chris; Schuller, Tara

    2013-10-01

    Identifying treatments that offer value and value for money is becoming increasingly important, with interest in how health technology assessment (HTA) and decision makers can take appropriate account of what is of value to patients and to society, and in the relationship between innovation and assessments of value. This study summarizes points from an Health Technology Assessment International (HTAi) Policy Forum discussion, drawing on presentations, discussions among attendees, and background papers. Various perspectives on value were considered; most place patient health at the core of value. Wider elements of value comprise other benefits for: patients; caregivers; the health and social care systems; and society. Most decision-making systems seek to take account of similar elements of value, although they are assessed and combined in different ways. Judgment in decisions remains important and cannot be replaced by mathematical approaches. There was discussion of the value of innovation and of the effects of value assessments on innovation. Discussion also included moving toward "progressive health system decision making," an ongoing process whereby evidence-based decisions on use would be made at various stages in the technology lifecycle. Five actions are identified: (i) development of a general framework for the definition and assessment of value; development by HTA/coverage bodies and regulators of (ii) disease-specific guidance and (iii) further joint scientific advice for industry on demonstrating value; (iv) development of a framework for progressive licensing, usage, and reimbursement; and (v) promoting work to better adapt HTA, coverage, and procurement approaches to medical devices.

  18. Quantum chemical calculations of glycine glutaric acid

    NASA Astrophysics Data System (ADS)

    Arioǧlu, ćaǧla; Tamer, Ömer; Avci, Davut; Atalay, Yusuf

    2017-02-01

    Density functional theory (DFT) calculations of glycine glutaric acid were performed by using B3LYP levels with 6-311++G(d,p) basis set. The theoretical structural parameters such as bond lengths and bond angles are in a good agreement with the experimental values of the title compound. HOMO and LUMO energies were calculated, and the obtained energy gap shows that charge transfer occurs in the title compound. Vibrational frequencies were calculated and compare with experimental ones. 3D molecular surfaces of the title compound were simulated using the same level and basis set. Finally, the 13C and 1H NMR chemical shift values were calculated by the application of the gauge independent atomic orbital (GIAO) method.

  19. A comparison of the prognostic value of preoperative inflammation-based scores and TNM stage in patients with gastric cancer

    PubMed Central

    Pan, Qun-Xiong; Su, Zi-Jian; Zhang, Jian-Hua; Wang, Chong-Ren; Ke, Shao-Ying

    2015-01-01

    Background People’s Republic of China is one of the countries with the highest incidence of gastric cancer, accounting for 45% of all new gastric cancer cases in the world. Therefore, strong prognostic markers are critical for the diagnosis and survival of Chinese patients suffering from gastric cancer. Recent studies have begun to unravel the mechanisms linking the host inflammatory response to tumor growth, invasion and metastasis in gastric cancers. Based on this relationship between inflammation and cancer progression, several inflammation-based scores have been demonstrated to have prognostic value in many types of malignant solid tumors. Objective To compare the prognostic value of inflammation-based prognostic scores and tumor node metastasis (TNM) stage in patients undergoing gastric cancer resection. Methods The inflammation-based prognostic scores were calculated for 207 patients with gastric cancer who underwent surgery. Glasgow prognostic score (GPS), neutrophil lymphocyte ratio (NLR), platelet lymphocyte ratio (PLR), prognostic nutritional index (PNI), and prognostic index (PI) were analyzed. Linear trend chi-square test, likelihood ratio chi-square test, and receiver operating characteristic were performed to compare the prognostic value of the selected scores and TNM stage. Results In univariate analysis, preoperative serum C-reactive protein (P<0.001), serum albumin (P<0.001), GPS (P<0.001), PLR (P=0.002), NLR (P<0.001), PI (P<0.001), PNI (P<0.001), and TNM stage (P<0.001) were significantly associated with both overall survival and disease-free survival of patients with gastric cancer. In multivariate analysis, GPS (P=0.024), NLR (P=0.012), PI (P=0.001), TNM stage (P<0.001), and degree of differentiation (P=0.002) were independent predictors of gastric cancer survival. GPS and TNM stage had a comparable prognostic value and higher linear trend chi-square value, likelihood ratio chi-square value, and larger area under the receiver operating

  20. Implications to Postsecondary Faculty of Alternative Calculation Methods of Gender-Based Wage Differentials.

    ERIC Educational Resources Information Center

    Hagedorn, Linda Serra

    1998-01-01

    A study explored two distinct methods of calculating a precise measure of gender-based wage differentials among college faculty. The first estimation considered wage differences using a formula based on human capital; the second included compensation for past discriminatory practices. Both measures were used to predict three specific aspects of…

  1. Handling the procurement of prostheses for total hip replacement: description of an original value based approach and application to a real-life dataset reported in the UK

    PubMed Central

    Messori, Andrea; Trippoli, Sabrina; Marinai, Claudio

    2017-01-01

    Objectives In most European countries, innovative medical devices are not managed according to cost–utility methods, the reason being that national agencies do not generally evaluate these products. The objective of our study was to investigate the cost-utility profile of prostheses for hip replacement and to calculate a value-based score to be used in the process of procurement and tendering for these devices. Methods The first phase of our study was aimed at retrieving the studies reporting the values of QALYs, direct cost, and net monetary benefit (NMB) from patients undergoing total hip arthroplasty (THA) with different brands of hip prosthesis. The second phase was aimed at calculating, on the basis of the results of cost–utility analysis, a tender score for each device (defined according to standard tendering equations and adapted to a 0–100 scale). This allowed us to determine the ranking of each device in the simulated tender. Results We identified a single study as the source of information for our analysis. Nine device brands (cemented, cementless, or hybrid) were evaluated. The cemented prosthesis Exeter V40/Elite Plus Ogee, the cementless device Taperloc/Exceed, and the hybrid device Exeter V40/Trident had the highest NMB (£152 877, £156 356, and £156 210, respectively) and the best value-based tender score. Conclusions The incorporation of value-based criteria in the procurement process can contribute to optimising the value for money for THA devices. According to the approach described herein, the acquisition of these devices does not necessarily converge on the product with the lowest cost; in fact, more costly devices should be preferred when their increased cost is offset by the monetary value of the increased clinical benefit. PMID:29259062

  2. Missing value imputation in DNA microarrays based on conjugate gradient method.

    PubMed

    Dorri, Fatemeh; Azmi, Paeiz; Dorri, Faezeh

    2012-02-01

    Analysis of gene expression profiles needs a complete matrix of gene array values; consequently, imputation methods have been suggested. In this paper, an algorithm that is based on conjugate gradient (CG) method is proposed to estimate missing values. k-nearest neighbors of the missed entry are first selected based on absolute values of their Pearson correlation coefficient. Then a subset of genes among the k-nearest neighbors is labeled as the best similar ones. CG algorithm with this subset as its input is then used to estimate the missing values. Our proposed CG based algorithm (CGimpute) is evaluated on different data sets. The results are compared with sequential local least squares (SLLSimpute), Bayesian principle component analysis (BPCAimpute), local least squares imputation (LLSimpute), iterated local least squares imputation (ILLSimpute) and adaptive k-nearest neighbors imputation (KNNKimpute) methods. The average of normalized root mean squares error (NRMSE) and relative NRMSE in different data sets with various missing rates shows CGimpute outperforms other methods. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. Application of Risk within Net Present Value Calculations for Government Projects

    NASA Technical Reports Server (NTRS)

    Grandl, Paul R.; Youngblood, Alisha D.; Componation, Paul; Gholston, Sampson

    2007-01-01

    In January 2004, President Bush announced a new vision for space exploration. This included retirement of the current Space Shuttle fleet by 2010 and the development of new set of launch vehicles. The President's vision did not include significant increases in the NASA budget, so these development programs need to be cost conscious. Current trade study procedures address factors such as performance, reliability, safety, manufacturing, maintainability, operations, and costs. It would be desirable, however, to have increased insight into the cost factors behind each of the proposed system architectures. This paper reports on a set of component trade studies completed on the upper stage engine for the new launch vehicles. Increased insight into architecture costs was developed by including a Net Present Value (NPV) method and applying a set of associated risks to the base parametric cost data. The use of the NPV method along with the risks was found to add fidelity to the trade study and provide additional information to support the selection of a more robust design architecture.

  4. Opioid Modulation of Value-Based Decision-Making in Healthy Humans.

    PubMed

    Eikemo, Marie; Biele, Guido; Willoch, Frode; Thomsen, Lotte; Leknes, Siri

    2017-08-01

    Modifying behavior to maximize reward is integral to adaptive decision-making. In rodents, the μ-opioid receptor (MOR) system encodes motivation and preference for high-value rewards. Yet it remains unclear whether and how human MORs contribute to value-based decision-making. We reasoned that if the human MOR system modulates value-based choice, this would be reflected by opposite effects of agonist and antagonist drugs. In a double-blind pharmacological cross-over study, 30 healthy men received morphine (10 mg), placebo, and the opioid antagonist naltrexone (50 mg). They completed a two-alternative decision-making task known to induce a considerable bias towards the most frequently rewarded response option. To quantify MOR involvement in this bias, we fitted accuracy and reaction time data with the drift-diffusion model (DDM) of decision-making. The DDM analysis revealed the expected bidirectional drug effects for two decision subprocesses. MOR stimulation with morphine increased the preference for the stimulus with high-reward probability (shift in starting point). Compared to placebo, morphine also increased, and naltrexone reduced, the efficiency of evidence accumulation. Since neither drug affected motor-coordination, speed-accuracy trade-off, or subjective state (indeed participants were still blinded after the third session), we interpret the MOR effects on evidence accumulation efficiency as a consequence of changes in effort exerted in the task. Together, these findings support a role for the human MOR system in value-based choice by tuning decision-making towards high-value rewards across stimulus domains.

  5. Empirical determination of collimator scatter data for use in Radcalc commercial monitor unit calculation software: Implication for prostate volumetric modulated-arc therapy calculations.

    PubMed

    Richmond, Neil; Tulip, Rachael; Walker, Chris

    2016-01-01

    The aim of this work was to determine, by measurement and independent monitor unit (MU) check, the optimum method for determining collimator scatter for an Elekta Synergy linac with an Agility multileaf collimator (MLC) within Radcalc, a commercial MU calculation software package. The collimator scatter factors were measured for 13 field shapes defined by an Elekta Agility MLC on a Synergy linac with 6MV photons. The value of the collimator scatter associated with each field was also calculated according to the equation Sc=Sc(mlc)+Sc(corr)(Sc(open)-Sc(mlc)) with Sc(corr) varied between 0 and 1, where Sc(open) is the value of collimator scatter calculated from the rectangular collimator-defined field and Sc(mlc) the value using only the MLC-defined field shape by applying sector integration. From this the optimum value of the correction was determined as that which gives the minimum difference between measured and calculated Sc. Single (simple fluence modulation) and dual-arc (complex fluence modulation) treatment plans were generated on the Monaco system for prostate volumetric modulated-arc therapy (VMAT) delivery. The planned MUs were verified by absolute dose measurement in phantom and by an independent MU calculation. The MU calculations were repeated with values of Sc(corr) between 0 and 1. The values of the correction yielding the minimum MU difference between treatment planning system (TPS) and check MU were established. The empirically derived value of Sc(corr) giving the best fit to the measured collimator scatter factors was 0.49. This figure however was not found to be optimal for either the single- or dual-arc prostate VMAT plans, which required 0.80 and 0.34, respectively, to minimize the differences between the TPS and independent-check MU. Point dose measurement of the VMAT plans demonstrated that the TPS MUs were appropriate for the delivered dose. Although the value of Sc(corr) may be obtained by direct comparison of calculation with measurement

  6. Modeling of an industrial environment: external dose calculations based on Monte Carlo simulations of photon transport.

    PubMed

    Kis, Zoltán; Eged, Katalin; Voigt, Gabriele; Meckbach, Reinhard; Müller, Heinz

    2004-02-01

    External gamma exposures from radionuclides deposited on surfaces usually result in the major contribution to the total dose to the public living in urban-industrial environments. The aim of the paper is to give an example for a calculation of the collective and averted collective dose due to the contamination and decontamination of deposition surfaces in a complex environment based on the results of Monte Carlo simulations. The shielding effects of the structures in complex and realistic industrial environments (where productive and/or commercial activity is carried out) were computed by the use of Monte Carlo method. Several types of deposition areas (walls, roofs, windows, streets, lawn) were considered. Moreover, this paper gives a summary about the time dependence of the source strengths relative to a reference surface and a short overview about the mechanical and chemical intervention techniques which can be applied in this area. An exposure scenario was designed based on a survey of average German and Hungarian supermarkets. In the first part of the paper the air kermas per photon per unit area due to each specific deposition area contaminated by 137Cs were determined at several arbitrary locations in the whole environment relative to a reference value of 8.39 x 10(-4) pGy per gamma m(-2). The calculations provide the possibility to assess the whole contribution of a specific deposition area to the collective dose, separately. According to the current results, the roof and the paved area contribute the most part (approximately 92%) to the total dose in the first year taking into account the relative contamination of the deposition areas. When integrating over 10 or 50 y, these two surfaces remain the most important contributors as well but the ratio will increasingly be shifted in favor of the roof. The decontamination of the roof and the paved area results in about 80-90% of the total averted collective dose in each calculated time period (1, 10, 50 y).

  7. Experience-based consulting: the value proposition.

    PubMed

    Pliner, Nicole; Thrall, James; Boland, Giles; Palumbo, Denise

    2004-11-01

    Consulting is a profession universally accepted and well entrenched throughout the business world. Whether it is providing objective analysis, supplying a specific expertise, managing a project, or simply adding extra manpower, consultants can add value. However, what are the attributes of a good consultant? In health care, with the rapid pace of emerging technologies, economic intricacies, and the complexities of clinical care, hands-on experience is the key. Recognizing the power of consultants with hands-on experience, the Department of Radiology at Massachusetts General Hospital launched the Radiology Consulting Group, an "experience-based" model for consulting that may potentially shift the profession's paradigm.

  8. Methods for calculating the absolute entropy and free energy of biological systems based on ideas from polymer physics.

    PubMed

    Meirovitch, Hagai

    2010-01-01

    The commonly used simulation techniques, Metropolis Monte Carlo (MC) and molecular dynamics (MD) are of a dynamical type which enables one to sample system configurations i correctly with the Boltzmann probability, P(i)(B), while the value of P(i)(B) is not provided directly; therefore, it is difficult to obtain the absolute entropy, S approximately -ln P(i)(B), and the Helmholtz free energy, F. With a different simulation approach developed in polymer physics, a chain is grown step-by-step with transition probabilities (TPs), and thus their product is the value of the construction probability; therefore, the entropy is known. Because all exact simulation methods are equivalent, i.e. they lead to the same averages and fluctuations of physical properties, one can treat an MC or MD sample as if its members have rather been generated step-by-step. Thus, each configuration i of the sample can be reconstructed (from nothing) by calculating the TPs with which it could have been constructed. This idea applies also to bulk systems such as fluids or magnets. This approach has led earlier to the "local states" (LS) and the "hypothetical scanning" (HS) methods, which are approximate in nature. A recent development is the hypothetical scanning Monte Carlo (HSMC) (or molecular dynamics, HSMD) method which is based on stochastic TPs where all interactions are taken into account. In this respect, HSMC(D) can be viewed as exact and the only approximation involved is due to insufficient MC(MD) sampling for calculating the TPs. The validity of HSMC has been established by applying it first to liquid argon, TIP3P water, self-avoiding walks (SAW), and polyglycine models, where the results for F were found to agree with those obtained by other methods. Subsequently, HSMD was applied to mobile loops of the enzymes porcine pancreatic alpha-amylase and acetylcholinesterase in explicit water, where the difference in F between the bound and free states of the loop was calculated. Currently

  9. 30 CFR 206.102 - How do I calculate royalty value for oil that I or my affiliate sell(s) under an arm's-length...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... or my affiliate sell(s) under an arm's-length contract? 206.102 Section 206.102 Mineral Resources... Federal Oil § 206.102 How do I calculate royalty value for oil that I or my affiliate sell(s) under an arm... seller under the arm's-length contract, less applicable allowances determined under §§ 206.110 or 206.111...

  10. Can value-based insurance impose societal costs?

    PubMed

    Koenig, Lane; Dall, Timothy M; Ruiz, David; Saavoss, Josh; Tongue, John

    2014-09-01

    Among policy alternatives considered to reduce health care costs and improve outcomes, value-based insurance design (VBID) has emerged as a promising option. Most applications of VBID, however, have not used higher cost sharing to discourage specific services. In April 2011, the state of Oregon introduced a policy for public employees that required additional cost sharing for high-cost procedures such as total knee arthroplasty (TKA). Our objectives were to estimate the societal impact of higher co-pays for TKA using Oregon as a case study and building on recent work demonstrating the effects of knee osteoarthritis and surgical treatment on employment and disability outcomes. We used a Markov model to estimate the societal impact in terms of quality of life, direct costs, and indirect costs of higher co-pays for TKA using Oregon as a case study. We found that TKA for a working population can generate societal benefits that offset the direct medical costs of the procedure. Delay in receiving surgical care, because of higher co-payment or other reasons, reduced the societal savings from TKA. We conclude that payers moving toward value-based cost sharing should consider consequences beyond direct medical expenses. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  11. Measurement Theory Based on the Truth Values Violates Local Realism

    NASA Astrophysics Data System (ADS)

    Nagata, Koji

    2017-02-01

    We investigate the violation factor of the Bell-Mermin inequality. Until now, we use an assumption that the results of measurement are ±1. In this case, the maximum violation factor is 2( n-1)/2. The quantum predictions by n-partite Greenberger-Horne-Zeilinger (GHZ) state violate the Bell-Mermin inequality by an amount that grows exponentially with n. Recently, a new measurement theory based on the truth values is proposed (Nagata and Nakamura, Int. J. Theor. Phys. 55:3616, 2016). The values of measurement outcome are either +1 or 0. Here we use the new measurement theory. We consider multipartite GHZ state. It turns out that the Bell-Mermin inequality is violated by the amount of 2( n-1)/2. The measurement theory based on the truth values provides the maximum violation of the Bell-Mermin inequality.

  12. Value-based insurance plus disease management increased medication use and produced savings.

    PubMed

    Gibson, Teresa B; Mahoney, John; Ranghell, Karlene; Cherney, Becky J; McElwee, Newell

    2011-01-01

    We evaluated the effects of implementing a value-based insurance design program for patients with diabetes in two groups within a single firm. One group participated in disease management; the other did not. We matched members of the two groups to similar enrollees within the company that did not offer the value-based program. We found that participation in both value-based insurance design and disease management resulted in sustained improvement over time. Use of diabetes medications increased 6.5 percent over three years. Adherence to diabetes medical guidelines also increased, producing a return on investment of $1.33 saved for every dollar spent during a three-year follow-up period.

  13. Market-Based Health Care in Specialty Surgery: Finding Patient-Centered Shared Value.

    PubMed

    Smith, Timothy R; Rambachan, Aksharananda; Cote, David; Cybulski, George; Laws, Edward R

    2015-10-01

    : The US health care system is struggling with rising costs, poor outcomes, waste, and inefficiency. The Patient Protection and Affordable Care Act represents a substantial effort to improve access and emphasizes value-based care. Value in health care has been defined as health outcomes for the patient per dollar spent. However, given the opacity of health outcomes and cost, the identification and quantification of patient-centered value is problematic. These problems are magnified by highly technical, specialized care (eg, neurosurgery). This is further complicated by potentially competing interests of the 5 major stakeholders in health care: patients, doctors, payers, hospitals, and manufacturers. These stakeholders are watching with great interest as health care in the United States moves toward a value-based system. Market principles can be harnessed to drive costs down, improve outcomes, and improve overall value to patients. However, there are many caveats to a market-based, value-driven system that must be identified and addressed. Many excellent neurosurgical efforts are already underway to nudge health care toward increased efficiency, decreased costs, and improved quality. Patient-centered shared value can provide a philosophical mooring for the development of health care policies that utilize market principles without losing sight of the ultimate goals of health care, to care for patients.

  14. Do volunteer community-based preceptors value students' feedback?

    PubMed

    Dent, M Marie; Boltri, John; Okosun, Ike S

    2004-11-01

    A key component of educational practice is to provide feedback and evaluation to teachers and learners to improve the teaching and learning process. The purpose of this study was to determine whether volunteer community preceptors value evaluation and feedback by students as much as they value other resources or rewards. In Fall 1999, a questionnaire concerning the resources and rewards of preceptorship was mailed to 236 community preceptors affiliated with the Mercer University School of Medicine, Macon, Georgia. Preceptors were asked to rate 20 factors on a five-point Likert scale (5 = very important to 1 = not very important). The mean values were compared using t-tests. One hundred sixty-eight preceptors (71%) completed questionnaires. Preceptors rated evaluation and feedback from students significantly higher (p < .001) than all other factors (mean = 4.02, standard deviation [SD] = .87). Continuing medical education for teaching was the next most highly valued factor (mean = 3.67, SD = 1.14). Preceptors rated financial compensation the lowest (mean = 2.01, SD = 1.19) of all factors. The high rank of feedback and evaluation from students persisted across gender, specialty, length of time as a preceptor, practice location, and years practicing medicine. This study demonstrates that feedback and evaluation from students is highly valued. The knowledge that community-based preceptors highly value feedback and evaluation from students should stimulate medical school programs to provide feedback and evaluation to preceptors that will enhance the educational outcomes for both faculty and learners.

  15. A Multi-Attribute Pheromone Ant Secure Routing Algorithm Based on Reputation Value for Sensor Networks

    PubMed Central

    Zhang, Lin; Yin, Na; Fu, Xiong; Lin, Qiaomin; Wang, Ruchuan

    2017-01-01

    With the development of wireless sensor networks, certain network problems have become more prominent, such as limited node resources, low data transmission security, and short network life cycles. To solve these problems effectively, it is important to design an efficient and trusted secure routing algorithm for wireless sensor networks. Traditional ant-colony optimization algorithms exhibit only local convergence, without considering the residual energy of the nodes and many other problems. This paper introduces a multi-attribute pheromone ant secure routing algorithm based on reputation value (MPASR). This algorithm can reduce the energy consumption of a network and improve the reliability of the nodes’ reputations by filtering nodes with higher coincidence rates and improving the method used to update the nodes’ communication behaviors. At the same time, the node reputation value, the residual node energy and the transmission delay are combined to formulate a synthetic pheromone that is used in the formula for calculating the random proportion rule in traditional ant-colony optimization to select the optimal data transmission path. Simulation results show that the improved algorithm can increase both the security of data transmission and the quality of routing service. PMID:28282894

  16. Value Based Care and Patient-Centered Care: Divergent or Complementary?

    PubMed

    Tseng, Eric K; Hicks, Lisa K

    2016-08-01

    Two distinct but overlapping care philosophies have emerged in cancer care: patient-centered care (PCC) and value-based care (VBC). Value in healthcare has been defined as the quality of care (measured typically by healthcare outcomes) modified by cost. In this conception of value, patient-centeredness is one important but not necessarily dominant quality measure. In contrast, PCC includes multiple domains of patient-centeredness and places the patient and family central to all decisions and evaluations of quality. The alignment of PCC and VBC is complicated by several tensions, including a relative lack of patient experience and preference measures, and conceptions of cost that are payer-focused instead of patient-focused. Several strategies may help to align these two philosophies, including the use of patient-reported outcomes in clinical trials and value determinations, and the purposeful integration of patient preference in clinical decisions and guidelines. Innovative models of care, including accountable care organizations and oncology patient-centered medical homes, may also facilitate alignment through improved care coordination and quality-based payment incentives. Ultimately, VBC and PCC will only be aligned if patient-centered outcomes, perspectives, and preferences are explicitly incorporated into the definitions and metrics of quality, cost, and value that will increasingly influence the delivery of cancer care.

  17. The Band Structure of Polymers: Its Calculation and Interpretation. Part 2. Calculation.

    ERIC Educational Resources Information Center

    Duke, B. J.; O'Leary, Brian

    1988-01-01

    Details ab initio crystal orbital calculations using all-trans-polyethylene as a model. Describes calculations based on various forms of translational symmetry. Compares these calculations with ab initio molecular orbital calculations discussed in a preceding article. Discusses three major approximations made in the crystal case. (CW)

  18. Computerized tomography-assisted calculation of sinus augmentation volume.

    PubMed

    Krennmair, Gerald; Krainhöfner, Martin; Maier, Harald; Weinländer, Michael; Piehslinger, Eva

    2006-01-01

    This study was intended to calculate the augmentation volume for a sinus lift procedure based on cross-sectional computerized tomography (CT) scans for 2 different augmentation heights. Based on area calculations of cross-sectional CT scans, the volume of additional bone needed was calculated for 44 sinus lift procedures. The amount of bone volume needed to raise the sinus floor to heights of both 12 and 17 mm was calculated. To achieve a sinus floor height of 12 mm, it was necessary to increase the height by a mean of 7.2+/-2.1 mm (range, 3.0 to 10.5 mm), depending on the residual ridge height; to achieve a height of 17 mm, a mean of 12.4+/-2.0 mm (range, 8.5 to 15.5 mm) was required (P < .01). The calculated augmentation volume for an augmentation height of 12 mm was 1.7+/-.9 cm3; for an augmentation height of 17 mm, the volume required was 3.6+/-1.5 cm3. Increasing the height of the sinus lift by 5 mm, ie, from 12 mm to 17 mm augmentation height, increased the augmentation volume by 100%. A significant correlation was found between augmentation height and the calculated sinus lift augmentation volume (r = 0. 78, P < .01). Detailed preoperative knowledge of sinus lift augmentation volume is helpful as a predictive value in deciding on a donor site for harvesting autogenous bone and on the ratio of bone to bone substitute to use. Calculation of the augmentation size can help determine the surgical approach and thus perioperative treatment and the costs of the surgery for both patients and clinicians.

  19. Osmotic potential calculations of inorganic and organic aqueous solutions over wide solute concentration levels and temperatures.

    PubMed

    Cochrane, T T; Cochrane, T A

    2016-01-01

    To demonstrate that the authors' new "aqueous solution vs pure water" equation to calculate osmotic potential may be used to calculate the osmotic potentials of inorganic and organic aqueous solutions over wide ranges of solute concentrations and temperatures. Currently, the osmotic potentials of solutions used for medical purposes are calculated from equations based on the thermodynamics of the gas laws which are only accurate at low temperature and solute concentration levels. Some solutions used in medicine may need their osmotic potentials calculated more accurately to take into account solute concentrations and temperatures. The authors experimented with their new equation for calculating the osmotic potentials of inorganic and organic aqueous solutions up to and beyond body temperatures by adjusting three of its factors; (a) the volume property of pure water, (b) the number of "free" water molecules per unit volume of solution, "Nf," and (c) the "t" factor expressing the cooperative structural relaxation time of pure water at given temperatures. Adequate information on the volume property of pure water at different temperatures is available in the literature. However, as little information on the relative densities of inorganic and organic solutions, respectively, at varying temperatures needed to calculate Nf was available, provisional equations were formulated to approximate values. Those values together with tentative t values for different temperatures chosen from values calculated by different workers were substituted into the authors' equation to demonstrate how osmotic potentials could be estimated over temperatures up to and beyond bodily temperatures. The provisional equations formulated to calculate Nf, the number of free water molecules per unit volume of inorganic and organic solute solutions, respectively, over wide concentration ranges compared well with the calculations of Nf using recorded relative density data at 20 °C. They were

  20. Can and should value-based pricing be applied to molecular diagnostics?

    PubMed

    Garau, Martina; Towse, Adrian; Garrison, Louis; Housman, Laura; Ossa, Diego

    2013-01-01

    Current pricing and reimbursement systems for diagnostics are not efficient. Prices for diagnostics are often driven by administrative practices and expected production cost. The purpose of the paper is to discuss how a value-based pricing framework being used to ensure efficient use and price of medicines could also be applied to diagnostics. Diagnostics not only facilitates health gain and cost savings, but also information to guide patients' decisions on interventions and their future 'behaviors'. For value assessment processes we recommend a two-part approach. Companion diagnostics introduced at the launch of the drug should be assessed through new drug assessment processes considering a broad range of value elements and a balanced analysis of diagnostic impacts. A separate diagnostic-dedicated committee using value-based pricing principles should review other diagnostics lying outside the companion diagnostics-and-drug 'at-launch' situation.