ERIC Educational Resources Information Center
Stolberg, Charles G., Ed.
To help improve school district financial management, the Association of School Business Officials at its 1980 annual meeting held a special session consisting of 20 "mini-workshops" about successful, field-proven practices in school budgeting, accounting, auditing, and other financial tasks. This document provides summaries of the…
Rethinking Twenty-First Century Acquisition: Emerging Trends for Efficiency Ends
1997-01-01
improve the quality of their services. However, dispari- ties in current accounting methods and rules between the two sectors make evalu- ating costs ...are facilitated by the grow- ing use of ABC methods in the account - ing community. These methods tie the to- tal costs of production or services more...Acquisition program man- agers can learn by studying recent devel- opments in the private sector accounting community. Knowledge of relevant total costs and
Leister, Jan Eric; Stausberg, Jürgen
2005-09-28
Diagnosis related groups (DRGs) are a well-established provider payment system. Because of their imminent potential of cost reduction, they have been widely introduced. In addition to cost cutting, several social objectives - e.g., improving overall health care quality - feed into the DRG system. The WHO compared different provider payment systems with regard to the following objectives: prevention of further health problems, providing services and solving health problems, and responsiveness to people's legitimate expectations. However, no study has been published which takes the impact of different cost accounting systems across the DRG systems into account. We compared the impact of different cost accounting methods within DRG-like systems by developing six criteria: integration of patients' health risk into pricing practice, incentives for quality improvement and innovation, availability of high class evidence based therapy, prohibition of economically founded exclusions, reduction of fragmentation incentives, and improvement of patient oriented treatment. We set up a first overview of potential and actual impacts of the pricing practices within Yale-DRGs, AR-DRGs, G-DRGs, Swiss AP-DRGs adoption and Swiss MIPP. It could be demonstrated that DRGs are not only a 'homogenous' group of similar provider payment systems but quite different by fulfilling major health care objectives connected with the used cost accounting methods. If not only the possible cost reduction is used to put in a good word for DRG-based provider payment systems, maximum accurateness concerning the method of cost accounting should prevail when implementing a new DRG-based provider payment system.
Accounting for partiality in serial crystallography using ray-tracing principles.
Kroon-Batenburg, Loes M J; Schreurs, Antoine M M; Ravelli, Raimond B G; Gros, Piet
2015-09-01
Serial crystallography generates `still' diffraction data sets that are composed of single diffraction images obtained from a large number of crystals arbitrarily oriented in the X-ray beam. Estimation of the reflection partialities, which accounts for the expected observed fractions of diffraction intensities, has so far been problematic. In this paper, a method is derived for modelling the partialities by making use of the ray-tracing diffraction-integration method EVAL. The method estimates partialities based on crystal mosaicity, beam divergence, wavelength dispersion, crystal size and the interference function, accounting for crystallite size. It is shown that modelling of each reflection by a distribution of interference-function weighted rays yields a `still' Lorentz factor. Still data are compared with a conventional rotation data set collected from a single lysozyme crystal. Overall, the presented still integration method improves the data quality markedly. The R factor of the still data compared with the rotation data decreases from 26% using a Monte Carlo approach to 12% after applying the Lorentz correction, to 5.3% when estimating partialities by EVAL and finally to 4.7% after post-refinement. The merging R(int) factor of the still data improves from 105 to 56% but remains high. This suggests that the accuracy of the model parameters could be further improved. However, with a multiplicity of around 40 and an R(int) of ∼50% the merged still data approximate the quality of the rotation data. The presented integration method suitably accounts for the partiality of the observed intensities in still diffraction data, which is a critical step to improve data quality in serial crystallography.
Improving School Accountability Measures. NBER Working Paper Series.
ERIC Educational Resources Information Center
Kane, Thomas J.; Staiger, Douglas O.
A growing number of states are using annual school-level test scores as part of their school accountability systems. This paper highlights an under-appreciated weakness of that approach, the imprecision of school-level test score means, and proposes a method for discerning signal from noise in annual school report cards. Using methods developed in…
Accounting for partiality in serial crystallography using ray-tracing principles
Kroon-Batenburg, Loes M. J.; Schreurs, Antoine M. M.; Ravelli, Raimond B. G.; Gros, Piet
2015-01-01
Serial crystallography generates ‘still’ diffraction data sets that are composed of single diffraction images obtained from a large number of crystals arbitrarily oriented in the X-ray beam. Estimation of the reflection partialities, which accounts for the expected observed fractions of diffraction intensities, has so far been problematic. In this paper, a method is derived for modelling the partialities by making use of the ray-tracing diffraction-integration method EVAL. The method estimates partialities based on crystal mosaicity, beam divergence, wavelength dispersion, crystal size and the interference function, accounting for crystallite size. It is shown that modelling of each reflection by a distribution of interference-function weighted rays yields a ‘still’ Lorentz factor. Still data are compared with a conventional rotation data set collected from a single lysozyme crystal. Overall, the presented still integration method improves the data quality markedly. The R factor of the still data compared with the rotation data decreases from 26% using a Monte Carlo approach to 12% after applying the Lorentz correction, to 5.3% when estimating partialities by EVAL and finally to 4.7% after post-refinement. The merging R int factor of the still data improves from 105 to 56% but remains high. This suggests that the accuracy of the model parameters could be further improved. However, with a multiplicity of around 40 and an R int of ∼50% the merged still data approximate the quality of the rotation data. The presented integration method suitably accounts for the partiality of the observed intensities in still diffraction data, which is a critical step to improve data quality in serial crystallography. PMID:26327370
Cartwright, William S
2008-04-01
Researchers have been at the forefront of applying new costing methods to drug abuse treatment programs and innovations. The motivation for such work has been to improve costing accuracy. Recent work has seen applications initiated in establishing charts of account and cost accounting for service delivery. As a result, researchers now have available five methods to apply to the costing of drug abuse treatment programs. In all areas of costing, there is room for more research on costing concepts and measurement applications. Additional work would be useful in establishing studies with activity-based costing for both research and managerial purposes. Studies of economies of scope are particularly relevant because of the integration of social services and criminal justice in drug abuse treatment. In the long run, managerial initiatives to improve the administration and quality of drug abuse treatment will benefit directly from research with new information on costing techniques.
Stewart, Sarah; Pearson, Janet; Rome, Keith; Dalbeth, Nicola; Vandal, Alain C
2018-01-01
Statistical techniques currently used in musculoskeletal research often inefficiently account for paired-limb measurements or the relationship between measurements taken from multiple regions within limbs. This study compared three commonly used analysis methods with a mixed-models approach that appropriately accounted for the association between limbs, regions, and trials and that utilised all information available from repeated trials. Four analysis were applied to an existing data set containing plantar pressure data, which was collected for seven masked regions on right and left feet, over three trials, across three participant groups. Methods 1-3 averaged data over trials and analysed right foot data (Method 1), data from a randomly selected foot (Method 2), and averaged right and left foot data (Method 3). Method 4 used all available data in a mixed-effects regression that accounted for repeated measures taken for each foot, foot region and trial. Confidence interval widths for the mean differences between groups for each foot region were used as a criterion for comparison of statistical efficiency. Mean differences in pressure between groups were similar across methods for each foot region, while the confidence interval widths were consistently smaller for Method 4. Method 4 also revealed significant between-group differences that were not detected by Methods 1-3. A mixed effects linear model approach generates improved efficiency and power by producing more precise estimates compared to alternative approaches that discard information in the process of accounting for paired-limb measurements. This approach is recommended in generating more clinically sound and statistically efficient research outputs. Copyright © 2017 Elsevier B.V. All rights reserved.
School Improvement Model to Foster Student Learning
ERIC Educational Resources Information Center
Rulloda, Rudolfo Barcena
2011-01-01
Many classroom teachers are still using the traditional teaching methods. The traditional teaching methods are one-way learning process, where teachers would introduce subject contents such as language arts, English, mathematics, science, and reading separately. However, the school improvement model takes into account that all students have…
Refinement of Fread s Method for improved tracking of stream discharges during unsteady flows
Lee, Kyutae; Muste, Marian
2017-02-07
There are a plethora of analytical approaches to account for the effect of unsteady flow (a.k.a. hysteretic behavior) on the conventionally-built steady rating curves (RCs) used to continuously estimate discharges in open channel flow. One of the most complete correction methods is Fread s method (Fread, 1975) which is based on fully dynamic one-dimensional wave equation. Proposed herein is a modified Fread s method which is adjusted to account for the actual geometry of the cross section. This method improves the accuracy associated with the estimation of conveyance factor and energy slope, so it is particularly useful for small tomore » mid-size streams/rivers where the original method s assumption does not properly hold. The modified Fread s method is tested for the sites in Clear Creek (Iowa, USA) and Ebro River (Spain) to illustrate the significance of its improvement in discharge estimation. While the degree of improvement is apparent for the conveyance factor because the hydraulic depth is replaced by hydraulic radius, that for the energy slope term specifically depends on the site and event conditions.« less
Refinement of Fread s Method for improved tracking of stream discharges during unsteady flows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Kyutae; Muste, Marian
There are a plethora of analytical approaches to account for the effect of unsteady flow (a.k.a. hysteretic behavior) on the conventionally-built steady rating curves (RCs) used to continuously estimate discharges in open channel flow. One of the most complete correction methods is Fread s method (Fread, 1975) which is based on fully dynamic one-dimensional wave equation. Proposed herein is a modified Fread s method which is adjusted to account for the actual geometry of the cross section. This method improves the accuracy associated with the estimation of conveyance factor and energy slope, so it is particularly useful for small tomore » mid-size streams/rivers where the original method s assumption does not properly hold. The modified Fread s method is tested for the sites in Clear Creek (Iowa, USA) and Ebro River (Spain) to illustrate the significance of its improvement in discharge estimation. While the degree of improvement is apparent for the conveyance factor because the hydraulic depth is replaced by hydraulic radius, that for the energy slope term specifically depends on the site and event conditions.« less
Just Culture: A Foundation for Balanced Accountability and Patient Safety
Boysen, Philip G.
2013-01-01
Background The framework of a just culture ensures balanced accountability for both individuals and the organization responsible for designing and improving systems in the workplace. Engineering principles and human factors analysis influence the design of these systems so they are safe and reliable. Methods Approaches for improving patient safety introduced here are (1) analysis of error, (2) specific tools to enhance safety, and (3) outcome engineering. Conclusion The just culture is a learning culture that is constantly improving and oriented toward patient safety. PMID:24052772
Accounting for partiality in serial crystallography using ray-tracing principles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kroon-Batenburg, Loes M. J., E-mail: l.m.j.kroon-batenburg@uu.nl; Schreurs, Antoine M. M.; Ravelli, Raimond B. G.
Serial crystallography generates partial reflections from still diffraction images. Partialities are estimated with EVAL ray-tracing simulations, thereby improving merged reflection data to a similar quality as conventional rotation data. Serial crystallography generates ‘still’ diffraction data sets that are composed of single diffraction images obtained from a large number of crystals arbitrarily oriented in the X-ray beam. Estimation of the reflection partialities, which accounts for the expected observed fractions of diffraction intensities, has so far been problematic. In this paper, a method is derived for modelling the partialities by making use of the ray-tracing diffraction-integration method EVAL. The method estimates partialitiesmore » based on crystal mosaicity, beam divergence, wavelength dispersion, crystal size and the interference function, accounting for crystallite size. It is shown that modelling of each reflection by a distribution of interference-function weighted rays yields a ‘still’ Lorentz factor. Still data are compared with a conventional rotation data set collected from a single lysozyme crystal. Overall, the presented still integration method improves the data quality markedly. The R factor of the still data compared with the rotation data decreases from 26% using a Monte Carlo approach to 12% after applying the Lorentz correction, to 5.3% when estimating partialities by EVAL and finally to 4.7% after post-refinement. The merging R{sub int} factor of the still data improves from 105 to 56% but remains high. This suggests that the accuracy of the model parameters could be further improved. However, with a multiplicity of around 40 and an R{sub int} of ∼50% the merged still data approximate the quality of the rotation data. The presented integration method suitably accounts for the partiality of the observed intensities in still diffraction data, which is a critical step to improve data quality in serial crystallography.« less
2017-01-01
Unique Molecular Identifiers (UMIs) are random oligonucleotide barcodes that are increasingly used in high-throughput sequencing experiments. Through a UMI, identical copies arising from distinct molecules can be distinguished from those arising through PCR amplification of the same molecule. However, bioinformatic methods to leverage the information from UMIs have yet to be formalized. In particular, sequencing errors in the UMI sequence are often ignored or else resolved in an ad hoc manner. We show that errors in the UMI sequence are common and introduce network-based methods to account for these errors when identifying PCR duplicates. Using these methods, we demonstrate improved quantification accuracy both under simulated conditions and real iCLIP and single-cell RNA-seq data sets. Reproducibility between iCLIP replicates and single-cell RNA-seq clustering are both improved using our proposed network-based method, demonstrating the value of properly accounting for errors in UMIs. These methods are implemented in the open source UMI-tools software package. PMID:28100584
Generalized contact and improved frictional heating in the material point method
NASA Astrophysics Data System (ADS)
Nairn, J. A.; Bardenhagen, S. G.; Smith, G. D.
2017-09-01
The material point method (MPM) has proved to be an effective particle method for computational mechanics modeling of problems involving contact, but all prior applications have been limited to Coulomb friction. This paper generalizes the MPM approach for contact to handle any friction law with examples given for friction with adhesion or with a velocity-dependent coefficient of friction. Accounting for adhesion requires an extra calculation to evaluate contact area. Implementation of velocity-dependent laws usually needs numerical methods to find contacting forces. The friction process involves work which can be converted into heat. This paper provides a new method for calculating frictional heating that accounts for interfacial acceleration during the time step. The acceleration terms is small for many problems, but temporal convergence of heating effects for problems involving vibrations and high contact forces is improved by the new method. Fortunately, the new method needs few extra calculations and therefore is recommended for all simulations.
Generalized contact and improved frictional heating in the material point method
NASA Astrophysics Data System (ADS)
Nairn, J. A.; Bardenhagen, S. G.; Smith, G. D.
2018-07-01
The material point method (MPM) has proved to be an effective particle method for computational mechanics modeling of problems involving contact, but all prior applications have been limited to Coulomb friction. This paper generalizes the MPM approach for contact to handle any friction law with examples given for friction with adhesion or with a velocity-dependent coefficient of friction. Accounting for adhesion requires an extra calculation to evaluate contact area. Implementation of velocity-dependent laws usually needs numerical methods to find contacting forces. The friction process involves work which can be converted into heat. This paper provides a new method for calculating frictional heating that accounts for interfacial acceleration during the time step. The acceleration terms is small for many problems, but temporal convergence of heating effects for problems involving vibrations and high contact forces is improved by the new method. Fortunately, the new method needs few extra calculations and therefore is recommended for all simulations.
Billing and accounts receivable: fundamentals for improvement.
Bizon, M M
1993-07-01
If a healthcare facility's accounts receivable operation is experiencing problems, the patient accounts manager should survey all areas of his or her responsibility to determine the best method of resolving the difficulties. One effective technique to reduce billing problems is to take a proactive--not reactive--approach. If mistakes can be corrected before they get out of control, and if the patient accounts manager can ensure that claims will not be denied, a healthcare facility's accounts receivable should remain in good condition.
Accounting for Co-Teaching: A Guide for Policymakers and Developers of Value-Added Models
ERIC Educational Resources Information Center
Isenberg, Eric; Walsh, Elias
2015-01-01
We outline the options available to policymakers for addressing co-teaching in a value-added model. Building on earlier work, we propose an improvement to a method of accounting for co-teaching that treats co-teachers as teams, with each teacher receiving equal credit for co-taught students. Hock and Isenberg (2012) described a method known as the…
Jewett, Ethan M.; Steinrücken, Matthias; Song, Yun S.
2016-01-01
Many approaches have been developed for inferring selection coefficients from time series data while accounting for genetic drift. These approaches have been motivated by the intuition that properly accounting for the population size history can significantly improve estimates of selective strengths. However, the improvement in inference accuracy that can be attained by modeling drift has not been characterized. Here, by comparing maximum likelihood estimates of selection coefficients that account for the true population size history with estimates that ignore drift by assuming allele frequencies evolve deterministically in a population of infinite size, we address the following questions: how much can modeling the population size history improve estimates of selection coefficients? How much can mis-inferred population sizes hurt inferences of selection coefficients? We conduct our analysis under the discrete Wright–Fisher model by deriving the exact probability of an allele frequency trajectory in a population of time-varying size and we replicate our results under the diffusion model. For both models, we find that ignoring drift leads to estimates of selection coefficients that are nearly as accurate as estimates that account for the true population history, even when population sizes are small and drift is high. This result is of interest because inference methods that ignore drift are widely used in evolutionary studies and can be many orders of magnitude faster than methods that account for population sizes. PMID:27550904
Broadening the Educational Evaluation Lens with Communicative Evaluation
ERIC Educational Resources Information Center
Brooks-LaRaviere, Margaret; Ryan, Katherine; Miron, Luis; Samuels, Maurice
2009-01-01
Outcomes-based accountability in the form of test scores and performance indicators are a primary lever for improving student achievement in the current educational landscape. The article presents communicative evaluation as a complementary evaluation approach that may be used along with the primary methods of school accountability to provide a…
Practical Considerations when Using Benchmarking for Accountability in Higher Education
ERIC Educational Resources Information Center
Achtemeier, Sue D.; Simpson, Ronald D.
2005-01-01
The qualitative study on which this article is based examined key individuals' perceptions, both within a research university community and beyond in its external governing board, of how to improve benchmarking as an accountability method in higher education. Differing understanding of benchmarking revealed practical implications for using it as…
Accountability Indicators from the Viewpoint of Statistical Method.
ERIC Educational Resources Information Center
Jordan, Larry
Few people seriously regard students as "products" coming off an educational assembly line, but notions about accountability and quality improvement in higher education are pervaded by manufacturing ideas and metaphors. Because numerical indicators of quality are inevitably expressed by trend lines or statistical control chars of some kind, they…
The need for monetary information within corporate water accounting.
Burritt, Roger L; Christ, Katherine L
2017-10-01
A conceptual discussion is provided about the need to add monetary data to water accounting initiatives and how best to achieve this if companies are to become aware of the water crisis and to take actions to improve water management. Analysis of current water accounting initiatives reveals the monetary business case for companies to improve water management is rarely considered, there being a focus on physical information about water use. Three possibilities emerge for mainstreaming the integration of monetization into water accounting: add-on to existing water accounting frameworks and tools, develop new tools which include physical and monetary information from the start, and develop environmental management accounting (EMA) into a water-specific application and set of tools. The paper appraises these three alternatives and concludes that development of EMA would be the best way forward. Suggestions for further research include the need to examine the use of a transdisciplinary method to address the complexities of water accounting. Copyright © 2017 Elsevier Ltd. All rights reserved.
Setting of index system of environmental and economic accounting of water
NASA Astrophysics Data System (ADS)
Tan, Yarong
2017-10-01
To realize the quality advancement of integrated water management in China, a scientific and perfect index system of environmental and economic accounting should be built. At present, the water shortage in China becomes increasingly serious, which further highlights the importance of efficient water management and improving the index system of water economic accounting. Based on the internal structure of the new statistical method of environmental and economic accounting, this paper focuses on analyzing and discussing the index system which it should have.
Kassahun, Aron; Braka, Fiona; Gallagher, Kathleen; Gebriel, Aregai Wolde; Nsubuga, Peter; M’pele-Kilebou, Pierre
2017-01-01
Introduction the World Health Organization (WHO), Ethiopia country office, introduced an accountability framework into its Polio Eradication Program in 2014 with the aim of improving the program's performance. Our study aims to evaluate staff performance and key program indicators following the introduction of the accountability framework. Methods the impact of the WHO accountability framework was reviewed after its first year of implementation from June 2014 to June 2015. We analyzed selected program and staff performance indicators associated with acute flaccid paralysis (AFP) surveillance from a database available at WHO. Data on managerial actions taken were also reviewed. Performance of a total of 38 staff was evaluated during our review. Results our review of results for the first four quarters of implementation of the polio eradication accountability framework showed improvement both at the program and individual level when compared with the previous year. Managerial actions taken during the study period based on the results from the monitoring tool included eleven written acknowledgments, six discussions regarding performance improvement, six rotations of staff, four written first-warning letters and nine non-renewal of contracts. Conclusion the introduction of the accountability framework resulted in improvement in staff performance and overall program indicators for AFP surveillance. PMID:28890753
Measurement Error and Environmental Epidemiology: A Policy Perspective
Edwards, Jessie K.; Keil, Alexander P.
2017-01-01
Purpose of review Measurement error threatens public health by producing bias in estimates of the population impact of environmental exposures. Quantitative methods to account for measurement bias can improve public health decision making. Recent findings We summarize traditional and emerging methods to improve inference under a standard perspective, in which the investigator estimates an exposure response function, and a policy perspective, in which the investigator directly estimates population impact of a proposed intervention. Summary Under a policy perspective, the analysis must be sensitive to errors in measurement of factors that modify the effect of exposure on outcome, must consider whether policies operate on the true or measured exposures, and may increasingly need to account for potentially dependent measurement error of two or more exposures affected by the same policy or intervention. Incorporating approaches to account for measurement error into such a policy perspective will increase the impact of environmental epidemiology. PMID:28138941
Parameter estimation using weighted total least squares in the two-compartment exchange model.
Garpebring, Anders; Löfstedt, Tommy
2018-01-01
The linear least squares (LLS) estimator provides a fast approach to parameter estimation in the linearized two-compartment exchange model. However, the LLS method may introduce a bias through correlated noise in the system matrix of the model. The purpose of this work is to present a new estimator for the linearized two-compartment exchange model that takes this noise into account. To account for the noise in the system matrix, we developed an estimator based on the weighted total least squares (WTLS) method. Using simulations, the proposed WTLS estimator was compared, in terms of accuracy and precision, to an LLS estimator and a nonlinear least squares (NLLS) estimator. The WTLS method improved the accuracy compared to the LLS method to levels comparable to the NLLS method. This improvement was at the expense of increased computational time; however, the WTLS was still faster than the NLLS method. At high signal-to-noise ratio all methods provided similar precisions while inconclusive results were observed at low signal-to-noise ratio. The proposed method provides improvements in accuracy compared to the LLS method, however, at an increased computational cost. Magn Reson Med 79:561-567, 2017. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
Accountability Groups to Enhance Language Learning in a University Intensive English Program
ERIC Educational Resources Information Center
Lippincott, Dianna
2017-01-01
This mixed methods classroom research examined if accountability groups in the lower proficiency levels of a university intensive English program would improve students' language acquisition. Students were assigned partners for the study period with whom they completed assignments inside and outside of class, as well as set goals for use of…
A framework for simulating map error in ecosystem models
Sean P. Healey; Shawn P. Urbanski; Paul L. Patterson; Chris Garrard
2014-01-01
The temporal depth and spatial breadth of observations from platforms such as Landsat provide unique perspective on ecosystem dynamics, but the integration of these observations into formal decision support will rely upon improved uncertainty accounting. Monte Carlo (MC) simulations offer a practical, empirical method of accounting for potential map errors in broader...
The March to Accountable Care Organizations--How Will Rural Fare?
ERIC Educational Resources Information Center
MacKinney, A. Clinton; Mueller, Keith J.; McBride, Timothy D.
2011-01-01
Purpose: This article describes a strategy for rural providers, communities, and policy makers to support or establish accountable care organizations (ACOs). Methods: ACOs represent a new health care delivery and provider payment system designed to improve clinical quality and control costs. The Patient Protection and Affordable Care Act (ACA)…
Tegegne, Sisay G.; MKanda, Pascal; Yehualashet, Yared G.; Erbeto, Tesfaye B.; Touray, Kebba; Nsubuga, Peter; Banda, Richard; Vaz, Rui G.
2016-01-01
Background. An accountability framework is a central feature of managing human and financial resources. One of its primary goals is to improve program performance through close monitoring of selected priority activities. The principal objective of this study was to determine the contribution of a systematic accountability framework to improving the performance of the World Health Organization (WHO)–Nigeria polio program staff, as well as the program itself. Methods. The effect of implementation of the accountability framework was evaluated using data on administrative actions and select process indicators associated with acute flaccid paralysis (AFP) surveillance, routine immunization, and polio supplemental immunization activities. Data were collected in 2014 during supportive supervision, using Magpi software (a company that provides service to collect data using mobile phones). A total of 2500 staff were studied. Results. Data on administrative actions and process indicators from quarters 2–4 in 2014 were compared. With respect to administrative actions, 1631 personnel (74%) received positive feedback (written or verbal commendation) in quarter 4 through the accountability framework, compared with 1569 (73%) and 1152 (61%) during quarters 3 and 2, respectively. These findings accorded with data on process indicators associated with AFP surveillance and routine immunization, showing statistically significant improvements in staff performance at the end of quarter 4, compared with other quarters. Conclusions. Improvements in staff performance and process indicators were observed for the WHO-Nigeria polio program after implementation of a systematic accountability framework. PMID:26823334
Quality improvement in neonatal care - a new paradigm for developing countries.
Chawla, Deepak; Suresh, Gautham K
2014-12-01
Infrastructure for facility-based neonatal care has rapidly grown in India over last few years. Experience from developed countries indicates that different health facilities have varying clinical outcomes despite accounting for differences in illness severity of admitted neonates and random variation. Variation in quality of care provided at different neonatal units may account for variable clinical outcomes. Monitoring quality of care, comparing outcomes across different centers and conducting collaborative quality improvement projects can improve outcome of neonates in health facilities. Top priority should be given to establishing quality monitoring and improvement procedures at special care neonatal units and neonatal intensive care units of the country. This article presents an overview of methods of quality improvement. Literature reports of successful collaborative quality improvement projects in neonatal health are also reviewed.
Jewett, Ethan M; Steinrücken, Matthias; Song, Yun S
2016-11-01
Many approaches have been developed for inferring selection coefficients from time series data while accounting for genetic drift. These approaches have been motivated by the intuition that properly accounting for the population size history can significantly improve estimates of selective strengths. However, the improvement in inference accuracy that can be attained by modeling drift has not been characterized. Here, by comparing maximum likelihood estimates of selection coefficients that account for the true population size history with estimates that ignore drift by assuming allele frequencies evolve deterministically in a population of infinite size, we address the following questions: how much can modeling the population size history improve estimates of selection coefficients? How much can mis-inferred population sizes hurt inferences of selection coefficients? We conduct our analysis under the discrete Wright-Fisher model by deriving the exact probability of an allele frequency trajectory in a population of time-varying size and we replicate our results under the diffusion model. For both models, we find that ignoring drift leads to estimates of selection coefficients that are nearly as accurate as estimates that account for the true population history, even when population sizes are small and drift is high. This result is of interest because inference methods that ignore drift are widely used in evolutionary studies and can be many orders of magnitude faster than methods that account for population sizes. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
Computation of subsonic flow around airfoil systems with multiple separation
NASA Technical Reports Server (NTRS)
Jacob, K.
1982-01-01
A numerical method for computing the subsonic flow around multi-element airfoil systems was developed, allowing for flow separation at one or more elements. Besides multiple rear separation also sort bubbles on the upper surface and cove bubbles can approximately be taken into account. Also, compressibility effects for pure subsonic flow are approximately accounted for. After presentation the method is applied to several examples and improved in some details. Finally, the present limitations and desirable extensions are discussed.
ERIC Educational Resources Information Center
Ylimaki, Rose M.; Brunderman, Lynnette; Bennett, Jeffrey V.; Dugan, Thad
2014-01-01
Today's accountability policies and changing demographics have created conditions in which leaders must rapidly build school capacity and improve outcomes in culturally diverse schools. This article presents findings from a mixed-methods evaluation of an Arizona Turnaround Leadership Development Project. The project drew on studies of turnaround…
A systematic framework for Monte Carlo simulation of remote sensing errors map in carbon assessments
S. Healey; P. Patterson; S. Urbanski
2014-01-01
Remotely sensed observations can provide unique perspective on how management and natural disturbance affect carbon stocks in forests. However, integration of these observations into formal decision support will rely upon improved uncertainty accounting. Monte Carlo (MC) simulations offer a practical, empirical method of accounting for potential remote sensing errors...
NASA Technical Reports Server (NTRS)
Ruo, S. Y.
1978-01-01
A computer program was developed to account approximately for the effects of finite wing thickness in transonic potential flow over an oscillation wing of finite span. The program is based on the original sonic box computer program for planar wing which was extended to account for the effect of wing thickness. Computational efficiency and accuracy were improved and swept trailing edges were accounted for. Account for the nonuniform flow caused by finite thickness was made by application of the local linearization concept with appropriate coordinate transformation. A brief description of each computer routine and the applications of cubic spline and spline surface data fitting techniques used in the program are given, and the method of input was shown in detail. Sample calculations as well as a complete listing of the computer program listing are presented.
Improving environmental impact and cost assessment for supplier evaluation
NASA Astrophysics Data System (ADS)
Beucker, Severin; Lang, Claus
2004-02-01
Improving a company"s environmental and financial performance necessitates the evaluation of environmental impacts deriving from the production and cost effects of corporate actions. These effects have to be made transparent and concrete targets have to be developed. Such an evaluation has to be done on a regular basis but with limited expenses. To achieve this, different instruments of environmental controlling such as LCA and environmental performance indicators have to be combined with methods from cost accounting. Within the research project CARE (Computer Aided Resource Efficiency Accounting for Medium-Sized Enterprises), the method Resource Efficiency Accounting (REA) is used to give the participating companies new insights into hidden costs and environmental effects of their production and products. The method combines process based cost accounting with environmental impact assessment methodology and offers results that can be integrated into a company"s environmental controlling system and business processes like cost accounting, supplier assessment, etc. Much of the data necessary for the combined assessment can be available within a company"s IT system and therefore can be efficiently used for the assessment process. The project CARE puts a strong focus on the use of company data and information systems for the described assessment process and offers a methodological background for the evaluation and the structuring of such data. Besides the general approach of the project CARE the paper will present results from a case study in which the described approach is used for the evaluation of suppliers.
Beyond Measurement and Reward: Methods of Motivating Quality Improvement and Accountability.
Berenson, Robert A; Rice, Thomas
2015-12-01
The article examines public policies designed to improve quality and accountability that do not rely on financial incentives and public reporting of provider performance. Payment policy should help temper the current "more is better" attitude of physicians and provider organizations. Incentive neutrality would better support health professionals' intrinsic motivation to act in their patients' best interests to improve overall quality than would pay-for-performance plans targeted to specific areas of clinical care. Public policy can support clinicians' intrinsic motivation through approaches that support systematic feedback to clinicians and provide concrete opportunities to collaborate to improve care. Some programs administered by the Centers for Medicare & Medicaid Services, including Partnership for Patients and Conditions of Participation, deserve more attention; they represent available, but largely ignored, approaches to support providers to improve quality and protect beneficiaries against substandard care. Public policies related to quality improvement should focus more on methods of enhancing professional intrinsic motivation, while recognizing the potential role of organizations to actively promote and facilitate that motivation. Actually achieving improvement, however, will require a reexamination of the role played by financial incentives embedded in payments and the unrealistic expectations placed on marginal incentives in pay-for-performance schemes. © Health Research and Educational Trust.
Yue Xu, Selene; Nelson, Sandahl; Kerr, Jacqueline; Godbole, Suneeta; Patterson, Ruth; Merchant, Gina; Abramson, Ian; Staudenmayer, John; Natarajan, Loki
2018-04-01
Physical inactivity is a recognized risk factor for many chronic diseases. Accelerometers are increasingly used as an objective means to measure daily physical activity. One challenge in using these devices is missing data due to device nonwear. We used a well-characterized cohort of 333 overweight postmenopausal breast cancer survivors to examine missing data patterns of accelerometer outputs over the day. Based on these observed missingness patterns, we created psuedo-simulated datasets with realistic missing data patterns. We developed statistical methods to design imputation and variance weighting algorithms to account for missing data effects when fitting regression models. Bias and precision of each method were evaluated and compared. Our results indicated that not accounting for missing data in the analysis yielded unstable estimates in the regression analysis. Incorporating variance weights and/or subject-level imputation improved precision by >50%, compared to ignoring missing data. We recommend that these simple easy-to-implement statistical tools be used to improve analysis of accelerometer data.
The missing link in Aboriginal care: resource accounting.
Ashton, C W; Duffie-Ashton, Denise
2008-01-01
Resource accounting principles provide more effective planning for Aboriginal healthcare delivery through driving best management practices, efficacious techniques for long-term resource allocation, transparency of information and performance measurement. Major improvements to Aboriginal health in New Zealand and Australia were facilitated in the context of this public finance paradigm, rather than cash accounting systems that remain the current method for public departments in Canada. Multiple funding sources and fragmented delivery of Aboriginal healthcare can be remedied through similar adoption of such principles.
Sustaining Reliability on Accountability Measures at The Johns Hopkins Hospital.
Pronovost, Peter J; Holzmueller, Christine G; Callender, Tiffany; Demski, Renee; Winner, Laura; Day, Richard; Austin, J Matthew; Berenholtz, Sean M; Miller, Marlene R
2016-02-01
In 2012 Johns Hopkins Medicine leaders challenged their health system to reliably deliver best practice care linked to nationally vetted core measures and achieve The Joint Commission Top Performer on Key Quality Measures ®program recognition and the Delmarva Foundation award. Thus, the Armstrong Institute for Patient Safety and Quality implemented an initiative to ensure that ≥96% of patients received care linked to measures. Nine low-performing process measures were targeted for improvement-eight Joint Commission accountability measures and one Delmarva Foundation core measure. In the initial evaluation at The Johns Hopkins Hospital, all accountability measures for the Top Performer program reached the required ≥95% performance, gaining them recognition by The Joint Commission in 2013. Efforts were made to sustain performance of accountability measures at The Johns Hopkins Hospital. Improvements were sustained through 2014 using the following conceptual framework: declare and communicate goals, create an enabling infrastructure, engage clinicians and connect them in peer learning communities, report transparently, and create accountability systems. One part of the accountability system was for teams to create a sustainability plan, which they presented to senior leaders. To support sustained improvements, Armstrong Institute leaders added a project management office for all externally reported quality measures and concurrent reviewers to audit performance on care processes for certain measure sets. The Johns Hopkins Hospital sustained performance on all accountability measures, and now more than 96% of patients receive recommended care consistent with nationally vetted quality measures. The initiative methods enabled the transition of quality improvement from an isolated project to a way of leading an organization.
ERIC Educational Resources Information Center
Johnstone, Christopher; Thurlow, Martha; Moore, Michael; Altman, Jason
2006-01-01
The No Child Left Behind Act of 2001 (NCLB) and other recent changes in federal legislation have placed greater emphasis on accountability in large-scale testing. Included in this emphasis are regulations that require assessments to be accessible. States are accountable for the success of all students, and tests should be designed in a way that…
Multiple Testing of Gene Sets from Gene Ontology: Possibilities and Pitfalls.
Meijer, Rosa J; Goeman, Jelle J
2016-09-01
The use of multiple testing procedures in the context of gene-set testing is an important but relatively underexposed topic. If a multiple testing method is used, this is usually a standard familywise error rate (FWER) or false discovery rate (FDR) controlling procedure in which the logical relationships that exist between the different (self-contained) hypotheses are not taken into account. Taking those relationships into account, however, can lead to more powerful variants of existing multiple testing procedures and can make summarizing and interpreting the final results easier. We will show that, from the perspective of interpretation as well as from the perspective of power improvement, FWER controlling methods are more suitable than FDR controlling methods. As an example of a possible power improvement, we suggest a modified version of the popular method by Holm, which we also implemented in the R package cherry. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.
The Role of Leadership and Culture in Creating Meaningful Assessment: A Mixed Methods Case Study
ERIC Educational Resources Information Center
Guetterman, Timothy C.; Mitchell, Nancy
2016-01-01
With increased demands for institutional accountability and improved student learning, involvement in assessment has become a fundamental role of higher education faculty (Rhodes, 2010). However, faculty members and administrators often question whether assessment efforts do indeed improve student learning (Hutchings, 2010). This mixed methods…
ERIC Educational Resources Information Center
Lindle, Jane Clark; Stalion, Nancy; Young, Lu
2005-01-01
Kentucky's accountability system includes a school-processes audit known as Standards and Indicators for School Improvement (SISI), which is in a nascent stage of validation. Content validity methods include comparison to instruments measuring similar constructs as well as other techniques such as job analysis. This study used a two-phase process…
FOOTPRINTS FOR SUSTAINABILITY: THE NEXT STEPS
This paper discusses the strengths and weaknesses of the ecological footprint as an ecological accounting method, points out research needs for improvement of the analysis, and suggests potential new applications.
NASA Technical Reports Server (NTRS)
Parrott, T. L.
1973-01-01
An improved method for the design of expansion-chamber mufflers is described and applied to the task of reducing exhaust noise generated by a helicopter. The method is an improvement of standard transmission-line theory in that it accounts for the effect of the mean exhaust-gas flow on the acoustic-transmission properties of a muffler system, including the termination boundary condition. The method has been computerized, and the computer program includes an optimization procedure that adjusts muffler component lengths to achieve a minimum specified desired transmission loss over a specified frequency range. A printout of the program is included together with a user-oriented description.
Matrix model of the grinding process of cement clinker in the ball mill
NASA Astrophysics Data System (ADS)
Sharapov, Rashid R.
2018-02-01
In the article attention is paid to improving the efficiency of production of fine powders, in particular Portland cement clinker. The questions of Portland cement clinker grinding in closed circuit ball mills. Noted that the main task of modeling the grinding process is predicting the granulometric composition of the finished product taking into account constructive and technological parameters used ball mill and separator. It is shown that the most complete and informative characterization of the grinding process in a ball mill is a grinding matrix taking into account the transformation of grain composition inside the mill drum. Shows how the relative mass fraction of the particles of crushed material, get to corresponding fraction. Noted, that the actual task of reconstruction of the matrix of grinding on the experimental data obtained in the real operating installations. On the basis of experimental data obtained on industrial installations, using matrix method to determine the kinetics of the grinding process in closed circuit ball mills. The calculation method of the conversion of the grain composition of the crushed material along the mill drum developed. Taking into account the proposed approach can be optimized processing methods to improve the manufacturing process of Portland cement clinker.
Technology as an Instrument to Improve Quality, Accountability, and Reflection in Academic Medicine
ERIC Educational Resources Information Center
Wilkes, Michael S.; Howell, Lydia
2006-01-01
Objective: This article describes two complementary technology systems used in academic medicine to 1) improve the quality of learning and teaching, and 2) describe the barriers and obstacles encountered in implementing these systems. Method: The literature was integrated with in-depth, case-based experience with technology related to student…
Cheng, Dunlei; Branscum, Adam J; Stamey, James D
2010-07-01
To quantify the impact of ignoring misclassification of a response variable and measurement error in a covariate on statistical power, and to develop software for sample size and power analysis that accounts for these flaws in epidemiologic data. A Monte Carlo simulation-based procedure is developed to illustrate the differences in design requirements and inferences between analytic methods that properly account for misclassification and measurement error to those that do not in regression models for cross-sectional and cohort data. We found that failure to account for these flaws in epidemiologic data can lead to a substantial reduction in statistical power, over 25% in some cases. The proposed method substantially reduced bias by up to a ten-fold margin compared to naive estimates obtained by ignoring misclassification and mismeasurement. We recommend as routine practice that researchers account for errors in measurement of both response and covariate data when determining sample size, performing power calculations, or analyzing data from epidemiological studies. 2010 Elsevier Inc. All rights reserved.
Informal payments and the financing of health care in developing and transition countries.
Lewis, Maureen
2007-01-01
Informal, under-the-table payments to public health care providers are increasingly viewed as a critically important source of health care financing in developing and transition countries. With minimal funding levels and limited accountability, publicly financed and delivered care falls prey to illegal payments, which require payments that can exceed 100 percent of a country's median income. Methods to address the abuse include establishing official fees, combined with improved oversight and accountability for public health care providers, and a role for communities in holding providers accountable.
Patient Accounting Systems: Are They Fit with the Users' Requirements?
Ayatollahi, Haleh; Nazemi, Zahra
2016-01-01
Objectives A patient accounting system is a subsystem of a hospital information system. This system like other information systems should be carefully designed to be able to meet users' requirements. The main aim of this research was to investigate users' requirements and to determine whether current patient accounting systems meet users' needs or not. Methods This was a survey study, and the participants were the users of six patient accounting systems used in 24 teaching hospitals. A stratified sampling method was used to select the participants (n = 216). The research instruments were a questionnaire and a checklist. The mean value of ≥3 showed the importance of each data element and the capability of the system. Results Generally, the findings showed that the current patient accounting systems had some weaknesses and were able to meet between 70% and 80% of users' requirements. Conclusions The current patient accounting systems need to be improved to be able to meet users' requirements. This approach can also help to provide hospitals with more usable and reliable financial information. PMID:26893945
Mshana, Simon; Shemilu, Haji; Ndawi, Benedict; Momburi, Roman; Olsen, Oystein Evjen; Byskov, Jens; Martin, Douglas K
2007-01-01
Background Priority setting in every health system is complex and difficult. In less wealthy countries the dominant approach to priority setting has been Burden of Disease (BOD) and cost-effectiveness analysis (CEA), which is helpful, but insufficient because it focuses on a narrow range of values – need and efficiency – and not the full range of relevant values, including legitimacy and fairness. 'Accountability for reasonableness' is a conceptual framework for legitimate and fair priority setting and is empirically based and ethically justified. It connects priority setting to broader, more fundamental, democratic deliberative processes that have an impact on social justice and equity. Can 'accountability for reasonableness' be helpful for improving priority setting in less wealthy countries? Methods In 2005, Tanzanian scholars from the Primary Health Care Institute (PHCI) conducted 6 capacity building workshops with senior health staff, district planners and managers, and representatives of the Tanzanian Ministry of Health to discussion improving priority setting in Tanzania using 'accountability for reasonableness'. The purpose of this paper is to describe this initiative and the participants' views about the approach. Results The approach to improving priority setting using 'accountability for reasonableness' was viewed by district decision makers with enthusiastic favour because it was the first framework that directly addressed their priority setting concerns. High level Ministry of Health participants were also very supportive of the approach. Conclusion Both Tanzanian district and governmental health planners viewed the 'accountability for reasonableness' approach with enthusiastic favour because it was the first framework that directly addressed their concerns. PMID:17997824
NASA Astrophysics Data System (ADS)
Olson, R.; An, S. I.
2016-12-01
Atlantic Meridional Overturning Circulation (AMOC) in the ocean might slow down in the future, which can lead to a host of climatic effects in North Atlantic and throughout the world. Despite improvements in climate models and availability of new observations, AMOC projections remain uncertain. Here we constrain CMIP5 multi-model ensemble output with observations of a recently developed AMOC index to provide improved Bayesian predictions of future AMOC. Specifically, we first calculate yearly AMOC index loosely based on Rahmstorf et al. (2015) for years 1880—2004 for both observations, and the CMIP5 models for which relevant output is available. We then assign a weight to each model based on a Bayesian Model Averaging method that accounts for differential model skill in terms of both mean state and variability. We include the temporal autocorrelation in climate model errors, and account for the uncertainty in the parameters of our statistical model. We use the weights to provide future weighted projections of AMOC, and compare them to un-weighted ones. Our projections use bootstrapping to account for uncertainty in internal AMOC variability. We also perform spectral and other statistical analyses to show that AMOC index variability, both in models and in observations, is consistent with red noise. Our results improve on and complement previous work by using a new ensemble of climate models, a different observational metric, and an improved Bayesian weighting method that accounts for differential model skill at reproducing internal variability. Reference: Rahmstorf, S., Box, J. E., Feulner, G., Mann, M. E., Robinson, A., Rutherford, S., & Schaffernicht, E. J. (2015). Exceptional twentieth-century slowdown in atlantic ocean overturning circulation. Nature Climate Change, 5(5), 475-480. doi:10.1038/nclimate2554
Automated Transition State Theory Calculations for High-Throughput Kinetics.
Bhoorasingh, Pierre L; Slakman, Belinda L; Seyedzadeh Khanshan, Fariba; Cain, Jason Y; West, Richard H
2017-09-21
A scarcity of known chemical kinetic parameters leads to the use of many reaction rate estimates, which are not always sufficiently accurate, in the construction of detailed kinetic models. To reduce the reliance on these estimates and improve the accuracy of predictive kinetic models, we have developed a high-throughput, fully automated, reaction rate calculation method, AutoTST. The algorithm integrates automated saddle-point geometry search methods and a canonical transition state theory kinetics calculator. The automatically calculated reaction rates compare favorably to existing estimated rates. Comparison against high level theoretical calculations show the new automated method performs better than rate estimates when the estimate is made by a poor analogy. The method will improve by accounting for internal rotor contributions and by improving methods to determine molecular symmetry.
Discussion on water resources value accounting and its application
NASA Astrophysics Data System (ADS)
Guo, Biying; Huang, Xiaorong; Ma, Kai; Gao, Linyun; Wang, Yanqiu
2018-06-01
The exploration of the compilation of natural resources balance sheet has been proposed since 2013. Several elements of water resources balance sheet have been discussed positively in China, including basic concept, framework and accounting methods, which focused on calculating the amount of water resources with statistical methods but lacked the analysis of the interrelationship between physical volume and magnitude of value. Based on the study of physical accounting of water resources balance sheet, the connotation of water resources value is analyzed in combination with research on the value of water resources in the world. What's more, the theoretical framework, form of measurement and research methods of water resources value accounting are further explored. Taking Chengdu, China as an example, the index system of water resources balance sheet in Chengdu which includes both physical and valuable volume is established to account the depletion of water resources, environmental damage and ecological water occupation caused by economic and social water use. Moreover, the water resources balance sheet in this region which reflects the negative impact of the economy on the environment is established. It provides a reference for advancing water resources management, improving government and social investment, realizing scientific and rational allocation of water resources.
Advances in Applications of Hierarchical Bayesian Methods with Hydrological Models
NASA Astrophysics Data System (ADS)
Alexander, R. B.; Schwarz, G. E.; Boyer, E. W.
2017-12-01
Mechanistic and empirical watershed models are increasingly used to inform water resource decisions. Growing access to historical stream measurements and data from in-situ sensor technologies has increased the need for improved techniques for coupling models with hydrological measurements. Techniques that account for the intrinsic uncertainties of both models and measurements are especially needed. Hierarchical Bayesian methods provide an efficient modeling tool for quantifying model and prediction uncertainties, including those associated with measurements. Hierarchical methods can also be used to explore spatial and temporal variations in model parameters and uncertainties that are informed by hydrological measurements. We used hierarchical Bayesian methods to develop a hybrid (statistical-mechanistic) SPARROW (SPAtially Referenced Regression On Watershed attributes) model of long-term mean annual streamflow across diverse environmental and climatic drainages in 18 U.S. hydrological regions. Our application illustrates the use of a new generation of Bayesian methods that offer more advanced computational efficiencies than the prior generation. Evaluations of the effects of hierarchical (regional) variations in model coefficients and uncertainties on model accuracy indicates improved prediction accuracies (median of 10-50%) but primarily in humid eastern regions, where model uncertainties are one-third of those in arid western regions. Generally moderate regional variability is observed for most hierarchical coefficients. Accounting for measurement and structural uncertainties, using hierarchical state-space techniques, revealed the effects of spatially-heterogeneous, latent hydrological processes in the "localized" drainages between calibration sites; this improved model precision, with only minor changes in regional coefficients. Our study can inform advances in the use of hierarchical methods with hydrological models to improve their integration with stream measurements.
Funk, Eric; Riddell, Jeff; Ankel, Felix; Cabrera, Daniel
2018-06-12
Health professions educators face multiple challenges, among them the need to adapt educational methods to new technologies. In the last decades multiple new digital platforms have appeared in the learning arena, including massive open online courses and social media-based education. The major critique of these novel methods is the lack of the ability to ascertain the origin, validity, and accountability of the knowledge that is created, shared, and acquired. Recently, a novel technology based on secured data storage and transmission, called blockchain, has emerged as a way to generate networks where validity, trust, and accountability can be created. Conceptually blockchain is an open, public, distributed, and secure digital registry where information transactions are secured and have a clear origin, explicit pathways, and concrete value. Health professions education based on the blockchain will potentially allow improved tracking of content and the individuals who create it, quantify educational impact on multiple generations of learners, and build a relative value of educational interventions. Furthermore, institutions adopting blockchain technology would be able to provide certification and credentialing of healthcare professionals with no intermediaries. There is potential for blockchain to significantly change the future of health professions education and radically transform how patients, professionals, educators, and learners interact around safe, valid, and accountable information.
Principles and methods of managerial cost-accounting systems.
Suver, J D; Cooper, J C
1988-01-01
An introduction to cost-accounting systems for pharmacy managers is provided; terms are defined and examples of specific applications are given. Cost-accounting systems determine, record, and report the resources consumed in providing services. An effective cost-accounting system must provide the information needed for both internal and external reports. In accounting terms, cost is the value given up to secure an asset. In determining how volumes of activity affect costs, fixed costs and variable costs are calculated; applications include pricing strategies, cost determinations, and break-even analysis. Also discussed are the concepts of direct and indirect costs, opportunity costs, and incremental and sunk costs. For most pharmacy department services, process costing, an accounting of intermediate outputs and homogeneous units, is used; in determining the full cost of providing a product or service (e.g., patient stay), job-order costing is used. Development of work-performance standards is necessary for monitoring productivity and determining product costs. In allocating pharmacy department costs, a ratio of costs to charges can be used; this method is convenient, but microcosting (specific identification of the costs of products) is more accurate. Pharmacy managers can use cost-accounting systems to evaluate the pharmacy's strategies, policies, and services and to improve budgets and reports.
Measurement of the complex permittivity of low loss polymer powders in the millimeter-wave range.
Kapilevich, Boris; Litvak, Boris; Wainstein, Vladimir; Moshe, Danny
2007-01-01
An improved measurement method of complex permittivity of low loss polymer powders is suggested. The measurements are done in the mm-wave range using a quasi optical resonator. The 2-D corrugated mode exciter is employed to improve suppression of undesirable higher modes. The model used for reconstructing complex permittivity takes into account ohm losses of metal mesh coupling that provide better accuracy of the reconstructing procedure. An example illustrating this method is reported.
ERIC Educational Resources Information Center
Jorgensen, Frances; Kofoed, Lise Busk
2007-01-01
In this paper, a study is presented in which engineering students at a Danish university developed Continuous Improvement (CI) and innovation capabilities through action research and experiential learning methods. The paper begins with a brief overview of the literature on CI and innovation, followed by an account of how the students designed and…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ovacik, Meric A.; Androulakis, Ioannis P., E-mail: yannis@rci.rutgers.edu; Biomedical Engineering Department, Rutgers University, Piscataway, NJ 08854
2013-09-15
Pathway-based information has become an important source of information for both establishing evolutionary relationships and understanding the mode of action of a chemical or pharmaceutical among species. Cross-species comparison of pathways can address two broad questions: comparison in order to inform evolutionary relationships and to extrapolate species differences used in a number of different applications including drug and toxicity testing. Cross-species comparison of metabolic pathways is complex as there are multiple features of a pathway that can be modeled and compared. Among the various methods that have been proposed, reaction alignment has emerged as the most successful at predicting phylogeneticmore » relationships based on NCBI taxonomy. We propose an improvement of the reaction alignment method by accounting for sequence similarity in addition to reaction alignment method. Using nine species, including human and some model organisms and test species, we evaluate the standard and improved comparison methods by analyzing glycolysis and citrate cycle pathways conservation. In addition, we demonstrate how organism comparison can be conducted by accounting for the cumulative information retrieved from nine pathways in central metabolism as well as a more complete study involving 36 pathways common in all nine species. Our results indicate that reaction alignment with enzyme sequence similarity results in a more accurate representation of pathway specific cross-species similarities and differences based on NCBI taxonomy.« less
How to Improve Quality If You're Not in Manufacturing.
ERIC Educational Resources Information Center
Johnson, Gary K.; Dumas, Roland A.
1992-01-01
Discusses the problems of applying quality methods to jobs that are not directly involved with manufacturing such as sales, merchandising, law, health care, accounting, and food service. Presents a model for nonmanufacturing organizations. (JOW)
Armsworth, Paul R; Jackson, Heather B; Cho, Seong-Hoon; Clark, Melissa; Fargione, Joseph E; Iacona, Gwenllian D; Kim, Taeyoung; Larson, Eric R; Minney, Thomas; Sutton, Nathan A
2017-12-21
Conservation organizations must redouble efforts to protect habitat given continuing biodiversity declines. Prioritization of future areas for protection is hampered by disagreements over what the ecological targets of conservation should be. Here we test the claim that such disagreements will become less important as conservation moves away from prioritizing areas for protection based only on ecological considerations and accounts for varying costs of protection using return-on-investment (ROI) methods. We combine a simulation approach with a case study of forests in the eastern United States, paying particular attention to how covariation between ecological benefits and economic costs influences agreement levels. For many conservation goals, agreement over spatial priorities improves with ROI methods. However, we also show that a reliance on ROI-based prioritization can sometimes exacerbate disagreements over priorities. As such, accounting for costs in conservation planning does not enable society to sidestep careful consideration of the ecological goals of conservation.
Wentlandt, Kirsten; Bracaglia, Andrea; Drummond, James; Handren, Lindsay; McCann, Joshua; Clarke, Catherine; Degendorfer, Niki; Chan, Charles K
2015-12-22
The Physician Quality Improvement Initiative (PQII) uses a well-established multi-source feedback program, and incorporates an additional facilitated feedback review with their department chief. The purpose of this mixed methods study was to examine the value of the PQII by eliciting feedback from various stakeholders. All participants and department chiefs (n = 45) were invited to provide feedback on the project implementation and outcomes via survey and/or an interview. The survey consisted of 12 questions focused on the value of the PQII, it's influence on practice and the promotion of quality improvement and accountability. A total of 5 chiefs and 12 physician participants completed semi structured interviews. Participants found the PQII process, report and review session helpful, self-affirming or an opportunity for self-reflection, and an opportunity to engage their leaders about their practice. Chiefs indicated the sessions strengthened their understanding, ability to communicate and engage physicians about their practice, best practices, quality improvement and accountability. Thirty participants (66.7 %) completed the survey; of the responders 75.9, 89.7, 86.7 % found patient, co-worker, and physician colleague feedback valuable, respectively. A total of 67.9 % valued their facilitated review with their chief and 55.2 % indicated they were contemplating change due to their feedback. Participants believed the PQII promoted quality improvement (27/30, 90.0 %), and accountability (28/30, 93.3 %). The PQII provides an opportunity for physician development, affirmation and reflection, but also a structure to further departmental quality improvement, best practices, and finally, an opportunity to enhance communication, accountability and relationships between the organization, department chiefs and their staff.
Using soil surveys to target riparian buffers in the Chesapeake Bay watershed
Michael G. Dosskey
2008-01-01
The efficacy of vegetative buffers for improving water quality could be enhanced by distinguishing differences in buffer capability across watersheds and accounting for them in buffer planning. A soil survey-based method was applied to riparian areas in the Chesapeake Bay watershed. The method is based on soil attributes that are important in determining buffer...
Model of medicines sales forecasting taking into account factors of influence
NASA Astrophysics Data System (ADS)
Kravets, A. G.; Al-Gunaid, M. A.; Loshmanov, V. I.; Rasulov, S. S.; Lempert, L. B.
2018-05-01
The article describes a method for forecasting sales of medicines in conditions of data sampling, which is insufficient for building a model based on historical data alone. The developed method is applicable mainly to new drugs that are already licensed and released for sale but do not yet have stable sales performance in the market. The purpose of this study is to prove the effectiveness of the suggested method forecasting drug sales, taking into account the selected factors of influence, revealed during the review of existing solutions and analysis of the specificity of the area under study. Three experiments were performed on samples of different volumes, which showed an improvement in the accuracy of forecasting sales in small samples.
Computational Fluid Dynamics Simulation Study of Active Power Control in Wind Plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fleming, Paul; Aho, Jake; Gebraad, Pieter
2016-08-01
This paper presents an analysis performed on a wind plant's ability to provide active power control services using a high-fidelity computational fluid dynamics-based wind plant simulator. This approach allows examination of the impact on wind turbine wake interactions within a wind plant on performance of the wind plant controller. The paper investigates several control methods for improving performance in waked conditions. One method uses wind plant wake controls, an active field of research in which wind turbine control systems are coordinated to account for their wakes, to improve the overall performance. Results demonstrate the challenge of providing active power controlmore » in waked conditions but also the potential methods for improving this performance.« less
Governance Practices in an Era of Healthcare Transformation: Achieving a Successful Turnaround.
Sondheim, Samuel E; Patel, Dipesh M; Chin, Nicole; Barwis, Kurt; Werner, Julie; Barclay, Alexus; Mattie, Angela
This article illustrates the successful application of principles established by the American Hospital Association (AHA) to foster hospital transformations (). We examined a small community hospital's successful transition from one emergency care center (ECC) physician group to another and the methods by which significant improvements in outcomes were achieved. The foundation of this transformation included a generative governance style at the board level, a shared governance model at the employee level, a renewed sense of employee and physician engagement, and a sense of individual accountability. Outcomes included improved communication, a more unified vision throughout the ECC (which led to improved efficiency and accountability among staff), improved metrics, and a positive impact on the community's perception of care. Press Ganey scores and ECC operational metrics demonstrated significant increases in patient satisfaction and decreases in wait times for seven operational metrics. These data serve as a proxy for the transformation's success. Structured interviews revealed an increase in employee satisfaction associated with the transition. The positive outcomes demonstrate the importance of the AHA-articulated governance principles. The AHA recommendations for a superior value-based care model closely align with the methods illustrated through Bristol Hospital's successful transformation. Other institutions can apply the lessons from this case study to drive positive change and improve patient care.
Latent variable method for automatic adaptation to background states in motor imagery BCI
NASA Astrophysics Data System (ADS)
Dagaev, Nikolay; Volkova, Ksenia; Ossadtchi, Alexei
2018-02-01
Objective. Brain-computer interface (BCI) systems are known to be vulnerable to variabilities in background states of a user. Usually, no detailed information on these states is available even during the training stage. Thus there is a need in a method which is capable of taking background states into account in an unsupervised way. Approach. We propose a latent variable method that is based on a probabilistic model with a discrete latent variable. In order to estimate the model’s parameters, we suggest to use the expectation maximization algorithm. The proposed method is aimed at assessing characteristics of background states without any corresponding data labeling. In the context of asynchronous motor imagery paradigm, we applied this method to the real data from twelve able-bodied subjects with open/closed eyes serving as background states. Main results. We found that the latent variable method improved classification of target states compared to the baseline method (in seven of twelve subjects). In addition, we found that our method was also capable of background states recognition (in six of twelve subjects). Significance. Without any supervised information on background states, the latent variable method provides a way to improve classification in BCI by taking background states into account at the training stage and then by making decisions on target states weighted by posterior probabilities of background states at the prediction stage.
Cederberg, C; Henriksson, M; Berglund, M
2013-06-01
The last decade has seen an increase in environmental systems analysis of livestock production, resulting in a significant number of studies with a holistic approach often based on life-cycle assessment (LCA) methodology. The growing public interest in global warming has added to this development; guidelines for carbon footprint (CF) accounting have been developed, including for greenhouse gas (GHG) accounting of animal products. Here we give an overview of methods for estimating GHG emissions, with emphasis on nitrous oxide, methane and carbon from land use change, presently used in LCA/CF studies of animal products. We discuss where methods and data availability for GHGs and nitrogen (N) compounds most urgently need to be improved in order to produce more accurate environmental assessments of livestock production. We conclude that the top priority is to improve models for N fluxes and emissions from soils and to implement soil carbon change models in LCA/CF studies of animal products. We also point at the need for more farm data and studies measuring emissions from soils, manure and livestock in developing countries.
Integrating Chemistry: Crossing the Millennium Divide.
Housecroft, Catherine E
2018-02-01
A personal account of the development of two University level chemistry books is presented. The account focuses on ways to integrate the traditional branches of chemistry into a textbook that captures the imagination of students and relates chemical principles and fundamental topics to environmental, medicinal, biological and industrial applications. The ways in which teaching methods have changed over two decades and how web-based resources can be used to improve the communication of chemical (in particular structural) concepts are highlighted.
Air Force Personnel Can Improve Compliance With the Berry Amendment and Buy American Act
2016-02-24
accountability , integrity, and efficiency; advises the Secretary of Defense and Congress; and informs the public. Vision Our vision is to be a model...of personnel from the DoD OIG’s Quantitative Methods Division. Prior Coverage During the last 5 years, the Government Accountability Office (GAO... disclosures . The designated ombudsman is the DoD Hotline Director. For more information on your rights and remedies against retaliation, visit www.dodig.mil
Van Driel, Robin; Trask, Catherine; Johnson, Peter W; Callaghan, Jack P; Koehoorn, Mieke; Teschke, Kay
2013-01-01
Measuring trunk posture in the workplace commonly involves subjective observation or self-report methods or the use of costly and time-consuming motion analysis systems (current gold standard). This work compared trunk inclination measurements using a simple data-logging inclinometer with trunk flexion measurements using a motion analysis system, and evaluated adding measures of subject anthropometry to exposure prediction models to improve the agreement between the two methods. Simulated lifting tasks (n=36) were performed by eight participants, and trunk postures were simultaneously measured with each method. There were significant differences between the two methods, with the inclinometer initially explaining 47% of the variance in the motion analysis measurements. However, adding one key anthropometric parameter (lower arm length) to the inclinometer-based trunk flexion prediction model reduced the differences between the two systems and accounted for 79% of the motion analysis method's variance. Although caution must be applied when generalizing lower-arm length as a correction factor, the overall strategy of anthropometric modeling is a novel contribution. In this lifting-based study, by accounting for subject anthropometry, a single, simple data-logging inclinometer shows promise for trunk posture measurement and may have utility in larger-scale field studies where similar types of tasks are performed.
SPIPS: Spectro-Photo-Interferometry of Pulsating Stars
NASA Astrophysics Data System (ADS)
Mérand, Antoine
2017-10-01
SPIPS (Spectro-Photo-Interferometry of Pulsating Stars) combines radial velocimetry, interferometry, and photometry to estimate physical parameters of pulsating stars, including presence of infrared excess, color excess, Teff, and ratio distance/p-factor. The global model-based parallax-of-pulsation method is implemented in Python. Derived parameters have a high level of confidence; statistical precision is improved (compared to other methods) due to the large number of data taken into account, accuracy is improved by using consistent physical modeling and reliability of the derived parameters is strengthened by redundancy in the data.
[Legal and methodical aspects of occupational risk management].
2011-01-01
Legal and methodical aspects of occupational risk management (ORM) are considered with account of new official documents. Introduction of risk and risk management notions into Labor Code reflects the change of forms of occupational health and safety. The role of hygienist and occupational medicine professionals in workplace conditions certification (WCC) and periodical medical examinations (PME) is strengthened. The ORM could be improved by introducing the block of prognosis and causation based on IT-technologies that could match systems of WCC and PME thus improving the effectiveness of prophylactics.
An Innovative Method for Estimating Soil Retention at a Continental Scale
Planning for a sustainable future should include an accounting of services currently provided by ecosystems such as erosion control. Retention of soil improves fertility, increases water retention, and decreases sedimentation in streams and rivers. Landscapes patterns that fac...
Improving Bedload Transport Predictions by Incorporating Hysteresis
NASA Astrophysics Data System (ADS)
Crowe Curran, J.; Gaeuman, D.
2015-12-01
The importance of unsteady flow on sediment transport rates has long been recognized. However, the majority of sediment transport models were developed under steady flow conditions that did not account for changing bed morphologies and sediment transport during flood events. More recent research has used laboratory data and field data to quantify the influence of hysteresis on bedload transport and adjust transport models. In this research, these new methods are combined to improve further the accuracy of bedload transport rate quantification and prediction. The first approach defined reference shear stresses for hydrograph rising and falling limbs, and used these values to predict total and fractional transport rates during a hydrograph. From this research, a parameter for improving transport predictions during unsteady flows was developed. The second approach applied a maximum likelihood procedure to fit a bedload rating curve to measurements from a number of different coarse bed rivers. Parameters defining the rating curve were optimized for values that maximized the conditional probability of producing the measured bedload transport rate. Bedload sample magnitude was fit to a gamma distribution, and the probability of collecting N particles in a sampler during a given time step was described with a Poisson probability density function. Both approaches improved estimates of total transport during large flow events when compared to existing methods and transport models. Recognizing and accounting for the changes in transport parameters over time frames on the order of a flood or flood sequence influences the choice of method for parameter calculation in sediment transport calculations. Those methods that more tightly link the changing flow rate and bed mobility have the potential to improve bedload transport rates.
NASA Astrophysics Data System (ADS)
Khe Sun, Pak; Vorona-Slivinskaya, Lubov; Voskresenskay, Elena
2017-10-01
The article highlights the necessity of a complex approach to assess economic security of municipalities, which would consider municipal management specifics. The approach allows comparing the economic security level of municipalities, but it does not describe parameter differences between compared municipalities. Therefore, there is a second method suggested: parameter rank order method. Applying these methods allowed to figure out the leaders and outsiders of the economic security among municipalities and rank all economic security parameters according to the significance level. Complex assessment of the economic security of municipalities, based on the combination of the two approaches, allowed to assess the security level more accurate. In order to assure economic security and equalize its threshold values, one should pay special attention to transportation system development in municipalities. Strategic aims of projects in the area of transportation infrastructure development in municipalities include the following issues: contribution into creating and elaborating transportation logistics and manufacture transport complexes, development of transportation infrastructure with account of internal and external functions of the region, public transport development, improvement of transport security and reducing its negative influence on the environment.
NASA Technical Reports Server (NTRS)
Trujillo, Anna C.
1996-01-01
In 1993, fuel accounted for approximately 15 percent of an airline's expenses. Fuel consumption increases as fuel reserves increase because of the added weight to the aircraft. Calculating fuel reserves is a function of Federal Aviation Regulations, airline company policy, and factors that impact or are impacted by fuel usage enroute. This research studied how pilots and dispatchers determined the fuel needed for a flight and identified areas where improvements in methods may yield measurable fuel savings by (1) listing the uncertainties that contribute to adding contingency fuel, (2) obtaining the pilots' and dispatchers' perspective on how often each uncertainty occurred, and (3) obtaining pilots' and dispatchers' perspective on the fuel used for each occurrence. This study found that for the majority of the time, pilots felt that dispatchers included enough fuel. As for the uncertainties that flight crews and dispatchers account for, air traffic control accounts for 28% and weather uncertainties account for 58 percent. If improvements can be made in these two areas, a great potential exists to decrease the reserve required, and therefore, fuel usage without jeopardizing safety.
One-step methods for the prediction of orbital motion, taking its periodic components into account
NASA Astrophysics Data System (ADS)
Lavrov, K. N.
1988-03-01
The paper examines the design and analysis of the properties of implicit one-step integration methods which use the trigonometric approximation of ordinary differential equations containing periodic components. With reference to an orbital-motion prediction example, it is shown that the proposed schemes are more efficient in terms of computer memory than Everhart's (1974) approach. The results obtained make it possible to improve Everhart's method.
Organizational responses to accountability requirements: Do we get what we expect?
Gray, Carolyn Steele; Berta, Whitney; Deber, Raisa; Lum, Janet
In health care, accountability is being championed as a promising approach to meeting the dual imperatives of improving care quality while managing constrained budgets. Few studies focus on public sector organizations' responsiveness to government imperatives for accountability. We applied and adapted a theory of organizational responsiveness to community care agencies operating in Ontario, Canada, asking the question: What is the array of realized organizational responses to government-imposed accountability requirements among community agencies that receive public funds to provide home and community care? A sequential complementary mixed methods approach was used. It gathered data through a survey of 114 home and community care organizations in Ontario and interviews with 20 key informants representing 13 home and community care agencies and four government agencies. It generated findings using a parallel mixed analysis technique. In addition to responses predicted by the theory, we found that organizations engage in active, as well as passive, forms of compliance; we refer to this response as internal modification in which internal policies, practices, and/or procedures are changed to meet accountability requirements. We also found that environmental factors, such as the presence of an association representing organizational interests, can influence bargaining tactics. Our study helps us to better understand the range of likely responses to accountability requirements and is a first step toward encouraging the development of accountability frameworks that favor positive outcomes for organizations and those holding them to account. Tailoring agreements to organizational environments, aligning perceived compliance with behaviors that encourage improved performance, and allowing for flexibility in accountability arrangements are suggested strategies to support beneficial outcomes.
NASA Astrophysics Data System (ADS)
Hori, Toshikazu; Mohri, Yoshiyuki; Matsushima, Kenichi; Ariyoshi, Mitsuru
In recent years the increase in the number of heavy rainfall occurrences such as through unpredictable cloudbursts have resulted in the safety of the embankments of small earth dams needing to be improved. However, the severe financial condition of the government and local autonomous bodies necessitate the cost of improving them to be reduced. This study concerns the development of a method of evaluating the life cycle cost of small earth dams considered to pose a risk and in order to improve the safety of the downstream areas of small earth dams at minimal cost. Use of a safety evaluation method that is based on a combination of runoff analysis, saturated and unsaturated seepage analysis, and slope stability analysis enables the probability of a dam breach and its life cycle cost with the risk of heavy rainfall taken into account to be calculated. Moreover, use of the life cycle cost evaluation method will lead to the development of a technique for selecting the method of the optimal improvement or countermeasures against heavy rainfall.
Evaluating the effectiveness of air quality interventions.
van Erp, Annemoon M M; O'Keefe, Robert; Cohen, Aaron J; Warren, Jane
2008-01-01
Evaluating the extent to which air quality regulations improve public health--sometimes referred to as accountability--is part of an emerging effort to assess the effectiveness of environmental regulatory policies. Air quality has improved substantially in the United States and Western Europe in recent decades, with far less visible pollution and decreasing concentrations of several major pollutants. In large part, these gains were achieved through increasingly stringent air quality regulations. The costs associated with compliance and, importantly, the need to ensure that the regulations are achieving the intended public health benefits underscore the importance of accountability research. To date, accountability research has emphasized measuring the effects of actions already taken to improve air quality. Such research may also contribute to estimating the burden of disease that might be avoided in the future if certain actions are taken. The Health Effects Institute (HEI) currently funds eight ongoing studies on accountability, which cover near-term interventions to improve air quality including (1) a ban on the sale of coal, (2) replacing old wood stoves with cleaner ones, (3) decreasing sulfur content in fuel, (4) measures to reduce traffic, and (5) longer term, wide-ranging actions or events (such as complex changes associated with the reunification of Germany). HEI is also funding the development of methods and research to assess regulations that are implemented incrementally over extended periods of time, such as Title IV of the 1990 Clean Air Act Amendments, which reduces sulfur dioxide emissions from power plants in the eastern United States.
Community structure detection based on the neighbor node degree information
NASA Astrophysics Data System (ADS)
Tang, Li-Ying; Li, Sheng-Nan; Lin, Jian-Hong; Guo, Qiang; Liu, Jian-Guo
2016-11-01
Community structure detection is of great significance for better understanding the network topology property. By taking into account the neighbor degree information of the topological network as the link weight, we present an improved Nonnegative Matrix Factorization (NMF) method for detecting community structure. The results for empirical networks show that the largest improved ratio of the Normalized Mutual Information value could reach 63.21%. Meanwhile, for synthetic networks, the highest Normalized Mutual Information value could closely reach 1, which suggests that the improved method with the optimal λ can detect the community structure more accurately. This work is helpful for understanding the interplay between the link weight and the community structure detection.
A review of health resource tracking in developing countries.
Powell-Jackson, Timothy; Mills, Anne
2007-11-01
Timely, reliable and complete information on financial resources in the health sector is critical for sound policy making and planning, particularly in developing countries where resources are both scarce and unpredictable. Health resource tracking has a long history and has seen renewed interest more recently as pressure has mounted to improve accountability for the attainment of the health Millennium Development Goals. We review the methods used to track health resources and recent experiences of their application, with a view to identifying the major challenges that must be overcome if data availability and reliability are to improve. At the country level, there have been important advances in the refinement of the National Health Accounts (NHA) methodology, which is now regarded as the international standard. Significant efforts have also been put into the development of methods to track disease-specific expenditures. However, NHA as a framework can do little to address the underlying problem of weak government public expenditure management and information systems that provide much of the raw data. The experience of institutionalizing NHA suggests progress has been uneven and there is a potential for stand-alone disease accounts to make the situation worse by undermining capacity and confusing technicians. Global level tracking of donor assistance to health relies to a large extent on the OECD's Creditor Reporting System. Despite improvements in its coverage and reliability, the demand for estimates of aid to control of specific diseases is resulting in multiple, uncoordinated data requests to donor agencies, placing additional workload on the providers of information. The emergence of budget support aid modalities poses a methodological challenge to health resource tracking, as such support is difficult to attribute to any particular sector or health programme. Attention should focus on improving underlying financial and information systems at the country level, which will facilitate more reliable and timely reporting of NHA estimates. Effective implementation of a framework to make donors more accountable to recipient countries and the international community will improve the availability of financial data on their activities.
Bridging the gap between finance and clinical operations with activity-based cost management.
Storfjell, J L; Jessup, S
1996-12-01
Activity-based cost management (ABCM) is an exciting management tool that links financial information with operations. By determining the costs of specific activities and processes, nurse managers accurately determine true costs of services more accurately than traditional cost accounting methods, and then can target processes for improvement and monitor them for change and improvement. The authors describe the ABCM process applied to nursing management situations.
Continuous Blood Pressure Monitoring in Daily Life
NASA Astrophysics Data System (ADS)
Lopez, Guillaume; Shuzo, Masaki; Ushida, Hiroyuki; Hidaka, Keita; Yanagimoto, Shintaro; Imai, Yasushi; Kosaka, Akio; Delaunay, Jean-Jacques; Yamada, Ichiro
Continuous monitoring of blood pressure in daily life could improve early detection of cardiovascular disorders, as well as promoting healthcare. Conventional ambulatory blood pressure monitoring (ABPM) equipment can measure blood pressure at regular intervals for 24 hours, but is limited by long measuring time, low sampling rate, and constrained measuring posture. In this paper, we demonstrate a new method for continuous real-time measurement of blood pressure during daily activities. Our method is based on blood pressure estimation from pulse wave velocity (PWV) calculation, which formula we improved to take into account changes in the inner diameter of blood vessels. Blood pressure estimation results using our new method showed a greater precision of measured data during exercise, and a better accuracy than the conventional PWV method.
ACCOUNTING FOR CALIBRATION UNCERTAINTIES IN X-RAY ANALYSIS: EFFECTIVE AREAS IN SPECTRAL FITTING
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Hyunsook; Kashyap, Vinay L.; Drake, Jeremy J.
2011-04-20
While considerable advance has been made to account for statistical uncertainties in astronomical analyses, systematic instrumental uncertainties have been generally ignored. This can be crucial to a proper interpretation of analysis results because instrumental calibration uncertainty is a form of systematic uncertainty. Ignoring it can underestimate error bars and introduce bias into the fitted values of model parameters. Accounting for such uncertainties currently requires extensive case-specific simulations if using existing analysis packages. Here, we present general statistical methods that incorporate calibration uncertainties into spectral analysis of high-energy data. We first present a method based on multiple imputation that can bemore » applied with any fitting method, but is necessarily approximate. We then describe a more exact Bayesian approach that works in conjunction with a Markov chain Monte Carlo based fitting. We explore methods for improving computational efficiency, and in particular detail a method of summarizing calibration uncertainties with a principal component analysis of samples of plausible calibration files. This method is implemented using recently codified Chandra effective area uncertainties for low-resolution spectral analysis and is verified using both simulated and actual Chandra data. Our procedure for incorporating effective area uncertainty is easily generalized to other types of calibration uncertainties.« less
Open Rotor Tone Shielding Methods for System Noise Assessments Using Multiple Databases
NASA Technical Reports Server (NTRS)
Bahr, Christopher J.; Thomas, Russell H.; Lopes, Leonard V.; Burley, Casey L.; Van Zante, Dale E.
2014-01-01
Advanced aircraft designs such as the hybrid wing body, in conjunction with open rotor engines, may allow for significant improvements in the environmental impact of aviation. System noise assessments allow for the prediction of the aircraft noise of such designs while they are still in the conceptual phase. Due to significant requirements of computational methods, these predictions still rely on experimental data to account for the interaction of the open rotor tones with the hybrid wing body airframe. Recently, multiple aircraft system noise assessments have been conducted for hybrid wing body designs with open rotor engines. These assessments utilized measured benchmark data from a Propulsion Airframe Aeroacoustic interaction effects test. The measured data demonstrated airframe shielding of open rotor tonal and broadband noise with legacy F7/A7 open rotor blades. Two methods are proposed for improving the use of these data on general open rotor designs in a system noise assessment. The first, direct difference, is a simple octave band subtraction which does not account for tone distribution within the rotor acoustic signal. The second, tone matching, is a higher-fidelity process incorporating additional physical aspects of the problem, where isolated rotor tones are matched by their directivity to determine tone-by-tone shielding. A case study is conducted with the two methods to assess how well each reproduces the measured data and identify the merits of each. Both methods perform similarly for system level results and successfully approach the experimental data for the case study. The tone matching method provides additional tools for assessing the quality of the match to the data set. Additionally, a potential path to improve the tone matching method is provided.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vandewouw, Marlee M., E-mail: marleev@mie.utoronto
Purpose: Continuous dose delivery in radiation therapy treatments has been shown to decrease total treatment time while improving the dose conformity and distribution homogeneity over the conventional step-and-shoot approach. The authors develop an inverse treatment planning method for Gamma Knife® Perfexion™ that continuously delivers dose along a path in the target. Methods: The authors’ method is comprised of two steps: find a path within the target, then solve a mixed integer optimization model to find the optimal collimator configurations and durations along the selected path. Robotic path-finding techniques, specifically, simultaneous localization and mapping (SLAM) using an extended Kalman filter, aremore » used to obtain a path that travels sufficiently close to selected isocentre locations. SLAM is novelly extended to explore a 3D, discrete environment, which is the target discretized into voxels. Further novel extensions are incorporated into the steering mechanism to account for target geometry. Results: The SLAM method was tested on seven clinical cases and compared to clinical, Hamiltonian path continuous delivery, and inverse step-and-shoot treatment plans. The SLAM approach improved dose metrics compared to the clinical plans and Hamiltonian path continuous delivery plans. Beam-on times improved over clinical plans, and had mixed performance compared to Hamiltonian path continuous plans. The SLAM method is also shown to be robust to path selection inaccuracies, isocentre selection, and dose distribution. Conclusions: The SLAM method for continuous delivery provides decreased total treatment time and increased treatment quality compared to both clinical and inverse step-and-shoot plans, and outperforms existing path methods in treatment quality. It also accounts for uncertainty in treatment planning by accommodating inaccuracies.« less
[Andragogy: reality or utopy].
Wautier, J L; Vileyn, F
2004-07-01
The education of adult differs from that of children and the methods, which have to be used, should take into account that adults have specific goals and diverse knowledge. As the teaching methods for children are called pedagogy, it is now known as andragogy for adults. Andragogy has lead to the development of several approaches to improve continuous education. Several tools and methodologies have been created for adult education.
An improved method for polarimetric image restoration in interferometry
NASA Astrophysics Data System (ADS)
Pratley, Luke; Johnston-Hollitt, Melanie
2016-11-01
Interferometric radio astronomy data require the effects of limited coverage in the Fourier plane to be accounted for via a deconvolution process. For the last 40 years this process, known as `cleaning', has been performed almost exclusively on all Stokes parameters individually as if they were independent scalar images. However, here we demonstrate for the case of the linear polarization P, this approach fails to properly account for the complex vector nature resulting in a process which is dependent on the axes under which the deconvolution is performed. We present here an improved method, `Generalized Complex CLEAN', which properly accounts for the complex vector nature of polarized emission and is invariant under rotations of the deconvolution axes. We use two Australia Telescope Compact Array data sets to test standard and complex CLEAN versions of the Högbom and SDI (Steer-Dwedney-Ito) CLEAN algorithms. We show that in general the complex CLEAN version of each algorithm produces more accurate clean components with fewer spurious detections and lower computation cost due to reduced iterations than the current methods. In particular, we find that the complex SDI CLEAN produces the best results for diffuse polarized sources as compared with standard CLEAN algorithms and other complex CLEAN algorithms. Given the move to wide-field, high-resolution polarimetric imaging with future telescopes such as the Square Kilometre Array, we suggest that Generalized Complex CLEAN should be adopted as the deconvolution method for all future polarimetric surveys and in particular that the complex version of an SDI CLEAN should be used.
Ketz, Alison C; Johnson, Therese L; Monello, Ryan J; Mack, John A; George, Janet L; Kraft, Benjamin R; Wild, Margaret A; Hooten, Mevin B; Hobbs, N Thompson
2018-04-01
Accurate assessment of abundance forms a central challenge in population ecology and wildlife management. Many statistical techniques have been developed to estimate population sizes because populations change over time and space and to correct for the bias resulting from animals that are present in a study area but not observed. The mobility of individuals makes it difficult to design sampling procedures that account for movement into and out of areas with fixed jurisdictional boundaries. Aerial surveys are the gold standard used to obtain data of large mobile species in geographic regions with harsh terrain, but these surveys can be prohibitively expensive and dangerous. Estimating abundance with ground-based census methods have practical advantages, but it can be difficult to simultaneously account for temporary emigration and observer error to avoid biased results. Contemporary research in population ecology increasingly relies on telemetry observations of the states and locations of individuals to gain insight on vital rates, animal movements, and population abundance. Analytical models that use observations of movements to improve estimates of abundance have not been developed. Here we build upon existing multi-state mark-recapture methods using a hierarchical N-mixture model with multiple sources of data, including telemetry data on locations of individuals, to improve estimates of population sizes. We used a state-space approach to model animal movements to approximate the number of marked animals present within the study area at any observation period, thereby accounting for a frequently changing number of marked individuals. We illustrate the approach using data on a population of elk (Cervus elaphus nelsoni) in Northern Colorado, USA. We demonstrate substantial improvement compared to existing abundance estimation methods and corroborate our results from the ground based surveys with estimates from aerial surveys during the same seasons. We develop a hierarchical Bayesian N-mixture model using multiple sources of data on abundance, movement and survival to estimate the population size of a mobile species that uses remote conservation areas. The model improves accuracy of inference relative to previous methods for estimating abundance of open populations. © 2018 by the Ecological Society of America.
Accounting for the costs of quality.
Suver, J D; Neumann, B R; Boles, K E
1992-09-01
Total quality management (TQM) represents a paradigm shift in the organizational values that shape every aspect of a healthcare provider's activities. The TQM approach to quality management subscribes to the theory that it is not the work of employees of an organization that leads to poor quality; rather, it is the poor design of systems and procedures. In a book recently published by HFMA, Management Accounting for Healthcare Organizations, third edition, authors Suver, Neumann and Boles point out that the changes in behavioral focus and organizational climate brought about by TQM will have a major impact on management accounting function in healthcare organizations. TQM will require new methods of accounting that will enable the effects of declining quality to be recognized and evaluated. It also will require new types of management accounting reports that will identify opportunities for quality improvement and will monitor the effectiveness of quality management endeavors. The following article has been adapted from the book cited above.
Pflueger, Dane
2015-04-23
Accounting-that is, standardized measurement, public reporting, performance evaluation and managerial control-is commonly seen to provide the core infrastructure for quality improvement in healthcare. Yet, accounting successfully for quality has been a problematic endeavor, often producing dysfunctional effects. This has raised questions about the appropriate role for accounting in achieving quality improvement. This paper contributes to this debate by contrasting the specific way in which accounting is understood and operationalized for quality improvement in the UK National Health Service (NHS) with findings from the broadly defined 'social studies of accounting' literature and illustrative examples. This paper highlights three significant differences between the way that accounting is understood to operate in the dominant health policy discourse and recent healthcare reforms, and in the social studies of accounting literature. It shows that accounting does not just find things out, but makes them up. It shows that accounting is not simply a matter of substance, but of style. And it shows that accounting does not just facilitate, but displaces, control. The illumination of these differences in the way that accounting is conceptualized helps to diagnose why accounting interventions often fail to produce the quality improvements that were envisioned. This paper concludes that accounting is not necessarily incompatible with the ambition of quality improvement, but that it would need to be understood and operationalized in new ways in order to contribute to this end. Proposals for this new way of advancing accounting are discussed. They include the cultivation of overlapping and even conflicting measures of quality, the evaluation of accounting regimes in terms of what they do to practice, and the development of distinctively skeptical calculative cultures.
Walker, Martin; Basáñez, María-Gloria; Ouédraogo, André Lin; Hermsen, Cornelus; Bousema, Teun; Churcher, Thomas S
2015-01-16
Quantitative molecular methods (QMMs) such as quantitative real-time polymerase chain reaction (q-PCR), reverse-transcriptase PCR (qRT-PCR) and quantitative nucleic acid sequence-based amplification (QT-NASBA) are increasingly used to estimate pathogen density in a variety of clinical and epidemiological contexts. These methods are often classified as semi-quantitative, yet estimates of reliability or sensitivity are seldom reported. Here, a statistical framework is developed for assessing the reliability (uncertainty) of pathogen densities estimated using QMMs and the associated diagnostic sensitivity. The method is illustrated with quantification of Plasmodium falciparum gametocytaemia by QT-NASBA. The reliability of pathogen (e.g. gametocyte) densities, and the accompanying diagnostic sensitivity, estimated by two contrasting statistical calibration techniques, are compared; a traditional method and a mixed model Bayesian approach. The latter accounts for statistical dependence of QMM assays run under identical laboratory protocols and permits structural modelling of experimental measurements, allowing precision to vary with pathogen density. Traditional calibration cannot account for inter-assay variability arising from imperfect QMMs and generates estimates of pathogen density that have poor reliability, are variable among assays and inaccurately reflect diagnostic sensitivity. The Bayesian mixed model approach assimilates information from replica QMM assays, improving reliability and inter-assay homogeneity, providing an accurate appraisal of quantitative and diagnostic performance. Bayesian mixed model statistical calibration supersedes traditional techniques in the context of QMM-derived estimates of pathogen density, offering the potential to improve substantially the depth and quality of clinical and epidemiological inference for a wide variety of pathogens.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nimbalkar, Sachin U.; Wenning, Thomas J.; Guo, Wei
In the United States, manufacturing facilities account for about 32% of total domestic energy consumption in 2014. Robust energy tracking methodologies are critical to understanding energy performance in manufacturing facilities. Due to its simplicity and intuitiveness, the classic energy intensity method (i.e. the ratio of total energy use over total production) is the most widely adopted. However, the classic energy intensity method does not take into account the variation of other relevant parameters (i.e. product type, feed stock type, weather, etc.). Furthermore, the energy intensity method assumes that the facilities’ base energy consumption (energy use at zero production) is zero,more » which rarely holds true. Therefore, it is commonly recommended to utilize regression models rather than the energy intensity approach for tracking improvements at the facility level. Unfortunately, many energy managers have difficulties understanding why regression models are statistically better than utilizing the classic energy intensity method. While anecdotes and qualitative information may convince some, many have major reservations about the accuracy of regression models and whether it is worth the time and effort to gather data and build quality regression models. This paper will explain why regression models are theoretically and quantitatively more accurate for tracking energy performance improvements. Based on the analysis of data from 114 manufacturing plants over 12 years, this paper will present quantitative results on the importance of utilizing regression models over the energy intensity methodology. This paper will also document scenarios where regression models do not have significant relevance over the energy intensity method.« less
Evaluation of accountability measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cacic, C.G.
The New Brunswick Laboratory (NBL) is programmatically responsible to the U.S. Department of Energy (DOE) Office of Safeguards and Security (OSS) for providing independent review and evaluation of accountability measurement technology in DOE nuclear facilities. This function is addressed in part through the NBL Safegaurds Measurement Evaluation (SME) Program. The SME Program utilizes both on-site review of measurement methods along with material-specific measurement evaluation studies to provide information concerning the adequacy of subject accountability measurements. This paper reviews SME Program activities for the 1986-87 time period, with emphasis on noted improvements in measurement capabilities. Continued evolution of the SME Programmore » to respond to changing safeguards concerns is discussed.« less
Performance Indicators for Accountability and Improvement.
ERIC Educational Resources Information Center
Banta, Trudy W.; Borden, Victor M. H.
1994-01-01
Five criteria for judging college or university performance indicators (PIs) used to guide strategic decision making are outlined. The criteria address: purpose; alignment of PIs throughout the organization or system; alignment of PIs across inputs, processes, and outcomes; capacity to accommodate a variety of evaluation methods; and utility in…
Web-Based Honorarium Confirmation System Prototype
NASA Astrophysics Data System (ADS)
Wisswani, N. W.; Catur Bawa, I. G. N. B.
2018-01-01
Improving services in academic environment can be applied by regulating salary payment process for all employees. As a form of control to maintain financial transparency, employees should have information concerning salary payment process. Currently, notification process of committee honorarium will be accepted by the employees in a manual manner. The salary will be received by the employee bank account and to know its details, they should go to the accounting unit to find out further information. Though there are some employees entering the accounting unit, they still find difficulty to obtain information about detailed honor information that they received in their accounts. This can be caused by many data collected and to be managed. Based on this issue, this research will design a prototype of web-based system for accounting unit system in order to provide detailed financial transaction confirmation to employee bank accounts that have been informed through mobile banking system. This prototype will be developed with Waterfall method through testing on final users after it is developed through PHP program with MySQL as DBMS
Patient Accounting Systems: Are They Fit with the Users' Requirements?
Ayatollahi, Haleh; Nazemi, Zahra; Haghani, Hamid
2016-01-01
A patient accounting system is a subsystem of a hospital information system. This system like other information systems should be carefully designed to be able to meet users' requirements. The main aim of this research was to investigate users' requirements and to determine whether current patient accounting systems meet users' needs or not. This was a survey study, and the participants were the users of six patient accounting systems used in 24 teaching hospitals. A stratified sampling method was used to select the participants (n = 216). The research instruments were a questionnaire and a checklist. The mean value of ≥3 showed the importance of each data element and the capability of the system. Generally, the findings showed that the current patient accounting systems had some weaknesses and were able to meet between 70% and 80% of users' requirements. The current patient accounting systems need to be improved to be able to meet users' requirements. This approach can also help to provide hospitals with more usable and reliable financial information.
Slepton pair production at the LHC in NLO+NLL with resummation-improved parton densities
NASA Astrophysics Data System (ADS)
Fiaschi, Juri; Klasen, Michael
2018-03-01
Novel PDFs taking into account resummation-improved matrix elements, albeit only in the fit of a reduced data set, allow for consistent NLO+NLL calculations of slepton pair production at the LHC. We apply a factorisation method to this process that minimises the effect of the data set reduction, avoids the problem of outlier replicas in the NNPDF method for PDF uncertainties and preserves the reduction of the scale uncertainty. For Run II of the LHC, left-handed selectron/smuon, right-handed and maximally mixed stau production, we confirm that the consistent use of threshold-improved PDFs partially compensates the resummation contributions in the matrix elements. Together with the reduction of the scale uncertainty at NLO+NLL, the described method further increases the reliability of slepton pair production cross sections at the LHC.
Freiman, Moti; Nickisch, Hannes; Prevrhal, Sven; Schmitt, Holger; Vembar, Mani; Maurovich-Horvat, Pál; Donnelly, Patrick; Goshen, Liran
2017-03-01
The goal of this study was to assess the potential added benefit of accounting for partial volume effects (PVE) in an automatic coronary lumen segmentation algorithm that is used to determine the hemodynamic significance of a coronary artery stenosis from coronary computed tomography angiography (CCTA). Two sets of data were used in our work: (a) multivendor CCTA datasets of 18 subjects from the MICCAI 2012 challenge with automatically generated centerlines and 3 reference segmentations of 78 coronary segments and (b) additional CCTA datasets of 97 subjects with 132 coronary lesions that had invasive reference standard FFR measurements. We extracted the coronary artery centerlines for the 97 datasets by an automated software program followed by manual correction if required. An automatic machine-learning-based algorithm segmented the coronary tree with and without accounting for the PVE. We obtained CCTA-based FFR measurements using a flow simulation in the coronary trees that were generated by the automatic algorithm with and without accounting for PVE. We assessed the potential added value of PVE integration as a part of the automatic coronary lumen segmentation algorithm by means of segmentation accuracy using the MICCAI 2012 challenge framework and by means of flow simulation overall accuracy, sensitivity, specificity, negative and positive predictive values, and the receiver operated characteristic (ROC) area under the curve. We also evaluated the potential benefit of accounting for PVE in automatic segmentation for flow simulation for lesions that were diagnosed as obstructive based on CCTA which could have indicated a need for an invasive exam and revascularization. Our segmentation algorithm improves the maximal surface distance error by ~39% compared to previously published method on the 18 datasets from the MICCAI 2012 challenge with comparable Dice and mean surface distance. Results with and without accounting for PVE were comparable. In contrast, integrating PVE analysis into an automatic coronary lumen segmentation algorithm improved the flow simulation specificity from 0.6 to 0.68 with the same sensitivity of 0.83. Also, accounting for PVE improved the area under the ROC curve for detecting hemodynamically significant CAD from 0.76 to 0.8 compared to automatic segmentation without PVE analysis with invasive FFR threshold of 0.8 as the reference standard. Accounting for PVE in flow simulation to support the detection of hemodynamic significant disease in CCTA-based obstructive lesions improved specificity from 0.51 to 0.73 with same sensitivity of 0.83 and the area under the curve from 0.69 to 0.79. The improvement in the AUC was statistically significant (N = 76, Delong's test, P = 0.012). Accounting for the partial volume effects in automatic coronary lumen segmentation algorithms has the potential to improve the accuracy of CCTA-based hemodynamic assessment of coronary artery lesions. © 2017 American Association of Physicists in Medicine.
NASA Astrophysics Data System (ADS)
Adrich, Przemysław
2016-05-01
In Part I of this work existing methods and problems in dual foil electron beam forming system design are presented. On this basis, a new method of designing these systems is introduced. The motivation behind this work is to eliminate the shortcomings of the existing design methods and improve overall efficiency of the dual foil design process. The existing methods are based on approximate analytical models applied in an unrealistically simplified geometry. Designing a dual foil system with these methods is a rather labor intensive task as corrections to account for the effects not included in the analytical models have to be calculated separately and accounted for in an iterative procedure. To eliminate these drawbacks, the new design method is based entirely on Monte Carlo modeling in a realistic geometry and using physics models that include all relevant processes. In our approach, an optimal configuration of the dual foil system is found by means of a systematic, automatized scan of the system performance in function of parameters of the foils. The new method, while being computationally intensive, minimizes the involvement of the designer and considerably shortens the overall design time. The results are of high quality as all the relevant physics and geometry details are naturally accounted for. To demonstrate the feasibility of practical implementation of the new method, specialized software tools were developed and applied to solve a real life design problem, as described in Part II of this work.
Improving awareness, accountability, and access through health coaching
Liddy, Clare; Johnston, Sharon; Irving, Hannah; Nash, Kate; Ward, Natalie
2015-01-01
Abstract Objective To assess patients’ experiences with and perceptions of health coaching as part of their ongoing care. Design A qualitative research design using semistructured interviews that were recorded and transcribed verbatim. Setting Ottawa, Ont. Participants Eleven patients (> 18 years of age) enrolled in a health coaching pilot program who were at risk of or diagnosed with type 2 diabetes. Methods Patients’ perspectives were assessed with semistructured interviews. Interviews were conducted with 11 patients at the end of the pilot program, using a stratified sampling approach to ensure maximum variation. Main findings All patients found the overall experience with the health coaching program to be positive. Patients believed the health coaching program was effective in increasing awareness of how diabetes affected their bodies and health, in building accountability for their health-related actions, and in improving access to care and other health resources. Conclusion Patients perceive one-on-one health coaching as an acceptable intervention in their ongoing care. Patients enrolled in the health coaching pilot program believed that there was an improvement in access to care, health literacy, and accountability, all factors considered to be precursors to behavioural change. PMID:25932483
Improved battery parameter estimation method considering operating scenarios for HEV/EV applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Jufeng; Xia, Bing; Shang, Yunlong
This study presents an improved battery parameter estimation method based on typical operating scenarios in hybrid electric vehicles and pure electric vehicles. Compared with the conventional estimation methods, the proposed method takes both the constant-current charging and the dynamic driving scenarios into account, and two separate sets of model parameters are estimated through different parts of the pulse-rest test. The model parameters for the constant-charging scenario are estimated from the data in the pulse-charging periods, while the model parameters for the dynamic driving scenario are estimated from the data in the rest periods, and the length of the fitted datasetmore » is determined by the spectrum analysis of the load current. In addition, the unsaturated phenomenon caused by the long-term resistor-capacitor (RC) network is analyzed, and the initial voltage expressions of the RC networks in the fitting functions are improved to ensure a higher model fidelity. Simulation and experiment results validated the feasibility of the developed estimation method.« less
Improved battery parameter estimation method considering operating scenarios for HEV/EV applications
Yang, Jufeng; Xia, Bing; Shang, Yunlong; ...
2016-12-22
This study presents an improved battery parameter estimation method based on typical operating scenarios in hybrid electric vehicles and pure electric vehicles. Compared with the conventional estimation methods, the proposed method takes both the constant-current charging and the dynamic driving scenarios into account, and two separate sets of model parameters are estimated through different parts of the pulse-rest test. The model parameters for the constant-charging scenario are estimated from the data in the pulse-charging periods, while the model parameters for the dynamic driving scenario are estimated from the data in the rest periods, and the length of the fitted datasetmore » is determined by the spectrum analysis of the load current. In addition, the unsaturated phenomenon caused by the long-term resistor-capacitor (RC) network is analyzed, and the initial voltage expressions of the RC networks in the fitting functions are improved to ensure a higher model fidelity. Simulation and experiment results validated the feasibility of the developed estimation method.« less
Epidemiologic methods in mastitis treatment and control.
Thurmond, M C
1993-11-01
Methods and concepts of epidemiology offer means whereby udder health can be monitored and evaluated. Prerequisite to a sound epidemiologic approach is development of measures of mastitis that minimize biases and that account for sensitivity and specificity of diagnostic tests. Mastitis surveillance offers an ongoing and passive system for evaluation of udder health, whereas clinical and observational trials offer a more proactive and developmental approach to improving udder health.
On the analysis of very small samples of Gaussian repeated measurements: an alternative approach.
Westgate, Philip M; Burchett, Woodrow W
2017-03-15
The analysis of very small samples of Gaussian repeated measurements can be challenging. First, due to a very small number of independent subjects contributing outcomes over time, statistical power can be quite small. Second, nuisance covariance parameters must be appropriately accounted for in the analysis in order to maintain the nominal test size. However, available statistical strategies that ensure valid statistical inference may lack power, whereas more powerful methods may have the potential for inflated test sizes. Therefore, we explore an alternative approach to the analysis of very small samples of Gaussian repeated measurements, with the goal of maintaining valid inference while also improving statistical power relative to other valid methods. This approach uses generalized estimating equations with a bias-corrected empirical covariance matrix that accounts for all small-sample aspects of nuisance correlation parameter estimation in order to maintain valid inference. Furthermore, the approach utilizes correlation selection strategies with the goal of choosing the working structure that will result in the greatest power. In our study, we show that when accurate modeling of the nuisance correlation structure impacts the efficiency of regression parameter estimation, this method can improve power relative to existing methods that yield valid inference. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Methods of equipment choice in shotcreting
NASA Astrophysics Data System (ADS)
Sharapov, R. R.; Yadykina, V. V.; Stepanov, M. A.; Kitukov, B. A.
2018-03-01
Shotcrete is widely used in architecture, hydraulic engineering structures, finishing works in tunnels, arc covers and ceilings. The problem of the equipment choice in shotcreting is very important. The main issues influencing the equipment choice are quality improvement and intensification of shotcreting. Main parameters and rational limits of technological characteristic of machines used in solving different problems in shotcreting are described. It is suggested to take into account peculiarities of shotcrete mixing processes and peculiarities of applying these mixtures with compressed air kinetic energy. The described method suggests choosing a mixer with the account of energy capacity, Reynolds number and rotational frequency of the mixing drum. The suggested choice procedure of the equipment nomenclature allows decreasing exploitation costs, increasing the quality of shotcrete and shotcreting in general.
Quantitative properties of clustering within modern microscopic nuclear models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Volya, A.; Tchuvil’sky, Yu. M., E-mail: tchuvl@nucl-th.sinp.msu.ru
2016-09-15
A method for studying cluster spectroscopic properties of nuclear fragmentation, such as spectroscopic amplitudes, cluster form factors, and spectroscopic factors, is developed on the basis of modern precision nuclear models that take into account the mixing of large-scale shell-model configurations. Alpha-cluster channels are considered as an example. A mathematical proof of the need for taking into account the channel-wave-function renormalization generated by exchange terms of the antisymmetrization operator (Fliessbach effect) is given. Examples where this effect is confirmed by a high quality of the description of experimental data are presented. By and large, the method in question extends substantially themore » possibilities for studying clustering phenomena in nuclei and for improving the quality of their description.« less
Path finding methods accounting for stoichiometry in metabolic networks
2011-01-01
Graph-based methods have been widely used for the analysis of biological networks. Their application to metabolic networks has been much discussed, in particular noting that an important weakness in such methods is that reaction stoichiometry is neglected. In this study, we show that reaction stoichiometry can be incorporated into path-finding approaches via mixed-integer linear programming. This major advance at the modeling level results in improved prediction of topological and functional properties in metabolic networks. PMID:21619601
Improving the risk assessment of lipophilic persistent environmental chemicals in breast milk
BACKGROUND: A breastfeeding infant’s intake of persistent organic pollutants (POPs) may be much greater than his/her mother’s average daily POP exposure. In many cases, current human health risk assessment methods do not account for differences between maternal and infant POP exp...
The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...
Using Cognitive Load Theory to Tailor Instruction to Levels of Accounting Students' Expertise
ERIC Educational Resources Information Center
Blayney, Paul; Kalyuga, Slava; Sweller, John
2015-01-01
Tailoring of instructional methods to learner levels of expertise may reduce extraneous cognitive load and improve learning. Contemporary technology-based learning environments have the potential to substantially enable learner-adapted instruction. This paper investigates the effects of adaptive instruction based on using the isolated-interactive…
Transparency and Its Determinants at Colombian Universities
ERIC Educational Resources Information Center
Flórez-Parra, Jesús Mauricio; López-Pérez, María Victoria; López-Hernández, Antonio Manuel
2017-01-01
Over the past decade, one of the demands upon public institutions, among which we find universities, has been for transparency and improvement of accountability. In this context, Colombian universities are introducing different methods of management and governance aimed at addressing the demands of society generally in relation to transparency and…
ERIC Educational Resources Information Center
Wrigley, William J.; Emmerson, Stephen B.
2013-01-01
This study investigated ways to improve the quality of music performance evaluation in an effort to address the accountability imperative in tertiary music education. An enhanced scientific methodology was employed incorporating ecological validity and using recognized qualitative methods involving grounded theory and quantitative methods…
The Motivating Language of Principals: A Sequential Transformative Strategy
ERIC Educational Resources Information Center
Holmes, William Tobias
2012-01-01
This study implemented a Sequential Transformative Mixed Methods design with teachers (as recipients) and principals (to give voice) in the examination of principal talk in two different school accountability contexts (Continuously Improving and Continuously Zigzag) using the conceptual framework of Motivating Language Theory. In phase one,…
Accounting for discovery bias in genomic prediction
USDA-ARS?s Scientific Manuscript database
Our objective was to evaluate an approach to mitigating discovery bias in genomic prediction. Accuracy may be improved by placing greater emphasis on regions of the genome expected to be more influential on a trait. Methods emphasizing regions result in a phenomenon known as “discovery bias” if info...
Learning about Teachers' Literacy Instruction from Classroom Observations
ERIC Educational Resources Information Center
Kelcey, Ben; Carlisle, Joanne F.
2013-01-01
The purpose of this study is to contribute to efforts to improve methods for gathering and analyzing data from classroom observations in early literacy. The methodological approach addresses current problems of reliability and validity of classroom observations by taking into account differences in teachers' uses of instructional actions (e.g.,…
Improvement of tritium accountancy technology for ITER fuel cycle safety enhancement
NASA Astrophysics Data System (ADS)
O'hira, S.; Hayashi, T.; Nakamura, H.; Kobayashi, K.; Tadokoro, T.; Nakamura, H.; Itoh, T.; Yamanishi, T.; Kawamura, Y.; Iwai, Y.; Arita, T.; Maruyama, T.; Kakuta, T.; Konishi, S.; Enoeda, M.; Yamada, M.; Suzuki, T.; Nishi, M.; Nagashima, T.; Ohta, M.
2000-03-01
In order to improve the safe handling and control of tritium for the ITER fuel cycle, effective in situ tritium accounting methods have been developed at the Tritium Process Laboratory in the Japan Atomic Energy Research Institute under one of the ITER-EDA R&D tasks. The remote and multilocation analysis of process gases by an application of laser Raman spectroscopy developed and tested could provide a measurement of hydrogen isotope gases with a detection limit of 0.3 kPa analytical periods of 120 s. An in situ tritium inventory measurement by application of a `self-assaying' storage bed with 25 g tritium capacity could provide a measurement with the required detection limit of less than 1% and a design proof of a bed with 100 g tritium capacity.
Karimli, Leyla; Ssewamala, Fred M.
2015-01-01
Purpose This present study tests the proposition that an economic strengthening intervention for families caring for AIDS-orphaned adolescents would positively affect adolescent future orientation and psychosocial outcomes through increased asset-accumulation (in this case, by increasing family savings). Methods Using longitudinal data from the cluster-randomized experiment we ran generalized estimating equation (GEE) models with robust standard errors clustering on individual observations. To examine whether family savings mediate the effect of the intervention on adolescents’ future orientation and psychosocial outcomes, analyses were conducted in three steps: (1) testing the effect of intervention on mediator; (2) testing the effect of mediator on outcomes, controlling for the intervention; and (3) testing the significance of mediating effect using Sobel-Goodman method. Asymmetric confidence intervals for mediated effect were obtained through bootstrapping—to address the assumption of normal distribution. Results Results indicate that participation in a matched Child Savings Account program improved adolescents’ future orientation and psychosocial outcomes by reducing hopelessness, enhancing self-concept, and improving adolescents’ confidence about their educational plans. However, the positive intervention effect on adolescent future orientation and psychosocial outcomes was not transmitted through saving. In other words, participation in the matched Child Savings Account program improved adolescent future orientation and psychosocial outcomes regardless of its impact on reported savings. Conclusions Further research is necessary to understand exactly how participation in economic strengthening interventions, for example, those that employ matched Child Savings Accounts, shape adolescent future orientation and psychosocial outcomes: what, if not savings, transmits the treatment effect and how? PMID:26271162
Forecasting biodiversity in breeding birds using best practices
Taylor, Shawn D.; White, Ethan P.
2018-01-01
Biodiversity forecasts are important for conservation, management, and evaluating how well current models characterize natural systems. While the number of forecasts for biodiversity is increasing, there is little information available on how well these forecasts work. Most biodiversity forecasts are not evaluated to determine how well they predict future diversity, fail to account for uncertainty, and do not use time-series data that captures the actual dynamics being studied. We addressed these limitations by using best practices to explore our ability to forecast the species richness of breeding birds in North America. We used hindcasting to evaluate six different modeling approaches for predicting richness. Hindcasts for each method were evaluated annually for a decade at 1,237 sites distributed throughout the continental United States. All models explained more than 50% of the variance in richness, but none of them consistently outperformed a baseline model that predicted constant richness at each site. The best practices implemented in this study directly influenced the forecasts and evaluations. Stacked species distribution models and “naive” forecasts produced poor estimates of uncertainty and accounting for this resulted in these models dropping in the relative performance compared to other models. Accounting for observer effects improved model performance overall, but also changed the rank ordering of models because it did not improve the accuracy of the “naive” model. Considering the forecast horizon revealed that the prediction accuracy decreased across all models as the time horizon of the forecast increased. To facilitate the rapid improvement of biodiversity forecasts, we emphasize the value of specific best practices in making forecasts and evaluating forecasting methods. PMID:29441230
26 CFR 1.446-2 - Method of accounting for interest.
Code of Federal Regulations, 2010 CFR
2010-04-01
... account by a taxpayer under the taxpayer's regular method of accounting (e.g., an accrual method or the... 26 Internal Revenue 6 2010-04-01 2010-04-01 false Method of accounting for interest. 1.446-2... TAX (CONTINUED) INCOME TAXES Methods of Accounting § 1.446-2 Method of accounting for interest. (a...
Martin, Emma C; Aarons, Leon; Yates, James W T
2016-07-01
Xenograft studies are commonly used to assess the efficacy of new compounds and characterise their dose-response relationship. Analysis often involves comparing the final tumour sizes across dose groups. This can cause bias, as often in xenograft studies a tumour burden limit (TBL) is imposed for ethical reasons, leading to the animals with the largest tumours being excluded from the final analysis. This means the average tumour size, particularly in the control group, is underestimated, leading to an underestimate of the treatment effect. Four methods to account for dropout due to the TBL are proposed, which use all the available data instead of only final observations: modelling, pattern mixture models, treating dropouts as censored using the M3 method and joint modelling of tumour growth and dropout. The methods were applied to both a simulated data set and a real example. All four proposed methods led to an improvement in the estimate of treatment effect in the simulated data. The joint modelling method performed most strongly, with the censoring method also providing a good estimate of the treatment effect, but with higher uncertainty. In the real data example, the dose-response estimated using the censoring and joint modelling methods was higher than the very flat curve estimated from average final measurements. Accounting for dropout using the proposed censoring or joint modelling methods allows the treatment effect to be recovered in studies where it may have been obscured due to dropout caused by the TBL.
Nurse-directed care model in a psychiatric hospital: a model for clinical accountability.
E-Morris, Marlene; Caldwell, Barbara; Mencher, Kathleen J; Grogan, Kimberly; Judge-Gorny, Margaret; Patterson, Zelda; Christopher, Terrian; Smith, Russell C; McQuaide, Teresa
2010-01-01
The focus on recovery for persons with severe and persistent mental illness is leading state psychiatric hospitals to transform their method of care delivery. This article describes a quality improvement project involving a hospital's administration and multidisciplinary state-university affiliation that collaborated in the development and implementation of a nursing care delivery model in a state psychiatric hospital. The quality improvement project team instituted a new model to promote the hospital's vision of wellness and recovery through utilization of the therapeutic relationship and greater clinical accountability. Implementation of the model was accomplished in 2 phases: first, the establishment of a structure to lay the groundwork for accountability and, second, the development of a mechanism to provide a clinical supervision process for staff in their work with clients. Effectiveness of the model was assessed by surveys conducted at baseline and after implementation. Results indicated improvement in clinical practices and client living environment. As a secondary outcome, these improvements appeared to be associated with increased safety on the units evidenced by reduction in incidents of seclusion and restraint. Restructuring of the service delivery system of care so that clients are the center of clinical focus improves safety and can enhance the staff's attention to work with clients on their recovery. The role of the advanced practice nurse can influence the recovery of clients in state psychiatric hospitals. Future research should consider the impact on clients and their perceptions of the new service models.
NASA Astrophysics Data System (ADS)
Ivanov, M. P.; Tolmachev, Yu. A.
2018-05-01
We consider the most feasible ways to significantly improve the sensitivity of spectroscopic methods for detection and measurement of trace concentrations of greenhouse gas molecules in the atmosphere. The proposed methods are based on combining light fluxes from a number of spectral components of the specified molecule on the same photodetector, taking into account the characteristic features of the transmission spectrum of devices utilizing multipath interference effects.
Bezombes, Lucie; Gaucherand, Stéphanie; Kerbiriou, Christian; Reinert, Marie-Eve; Spiegelberger, Thomas
2017-08-01
In many countries, biodiversity compensation is required to counterbalance negative impacts of development projects on biodiversity by carrying out ecological measures, called offset when the goal is to reach "no net loss" of biodiversity. One main issue is to ensure that offset gains are equivalent to impact-related losses. Ecological equivalence is assessed with ecological equivalence assessment methods taking into account a range of key considerations that we summarized as ecological, spatial, temporal, and uncertainty. When equivalence assessment methods take into account all considerations, we call them "comprehensive". Equivalence assessment methods should also aim to be science-based and operational, which is challenging. Many equivalence assessment methods have been developed worldwide but none is fully satisfying. In the present study, we examine 13 equivalence assessment methods in order to identify (i) their general structure and (ii) the synergies and trade-offs between equivalence assessment methods characteristics related to operationality, scientific-basis and comprehensiveness (called "challenges" in his paper). We evaluate each equivalence assessment methods on the basis of 12 criteria describing the level of achievement of each challenge. We observe that all equivalence assessment methods share a general structure, with possible improvements in the choice of target biodiversity, the indicators used, the integration of landscape context and the multipliers reflecting time lags and uncertainties. We show that no equivalence assessment methods combines all challenges perfectly. There are trade-offs between and within the challenges: operationality tends to be favored while scientific basis are integrated heterogeneously in equivalence assessment methods development. One way of improving the challenges combination would be the use of offset dedicated data-bases providing scientific feedbacks on previous offset measures.
An Analysis of Performance-Based Funding Policies and Recommendations for the Florida College System
ERIC Educational Resources Information Center
Balog, Scott E.
2016-01-01
Nearly 30 states have adopted or are transitioning to performance-based funding programs for community colleges that allocate funding based on institutional performance according to defined metrics. While embraced by state lawmakers and promoted by outside advocacy groups as a method to improve student outcomes, enhance accountability and ensure…
A Quality Scorecard for the Administration of Online Education Programs: A Delphi Study
ERIC Educational Resources Information Center
Shelton, Kaye
2010-01-01
As the demands for public accountability increase for the higher education industry, institutions are seeking methods for continuous improvement in order to demonstrate quality within programs and processes, including those provided through online education. Because of the rapid growth of online education programs, institutions are further called…
Homogeneous Grouping in the Context of High-Stakes Testing: Does It Improve Reading Achievement?
ERIC Educational Resources Information Center
Salcedo-Gonzalez, Trena
2012-01-01
As accountability reform intensifies, urban school districts strive to meet No Child Left Behind mandates to avoid severe penalties. This study investigated the resurgence of homogeneous grouping methods as a means to increase reading achievement and meet English Language Arts Adequate Yearly Progress requirements. Specifically, this study…
USDA-ARS?s Scientific Manuscript database
Conservation tillage methods are beneficial as they disturb soil less and leaves increased crop residue cover (CRC) after planting on the soil surface. CRC helps reduce soil erosion, evaporation, and the need for tillage operations in fields. Greenhouse gas emissions are reduced to due to less fos...
Rigid Response in an Age of Accountability: The Potential of Leadership and Trust
ERIC Educational Resources Information Center
Daly, Alan J.
2009-01-01
Purpose: The No Child Left Behind Act laudably brings social justice and equity issues to the forefront; however, the act's threat- and sanction-driven methods are not only increasing stress levels but potentially causing a rigid response, especially in the growing population of schools labeled "program improvement" (PI). Specifically,…
Mixed Methods Research: Taking a Broader View
ERIC Educational Resources Information Center
Stewart, Tricia J.; Palermo-Biggs, Michelle
2013-01-01
For school districts, the increasing importance of using data for continuous improvement has become part of the educational landscape under accountability. In many ways, educators have become inundated with data but not always in ways that provide them with a full picture to adequately weigh decisions for their specific context. One way to use…
Interacting multiple model forward filtering and backward smoothing for maneuvering target tracking
NASA Astrophysics Data System (ADS)
Nandakumaran, N.; Sutharsan, S.; Tharmarasa, R.; Lang, Tom; McDonald, Mike; Kirubarajan, T.
2009-08-01
The Interacting Multiple Model (IMM) estimator has been proven to be effective in tracking agile targets. Smoothing or retrodiction, which uses measurements beyond the current estimation time, provides better estimates of target states. Various methods have been proposed for multiple model smoothing in the literature. In this paper, a new smoothing method, which involves forward filtering followed by backward smoothing while maintaining the fundamental spirit of the IMM, is proposed. The forward filtering is performed using the standard IMM recursion, while the backward smoothing is performed using a novel interacting smoothing recursion. This backward recursion mimics the IMM estimator in the backward direction, where each mode conditioned smoother uses standard Kalman smoothing recursion. Resulting algorithm provides improved but delayed estimates of target states. Simulation studies are performed to demonstrate the improved performance with a maneuvering target scenario. The comparison with existing methods confirms the improved smoothing accuracy. This improvement results from avoiding the augmented state vector used by other algorithms. In addition, the new technique to account for model switching in smoothing is a key in improving the performance.
Writing across the Accounting Curriculum: An Experiment.
ERIC Educational Resources Information Center
Riordan, Diane A.; Riordan, Michael P.; Sullivan, M. Cathy
2000-01-01
Develops a structured writing effectiveness program across three junior level courses in the accounting major (tax, cost, and financial accounting) to improve the writing skills of accounting students. Provides evidence that the writing across the curriculum project significantly improved the students' writing skills. (SC)
NASA Astrophysics Data System (ADS)
Paganelli, Chiara; Lee, Danny; Greer, Peter B.; Baroni, Guido; Riboldi, Marco; Keall, Paul
2015-09-01
The quantification of tumor motion in sites affected by respiratory motion is of primary importance to improve treatment accuracy. To account for motion, different studies analyzed the translational component only, without focusing on the rotational component, which was quantified in a few studies on the prostate with implanted markers. The aim of our study was to propose a tool able to quantify lung tumor rotation without the use of internal markers, thus providing accurate motion detection close to critical structures such as the heart or liver. Specifically, we propose the use of an automatic feature extraction method in combination with the acquisition of fast orthogonal cine MRI images of nine lung patients. As a preliminary test, we evaluated the performance of the feature extraction method by applying it on regions of interest around (i) the diaphragm and (ii) the tumor and comparing the estimated motion with that obtained by (i) the extraction of the diaphragm profile and (ii) the segmentation of the tumor, respectively. The results confirmed the capability of the proposed method in quantifying tumor motion. Then, a point-based rigid registration was applied to the extracted tumor features between all frames to account for rotation. The median lung rotation values were -0.6 ± 2.3° and -1.5 ± 2.7° in the sagittal and coronal planes respectively, confirming the need to account for tumor rotation along with translation to improve radiotherapy treatment.
Messan Setodji, Claude; Le, Vi-Nhuan; Schaack, Diana
2012-01-01
Child care studies that have examined links between teachers' qualifications and children's outcomes often ignore teachers’ and children’s transitions between classrooms at a center throughout the day and only take into account head teacher qualifications. The objective of this investigation was to examine these traditional assumptions and to compare inferences made from these traditional models to methods accounting for transitions between classrooms and multiple teachers in a classroom. The study examined the receptive language, letter-word identification, and passage comprehension skills of 307 children enrolled in 49 community-based childcare centers serving primarily low-income families in Colorado. Results suggest that nearly one-third of children and over 80% of teachers moved daily between classrooms. Findings also reveal that failure to account for daily transitions between classrooms can affect interpretations of the relationship between teacher qualifications and child outcomes, with the model accounting for movement providing significant improvements in model fit and inference. PMID:22389546
Treatment of constraints in the stochastic quantization method and covariantized Langevin equation
NASA Astrophysics Data System (ADS)
Ikegami, Kenji; Kimura, Tadahiko; Mochizuki, Riuji
1993-04-01
We study the treatment of the constraints in the stochastic quantization method. We improve the treatment of the stochastic consistency condition proposed by Namiki et al. by suitably taking into account the Ito calculus. Then we obtain an improved Langevi equation and the Fokker-Planck equation which naturally leads to the correct path integral quantization of the constrained system as the stochastic equilibrium state. This treatment is applied to an O( N) non-linear α model and it is shown that singular terms appearing in the improved Langevin equation cancel out the σ n(O) divergences in one loop order. We also ascertain that the above Langevin equation, rewritten in terms of idependent variables, is actually equivalent to the one in the general-coordinate transformation covariant and vielbein-rotation invariant formalish.
Erchick, Daniel J.; George, Asha S.; Umeh, Chukwunonso; Wonodi, Chizoba
2017-01-01
Background: Routine immunization coverage in Nigeria has remained low, and studies have identified a lack of accountability as a barrier to high performance in the immunization system. Accountability lies at the heart of various health systems strengthening efforts recently launched in Nigeria, including those related to immunization. Our aim was to understand the views of health officials on the accountability challenges hindering immunization service delivery at various levels of government. Methods: A semi-structured questionnaire was used to interview immunization and primary healthcare (PHC) officials from national, state, local, and health facility levels in Niger State in north central Nigeria. Individuals were selected to represent a range of roles and responsibilities in the immunization system. The questionnaire explored concepts related to internal accountability using a framework that organizes accountability into three axes based upon how they drive change in the health system. Results: Respondents highlighted accountability challenges across multiple components of the immunization system, including vaccine availability, financing, logistics, human resources, and data management. A major focus was the lack of clear roles and responsibilities both within institutions and between levels of government. Delays in funding, especially at lower levels of government, disrupted service delivery. Supervision occurred less frequently than necessary, and the limited decision space of managers prevented problems from being resolved. Motivation was affected by the inability of officials to fulfill their responsibilities. Officials posited numerous suggestions to improve accountability, including clarifying roles and responsibilities, ensuring timely release of funding, and formalizing processes for supervision, problem solving, and data reporting. Conclusion: Weak accountability presents a significant barrier to performance of the routine immunization system and high immunization coverage in Nigeria. As one stakeholder in ensuring the performance of health systems, routine immunization officials reveal critical areas that need to be prioritized if emerging interventions to improve accountability in routine immunization are to have an effect. PMID:28812836
Lymeus, Freddie; Lindberg, Per; Hartig, Terry
2018-03-01
Mindfulness courses conventionally use effortful, focused meditation to train attention. In contrast, natural settings can effortlessly support state mindfulness and restore depleted attention resources, which could facilitate meditation. We performed two studies that compared conventional training with restoration skills training (ReST) that taught low-effort open monitoring meditation in a garden over five weeks. Assessments before and after meditation on multiple occasions showed that ReST meditation increasingly enhanced attention performance. Conventional meditation enhanced attention initially but increasingly incurred effort, reflected in performance decrements toward the course end. With both courses, attentional improvements generalized in the first weeks of training. Against established accounts, the generalized improvements thus occurred before any effort was incurred by the conventional exercises. We propose that restoration rather than attention training can account for early attentional improvements with meditation. ReST holds promise as an undemanding introduction to mindfulness and as a method to enhance restoration in nature contacts. Copyright © 2018 Elsevier Inc. All rights reserved.
2013-07-02
in streamer discharge afterglow in a variety of fueVair mixtures in order to account for the 0 reaction pathways in transient plasma ignition. It is... plasma ignition (TPI), the use of streamers for ignition in combustion engines, holds great promise for improving performance. TPI has been tested...standard spark gap or arc ignition methods [1-4]. These improvements to combustion allow increasing power and efficiency in existing engines such as
DRG systems in Europe: variations in cost accounting systems among 12 countries.
Tan, Siok Swan; Geissler, Alexander; Serdén, Lisbeth; Heurgren, Mona; van Ineveld, B Martin; Redekop, W Ken; Hakkaart-van Roijen, Leona
2014-12-01
Diagnosis-related group (DRG)-based hospital payment systems have gradually become the principal means of reimbursing hospitals in many European countries. Owing to the absence or inaccuracy of costs related to DRGs, these countries have started to routinely collect cost accounting data. The aim of the present article was to compare the cost accounting systems of 12 European countries. A standardized questionnaire was developed to guide comprehensive cost accounting system descriptions for each of the 12 participating countries. The cost accounting systems of European countries vary widely by the share of hospital costs reimbursed through DRG payment, the presence of mandatory cost accounting and/or costing guidelines, the share of cost collecting hospitals, costing methods and data checks on reported cost data. Each of these aspects entails a trade-off between accuracy of the cost data and feasibility constraints. Although a 'best' cost accounting system does not exist, our cross-country comparison gives insight into international differences and may help regulatory authorities and hospital managers to identify and improve areas of weakness in their cost accounting systems. Moreover, it may help health policymakers to underpin the development of a cost accounting system. © The Author 2014. Published by Oxford University Press on behalf of the European Public Health Association. All rights reserved.
NASA Technical Reports Server (NTRS)
Padavala, Satyasrinivas; Palazzolo, Alan B.; Vallely, Pat; Ryan, Steve
1994-01-01
An improved dynamic analysis for liquid annular seals with arbitrary profile based on a method, first proposed by Nelson and Nguyen, is presented. An improved first order solution that incorporates a continuous interpolation of perturbed quantities in the circumferential direction, is presented. The original method uses an approximation scheme for circumferential gradients, based on Fast Fourier Transforms (FFT). A simpler scheme based on cubic splines is found to be computationally more efficient with better convergence at higher eccentricities. A new approach of computing dynamic coefficients based on external specified load is introduced. This improved analysis is extended to account for arbitrarily varying seal profile in both axial and circumferential directions. An example case of an elliptical seal with varying degrees of axial curvature is analyzed. A case study based on actual operating clearances of an interstage seal of the Space Shuttle Main Engine High Pressure Oxygen Turbopump is presented.
Accounting for estimated IQ in neuropsychological test performance with regression-based techniques.
Testa, S Marc; Winicki, Jessica M; Pearlson, Godfrey D; Gordon, Barry; Schretlen, David J
2009-11-01
Regression-based normative techniques account for variability in test performance associated with multiple predictor variables and generate expected scores based on algebraic equations. Using this approach, we show that estimated IQ, based on oral word reading, accounts for 1-9% of the variability beyond that explained by individual differences in age, sex, race, and years of education for most cognitive measures. These results confirm that adding estimated "premorbid" IQ to demographic predictors in multiple regression models can incrementally improve the accuracy with which regression-based norms (RBNs) benchmark expected neuropsychological test performance in healthy adults. It remains to be seen whether the incremental variance in test performance explained by estimated "premorbid" IQ translates to improved diagnostic accuracy in patient samples. We describe these methods, and illustrate the step-by-step application of RBNs with two cases. We also discuss the rationale, assumptions, and caveats of this approach. More broadly, we note that adjusting test scores for age and other characteristics might actually decrease the accuracy with which test performance predicts absolute criteria, such as the ability to drive or live independently.
An improved method for measuring the magnetic inhomogeneity shift in hydrogen masers
NASA Technical Reports Server (NTRS)
Reinhardt, V. S.; Peters, H. E.
1975-01-01
The reported method makes it possible to conduct all maser frequency measurements under conditions of low magnetic field intensity for which the hydrogen maser is most stable. Aspects concerning the origin of the magnetic inhomogeneity shift are examined and the available approaches for measuring this shift are considered, taking into account certain drawbacks of currently used methods. An approach free of these drawbacks can be based on the measurement of changes in a parameter representing the difference between the number of atoms in the involved states.
Method for measuring multiple scattering corrections between liquid scintillators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Verbeke, J. M.; Glenn, A. M.; Keefer, G. J.
2016-04-11
In this study, a time-of-flight method is proposed to experimentally quantify the fractions of neutrons scattering between scintillators. An array of scintillators is characterized in terms of crosstalk with this method by measuring a californium source, for different neutron energy thresholds. The spectral information recorded by the scintillators can be used to estimate the fractions of neutrons multiple scattering. With the help of a correction to Feynman's point model theory to account for multiple scattering, these fractions can in turn improve the mass reconstruction of fissile materials under investigation.
Johnson, Kevin K.; Goodwin, Greg E.
2013-01-01
Lake Michigan diversion accounting is the process used by the U. S. Army Corps of Engineers to quantify the amount of water that is diverted from the Lake Michigan watershed into the Illinois and Mississippi River Basins. A network of streamgages within the Chicago area waterway system monitor tributary river flows and the major river flow on the Chicago Sanitary and Ship Canal near Lemont as one of the instrumental tools used for Lake Michigan diversion accounting. The mean annual discharges recorded by these streamgages are used as additions or deductions to the mean annual discharge recorded by the main stream gaging station currently used in the Lake Michigan diversion accounting process, which is the Chicago Sanitary and Ship Canal near Lemont, Illinois (station number 05536890). A new stream gaging station, Summit Conduit near Summit, Illinois (station number 414757087490401), was installed on September 23, 2010, for the purpose of monitoring stage, velocity, and discharge through the Summit Conduit for the U.S. Army Corps of Engineers in accordance with Lake Michigan diversion accounting. Summit Conduit conveys flow from a small part of the lower Des Plaines River watershed underneath the Des Plaines River directly into the Chicago Sanitary and Ship Canal. Because the Summit Conduit discharges into the Chicago Sanitary and Ship Canal upstream from the stream gaging station at Lemont, Illinois, but does not contain flow diverted from the Lake Michigan watershed, it is considered a flow deduction to the discharge measured by the Lemont stream gaging station in the Lake Michigan diversion accounting process. This report offers a technical summary of the techniques and methods used for the collection and computation of the stage, velocity, and discharge data at the Summit Conduit near Summit, Illinois stream gaging station for the 2011 and 2012 Water Years. The stream gaging station Summit Conduit near Summit, Illinois (station number 414757087490401) is an example of a nonstandard stream gage. Traditional methods of equating stage to discharge historically were not effective. Examples of the nonstandard conditions include the converging tributary flows directly upstream of the gage; the trash rack and walkway near the opening of the conduit introducing turbulence and occasionally entraining air bubbles into the flow; debris within the conduit creating conditions of variable backwater and the constant influx of smaller debris that escapes the trash rack and catches or settles in the conduit and on the equipment. An acoustic Doppler velocity meter was installed to measure stage and velocity to compute discharge. The stage is used to calculate area based the stage-area rating. The index-velocity from the acoustic Doppler velocity meter is applied to the velocity-velocity rating and the product of the two rated values is a rated discharge by the index-velocity method. Nonstandard site conditions prevalent at the Summit Conduit stream gaging station generally are overcome through the index-velocity method. Despite the difficulties in gaging and measurements, improvements continue to be made in data collection, transmission, and measurements. Efforts to improve the site and to improve the ratings continue to improve the quality and quantity of the data available for Lake Michigan diversion accounting.
Wu, Zhihong; Lu, Ke; Zhu, Yuan
2015-01-01
The torque output accuracy of the IPMSM in electric vehicles using a state of the art MTPA strategy highly depends on the accuracy of machine parameters, thus, a torque estimation method is necessary for the safety of the vehicle. In this paper, a torque estimation method based on flux estimator with a modified low pass filter is presented. Moreover, by taking into account the non-ideal characteristic of the inverter, the torque estimation accuracy is improved significantly. The effectiveness of the proposed method is demonstrated through MATLAB/Simulink simulation and experiment.
Zhu, Yuan
2015-01-01
The torque output accuracy of the IPMSM in electric vehicles using a state of the art MTPA strategy highly depends on the accuracy of machine parameters, thus, a torque estimation method is necessary for the safety of the vehicle. In this paper, a torque estimation method based on flux estimator with a modified low pass filter is presented. Moreover, by taking into account the non-ideal characteristic of the inverter, the torque estimation accuracy is improved significantly. The effectiveness of the proposed method is demonstrated through MATLAB/Simulink simulation and experiment. PMID:26114557
Professional Accountability for Improving Life, College, and Career Readiness
ERIC Educational Resources Information Center
Snyder, Jon; Bristol, Travis J.
2015-01-01
This article builds on Darling-Hammond, Wilhoit, and Pittenger's (2014) new paradigm on "Accountability for College and Career Readiness" by focusing on one of its three pillars--professional accountability. The article begins by offering a conceptual framework for professional accountability for improvement. Next, it highlights slices…
26 CFR 1.6655-6 - Methods of accounting.
Code of Federal Regulations, 2010 CFR
2010-04-01
... of accounting method. Corporation ABC, a calendar year taxpayer, uses an accrual method of accounting... 26 Internal Revenue 13 2010-04-01 2010-04-01 false Methods of accounting. 1.6655-6 Section 1.6655... Methods of accounting. (a) In general. In computing any required installment, a corporation must use the...
Mathematical Modeling For Control Of A Flexible Manipulator
NASA Technical Reports Server (NTRS)
Hu, Anren
1996-01-01
Improved method of mathematical modeling of dynamics of flexible robotic manipulators developed for use in controlling motions of manipulators. Involves accounting for effect, upon modes of vibration of manipulator, of changes in configuration of manipulator and manipulated payload(s). Flexible manipulator has one or more long, slender articulated link(s), like those used in outer space, method also applicable to terrestrial industrial robotic manipulators with relatively short, stiff links, or to such terrestrial machines as construction cranes.
NASA Astrophysics Data System (ADS)
Pan, Shijia; Mirshekari, Mostafa; Fagert, Jonathon; Ramirez, Ceferino Gabriel; Chung, Albert Jin; Hu, Chih Chi; Shen, John Paul; Zhang, Pei; Noh, Hae Young
2018-02-01
Many human activities induce excitations on ambient structures with various objects, causing the structures to vibrate. Accurate vibration excitation source detection and characterization enable human activity information inference, hence allowing human activity monitoring for various smart building applications. By utilizing structural vibrations, we can achieve sparse and non-intrusive sensing, unlike pressure- and vision-based methods. Many approaches have been presented on vibration-based source characterization, and they often either focus on one excitation type or have limited performance due to the dispersion and attenuation effects of the structures. In this paper, we present our method to characterize two main types of excitations induced by human activities (impulse and slip-pulse) on multiple structures. By understanding the physical properties of waves and their propagation, the system can achieve accurate excitation tracking on different structures without large-scale labeled training data. Specifically, our algorithm takes properties of surface waves generated by impulse and of body waves generated by slip-pulse into account to handle the dispersion and attenuation effects when different types of excitations happen on various structures. We then evaluate the algorithm through multiple scenarios. Our method achieves up to a six times improvement in impulse localization accuracy and a three times improvement in slip-pulse trajectory length estimation compared to existing methods that do not take wave properties into account.
Ross, Simone J; Preston, Robyn; Lindemann, Iris C; Matte, Marie C; Samson, Rex; Tandinco, Filedito D; Larkins, Sarah L; Palsdottir, Bjorg; Neusy, Andre-Jacques
2014-01-01
The Training for Health Equity Network (THEnet), a group of diverse health professional schools aspiring toward social accountability, developed and pilot tested a comprehensive evaluation framework to assess progress toward socially accountable health professions education. The evaluation framework provides criteria for schools to assess their level of social accountability within their organization and planning; education, research and service delivery; and the direct and indirect impacts of the school and its graduates, on the community and health system. This paper describes the pilot implementation of testing the evaluation framework across five THEnet schools, and examines whether the evaluation framework was practical and feasible across contexts for the purposes of critical reflection and continuous improvement in terms of progress towards social accountability. In this pilot study, schools utilized the evaluation framework using a mixed method approach of data collection comprising of workshops, qualitative interviews and focus group discussions, document review and collation and analysis of existing quantitative data. The evaluation framework allowed each school to contextually gather evidence on how it was meeting the aspirational goals of social accountability across a range of school activities, and to identify strengths and areas for improvement and development. The evaluation framework pilot study demonstrated how social accountability can be assessed through a critically reflective and comprehensive process. As social accountability focuses on the relationship between health professions schools and health system and health population outcomes, each school was able to demonstrate to students, health professionals, governments, accrediting bodies, communities and other stakeholders how current and future health care needs of populations are addressed in terms of education, research, and service learning.
Lodenstein, Elsbet; Dieleman, Marjolein; Gerretsen, Barend; Broerse, Jacqueline E W
2017-02-01
Social accountability in the health sector has been promoted as a strategy to improve the quality and performance of health providers in low- and middle-income countries. Whether improvements occur, however, depends on the willingness and ability of health providers to respond to societal pressure for better care. This article uses a realist approach to review cases of collective citizen action and advocacy with the aim to identify key mechanisms of provider responsiveness. Purposeful searches for cases were combined with a systematic search in four databases. To be included in the review, the initiatives needed to describe at least one outcome at the level of frontline service provision. Some 37 social accountability initiatives in 15 countries met these criteria. Using a realist approach, retroductive analysis and triangulation of methods and sources were performed to construct Context-Mechanism-Outcome configurations that explain potential pathways to provider responsiveness. The findings suggest that health provider receptivity to citizens' demands for better health care is mediated by health providers' perceptions of the legitimacy of citizen groups and by the extent to which citizen groups provide personal and professional support to health providers. Some citizen groups activated political or formal bureaucratic accountability channels but the effect on provider responsiveness of such strategies was more mixed. Favourable contexts for health provider responsiveness comprise socio-political contexts in which providers self-identify as activists, health system contexts in which health providers depend on citizens' expertise and capacities, and health system contexts where providers have the self-perceived ability to change the system in which they operate. Rather than providing recipes for successful social accountability initiatives, the synthesis proposes a programme theory that can support reflections on the theories of change underpinning social accountability initiatives and interventions to improve the quality of primary health care in different settings. © The Author 2016. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine.
Neutron crosstalk between liquid scintillators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Verbeke, J. M.; Prasad, M. K.; Snyderman, N. J.
2015-05-01
We propose a method to quantify the fractions of neutrons scattering between liquid scintillators. Using a spontaneous fission source, this method can be utilized to quickly characterize an array of liquid scintillators in terms of crosstalk. The point model theory due to Feynman is corrected to account for these multiple scatterings. Using spectral information measured by the liquid scintillators, fractions of multiple scattering can be estimated, and mass reconstruction of fissile materials under investigation can be improved. Monte Carlo simulations of mono-energetic neutron sources were performed to estimate neutron crosstalk. A californium source in an array of liquid scintillators wasmore » modeled to illustrate the improvement of the mass reconstruction.« less
Efficient Unsteady Flow Visualization with High-Order Access Dependencies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Jiang; Guo, Hanqi; Yuan, Xiaoru
We present a novel high-order access dependencies based model for efficient pathline computation in unsteady flow visualization. By taking longer access sequences into account to model more sophisticated data access patterns in particle tracing, our method greatly improves the accuracy and reliability in data access prediction. In our work, high-order access dependencies are calculated by tracing uniformly-seeded pathlines in both forward and backward directions in a preprocessing stage. The effectiveness of our proposed approach is demonstrated through a parallel particle tracing framework with high-order data prefetching. Results show that our method achieves higher data locality and hence improves the efficiencymore » of pathline computation.« less
Hybrid electro-optics and chipscale integration of electronics and photonics
NASA Astrophysics Data System (ADS)
Dalton, L. R.; Robinson, B. H.; Elder, D. L.; Tillack, A. F.; Johnson, L. E.
2017-08-01
Taken together, theory-guided nano-engineering of organic electro-optic materials and hybrid device architectures have permitted dramatic improvement of the performance of electro-optic devices. For example, the voltage-length product has been improved by nearly a factor of 104 , bandwidths have been extended to nearly 200 GHz, device footprints reduced to less than 200 μm2 , and femtojoule energy efficiency achieved. This presentation discusses the utilization of new coarse-grained theoretical methods and advanced quantum mechanical methods to quantitatively simulate the physical properties of new classes of organic electro-optic materials and to evaluate their performance in nanoscopic device architectures, accounting for the effect on chromophore ordering at interfaces in nanoscopic waveguides.
Study of Fuze Structure and Reliability Design Based on the Direct Search Method
NASA Astrophysics Data System (ADS)
Lin, Zhang; Ning, Wang
2017-03-01
Redundant design is one of the important methods to improve the reliability of the system, but mutual coupling of multiple factors is often involved in the design. In my study, Direct Search Method is introduced into the optimum redundancy configuration for design optimization, in which, the reliability, cost, structural weight and other factors can be taken into account simultaneously, and the redundant allocation and reliability design of aircraft critical system are computed. The results show that this method is convenient and workable, and applicable to the redundancy configurations and optimization of various designs upon appropriate modifications. And this method has a good practical value.
Accounting for GC-content bias reduces systematic errors and batch effects in ChIP-seq data.
Teng, Mingxiang; Irizarry, Rafael A
2017-11-01
The main application of ChIP-seq technology is the detection of genomic regions that bind to a protein of interest. A large part of functional genomics' public catalogs is based on ChIP-seq data. These catalogs rely on peak calling algorithms that infer protein-binding sites by detecting genomic regions associated with more mapped reads (coverage) than expected by chance, as a result of the experimental protocol's lack of perfect specificity. We find that GC-content bias accounts for substantial variability in the observed coverage for ChIP-seq experiments and that this variability leads to false-positive peak calls. More concerning is that the GC effect varies across experiments, with the effect strong enough to result in a substantial number of peaks called differently when different laboratories perform experiments on the same cell line. However, accounting for GC content bias in ChIP-seq is challenging because the binding sites of interest tend to be more common in high GC-content regions, which confounds real biological signals with unwanted variability. To account for this challenge, we introduce a statistical approach that accounts for GC effects on both nonspecific noise and signal induced by the binding site. The method can be used to account for this bias in binding quantification as well to improve existing peak calling algorithms. We use this approach to show a reduction in false-positive peaks as well as improved consistency across laboratories. © 2017 Teng and Irizarry; Published by Cold Spring Harbor Laboratory Press.
Can Early Intervention Improve Maternal Well-Being? Evidence from a Randomized Controlled Trial
Doyle, Orla; Delaney, Liam; O’Farrelly, Christine; Fitzpatrick, Nick; Daly, Michael
2017-01-01
Objective This study estimates the effect of a targeted early childhood intervention program on global and experienced measures of maternal well-being utilizing a randomized controlled trial design. The primary aim of the intervention is to improve children’s school readiness skills by working directly with parents to improve their knowledge of child development and parenting behavior. One potential externality of the program is well-being benefits for parents given its direct focus on improving parental coping, self-efficacy, and problem solving skills, as well as generating an indirect effect on parental well-being by targeting child developmental problems. Methods Participants from a socio-economically disadvantaged community are randomly assigned during pregnancy to an intensive 5-year home visiting parenting program or a control group. We estimate and compare treatment effects on multiple measures of global and experienced well-being using permutation testing to account for small sample size and a stepdown procedure to account for multiple testing. Results The intervention has no impact on global well-being as measured by life satisfaction and parenting stress or experienced negative affect using episodic reports derived from the Day Reconstruction Method (DRM). Treatment effects are observed on measures of experienced positive affect derived from the DRM and a measure of mood yesterday. Conclusion The limited treatment effects suggest that early intervention programs may produce some improvements in experienced positive well-being, but no effects on negative aspects of well-being. Different findings across measures may result as experienced measures of well-being avoid the cognitive biases that impinge upon global assessments. PMID:28095505
Ethnic Group Attitudes: A Behavioral Model for the Study of Attitude Intensity.
ERIC Educational Resources Information Center
Gaither, Gerald; And Others
The attitude assessment model presented here is intended to be an improvement over methods traditionally used to study attitudes. It takes into account findings by Astin (1969) and Berkowitz (1968), calling for a model expressing the covert behavior of a subject in terms equivalent to those used to anticipate overt behavior. This paper presents…
Measuring School Performance To Improve Student Achievement and To Reward Effective Programs.
ERIC Educational Resources Information Center
Heistad, Dave; Spicuzza, Rick
This paper describes the method that the Minneapolis Public School system (MPS), Minnesota, uses to measure school and student performance. MPS uses a multifaceted system that both captures and accounts for the complexity of a large urban school district. The system incorporates: (1) a hybrid model of critical indicators that report on level of…
ERIC Educational Resources Information Center
Agnant Rogers, Myriam
2013-01-01
As a result of poor student performance, professional development has emerged as a key strategy for improving instruction and achievement. In times of reduced resources and increased accountability, schools must evaluate their efforts in order to make sound decisions about policy and practice. This mixed method study was designed to investigate…
A multi-source data assimilation framework for flood forecasting: Accounting for runoff routing lags
NASA Astrophysics Data System (ADS)
Meng, S.; Xie, X.
2015-12-01
In the flood forecasting practice, model performance is usually degraded due to various sources of uncertainties, including the uncertainties from input data, model parameters, model structures and output observations. Data assimilation is a useful methodology to reduce uncertainties in flood forecasting. For the short-term flood forecasting, an accurate estimation of initial soil moisture condition will improve the forecasting performance. Considering the time delay of runoff routing is another important effect for the forecasting performance. Moreover, the observation data of hydrological variables (including ground observations and satellite observations) are becoming easily available. The reliability of the short-term flood forecasting could be improved by assimilating multi-source data. The objective of this study is to develop a multi-source data assimilation framework for real-time flood forecasting. In this data assimilation framework, the first step is assimilating the up-layer soil moisture observations to update model state and generated runoff based on the ensemble Kalman filter (EnKF) method, and the second step is assimilating discharge observations to update model state and runoff within a fixed time window based on the ensemble Kalman smoother (EnKS) method. This smoothing technique is adopted to account for the runoff routing lag. Using such assimilation framework of the soil moisture and discharge observations is expected to improve the flood forecasting. In order to distinguish the effectiveness of this dual-step assimilation framework, we designed a dual-EnKF algorithm in which the observed soil moisture and discharge are assimilated separately without accounting for the runoff routing lag. The results show that the multi-source data assimilation framework can effectively improve flood forecasting, especially when the runoff routing has a distinct time lag. Thus, this new data assimilation framework holds a great potential in operational flood forecasting by merging observations from ground measurement and remote sensing retrivals.
25 CFR 170.605 - When may BIA use force account methods in the IRR Program?
Code of Federal Regulations, 2010 CFR
2010-04-01
... 25 Indians 1 2010-04-01 2010-04-01 false When may BIA use force account methods in the IRR Program... § 170.605 When may BIA use force account methods in the IRR Program? BIA may use force account methods... account project activities. ...
Information Filtering via Heterogeneous Diffusion in Online Bipartite Networks
Zhang, Fu-Guo; Zeng, An
2015-01-01
The rapid expansion of Internet brings us overwhelming online information, which is impossible for an individual to go through all of it. Therefore, recommender systems were created to help people dig through this abundance of information. In networks composed by users and objects, recommender algorithms based on diffusion have been proven to be one of the best performing methods. Previous works considered the diffusion process from user to object, and from object to user to be equivalent. We show in this work that it is not the case and we improve the quality of the recommendation by taking into account the asymmetrical nature of this process. We apply this idea to modify the state-of-the-art recommendation methods. The simulation results show that the new methods can outperform these existing methods in both recommendation accuracy and diversity. Finally, this modification is checked to be able to improve the recommendation in a realistic case. PMID:26125631
Information Filtering via Heterogeneous Diffusion in Online Bipartite Networks.
Zhang, Fu-Guo; Zeng, An
2015-01-01
The rapid expansion of Internet brings us overwhelming online information, which is impossible for an individual to go through all of it. Therefore, recommender systems were created to help people dig through this abundance of information. In networks composed by users and objects, recommender algorithms based on diffusion have been proven to be one of the best performing methods. Previous works considered the diffusion process from user to object, and from object to user to be equivalent. We show in this work that it is not the case and we improve the quality of the recommendation by taking into account the asymmetrical nature of this process. We apply this idea to modify the state-of-the-art recommendation methods. The simulation results show that the new methods can outperform these existing methods in both recommendation accuracy and diversity. Finally, this modification is checked to be able to improve the recommendation in a realistic case.
IT investments can add business value.
Williams, Terry G
2002-05-01
Investment in information technology (IT) is costly, but necessary to enable healthcare organizations to improve their infrastructure and achieve other improvement initiatives. Such an investment is even more costly, however, if the technology does not appropriately enable organizations to perform business processes that help them accomplish their mission of providing safe, high-quality care cost-effectively. Before committing to a costly IT investment, healthcare organizations should implement a decision-making process that can help them choose, implement, and use technology that will provide sustained business value. A seven-step decision-making process that can help healthcare organizations achieve this result involves performing a gap analysis, assessing and aligning organizational goals, establishing distributed accountability, identifying linked organizational-change initiatives, determining measurement methods, establishing appropriate teams to ensure systems are integrated with multidisciplinary improvement methods, and developing a plan to accelerate adoption of the IT product.
NASA Astrophysics Data System (ADS)
Song, Yanpo; Peng, Xiaoqi; Tang, Ying; Hu, Zhikun
2013-07-01
To improve the operation level of copper converter, the approach to optimal decision making modeling for coppermatte converting process based on data mining is studied: in view of the characteristics of the process data, such as containing noise, small sample size and so on, a new robust improved ANN (artificial neural network) modeling method is proposed; taking into account the application purpose of decision making model, three new evaluation indexes named support, confidence and relative confidence are proposed; using real production data and the methods mentioned above, optimal decision making model for blowing time of S1 period (the 1st slag producing period) are developed. Simulation results show that this model can significantly improve the converting quality of S1 period, increase the optimal probability from about 70% to about 85%.
NECAP 4.1: NASA's energy-cost analysis program user's manual
NASA Technical Reports Server (NTRS)
Jensen, R. N.; Henninger, R. H.; Miner, D. L.
1983-01-01
The Enery Cost Analysis Program (NECAP) is a powerful computerized method to determine and to minimize building energy consumption. The program calculates hourly heat gain or losses taking into account the building thermal resistance and mass, using hourly weather and a "response factor' method. Internal temperatures are allowed to vary in accordance with thermostat settings and equipment capacity. A simplified input procedure and numerous other technical improvements are presented. This Users Manual describes the program and provides examples.
Levin-Rector, Alison; Wilson, Elisha L; Fine, Annie D; Greene, Sharon K
2015-02-01
Since the early 2000s, the Bureau of Communicable Disease of the New York City Department of Health and Mental Hygiene has analyzed reportable infectious disease data weekly by using the historical limits method to detect unusual clusters that could represent outbreaks. This method typically produced too many signals for each to be investigated with available resources while possibly failing to signal during true disease outbreaks. We made method refinements that improved the consistency of case inclusion criteria and accounted for data lags and trends and aberrations in historical data. During a 12-week period in 2013, we prospectively assessed these refinements using actual surveillance data. The refined method yielded 74 signals, a 45% decrease from what the original method would have produced. Fewer and less biased signals included a true citywide increase in legionellosis and a localized campylobacteriosis cluster subsequently linked to live-poultry markets. Future evaluations using simulated data could complement this descriptive assessment.
Coherence penalty functional: A simple method for adding decoherence in Ehrenfest dynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Akimov, Alexey V., E-mail: alexvakimov@gmail.com, E-mail: oleg.prezhdo@rochester.edu; Chemistry Department, Brookhaven National Laboratory, Upton, New York 11973; Long, Run
2014-05-21
We present a new semiclassical approach for description of decoherence in electronically non-adiabatic molecular dynamics. The method is formulated on the grounds of the Ehrenfest dynamics and the Meyer-Miller-Thoss-Stock mapping of the time-dependent Schrödinger equation onto a fully classical Hamiltonian representation. We introduce a coherence penalty functional (CPF) that accounts for decoherence effects by randomizing the wavefunction phase and penalizing development of coherences in regions of strong non-adiabatic coupling. The performance of the method is demonstrated with several model and realistic systems. Compared to other semiclassical methods tested, the CPF method eliminates artificial interference and improves agreement with the fullymore » quantum calculations on the models. When applied to study electron transfer dynamics in the nanoscale systems, the method shows an improved accuracy of the predicted time scales. The simplicity and high computational efficiency of the CPF approach make it a perfect practical candidate for applications in realistic systems.« less
Using accountability to improve reproductive health care.
George, Asha
2003-05-01
Accountability is best understood as a referee of the dynamics in two-way relationships, often between unequal partners. The literature on accountability distinguishes between political, fiscal, administrative, legal and constitutional accountability. This paper focuses on accountability mechanisms in health care and how they mediate between service providers and communities and between different kinds of health personnel at the primary health care level. It refers to case studies of participatory processes for improving sexual and reproductive health service delivery. Information, dialogue and negotiation are important elements that enable accountability mechanisms to address problems by supporting change and engagement between participants. In order to succeed, however, efforts towards better accountability that broaden the participation of users must take into account the social contexts and the policy and service delivery systems in which they are applied, address power relations and improve the representation of marginalised groups within communities and service delivery systems.
26 CFR 1.818-2 - Accounting provisions.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 26 Internal Revenue 8 2010-04-01 2010-04-01 false Accounting provisions. 1.818-2 Section 1.818-2...) INCOME TAXES Miscellaneous Provisions § 1.818-2 Accounting provisions. (a) Method of accounting. (1... accounting. Thus, the over-all method of accounting for life insurance companies shall be the accrual method...
NASA Astrophysics Data System (ADS)
Tamboli, Prakash Kumar; Duttagupta, Siddhartha P.; Roy, Kallol
2015-08-01
The paper deals with dynamic compensation of delayed Self Powered Flux Detectors (SPFDs) using discrete time H∞ filtering method for improving the response of SPFDs with significant delayed components such as Platinum and Vanadium SPFD. We also present a comparative study between the Linear Matrix Inequality (LMI) based H∞ filtering and Algebraic Riccati Equation (ARE) based Kalman filtering methods with respect to their delay compensation capabilities. Finally an improved recursive H∞ filter based on the adaptive fading memory technique is proposed which provides an improved performance over existing methods. The existing delay compensation algorithms do not account for the rate of change in the signal for determining the filter gain and therefore add significant noise during the delay compensation process. The proposed adaptive fading memory H∞ filter minimizes the overall noise very effectively at the same time keeps the response time at minimum values. The recursive algorithm is easy to implement in real time as compared to the LMI (or ARE) based solutions.
Iglesias, Juan Eugenio; Sabuncu, Mert Rory; Van Leemput, Koen
2013-10-01
Many segmentation algorithms in medical image analysis use Bayesian modeling to augment local image appearance with prior anatomical knowledge. Such methods often contain a large number of free parameters that are first estimated and then kept fixed during the actual segmentation process. However, a faithful Bayesian analysis would marginalize over such parameters, accounting for their uncertainty by considering all possible values they may take. Here we propose to incorporate this uncertainty into Bayesian segmentation methods in order to improve the inference process. In particular, we approximate the required marginalization over model parameters using computationally efficient Markov chain Monte Carlo techniques. We illustrate the proposed approach using a recently developed Bayesian method for the segmentation of hippocampal subfields in brain MRI scans, showing a significant improvement in an Alzheimer's disease classification task. As an additional benefit, the technique also allows one to compute informative "error bars" on the volume estimates of individual structures. Copyright © 2013 Elsevier B.V. All rights reserved.
Iglesias, Juan Eugenio; Sabuncu, Mert Rory; Leemput, Koen Van
2013-01-01
Many segmentation algorithms in medical image analysis use Bayesian modeling to augment local image appearance with prior anatomical knowledge. Such methods often contain a large number of free parameters that are first estimated and then kept fixed during the actual segmentation process. However, a faithful Bayesian analysis would marginalize over such parameters, accounting for their uncertainty by considering all possible values they may take. Here we propose to incorporate this uncertainty into Bayesian segmentation methods in order to improve the inference process. In particular, we approximate the required marginalization over model parameters using computationally efficient Markov chain Monte Carlo techniques. We illustrate the proposed approach using a recently developed Bayesian method for the segmentation of hippocampal subfields in brain MRI scans, showing a significant improvement in an Alzheimer’s disease classification task. As an additional benefit, the technique also allows one to compute informative “error bars” on the volume estimates of individual structures. PMID:23773521
NASA Astrophysics Data System (ADS)
Hirata, Masafumi; Yamamoto, Tatsuo; Yasui, Toshiaki; Hayashi, Mayu; Takebe, Atsuji; Funahashi, Masashi
In the construction site, the light oil that the construction vehicle such as dump trucks uses accounts for 70 percent of the amount of the energy use. Therefore, the eco-driving education of the construction vehicle is effective in the fuel cost improvement and the CO2 reduction. The eco-driving education can be executed cheap and easily, and a high effect can be expected. However, it is necessary to evaluate the eco-driving situation of the construction vehicle exactly to maintain the educative effect for a long term. In this paper, the method for evaluating the effect of the fuel cost improvement was examined by using the vehicle speed and the engine rotational speed of the dump truck. In this method, "Ideal eco-driving model" that considers the difference between the vehicle model and the running condition (traffic jam etc.) is made. As a result, it is possible to evaluate the fuel consumption improvement effect of a dump truck by the same index.
Leveraging Human and Fiscal Resources for School Improvement.
ERIC Educational Resources Information Center
Kelley, Carolyn
1999-01-01
Critiques "Handbook" chapters on achieving educational accountability and managing resources for school improvement. Accountability and finance research have shifted focus from resource allocation to organizational results. Principal-agent, expectancy, and risk theories provide lenses for viewing accountability. Research on how these…
Quality, patient safety, and professional values.
Skarda, David; Barnhart, Doug
2015-12-01
From the time of Earnest Codman until recently, measuring and improving quality has variably been viewed as a supportive group in the hospital, or an irritating "fringe" movement in health care. A more thoughtful view of quality improvement (QI) is that it is a central tenet of surgical professionalism, and really what we signed up for when we accepted the responsibility of healing patients using surgery as our methodology. The following article uses a patient safety event to highlight the successful use of a well-known method of improving care, while engaging trainees in the principles of physician engagement, accountability, and professionalism. Copyright © 2015 Elsevier Inc. All rights reserved.
An improved target velocity sampling algorithm for free gas elastic scattering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Romano, Paul K.; Walsh, Jonathan A.
We present an improved algorithm for sampling the target velocity when simulating elastic scattering in a Monte Carlo neutron transport code that correctly accounts for the energy dependence of the scattering cross section. The algorithm samples the relative velocity directly, thereby avoiding a potentially inefficient rejection step based on the ratio of cross sections. Here, we have shown that this algorithm requires only one rejection step, whereas other methods of similar accuracy require two rejection steps. The method was verified against stochastic and deterministic reference results for upscattering percentages in 238U. Simulations of a light water reactor pin cell problemmore » demonstrate that using this algorithm results in a 3% or less penalty in performance when compared with an approximate method that is used in most production Monte Carlo codes« less
An improved target velocity sampling algorithm for free gas elastic scattering
Romano, Paul K.; Walsh, Jonathan A.
2018-02-03
We present an improved algorithm for sampling the target velocity when simulating elastic scattering in a Monte Carlo neutron transport code that correctly accounts for the energy dependence of the scattering cross section. The algorithm samples the relative velocity directly, thereby avoiding a potentially inefficient rejection step based on the ratio of cross sections. Here, we have shown that this algorithm requires only one rejection step, whereas other methods of similar accuracy require two rejection steps. The method was verified against stochastic and deterministic reference results for upscattering percentages in 238U. Simulations of a light water reactor pin cell problemmore » demonstrate that using this algorithm results in a 3% or less penalty in performance when compared with an approximate method that is used in most production Monte Carlo codes« less
Image Reconstruction for a Partially Collimated Whole Body PET Scanner
Alessio, Adam M.; Schmitz, Ruth E.; MacDonald, Lawrence R.; Wollenweber, Scott D.; Stearns, Charles W.; Ross, Steven G.; Ganin, Alex; Lewellen, Thomas K.; Kinahan, Paul E.
2008-01-01
Partially collimated PET systems have less collimation than conventional 2-D systems and have been shown to offer count rate improvements over 2-D and 3-D systems. Despite this potential, previous efforts have not established image-based improvements with partial collimation and have not customized the reconstruction method for partially collimated data. This work presents an image reconstruction method tailored for partially collimated data. Simulated and measured sensitivity patterns are presented and provide a basis for modification of a fully 3-D reconstruction technique. The proposed method uses a measured normalization correction term to account for the unique sensitivity to true events. This work also proposes a modified scatter correction based on simulated data. Measured image quality data supports the use of the normalization correction term for true events, and suggests that the modified scatter correction is unnecessary. PMID:19096731
Image Reconstruction for a Partially Collimated Whole Body PET Scanner.
Alessio, Adam M; Schmitz, Ruth E; Macdonald, Lawrence R; Wollenweber, Scott D; Stearns, Charles W; Ross, Steven G; Ganin, Alex; Lewellen, Thomas K; Kinahan, Paul E
2008-06-01
Partially collimated PET systems have less collimation than conventional 2-D systems and have been shown to offer count rate improvements over 2-D and 3-D systems. Despite this potential, previous efforts have not established image-based improvements with partial collimation and have not customized the reconstruction method for partially collimated data. This work presents an image reconstruction method tailored for partially collimated data. Simulated and measured sensitivity patterns are presented and provide a basis for modification of a fully 3-D reconstruction technique. The proposed method uses a measured normalization correction term to account for the unique sensitivity to true events. This work also proposes a modified scatter correction based on simulated data. Measured image quality data supports the use of the normalization correction term for true events, and suggests that the modified scatter correction is unnecessary.
Single image super-resolution reconstruction algorithm based on eage selection
NASA Astrophysics Data System (ADS)
Zhang, Yaolan; Liu, Yijun
2017-05-01
Super-resolution (SR) has become more important, because it can generate high-quality high-resolution (HR) images from low-resolution (LR) input images. At present, there are a lot of work is concentrated on developing sophisticated image priors to improve the image quality, while taking much less attention to estimating and incorporating the blur model that can also impact the reconstruction results. We present a new reconstruction method based on eager selection. This method takes full account of the factors that affect the blur kernel estimation and accurately estimating the blur process. When comparing with the state-of-the-art methods, our method has comparable performance.
Dynamic Stark broadening as the Dicke narrowing effect
NASA Astrophysics Data System (ADS)
Calisti, A.; Mossé, C.; Ferri, S.; Talin, B.; Rosmej, F.; Bureyeva, L. A.; Lisitsa, V. S.
2010-01-01
A very fast method to account for charged particle dynamics effects in calculations of spectral line shape emitted by plasmas is presented. This method is based on a formulation of the frequency fluctuation model (FFM), which provides an expression of the dynamic line shape as a functional of the static distribution of frequencies. Thus, the main numerical work rests on the calculation of the quasistatic Stark profile. This method for taking into account ion dynamics allows a very fast and accurate calculation of Stark broadening of atomic hydrogen high- n series emission lines. It is not limited to hydrogen spectra. Results on helium- β and Lyman- α lines emitted by argon in microballoon implosion experiment conditions compared with experimental data and simulation results are also presented. The present approach reduces the computer time by more than 2 orders of magnitude as compared with the original FFM with an improvement of the calculation precision, and it opens broad possibilities for its application in spectral line-shape codes.
An improved predictive functional control method with application to PMSM systems
NASA Astrophysics Data System (ADS)
Li, Shihua; Liu, Huixian; Fu, Wenshu
2017-01-01
In common design of prediction model-based control method, usually disturbances are not considered in the prediction model as well as the control design. For the control systems with large amplitude or strong disturbances, it is difficult to precisely predict the future outputs according to the conventional prediction model, and thus the desired optimal closed-loop performance will be degraded to some extent. To this end, an improved predictive functional control (PFC) method is developed in this paper by embedding disturbance information into the system model. Here, a composite prediction model is thus obtained by embedding the estimated value of disturbances, where disturbance observer (DOB) is employed to estimate the lumped disturbances. So the influence of disturbances on system is taken into account in optimisation procedure. Finally, considering the speed control problem for permanent magnet synchronous motor (PMSM) servo system, a control scheme based on the improved PFC method is designed to ensure an optimal closed-loop performance even in the presence of disturbances. Simulation and experimental results based on a hardware platform are provided to confirm the effectiveness of the proposed algorithm.
Lee, Hwa-Young; Yang, Bong-Ming; Kang, Minah
2016-01-01
Despite continued global efforts, HIV/AIDS outcomes in developing countries have not made much progress. Poor governance in recipient countries is often seen as one of the reasons for ineffectiveness of aid efforts to achieve stated objectives and desired outcomes. This study examines the impact of two important dimensions of governance - control of corruption and democratic accountability - on the effectiveness of HIV/AIDS official development assistance. An empirical analysis using dynamic panel Generalized Method of Moments estimation was conducted on 2001-2010 datasets. Control of corruption and democratic accountability revealed an independent effect and interaction with the amount of HIV/AIDS aid on incidence of HIV/AIDS, respectively, while none of the two governance variables had a significant effect on HIV/AIDS prevalence. Specifically, in countries with accountability level below -2.269, aid has a detrimental effect on incidence of HIV/AIDS. The study findings suggest that aid programs need to be preceded or at least accompanied by serious efforts to improve governance in recipient countries and that democratic accountability ought to receive more critical attention.
Methods to Improve the Maintenance of the Earth Catalog of Satellites During Severe Solar Storms
NASA Technical Reports Server (NTRS)
Wilkin, Paul G.; Tolson, Robert H.
1998-01-01
The objective of this thesis is to investigate methods to improve the ability to maintain the inventory of orbital elements of Earth satellites during periods of atmospheric disturbance brought on by severe solar activity. Existing techniques do not account for such atmospheric dynamics, resulting in tracking errors of several seconds in predicted crossing time. Two techniques are examined to reduce of these tracking errors. First, density predicted from various atmospheric models is fit to the orbital decay rate for a number of satellites. An orbital decay model is then developed that could be used to reduce tracking errors by accounting for atmospheric changes. The second approach utilizes a Kalman filter to estimate the orbital decay rate of a satellite after every observation. The new information is used to predict the next observation. Results from the first approach demonstrated the feasibility of building an orbital decay model based on predicted atmospheric density. Correlation of atmospheric density to orbital decay was as high as 0.88. However, it is clear that contemporary: atmospheric models need further improvement in modeling density perturbations polar region brought on by solar activity. The second approach resulted in a dramatic reduction in tracking errors for certain satellites during severe solar Storms. For example, in the limited cases studied, the reduction in tracking errors ranged from 79 to 25 percent.
Nuclear techniques for the on-line bulk analysis of carbon in coal-fired power stations.
Sowerby, B D
2009-09-01
Carbon trading schemes usually require large emitters of CO(2), such as coal-fired power stations, to monitor, report and be audited on their CO(2) emissions. The emission price provides a significant additional incentive for power stations to improve efficiency. In the present paper, previous work on the bulk determination of carbon in coal is reviewed and assessed. The most favourable method is that based on neutron inelastic scattering. The potential role of on-line carbon analysers in improving boiler efficiency and in carbon accounting is discussed.
ERIC Educational Resources Information Center
Krupa, Erin Elizabeth
2011-01-01
In this era of high-stakes testing and accountability, curricula are viewed as catalysts to improve high school students' mathematics performances and a critical question is whether single subject or integrated curricula produce stronger student outcomes. This study was designed to investigate the effects of an integrated reform-based curriculum,…
Missing Measures of the Who and Why of School Dropouts: Implications for Policy and Research.
ERIC Educational Resources Information Center
Bloch, Deborah Perlmutter
1991-01-01
Presents five goals of a policy and research agenda for dropout prevention: (1) to develop commonly accepted definition of dropout; (2) to improve methods of pupil accounting; (3) to separate causes and identifiers of at-risk behavior; (4) to analyze relevance of questions asked about at-risk youth; and (5) to engage in institutional…
Single-cell analysis of population context advances RNAi screening at multiple levels
Snijder, Berend; Sacher, Raphael; Rämö, Pauli; Liberali, Prisca; Mench, Karin; Wolfrum, Nina; Burleigh, Laura; Scott, Cameron C; Verheije, Monique H; Mercer, Jason; Moese, Stefan; Heger, Thomas; Theusner, Kristina; Jurgeit, Andreas; Lamparter, David; Balistreri, Giuseppe; Schelhaas, Mario; De Haan, Cornelis A M; Marjomäki, Varpu; Hyypiä, Timo; Rottier, Peter J M; Sodeik, Beate; Marsh, Mark; Gruenberg, Jean; Amara, Ali; Greber, Urs; Helenius, Ari; Pelkmans, Lucas
2012-01-01
Isogenic cells in culture show strong variability, which arises from dynamic adaptations to the microenvironment of individual cells. Here we study the influence of the cell population context, which determines a single cell's microenvironment, in image-based RNAi screens. We developed a comprehensive computational approach that employs Bayesian and multivariate methods at the single-cell level. We applied these methods to 45 RNA interference screens of various sizes, including 7 druggable genome and 2 genome-wide screens, analysing 17 different mammalian virus infections and four related cell physiological processes. Analysing cell-based screens at this depth reveals widespread RNAi-induced changes in the population context of individual cells leading to indirect RNAi effects, as well as perturbations of cell-to-cell variability regulators. We find that accounting for indirect effects improves the consistency between siRNAs targeted against the same gene, and between replicate RNAi screens performed in different cell lines, in different labs, and with different siRNA libraries. In an era where large-scale RNAi screens are increasingly performed to reach a systems-level understanding of cellular processes, we show that this is often improved by analyses that account for and incorporate the single-cell microenvironment. PMID:22531119
Yu, Hao; Solvang, Wei Deng
2016-01-01
Hazardous waste location-routing problems are of importance due to the potential risk for nearby residents and the environment. In this paper, an improved mathematical formulation is developed based upon a multi-objective mixed integer programming approach. The model aims at assisting decision makers in selecting locations for different facilities including treatment plants, recycling plants and disposal sites, providing appropriate technologies for hazardous waste treatment, and routing transportation. In the model, two critical factors are taken into account: system operating costs and risk imposed on local residents, and a compensation factor is introduced to the risk objective function in order to account for the fact that the risk level imposed by one type of hazardous waste or treatment technology may significantly vary from that of other types. Besides, the policy instruments for promoting waste recycling are considered, and their influence on the costs and risk of hazardous waste management is also discussed. The model is coded and calculated in Lingo optimization solver, and the augmented ε-constraint method is employed to generate the Pareto optimal curve of the multi-objective optimization problem. The trade-off between different objectives is illustrated in the numerical experiment. PMID:27258293
Cortical thickness measurement from magnetic resonance images using partial volume estimation
NASA Astrophysics Data System (ADS)
Zuluaga, Maria A.; Acosta, Oscar; Bourgeat, Pierrick; Hernández Hoyos, Marcela; Salvado, Olivier; Ourselin, Sébastien
2008-03-01
Measurement of the cortical thickness from 3D Magnetic Resonance Imaging (MRI) can aid diagnosis and longitudinal studies of a wide range of neurodegenerative diseases. We estimate the cortical thickness using a Laplacian approach whereby equipotentials analogous to layers of tissue are computed. The thickness is then obtained using an Eulerian approach where partial differential equations (PDE) are solved, avoiding the explicit tracing of trajectories along the streamlines gradient. This method has the advantage of being relatively fast and insure unique correspondence points between the inner and outer boundaries of the cortex. The original method is challenged when the thickness of the cortex is of the same order of magnitude as the image resolution since partial volume (PV) effect is not taken into account at the gray matter (GM) boundaries. We propose a novel way to take into account PV which improves substantially accuracy and robustness. We model PV by computing a mixture of pure Gaussian probability distributions and use this estimate to initialize the cortical thickness estimation. On synthetic phantoms experiments, the errors were divided by three while reproducibility was improved when the same patients was scanned three consecutive times.
Yu, Hao; Solvang, Wei Deng
2016-05-31
Hazardous waste location-routing problems are of importance due to the potential risk for nearby residents and the environment. In this paper, an improved mathematical formulation is developed based upon a multi-objective mixed integer programming approach. The model aims at assisting decision makers in selecting locations for different facilities including treatment plants, recycling plants and disposal sites, providing appropriate technologies for hazardous waste treatment, and routing transportation. In the model, two critical factors are taken into account: system operating costs and risk imposed on local residents, and a compensation factor is introduced to the risk objective function in order to account for the fact that the risk level imposed by one type of hazardous waste or treatment technology may significantly vary from that of other types. Besides, the policy instruments for promoting waste recycling are considered, and their influence on the costs and risk of hazardous waste management is also discussed. The model is coded and calculated in Lingo optimization solver, and the augmented ε-constraint method is employed to generate the Pareto optimal curve of the multi-objective optimization problem. The trade-off between different objectives is illustrated in the numerical experiment.
EMAS statement: benign accountability or wishful thinking? Insights from the Greek EMAS registry.
Skouloudis, Antonis; Jones, Keith; Sfakianaki, Eleni; Lazoudi, Eugenia; Evangelinos, Konstantinos
2013-10-15
Do organizations certified under the Eco-Management and Audit Scheme (EMAS) effectively discharge their environmental accountability through their statements? Is the EMAS statement a step forward for the transparency of environmental management and the empowerment of organizational stakeholders' decision-making? Drawing from the Greek EMAS registry we apply an evaluation method for the completeness and materiality of environmental statements. While the latest version of the EMAS Regulation has introduced a set of forward-looking - yet challenging - improvements, the application of the standard should be closely examined. With this in mind, the key objective of this research note is to provide - from a descriptive standpoint - insights on the content of EMAS-based environmental accountability and a basis for future research as well as fruitful policy debate. Copyright © 2013 Elsevier Ltd. All rights reserved.
4 CFR 28.41 - Explanation, scope and methods.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 4 Accounts 1 2010-01-01 2010-01-01 false Explanation, scope and methods. 28.41 Section 28.41 Accounts GOVERNMENT ACCOUNTABILITY OFFICE GENERAL PROCEDURES GOVERNMENT ACCOUNTABILITY OFFICE PERSONNEL... ACCOUNTABILITY OFFICE Procedures Discovery § 28.41 Explanation, scope and methods. (a) Explanation. Discovery is...
26 CFR 1.446-1 - General rule for methods of accounting.
Code of Federal Regulations, 2011 CFR
2011-04-01
... books. For requirement respecting the adoption or change of accounting method, see section 446(e) and... taxpayer to adopt or change to a method of accounting permitted by this chapter although the method is not..., which require the prior approval of the Commissioner in the case of changes in accounting method. (iii...
Myers, Mary; Parchen, Debra; Geraci, Marilla; Brenholtz, Roger; Knisely-Carrigan, Denise; Hastings, Clare
2013-10-01
Sustaining change in the behaviors and habits of experienced practicing nurses can be frustrating and daunting, even when changes are based on evidence. Partnering with an active shared governance structure to communicate change and elicit feedback is an established method to foster partnership, equity, accountability, and ownership. Few recent exemplars in the literature link shared governance, change management, and evidence-based practice to transitions in care models. This article describes an innovative staff-driven approach used by nurses in a shared governance performance improvement committee to use evidence-based practice in determining the best methods to evaluate the implementation of a new model of care.
Myers, Mary; Parchen, Debra; Geraci, Marilla; Brenholtz, Roger; Knisely-Carrigan, Denise; Hastings, Clare
2013-01-01
Sustaining change in the behaviors and habits of experienced practicing nurses can be frustrating and daunting, even when changes are based on evidence. Partnering with an active shared governance structure to communicate change and elicit feedback is an established method to foster partnership, equity, accountability and ownership. Few recent exemplars in the literature link shared governance, change management and evidence-based practice to transitions in care models. This article describes an innovative staff-driven approach used by nurses in a shared governance performance improvement committee to use evidence based practice in determining the best methods to evaluate the implementation of a new model of care. PMID:24061583
Deformable image registration with local rigidity constraints for cone-beam CT-guided spine surgery
NASA Astrophysics Data System (ADS)
Reaungamornrat, S.; Wang, A. S.; Uneri, A.; Otake, Y.; Khanna, A. J.; Siewerdsen, J. H.
2014-07-01
Image-guided spine surgery (IGSS) is associated with reduced co-morbidity and improved surgical outcome. However, precise localization of target anatomy and adjacent nerves and vessels relative to planning information (e.g., device trajectories) can be challenged by anatomical deformation. Rigid registration alone fails to account for deformation associated with changes in spine curvature, and conventional deformable registration fails to account for rigidity of the vertebrae, causing unrealistic distortions in the registered image that can confound high-precision surgery. We developed and evaluated a deformable registration method capable of preserving rigidity of bones while resolving the deformation of surrounding soft tissue. The method aligns preoperative CT to intraoperative cone-beam CT (CBCT) using free-form deformation (FFD) with constraints on rigid body motion imposed according to a simple intensity threshold of bone intensities. The constraints enforced three properties of a rigid transformation—namely, constraints on affinity (AC), orthogonality (OC), and properness (PC). The method also incorporated an injectivity constraint (IC) to preserve topology. Physical experiments involving phantoms, an ovine spine, and a human cadaver as well as digital simulations were performed to evaluate the sensitivity to registration parameters, preservation of rigid body morphology, and overall registration accuracy of constrained FFD in comparison to conventional unconstrained FFD (uFFD) and Demons registration. FFD with orthogonality and injectivity constraints (denoted FFD+OC+IC) demonstrated improved performance compared to uFFD and Demons. Affinity and properness constraints offered little or no additional improvement. The FFD+OC+IC method preserved rigid body morphology at near-ideal values of zero dilatation ({ D} = 0.05, compared to 0.39 and 0.56 for uFFD and Demons, respectively) and shear ({ S} = 0.08, compared to 0.36 and 0.44 for uFFD and Demons, respectively). Target registration error (TRE) was similarly improved for FFD+OC+IC (0.7 mm), compared to 1.4 and 1.8 mm for uFFD and Demons. Results were validated in human cadaver studies using CT and CBCT images, with FFD+OC+IC providing excellent preservation of rigid morphology and equivalent or improved TRE. The approach therefore overcomes distortions intrinsic to uFFD and could better facilitate high-precision IGSS.
A Multi-Channel Method for Detecting Periodic Forced Oscillations in Power Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Follum, James D.; Tuffner, Francis K.
2016-11-14
Forced oscillations in electric power systems are often symptomatic of equipment malfunction or improper operation. Detecting and addressing the cause of the oscillations can improve overall system operation. In this paper, a multi-channel method of detecting forced oscillations and estimating their frequencies is proposed. The method operates by comparing the sum of scaled periodograms from various channels to a threshold. A method of setting the threshold to specify the detector's probability of false alarm while accounting for the correlation between channels is also presented. Results from simulated and measured power system data indicate that the method outperforms its single-channel counterpartmore » and is suitable for real-world applications.« less
On Bayesian Testing of Additive Conjoint Measurement Axioms Using Synthetic Likelihood.
Karabatsos, George
2018-06-01
This article introduces a Bayesian method for testing the axioms of additive conjoint measurement. The method is based on an importance sampling algorithm that performs likelihood-free, approximate Bayesian inference using a synthetic likelihood to overcome the analytical intractability of this testing problem. This new method improves upon previous methods because it provides an omnibus test of the entire hierarchy of cancellation axioms, beyond double cancellation. It does so while accounting for the posterior uncertainty that is inherent in the empirical orderings that are implied by these axioms, together. The new method is illustrated through a test of the cancellation axioms on a classic survey data set, and through the analysis of simulated data.
D'Amours, Michel; Pouliot, Jean; Dagnault, Anne; Verhaegen, Frank; Beaulieu, Luc
2011-12-01
Brachytherapy planning software relies on the Task Group report 43 dosimetry formalism. This formalism, based on a water approximation, neglects various heterogeneous materials present during treatment. Various studies have suggested that these heterogeneities should be taken into account to improve the treatment quality. The present study sought to demonstrate the feasibility of incorporating Monte Carlo (MC) dosimetry within an inverse planning algorithm to improve the dose conformity and increase the treatment quality. The method was based on precalculated dose kernels in full patient geometries, representing the dose distribution of a brachytherapy source at a single dwell position using MC simulations and the Geant4 toolkit. These dose kernels are used by the inverse planning by simulated annealing tool to produce a fast MC-based plan. A test was performed for an interstitial brachytherapy breast treatment using two different high-dose-rate brachytherapy sources: the microSelectron iridium-192 source and the electronic brachytherapy source Axxent operating at 50 kVp. A research version of the inverse planning by simulated annealing algorithm was combined with MC to provide a method to fully account for the heterogeneities in dose optimization, using the MC method. The effect of the water approximation was found to depend on photon energy, with greater dose attenuation for the lower energies of the Axxent source compared with iridium-192. For the latter, an underdosage of 5.1% for the dose received by 90% of the clinical target volume was found. A new method to optimize afterloading brachytherapy plans that uses MC dosimetric information was developed. Including computed tomography-based information in MC dosimetry in the inverse planning process was shown to take into account the full range of scatter and heterogeneity conditions. This led to significant dose differences compared with the Task Group report 43 approach for the Axxent source. Copyright © 2011 Elsevier Inc. All rights reserved.
A single-stage flux-corrected transport algorithm for high-order finite-volume methods
Chaplin, Christopher; Colella, Phillip
2017-05-08
We present a new limiter method for solving the advection equation using a high-order, finite-volume discretization. The limiter is based on the flux-corrected transport algorithm. Here, we modify the classical algorithm by introducing a new computation for solution bounds at smooth extrema, as well as improving the preconstraint on the high-order fluxes. We compute the high-order fluxes via a method-of-lines approach with fourth-order Runge-Kutta as the time integrator. For computing low-order fluxes, we select the corner-transport upwind method due to its improved stability over donor-cell upwind. Several spatial differencing schemes are investigated for the high-order flux computation, including centered- differencemore » and upwind schemes. We show that the upwind schemes perform well on account of the dissipation of high-wavenumber components. The new limiter method retains high-order accuracy for smooth solutions and accurately captures fronts in discontinuous solutions. Further, we need only apply the limiter once per complete time step.« less
Chambert, Thierry A.; Waddle, J. Hardin; Miller, David A.W.; Walls, Susan; Nichols, James D.
2018-01-01
The development and use of automated species-detection technologies, such as acoustic recorders, for monitoring wildlife are rapidly expanding. Automated classification algorithms provide a cost- and time-effective means to process information-rich data, but often at the cost of additional detection errors. Appropriate methods are necessary to analyse such data while dealing with the different types of detection errors.We developed a hierarchical modelling framework for estimating species occupancy from automated species-detection data. We explore design and optimization of data post-processing procedures to account for detection errors and generate accurate estimates. Our proposed method accounts for both imperfect detection and false positive errors and utilizes information about both occurrence and abundance of detections to improve estimation.Using simulations, we show that our method provides much more accurate estimates than models ignoring the abundance of detections. The same findings are reached when we apply the methods to two real datasets on North American frogs surveyed with acoustic recorders.When false positives occur, estimator accuracy can be improved when a subset of detections produced by the classification algorithm is post-validated by a human observer. We use simulations to investigate the relationship between accuracy and effort spent on post-validation, and found that very accurate occupancy estimates can be obtained with as little as 1% of data being validated.Automated monitoring of wildlife provides opportunity and challenges. Our methods for analysing automated species-detection data help to meet key challenges unique to these data and will prove useful for many wildlife monitoring programs.
Cappon, Giacomo; Marturano, Francesca; Vettoretti, Martina; Facchinetti, Andrea; Sparacino, Giovanni
2018-05-01
The standard formula (SF) used in bolus calculators (BCs) determines meal insulin bolus using "static" measurement of blood glucose concentration (BG) obtained by self-monitoring of blood glucose (SMBG) fingerprick device. Some methods have been proposed to improve efficacy of SF using "dynamic" information provided by continuous glucose monitoring (CGM), and, in particular, glucose rate of change (ROC). This article compares, in silico and in an ideal framework limiting the exposition to possibly confounding factors (such as CGM noise), the performance of three popular techniques devised for such a scope, that is, the methods of Buckingham et al (BU), Scheiner (SC), and Pettus and Edelman (PE). Using the UVa/Padova Type 1 diabetes simulator we generated data of 100 virtual subjects in noise-free, single-meal scenarios having different preprandial BG and ROC values. Meal insulin bolus was computed using SF, BU, SC, and PE. Performance was assessed with the blood glucose risk index (BGRI) on the 9 hours after meal. On average, BU, SC, and PE improve BGRI compared to SF. When BG is rapidly decreasing, PE obtains the best performance. In the other ROC scenarios, none of the considered methods prevails in all the preprandial BG conditions tested. Our study showed that, at least in the considered ideal framework, none of the methods to correct SF according to ROC is globally better than the others. Critical analysis of the results also suggests that further investigations are needed to develop more effective formulas to account for ROC information in BCs.
An improved method for bivariate meta-analysis when within-study correlations are unknown.
Hong, Chuan; D Riley, Richard; Chen, Yong
2018-03-01
Multivariate meta-analysis, which jointly analyzes multiple and possibly correlated outcomes in a single analysis, is becoming increasingly popular in recent years. An attractive feature of the multivariate meta-analysis is its ability to account for the dependence between multiple estimates from the same study. However, standard inference procedures for multivariate meta-analysis require the knowledge of within-study correlations, which are usually unavailable. This limits standard inference approaches in practice. Riley et al proposed a working model and an overall synthesis correlation parameter to account for the marginal correlation between outcomes, where the only data needed are those required for a separate univariate random-effects meta-analysis. As within-study correlations are not required, the Riley method is applicable to a wide variety of evidence synthesis situations. However, the standard variance estimator of the Riley method is not entirely correct under many important settings. As a consequence, the coverage of a function of pooled estimates may not reach the nominal level even when the number of studies in the multivariate meta-analysis is large. In this paper, we improve the Riley method by proposing a robust variance estimator, which is asymptotically correct even when the model is misspecified (ie, when the likelihood function is incorrect). Simulation studies of a bivariate meta-analysis, in a variety of settings, show a function of pooled estimates has improved performance when using the proposed robust variance estimator. In terms of individual pooled estimates themselves, the standard variance estimator and robust variance estimator give similar results to the original method, with appropriate coverage. The proposed robust variance estimator performs well when the number of studies is relatively large. Therefore, we recommend the use of the robust method for meta-analyses with a relatively large number of studies (eg, m≥50). When the sample size is relatively small, we recommend the use of the robust method under the working independence assumption. We illustrate the proposed method through 2 meta-analyses. Copyright © 2017 John Wiley & Sons, Ltd.
26 CFR 1.9002-1 - Purpose, applicability, and definitions.
Code of Federal Regulations, 2010 CFR
2010-04-01
... compute, taxable income under an accrual method of accounting, and (2) treated dealer reserve income (or portions thereof) which should have been taken into account (under the accrual method of accounting) for... accounting or who was not required to compute taxable income under the accrual method of accounting. An...
2007-12-01
Representatives DOD SCHOOLS Additional Reporting Could Improve Accountability for Academic Achievement of Students with Dyslexia December...Could Improve Accountability for Academic Achievement of Students with Dyslexia 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...Students with Dyslexia Highlights of GAO-08-70, a report to the Chairman, Committee on Science and Technology, House of Representatives Many of our
Identifying the location of the OMP separatrix in DIII-D using power accounting
Stangeby, Peter C.; Canik, John M.; Elder, J. D.; ...
2015-08-07
In order to identify reliable scalings for the scrape-off layer (SOL) power width it is necessary to know the location of the separatrix in divertor tokamaks as accurately as possible, specifically its location at the outside midplane (OMP) the standard reference location. Two methods are described which use power accounting to improve the accuracy of identifying the location of the OMP separatrix. The first uses the infrared-measured deposited power profile at the outer target as the primary input, the 'more » $$P_{{\\rm SOL}}^{{\\rm exhaust}}$$ method'. The second uses the measured power input to the SOL, obtained by subtracting the power radiated from inside the separatrix from the total heating power, the ' $$P_{{\\rm SOL}}^{{\\rm input}}$$ method'. Furthermore, these two power accounting methods are illustrated with the examples of 21 H-mode DIII-D discharges. High spatial resolution Thomson scattering measured profiles of ne and Te for the main SOL near the OMP are also used as primary input to the analysis; only between-edge localized mode data are used here. The Thomson profiles are used to calculate the electron parallel conducted heat flux profiles which are then matched to the measured $$P_{{\\rm SOL}}^{{\\rm exhaust}}$$ and $$P_{{\\rm SOL}}^{{\\rm input}}$$ by adjusting the location of the OMP separatrix relative to that of the Thomson data. For these attached discharges, it is found that the values of $$R_{{\\rm sep}}^{{\\rm omp}}$$ given by the two power accounting methods agree to within ~1 mm of each other and also to within ~1 mm of the values given by the 'standard DIII-D method' described by Porter et al (1998 Phys. Plasmas 5 1410). Lastly, the shifted $$R_{{\\rm sep}}^{{\\rm omp}}$$ results in only modest changes to the values of ne and Te at the OMP separatrix relative to the 'standard' values, increasing $$n_{{\\rm e}}^{{\\rm sep}}$$ by 8% and $$T_{{\\rm e}}^{{\\rm sep}}$$ by 20%.« less
NASA Astrophysics Data System (ADS)
Borowik, Piotr; Thobel, Jean-Luc; Adamowicz, Leszek
2017-07-01
Standard computational methods used to take account of the Pauli Exclusion Principle into Monte Carlo (MC) simulations of electron transport in semiconductors may give unphysical results in low field regime, where obtained electron distribution function takes values exceeding unity. Modified algorithms were already proposed and allow to correctly account for electron scattering on phonons or impurities. Present paper extends this approach and proposes improved simulation scheme allowing including Pauli exclusion principle for electron-electron (e-e) scattering into MC simulations. Simulations with significantly reduced computational cost recreate correct values of the electron distribution function. Proposed algorithm is applied to study transport properties of degenerate electrons in graphene with e-e interactions. This required adapting the treatment of e-e scattering in the case of linear band dispersion relation. Hence, this part of the simulation algorithm is described in details.
Lu, Tao
2017-01-01
The joint modeling of mean and variance for longitudinal data is an active research area. This type of model has the advantage of accounting for heteroscedasticity commonly observed in between and within subject variations. Most of researches focus on improving the estimating efficiency but ignore many data features frequently encountered in practice. In this article, we develop a mixed-effects location scale joint model that concurrently accounts for longitudinal data with multiple features. Specifically, our joint model handles heterogeneity, skewness, limit of detection, measurement errors in covariates which are typically observed in the collection of longitudinal data from many studies. We employ a Bayesian approach for making inference on the joint model. The proposed model and method are applied to an AIDS study. Simulation studies are performed to assess the performance of the proposed method. Alternative models under different conditions are compared.
User Experience Evaluation in the Mobile Context
NASA Astrophysics Data System (ADS)
Obrist, Marianna; Meschtscherjakov, Alexander; Tscheligi, Manfred
Multimedia services on mobile devices are becoming increasingly popular. Whereas the mobile phone is the most likely platform for mobile TV, PDAs, portable game consoles, and music players are attractive alternatives. Mobile TV consumption on mobile phones allows new kinds of user experiences, but it also puts designers and researchers in front of new challenges. On the one hand, designers have to take these novel experience potentials into account. On the other hand, the right methods to collect user feedback to further improve services for the mobile context have to be applied. In this chapter the importance of user experience research for mobile TV within the mobile context is highlighted. We present how different experience levels can be evaluated taking different mobile context categories into account. In particular, we discuss the Experience Sampling Method (ESM), which seems to be a fruitful approach for investigating user TV experiences.
26 CFR 1.267(a)-1 - Deductions disallowed.
Code of Federal Regulations, 2010 CFR
2010-04-01
... accrual method of accounting. For example, if the accrued expenses or interest are paid after the... an accrual method of accounting. A uses a combination of accounting methods permitted under section... disbursements method of accounting with respect to such items of gross income for his taxable year in which or...
NASA Astrophysics Data System (ADS)
Tripathi, Anjan Kumar
Electrically charged particles are found in a wide range of applications ranging from electrostatic powder coating, mineral processing, and powder handling to rain-producing cloud formation in atmospheric turbulent flows. In turbulent flows, particle dynamics is influenced by the electric force due to particle charge generation. Quantifying particle charges in such systems will help in better predicting and controlling particle clustering, relative motion, collision, and growth. However, there is a lack of noninvasive techniques to measure particle charges. Recently, a non-invasive method for particle charge measurement using in-line Digital Holographic Particle Tracking Velocimetry (DHPTV) technique was developed in our lab, where charged particles to be measured were introduced to a uniform electric field, and their movement towards the oppositely charged electrode was deemed proportional to the amount of charge on the particles (Fan Yang, 2014 [1]). However, inherent speckle noise associated with reconstructed images was not adequately removed and therefore particle tracking data was contaminated. Furthermore, particle charge calculation based on particle deflection velocity neglected the particle drag force and rebound effect of the highly charged particles from the electrodes. We improved upon the existing particle charge measurement method by: 1) hologram post processing, 2) taking drag force into account in charge calculation, 3) considering rebound effect. The improved method was first fine-tuned through a calibration experiment. The complete method was then applied to two different experiments, namely conduction charging and enclosed fan-driven turbulence chamber, to measure particle charges. In all three experiments conducted, the particle charge was found to obey non-central t-location scale family of distribution. It was also noted that the charge distribution was insensitive to the change in voltage applied between the electrodes. The range of voltage applied where reliable particle charges can be measured was also quantified by taking into account the rebound effect of highly charged particles. Finally, in the enclosed chamber experiment, it was found that using carbon conductive coating on the inner walls of the chamber minimized the charge generation inside the chamber when glass bubble particles were used. The value of electric charges obtained in calibration experiment through the improved method was found to have the same order as reported in the existing work (Y.C Ahn et al. 2004 [2]), indicating that the method is indeed effective.
Sainju, Upendra M
2016-01-01
Management practices, such as tillage, crop rotation, and N fertilization, may affect net global warming potential (GWP) and greenhouse gas intensity (GHGI), but their global impact on cropland soils under different soil and climatic conditions need further evaluation. Available global data from 57 experiments and 225 treatments were evaluated for individual and combined effects of tillage, cropping systems, and N fertilization rates on GWP and GHGI which accounted for CO2 equivalents from N2O and CH4 emissions with or without equivalents from soil C sequestration rate (ΔSOC), farm operations, and N fertilization. The GWP and GHGI were 66 to 71% lower with no-till than conventional till and 168 to 215% lower with perennial than annual cropping systems, but 41 to 46% greater with crop rotation than monocroppping. With no-till vs. conventional till, GWP and GHGI were 2.6- to 7.4-fold lower when partial than full accounting of all sources and sinks of greenhouse gases (GHGs) were considered. With 100 kg N ha-1, GWP and GHGI were 3.2 to 11.4 times greater with partial than full accounting. Both GWP and GHGI increased curvilinearly with increased N fertilization rate. Net GWP and GHGI were 70 to 87% lower in the improved combined management that included no-till, crop rotation/perennial crop, and reduced N rate than the traditional combined management that included conventional till, monocopping/annual crop, and recommended N rate. An alternative soil respiration method, which replaces ΔSOC by soil respiration and crop residue returned to soil in the previous year, similarly reduced GWP and GHGI by 133 to 158% in the improved vs. the traditional combined management. Changes in GWP and GHGI due to improved vs. traditional management varied with the duration of the experiment and inclusion of soil and climatic factors in multiple linear regressions improved their relationships. Improved management practices reduced GWP and GHGI compared with traditional management practices and combined management practices were even more effective than individual management practices in reducing net GHG emissions from cropland soils. Partial accounting overestimated GWP and GHGI values as sinks or sources of net GHGs compared with full accounting when evaluating the effect of management practices.
Sainju, Upendra M.
2016-01-01
Management practices, such as tillage, crop rotation, and N fertilization, may affect net global warming potential (GWP) and greenhouse gas intensity (GHGI), but their global impact on cropland soils under different soil and climatic conditions need further evaluation. Available global data from 57 experiments and 225 treatments were evaluated for individual and combined effects of tillage, cropping systems, and N fertilization rates on GWP and GHGI which accounted for CO2 equivalents from N2O and CH4 emissions with or without equivalents from soil C sequestration rate (ΔSOC), farm operations, and N fertilization. The GWP and GHGI were 66 to 71% lower with no-till than conventional till and 168 to 215% lower with perennial than annual cropping systems, but 41 to 46% greater with crop rotation than monocroppping. With no-till vs. conventional till, GWP and GHGI were 2.6- to 7.4-fold lower when partial than full accounting of all sources and sinks of greenhouse gases (GHGs) were considered. With 100 kg N ha-1, GWP and GHGI were 3.2 to 11.4 times greater with partial than full accounting. Both GWP and GHGI increased curvilinearly with increased N fertilization rate. Net GWP and GHGI were 70 to 87% lower in the improved combined management that included no-till, crop rotation/perennial crop, and reduced N rate than the traditional combined management that included conventional till, monocopping/annual crop, and recommended N rate. An alternative soil respiration method, which replaces ΔSOC by soil respiration and crop residue returned to soil in the previous year, similarly reduced GWP and GHGI by 133 to 158% in the improved vs. the traditional combined management. Changes in GWP and GHGI due to improved vs. traditional management varied with the duration of the experiment and inclusion of soil and climatic factors in multiple linear regressions improved their relationships. Improved management practices reduced GWP and GHGI compared with traditional management practices and combined management practices were even more effective than individual management practices in reducing net GHG emissions from cropland soils. Partial accounting overestimated GWP and GHGI values as sinks or sources of net GHGs compared with full accounting when evaluating the effect of management practices. PMID:26901827
As Easy as ABC: Re-engineering the Cost Accounting System.
ERIC Educational Resources Information Center
Trussel, John M.; Bitner, Larry N.
1996-01-01
To be useful for management decision making, the college or university's cost accounting system must capture and measure improvements. Activity-based costing (ABC), which determines more accurately the full costs of services and products, tracks improvements and should proceed alongside reengineering of institutional accounting. Guidelines are…
26 CFR 1.381(c)(4)-1 - Method of accounting.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 26 Internal Revenue 4 2012-04-01 2012-04-01 false Method of accounting. 1.381(c)(4)-1 Section 1... TAX (CONTINUED) INCOME TAXES (Continued) Insolvency Reorganizations § 1.381(c)(4)-1 Method of accounting. (a) Introduction—(1) Purpose. This section provides guidance regarding the method of accounting...
26 CFR 1.1502-17 - Methods of accounting.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 26 Internal Revenue 12 2010-04-01 2010-04-01 false Methods of accounting. 1.1502-17 Section 1.1502... (CONTINUED) INCOME TAXES Computation of Separate Taxable Income § 1.1502-17 Methods of accounting. (a) General rule. The method of accounting to be used by each member of the group shall be determined in...
Indiana's New and (Somewhat) Improved K-12 School Finance System. School Choice Issues in the State
ERIC Educational Resources Information Center
Aud, Susan L.
2005-01-01
Education finance policy has become an urgent concern in many state legislatures. Demands for greater equity and accountability have forced states to review, and in many cases to revise, the method by which schools are funded. This study sheds light on Indiana's financing of public K-12 education by providing a clear explanation of the components…
ERIC Educational Resources Information Center
Van Horn, Mark Louis
2012-01-01
In 1999, California was among the first schools in the nation to initiate an accountability model for public education using a method for system measurement of academic improvement constructed on the bedrock of standards-based education. The State also included a new twist...sanctions. Schools that failed to make expected progress, as measured…
ERIC Educational Resources Information Center
Matsushita, Ryohei
2017-01-01
In the field of education, evidence means an objective ground for setting or judging an educational policy, plan or method, as an effective means to attain a given political end or educational objective. Evidence-based education has been regarded as a decisive device to pursue the accountability and improve the quality of education by connecting…
Alkema, Leontine; New, Jin Rou; Pedersen, Jon; You, Danzhen
2014-01-01
Background In September 2013, the United Nations Inter-agency Group for Child Mortality Estimation (UN IGME) published an update of the estimates of the under-five mortality rate (U5MR) and under-five deaths for all countries. Compared to the UN IGME estimates published in 2012, updated data inputs and a new method for estimating the U5MR were used. Methods We summarize the new U5MR estimation method, which is a Bayesian B-spline Bias-reduction model, and highlight differences with the previously used method. Differences in UN IGME U5MR estimates as published in 2012 and those published in 2013 are presented and decomposed into differences due to the updated database and differences due to the new estimation method to explain and motivate changes in estimates. Findings Compared to the previously used method, the new UN IGME estimation method is based on a different trend fitting method that can track (recent) changes in U5MR more closely. The new method provides U5MR estimates that account for data quality issues. Resulting differences in U5MR point estimates between the UN IGME 2012 and 2013 publications are small for the majority of countries but greater than 10 deaths per 1,000 live births for 33 countries in 2011 and 19 countries in 1990. These differences can be explained by the updated database used, the curve fitting method as well as accounting for data quality issues. Changes in the number of deaths were less than 10% on the global level and for the majority of MDG regions. Conclusions The 2013 UN IGME estimates provide the most recent assessment of levels and trends in U5MR based on all available data and an improved estimation method that allows for closer-to-real-time monitoring of changes in the U5MR and takes account of data quality issues. PMID:25013954
Eisen, Lars; Eisen, Rebecca J
2007-12-01
Improved methods for collection and presentation of spatial epidemiologic data are needed for vectorborne diseases in the United States. Lack of reliable data for probable pathogen exposure site has emerged as a major obstacle to the development of predictive spatial risk models. Although plague case investigations can serve as a model for how to ideally generate needed information, this comprehensive approach is cost-prohibitive for more common and less severe diseases. New methods are urgently needed to determine probable pathogen exposure sites that will yield reliable results while taking into account economic and time constraints of the public health system and attending physicians. Recent data demonstrate the need for a change from use of the county spatial unit for presentation of incidence of vectorborne diseases to more precise ZIP code or census tract scales. Such fine-scale spatial risk patterns can be communicated to the public and medical community through Web-mapping approaches.
A general panel sizing computer code and its application to composite structural panels
NASA Technical Reports Server (NTRS)
Anderson, M. S.; Stroud, W. J.
1978-01-01
A computer code for obtaining the dimensions of optimum (least mass) stiffened composite structural panels is described. The procedure, which is based on nonlinear mathematical programming and a rigorous buckling analysis, is applicable to general cross sections under general loading conditions causing buckling. A simplified method of accounting for bow-type imperfections is also included. Design studies in the form of structural efficiency charts for axial compression loading are made with the code for blade and hat stiffened panels. The effects on panel mass of imperfections, material strength limitations, and panel stiffness requirements are also examined. Comparisons with previously published experimental data show that accounting for imperfections improves correlation between theory and experiment.
The basis function approach for modeling autocorrelation in ecological data
Hefley, Trevor J.; Broms, Kristin M.; Brost, Brian M.; Buderman, Frances E.; Kay, Shannon L.; Scharf, Henry; Tipton, John; Williams, Perry J.; Hooten, Mevin B.
2017-01-01
Analyzing ecological data often requires modeling the autocorrelation created by spatial and temporal processes. Many seemingly disparate statistical methods used to account for autocorrelation can be expressed as regression models that include basis functions. Basis functions also enable ecologists to modify a wide range of existing ecological models in order to account for autocorrelation, which can improve inference and predictive accuracy. Furthermore, understanding the properties of basis functions is essential for evaluating the fit of spatial or time-series models, detecting a hidden form of collinearity, and analyzing large data sets. We present important concepts and properties related to basis functions and illustrate several tools and techniques ecologists can use when modeling autocorrelation in ecological data.
Liao, Wenjie; van der Werf, Hayo M G; Salmon-Monviola, Jordy
2015-09-15
One of the major challenges in environmental life cycle assessment (LCA) of crop production is the nonlinearity between nitrogen (N) fertilizer inputs and on-site N emissions resulting from complex biogeochemical processes. A few studies have addressed this nonlinearity by combining process-based N simulation models with LCA, but none accounted for nitrate (NO3(-)) flows across fields. In this study, we present a new method, TNT2-LCA, that couples the topography-based simulation of nitrogen transfer and transformation (TNT2) model with LCA, and compare the new method with a current LCA method based on a French life cycle inventory database. Application of the two methods to a case study of crop production in a catchment in France showed that, compared to the current method, TNT2-LCA allows delineation of more appropriate temporal limits when developing data for on-site N emissions associated with specific crops in this catchment. It also improves estimates of NO3(-) emissions by better consideration of agricultural practices, soil-climatic conditions, and spatial interactions of NO3(-) flows across fields, and by providing predicted crop yield. The new method presented in this study provides improved LCA of crop production at the catchment scale.
Cost accounting for end-of-life care: recommendations to the field by the Cost Accounting Workgroup.
Seninger, Stephen; Smith, Dean G
2004-01-01
Accurate measurement of economic costs is prerequisite to progress in improving the care delivered to Americans during the last stage of life. The Robert Wood Johnson Excellence in End-of-Life Care national program assembled a Cost Accounting Workgroup to identify accurate and meaningful methods to measure palliative and end-of-life health care use and costs. Eight key issues were identified: (1) planning the cost analysis; (2) identifying the perspective for cost analysis; (3) describing the end-of-life care program; (4) identifying the appropriate comparison group; (5) defining the period of care to be studied; (6) identifying the units of health care services; (7) assigning monetary values to health care service units; and (8) calculating costs. Economic principles of cost measurement and cost measurement issues encountered by practitioners were reviewed and incorporated into a set of recommendations.
26 CFR 1.448-1 - Limitation on the use of the cash receipts and disbursements method of accounting.
Code of Federal Regulations, 2014 CFR
2014-04-01
... disbursements method of accounting. 1.448-1 Section 1.448-1 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED) INCOME TAX (CONTINUED) INCOME TAXES (CONTINUED) Methods of Accounting § 1.448-1 Limitation on the use of the cash receipts and disbursements method of accounting. (a)-(f...
26 CFR 1.448-1 - Limitation on the use of the cash receipts and disbursements method of accounting.
Code of Federal Regulations, 2011 CFR
2011-04-01
... disbursements method of accounting. 1.448-1 Section 1.448-1 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED) INCOME TAX (CONTINUED) INCOME TAXES (CONTINUED) Methods of Accounting § 1.448-1 Limitation on the use of the cash receipts and disbursements method of accounting. (a)-(f...
26 CFR 1.448-1 - Limitation on the use of the cash receipts and disbursements method of accounting.
Code of Federal Regulations, 2012 CFR
2012-04-01
... disbursements method of accounting. 1.448-1 Section 1.448-1 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED) INCOME TAX (CONTINUED) INCOME TAXES (CONTINUED) Methods of Accounting § 1.448-1 Limitation on the use of the cash receipts and disbursements method of accounting. (a)-(f...
26 CFR 1.448-1 - Limitation on the use of the cash receipts and disbursements method of accounting.
Code of Federal Regulations, 2013 CFR
2013-04-01
... disbursements method of accounting. 1.448-1 Section 1.448-1 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED) INCOME TAX (CONTINUED) INCOME TAXES (CONTINUED) Methods of Accounting § 1.448-1 Limitation on the use of the cash receipts and disbursements method of accounting. (a)-(f...
26 CFR 1.381(c)(4)-1 - Method of accounting.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 26 Internal Revenue 4 2010-04-01 2010-04-01 false Method of accounting. 1.381(c)(4)-1 Section 1... TAX (CONTINUED) INCOME TAXES Insolvency Reorganizations § 1.381(c)(4)-1 Method of accounting. (a... section 381(a) applies, an acquiring corporation shall use the same method of accounting used by the...
26 CFR 1.985-4 - Method of accounting.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 26 Internal Revenue 10 2010-04-01 2010-04-01 false Method of accounting. 1.985-4 Section 1.985-4...) INCOME TAXES Export Trade Corporations § 1.985-4 Method of accounting. (a) Adoption of election. The adoption of, or the election to use, a functional currency shall be treated as a method of accounting. The...
Woody debris volume depletion through decay: implications for biomass and carbon accounting
Fraver, Shawn; Milo, Amy M.; Bradford, John B.; D'Amato, Anthony W.; Kenefic, Laura; Palik, Brian J.; Woodall, Christopher W.; Brissette, John
2013-01-01
Woody debris decay rates have recently received much attention because of the need to quantify temporal changes in forest carbon stocks. Published decay rates, available for many species, are commonly used to characterize deadwood biomass and carbon depletion. However, decay rates are often derived from reductions in wood density through time, which when used to model biomass and carbon depletion are known to underestimate rate loss because they fail to account for volume reduction (changes in log shape) as decay progresses. We present a method for estimating changes in log volume through time and illustrate the method using a chronosequence approach. The method is based on the observation, confirmed herein, that decaying logs have a collapse ratio (cross-sectional height/width) that can serve as a surrogate for the volume remaining. Combining the resulting volume loss with concurrent changes in wood density from the same logs then allowed us to quantify biomass and carbon depletion for three study species. Results show that volume, density, and biomass follow distinct depletion curves during decomposition. Volume showed an initial lag period (log dimensions remained unchanged), even while wood density was being reduced. However, once volume depletion began, biomass loss (the product of density and volume depletion) occurred much more rapidly than density alone. At the temporal limit of our data, the proportion of the biomass remaining was roughly half that of the density remaining. Accounting for log volume depletion, as demonstrated in this study, provides a comprehensive characterization of deadwood decomposition, thereby improving biomass-loss and carbon-accounting models.
Simulating the nasal cycle with computational fluid dynamics
Patel, Ruchin G.; Garcia, Guilherme J. M.; Frank-Ito, Dennis O.; Kimbell, Julia S.; Rhee, John S.
2015-01-01
Objectives (1) Develop a method to account for the confounding effect of the nasal cycle when comparing pre- and post-surgery objective measures of nasal patency. (2) Illustrate this method by reporting objective measures derived from computational fluid dynamics (CFD) models spanning the full range of mucosal engorgement associated with the nasal cycle in two subjects. Study Design Retrospective Setting Academic tertiary medical center. Subjects and Methods A cohort of 24 nasal airway obstruction patients was reviewed to select the two patients with the greatest reciprocal change in mucosal engorgement between pre- and post-surgery computed tomography (CT) scans. Three-dimensional anatomic models were created based on the pre- and post-operative CT scans. Nasal cycling models were also created by gradually changing the thickness of the inferior turbinate, middle turbinate, and septal swell body. CFD was used to simulate airflow and to calculate nasal resistance and average heat flux. Results Before accounting for the nasal cycle, Patient A appeared to have a paradoxical worsening nasal obstruction in the right cavity postoperatively. After accounting for the nasal cycle, Patient A had small improvements in objective measures postoperatively. The magnitude of the surgical effect also differed in Patient B after accounting for the nasal cycle. Conclusion By simulating the nasal cycle and comparing models in similar congestive states, surgical changes in nasal patency can be distinguished from physiological changes associated with the nasal cycle. This ability can lead to more precise comparisons of pre and post-surgery objective measures and potentially more accurate virtual surgery planning. PMID:25450411
ERIC Educational Resources Information Center
Ryan, Katherine E.
2004-01-01
Today, educational evaluation theory and practice face a critical juncture with the kind of educational accountability evaluation legislated by No Child Left Behind. While the goal of this kind of educational accountability is to improve education, it is characterized by a hierarchical, top-down approach to improving educational achievement…
ERIC Educational Resources Information Center
Gasoi, Emily
2012-01-01
Educational accountability has come to be defined almost exclusively in terms of schools meeting external standards of improvement. Building on a body of scholarship that presents schools as complex organizations, this research proposes that a more robust understanding of educational accountability must be grounded in practitioners perceptions of…
Alkema, Leontine; New, Jin Rou; Pedersen, Jon; You, Danzhen
2014-01-01
In September 2013, the United Nations Inter-agency Group for Child Mortality Estimation (UN IGME) published an update of the estimates of the under-five mortality rate (U5MR) and under-five deaths for all countries. Compared to the UN IGME estimates published in 2012, updated data inputs and a new method for estimating the U5MR were used. We summarize the new U5MR estimation method, which is a Bayesian B-spline Bias-reduction model, and highlight differences with the previously used method. Differences in UN IGME U5MR estimates as published in 2012 and those published in 2013 are presented and decomposed into differences due to the updated database and differences due to the new estimation method to explain and motivate changes in estimates. Compared to the previously used method, the new UN IGME estimation method is based on a different trend fitting method that can track (recent) changes in U5MR more closely. The new method provides U5MR estimates that account for data quality issues. Resulting differences in U5MR point estimates between the UN IGME 2012 and 2013 publications are small for the majority of countries but greater than 10 deaths per 1,000 live births for 33 countries in 2011 and 19 countries in 1990. These differences can be explained by the updated database used, the curve fitting method as well as accounting for data quality issues. Changes in the number of deaths were less than 10% on the global level and for the majority of MDG regions. The 2013 UN IGME estimates provide the most recent assessment of levels and trends in U5MR based on all available data and an improved estimation method that allows for closer-to-real-time monitoring of changes in the U5MR and takes account of data quality issues.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurnik, Charles W.; Tiessen, Alex
Retrocommissioning (RCx) is a systematic process for optimizing energy performance in existing buildings. It specifically focuses on improving the control of energy-using equipment (e.g., heating, ventilation, and air conditioning [HVAC] equipment and lighting) and typically does not involve equipment replacement. Field results have shown proper RCx can achieve energy savings ranging from 5 percent to 20 percent, with a typical payback of two years or less (Thorne 2003). The method presented in this protocol provides direction regarding: (1) how to account for each measure's specific characteristics and (2) how to choose the most appropriate savings verification approach.
NECAP 4.1: NASA's Energy-Cost Analysis Program fast input manual and example
NASA Technical Reports Server (NTRS)
Jensen, R. N.; Miner, D. L.
1982-01-01
NASA's Energy-Cost Analysis Program (NECAP) is a powerful computerized method to determine and to minimize building energy consumption. The program calculates hourly heat gain or losses taking into account the building thermal resistance and mass, using hourly weather and a response factor method. Internal temperatures are allowed to vary in accordance with thermostat settings and equipment capacity. NECAP 4.1 has a simplified input procedure and numerous other technical improvements. A very short input method is provided. It is limited to a single zone building. The user must still describe the building's outside geometry and select the type of system to be used.
Error analysis in inverse scatterometry. I. Modeling.
Al-Assaad, Rayan M; Byrne, Dale M
2007-02-01
Scatterometry is an optical technique that has been studied and tested in recent years in semiconductor fabrication metrology for critical dimensions. Previous work presented an iterative linearized method to retrieve surface-relief profile parameters from reflectance measurements upon diffraction. With the iterative linear solution model in this work, rigorous models are developed to represent the random and deterministic or offset errors in scatterometric measurements. The propagation of different types of error from the measurement data to the profile parameter estimates is then presented. The improvement in solution accuracies is then demonstrated with theoretical and experimental data by adjusting for the offset errors. In a companion paper (in process) an improved optimization method is presented to account for unknown offset errors in the measurements based on the offset error model.
2HOT: An Improved Parallel Hashed Oct-Tree N-Body Algorithm for Cosmological Simulation
Warren, Michael S.
2014-01-01
We report on improvements made over the past two decades to our adaptive treecode N-body method (HOT). A mathematical and computational approach to the cosmological N-body problem is described, with performance and scalability measured up to 256k (2 18 ) processors. We present error analysis and scientific application results from a series of more than ten 69 billion (4096 3 ) particle cosmological simulations, accounting for 4×10 20 floating point operations. These results include the first simulations using the new constraints on the standard model of cosmology from the Planck satellite. Our simulations set a new standard for accuracy andmore » scientific throughput, while meeting or exceeding the computational efficiency of the latest generation of hybrid TreePM N-body methods.« less
Analytical N beam position monitor method
NASA Astrophysics Data System (ADS)
Wegscheider, A.; Langner, A.; Tomás, R.; Franchi, A.
2017-11-01
Measurement and correction of focusing errors is of great importance for performance and machine protection of circular accelerators. Furthermore LHC needs to provide equal luminosities to the experiments ATLAS and CMS. High demands are also set on the speed of the optics commissioning, as the foreseen operation with β*-leveling on luminosity will require many operational optics. A fast measurement of the β -function around a storage ring is usually done by using the measured phase advance between three consecutive beam position monitors (BPMs). A recent extension of this established technique, called the N-BPM method, was successfully applied for optics measurements at CERN, ALBA, and ESRF. We present here an improved algorithm that uses analytical calculations for both random and systematic errors and takes into account the presence of quadrupole, sextupole, and BPM misalignments, in addition to quadrupolar field errors. This new scheme, called the analytical N-BPM method, is much faster, further improves the measurement accuracy, and is applicable to very pushed beam optics where the existing numerical N-BPM method tends to fail.
Detecting breast microcalcifications using super-resolution ultrasound imaging: a clinical study
NASA Astrophysics Data System (ADS)
Huang, Lianjie; Labyed, Yassin; Hanson, Kenneth; Sandoval, Daniel; Pohl, Jennifer; Williamson, Michael
2013-03-01
Imaging breast microcalcifications is crucial for early detection and diagnosis of breast cancer. It is challenging for current clinical ultrasound to image breast microcalcifications. However, new imaging techniques using data acquired with a synthetic-aperture ultrasound system have the potential to significantly improve ultrasound imaging. We recently developed a super-resolution ultrasound imaging method termed the phase-coherent multiple-signal classification (PC-MUSIC). This signal subspace method accounts for the phase response of transducer elements to improve image resolution. In this paper, we investigate the clinical feasibility of our super-resolution ultrasound imaging method for detecting breast microcalcifications. We use our custom-built, real-time synthetic-aperture ultrasound system to acquire breast ultrasound data for 40 patients whose mammograms show the presence of breast microcalcifications. We apply our super-resolution ultrasound imaging method to the patient data, and produce clear images of breast calcifications. Our super-resolution ultrasound PC-MUSIC imaging with synthetic-aperture ultrasound data can provide a new imaging modality for detecting breast microcalcifications in clinic without using ionizing radiation.
A geostatistical state-space model of animal densities for stream networks.
Hocking, Daniel J; Thorson, James T; O'Neil, Kyle; Letcher, Benjamin H
2018-06-21
Population dynamics are often correlated in space and time due to correlations in environmental drivers as well as synchrony induced by individual dispersal. Many statistical analyses of populations ignore potential autocorrelations and assume that survey methods (distance and time between samples) eliminate these correlations, allowing samples to be treated independently. If these assumptions are incorrect, results and therefore inference may be biased and uncertainty under-estimated. We developed a novel statistical method to account for spatio-temporal correlations within dendritic stream networks, while accounting for imperfect detection in the surveys. Through simulations, we found this model decreased predictive error relative to standard statistical methods when data were spatially correlated based on stream distance and performed similarly when data were not correlated. We found that increasing the number of years surveyed substantially improved the model accuracy when estimating spatial and temporal correlation coefficients, especially from 10 to 15 years. Increasing the number of survey sites within the network improved the performance of the non-spatial model but only marginally improved the density estimates in the spatio-temporal model. We applied this model to Brook Trout data from the West Susquehanna Watershed in Pennsylvania collected over 34 years from 1981 - 2014. We found the model including temporal and spatio-temporal autocorrelation best described young-of-the-year (YOY) and adult density patterns. YOY densities were positively related to forest cover and negatively related to spring temperatures with low temporal autocorrelation and moderately-high spatio-temporal correlation. Adult densities were less strongly affected by climatic conditions and less temporally variable than YOY but with similar spatio-temporal correlation and higher temporal autocorrelation. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Evaluation of Aeroservoelastic Effects on Flutter
NASA Technical Reports Server (NTRS)
Nagaraja, K. S.; Felt, Larry R.; Kraft, Raymond
1998-01-01
This report presents work performed by The Boeing Company to satisfy the deliverable "Evaluation of aeroservoelastic Effects on Symmetric Flutter" for Subtask 7 of Reference 1. The objective of this report is to incorporate the improved methods for studying the effects of a closed-loop control system on the aeroservoelastic behavior of the airplane planned under NASA HSR technical Integration Task 20 work. Also, a preliminary evaluation of the existing pitch control laws on symmetric flutter of the TCA configuration was addressed."The goal is to develop an improved modeling methodology and perform design studies that account for the aero-structures-systems interaction effects.
Byskov, Jens; Bloch, Paul; Blystad, Astrid; Hurtig, Anna-Karin; Fylkesnes, Knut; Kamuzora, Peter; Kombe, Yeri; Kvåle, Gunnar; Marchal, Bruno; Martin, Douglas K; Michelo, Charles; Ndawi, Benedict; Ngulube, Thabale J; Nyamongo, Isaac; Olsen, Oystein E; Onyango-Ouma, Washington; Sandøy, Ingvild F; Shayo, Elizabeth H; Silwamba, Gavin; Songstad, Nils Gunnar; Tuba, Mary
2009-10-24
Despite multiple efforts to strengthen health systems in low and middle income countries, intended sustainable improvements in health outcomes have not been shown. To date most priority setting initiatives in health systems have mainly focused on technical approaches involving information derived from burden of disease statistics, cost effectiveness analysis, and published clinical trials. However, priority setting involves value-laden choices and these technical approaches do not equip decision-makers to address a broader range of relevant values - such as trust, equity, accountability and fairness - that are of concern to other partners and, not least, the populations concerned. A new focus for priority setting is needed.Accountability for Reasonableness (AFR) is an explicit ethical framework for legitimate and fair priority setting that provides guidance for decision-makers who must identify and consider the full range of relevant values. AFR consists of four conditions: i) relevance to the local setting, decided by agreed criteria; ii) publicizing priority-setting decisions and the reasons behind them; iii) the establishment of revisions/appeal mechanisms for challenging and revising decisions; iv) the provision of leadership to ensure that the first three conditions are met.REACT - "REsponse to ACcountable priority setting for Trust in health systems" is an EU-funded five-year intervention study started in 2006, which is testing the application and effects of the AFR approach in one district each in Kenya, Tanzania and Zambia. The objectives of REACT are to describe and evaluate district-level priority setting, to develop and implement improvement strategies guided by AFR and to measure their effect on quality, equity and trust indicators. Effects are monitored within selected disease and programme interventions and services and within human resources and health systems management. Qualitative and quantitative methods are being applied in an action research framework to examine the potential of AFR to support sustainable improvements to health systems performance.This paper reports on the project design and progress and argues that there is a high need for research into legitimate and fair priority setting to improve the knowledge base for achieving sustainable improvements in health outcomes.
Greenhouse gases accounting and reporting for waste management--a South African perspective.
Friedrich, Elena; Trois, Cristina
2010-11-01
This paper investigates how greenhouse gases are accounted and reported in the waste sector in South Africa. Developing countries (including South Africa) do not have binding emission reduction targets, but many of them publish different greenhouse gas emissions data which have been accounted and reported in different ways. Results show that for South Africa, inventories at national and municipal level are the most important tools in the process of accounting and reporting greenhouse gases from waste. For the development of these inventories international initiatives were important catalysts at national and municipal levels, and assisted in developing local expertise, resulting in increased output quality. However, discrepancies in the methodology used to account greenhouse gases from waste between inventories still remain a concern. This is a challenging issue for developing countries, especially African ones, since higher accuracy methods are more data intensive. Analysis of the South African inventories shows that results from the recent inventories can not be compared with older ones due to the use of different accounting methodologies. More recently the use of Clean Development Mechanism (CDM) procedures in Africa, geared towards direct measurements of greenhouse gases from landfill sites, has increased and resulted in an improvement of the quality of greenhouse gas inventories at municipal level. Copyright © 2010 Elsevier Ltd. All rights reserved.
26 CFR 1.448-1 - Limitation on the use of the cash receipts and disbursements method of accounting.
Code of Federal Regulations, 2010 CFR
2010-04-01
... disbursements method of accounting. 1.448-1 Section 1.448-1 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED) INCOME TAX (CONTINUED) INCOME TAXES Methods of Accounting § 1.448-1 Limitation on the use of the cash receipts and disbursements method of accounting. (a)-(f) [Reserved] (g...
Lee, Hwa-Young; Yang, Bong-Ming; Kang, Minah
2016-01-01
Background Despite continued global efforts, HIV/AIDS outcomes in developing countries have not made much progress. Poor governance in recipient countries is often seen as one of the reasons for ineffectiveness of aid efforts to achieve stated objectives and desired outcomes. Objective This study examines the impact of two important dimensions of governance – control of corruption and democratic accountability – on the effectiveness of HIV/AIDS official development assistance. Design An empirical analysis using dynamic panel Generalized Method of Moments estimation was conducted on 2001–2010 datasets. Results Control of corruption and democratic accountability revealed an independent effect and interaction with the amount of HIV/AIDS aid on incidence of HIV/AIDS, respectively, while none of the two governance variables had a significant effect on HIV/AIDS prevalence. Specifically, in countries with accountability level below −2.269, aid has a detrimental effect on incidence of HIV/AIDS. Conclusion The study findings suggest that aid programs need to be preceded or at least accompanied by serious efforts to improve governance in recipient countries and that democratic accountability ought to receive more critical attention. PMID:27189199
Davidov, Ori; Rosen, Sophia
2011-04-01
In medical studies, endpoints are often measured for each patient longitudinally. The mixed-effects model has been a useful tool for the analysis of such data. There are situations in which the parameters of the model are subject to some restrictions or constraints. For example, in hearing loss studies, we expect hearing to deteriorate with time. This means that hearing thresholds which reflect hearing acuity will, on average, increase over time. Therefore, the regression coefficients associated with the mean effect of time on hearing ability will be constrained. Such constraints should be accounted for in the analysis. We propose maximum likelihood estimation procedures, based on the expectation-conditional maximization either algorithm, to estimate the parameters of the model while accounting for the constraints on them. The proposed methods improve, in terms of mean square error, on the unconstrained estimators. In some settings, the improvement may be substantial. Hypotheses testing procedures that incorporate the constraints are developed. Specifically, likelihood ratio, Wald, and score tests are proposed and investigated. Their empirical significance levels and power are studied using simulations. It is shown that incorporating the constraints improves the mean squared error of the estimates and the power of the tests. These improvements may be substantial. The methodology is used to analyze a hearing loss study.
Freeman, T; Walshe, K
2004-01-01
Background: A national cross sectional study was undertaken to explore the perceptions concerning the importance of, and progress in, aspects of clinical governance among board level and directorate managers in English acute, ambulance, and mental health/learning disabilities (MH/LD) trusts. Participants: A stratified sample of acute, ambulance, and mental health/learning disabilities trusts in England (n = 100), from each of which up to 10 board level and 10 directorate level managers were randomly sampled. Methods: Fieldwork was undertaken between April and July 2002 using the Organisational Progress in Clinical Governance (OPCG) schedule to explore managers' perceptions of the importance of, and organisational achievement in, 54 clinical governance competency items in five aggregated domains: improving quality; managing risks; improving staff performance; corporate accountability; and leadership and collaboration. The difference between ratings of importance and achievement was termed a shortfall. Results: Of 1916 individuals surveyed, 1177 (61.4%) responded. The competency items considered most important and recording highest perceived achievement related to corporate accountability structures and clinical risks. The highest shortfalls between perceived importance and perceived achievement were reported in joint working across local health communities, feedback of performance data, and user involvement. When aggregated into domains, greatest achievement was perceived in the assurance related areas of corporate accountability and risk management, with considerably less perceived achievement and consequently higher shortfalls in quality improvement and leadership and collaboration. Directorate level managers' perceptions of achievement were found to be significantly lower than those of their board level colleagues on all domains other than improving performance. No differences were found in perceptions of achievement between different types of trusts, or between trusts at different stages in the Commission for Health Improvement (CHI) review cycle. Conclusions: While structures and systems for clinical governance seem well established, there is more perceived progress in areas concerned with quality assurance than quality improvement. This study raises some uncomfortable questions about the impact of CHI review visits. PMID:15465936
Review and assessment of the HOST turbine heat transfer program
NASA Technical Reports Server (NTRS)
Gladden, Herbert J.
1988-01-01
The objectives of the HOST Turbine Heat Transfer subproject were to obtain a better understanding of the physics of the aerothermodynamic phenomena occurring in high-performance gas turbine engines and to assess and improve the analytical methods used to predict the fluid dynamics and heat transfer phenomena. At the time the HOST project was initiated, an across-the-board improvement in turbine design technology was needed. Therefore, a building-block approach was utilized, with research ranging from the study of fundamental phenomena and analytical modeling to experiments in simulated real-engine environments. Experimental research accounted for 75 percent of the project, and analytical efforts accounted for approximately 25 percent. Extensive experimental datasets were created depicting the three-dimensional flow field, high free-stream turbulence, boundary-layer transition, blade tip region heat transfer, film cooling effects in a simulated engine environment, rough-wall cooling enhancement in a rotating passage, and rotor-stator interaction effects. In addition, analytical modeling of these phenomena was initiated using boundary-layer assumptions as well as Navier-Stokes solutions.
An Improved BLE Indoor Localization with Kalman-Based Fusion: An Experimental Study
Röbesaat, Jenny; Zhang, Peilin; Abdelaal, Mohamed; Theel, Oliver
2017-01-01
Indoor positioning has grasped great attention in recent years. A number of efforts have been exerted to achieve high positioning accuracy. However, there exists no technology that proves its efficacy in various situations. In this paper, we propose a novel positioning method based on fusing trilateration and dead reckoning. We employ Kalman filtering as a position fusion algorithm. Moreover, we adopt an Android device with Bluetooth Low Energy modules as the communication platform to avoid excessive energy consumption and to improve the stability of the received signal strength. To further improve the positioning accuracy, we take the environmental context information into account while generating the position fixes. Extensive experiments in a testbed are conducted to examine the performance of three approaches: trilateration, dead reckoning and the fusion method. Additionally, the influence of the knowledge of the environmental context is also examined. Finally, our proposed fusion method outperforms both trilateration and dead reckoning in terms of accuracy: experimental results show that the Kalman-based fusion, for our settings, achieves a positioning accuracy of less than one meter. PMID:28445421
Hedlund, Ann; Ateg, Mattias; Andersson, Ing-Marie; Rosén, Gunnar
2010-04-01
Workers' motivation to actively take part in improvements to the work environment is assumed to be important for the efficiency of investments for that purpose. That gives rise to the need for a tool to measure this motivation. A questionnaire to measure motivation for improvements to the work environment has been designed. Internal consistency and test-retest reliability of the domains of the questionnaire have been measured, and the factorial structure has been explored, from the answers of 113 employees. The internal consistency is high (0.94), as well as the correlation for the total score (0.84). Three factors are identified accounting for 61.6% of the total variance. The questionnaire can be a useful tool in improving intervention methods. The expectation is that the tool can be useful, particularly with the aim of improving efficiency of companies' investments for work environment improvements. Copyright 2010 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Bush-Mecenas, Susan; Marsh, Julie A.; Montes de Oca, David; Hough, Heather
2018-01-01
School accountability and improvement policy are on the precipice of a paradigm shift. While the multiple-measure dashboard accountability approach holds great promise for promoting more meaningful learning opportunities for all students, our research indicates that this can come with substantial challenges in practice. We reflect upon the lessons…
Graduation Matters: Improving Accountability for High School Graduation
ERIC Educational Resources Information Center
Hall, Daria
2007-01-01
An analysis of accountability for high school graduation rates under the federal No Child Left Behind Act (NCLB) reveals two major problems: (1) State goals for raising graduation rates are far too low to spur needed improvement and (2) Gaps between student groups are allowed to persist by an accountability system that looks only at average…
Analysis of documentary support for environmental restoration programs in Russia
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nechaev, A.F.; Projaev, V.V.
1995-12-31
Taking into account an importance of an adequate regulations for ensuring of radiological safety of the biosphere and for successful implementation of environmental restoration projects, contents of legislative and methodical documents as well as their comprehensitivity and substantiation are subjected to critical analysis. It is shown that there is much scope for further optimization of and improvements in regulatory basis both on Federal and regional levels.
ERIC Educational Resources Information Center
Tyson, Deonte Rashawn
2017-01-01
This multiple case study examined the methods by which school leaders determined and planned teacher professional development, as well as what teachers perceived as their professional development needs and how they believe school leaders take those needs into account. The study took place at two suburban elementary schools (1 traditional public, 1…
Irimia, Andrei; Goh, S.-Y. Matthew; Torgerson, Carinna M.; Stein, Nathan R.; Chambers, Micah C.; Vespa, Paul M.; Van Horn, John D.
2013-01-01
Objective To inverse-localize epileptiform cortical electrical activity recorded from severe traumatic brain injury (TBI) patients using electroencephalography (EEG). Methods Three acute TBI cases were imaged using computed tomography (CT) and multimodal magnetic resonance imaging (MRI). Semi-automatic segmentation was performed to partition the complete TBI head into 25 distinct tissue types, including 6 tissue types accounting for pathology. Segmentations were employed to generate a finite element method model of the head, and EEG activity generators were modeled as dipolar currents distributed over the cortical surface. Results We demonstrate anatomically faithful localization of EEG generators responsible for epileptiform discharges in severe TBI. By accounting for injury-related tissue conductivity changes, our work offers the most realistic implementation currently available for the inverse estimation of cortical activity in TBI. Conclusion Whereas standard localization techniques are available for electrical activity mapping in uninjured brains, they are rarely applied to acute TBI. Modern models of TBI-induced pathology can inform the localization of epileptogenic foci, improve surgical efficacy, contribute to the improvement of critical care monitoring and provide guidance for patient-tailored treatment. With approaches such as this, neurosurgeons and neurologists can study brain activity in acute TBI and obtain insights regarding injury effects upon brain metabolism and clinical outcome. PMID:24011495
Verdú, Miguel; Traveset, Anna
2004-02-01
Most studies using meta-analysis try to establish relationships between traits across taxa from interspecific databases and, thus, the phylogenetic relatedness among these taxa should be taken into account to avoid pseudoreplication derived from common ancestry. This paper illustrates, with a representative example of the relationship between seed size and the effect of frugivore's gut on seed germination, that meta-analytic procedures can also be phylogenetically corrected by means of the comparative method. The conclusions obtained in the meta-analytical and phylogenetical approaches are very different. The meta-analysis revealed that the positive effects that gut passage had on seed germination increased with seed size in the case of gut passage through birds whereas decreased in the case of gut passage through non-flying mammals. However, once the phylogenetic relatedness among plant species was taken into account, the effects of gut passage on seed germination did not depend on seed size and were similar between birds and non-flying mammals. Some methodological considerations are given to improve the bridge between the meta-analysis and the comparative method.
Trends in Accounting Education: Decreasing Accounting Anxiety and Promoting New Methods
ERIC Educational Resources Information Center
Buckhaults, Jessica; Fisher, Diane
2011-01-01
In this paper, authors (a) identified accounting anxiety for the educator and the student as a possible explanation for the decline in accounting education and (b) investigated new methods for teaching accounting at the secondary and postsecondary levels that will increase interest in accounting education as well as decrease educator and student…
Wykes, Til; Reeder, Clare; Huddy, Vyv; Taylor, Rumina; Wood, Helen; Ghirasim, Natalia; Kontis, Dimitrios; Landau, Sabine
2012-01-01
Background Cognitive remediation (CRT) affects functioning but the extent and type of cognitive improvements necessary are unknown. Aim To develop and test models of how cognitive improvement transfers to work behaviour using the data from a current service. Method Participants (N49) with a support worker and a paid or voluntary job were offered CRT in a Phase 2 single group design with three assessments: baseline, post therapy and follow-up. Working memory, cognitive flexibility, planning and work outcomes were assessed. Results Three models were tested (mediation — cognitive improvements drive functioning improvement; moderation — post treatment cognitive level affects the impact of CRT on functioning; moderated mediation — cognition drives functioning improvements only after a certain level is achieved). There was evidence of mediation (planning improvement associated with improved work quality). There was no evidence that cognitive flexibility (total Wisconsin Card Sorting Test errors) and working memory (Wechsler Adult Intelligence Scale III digit span) mediated work functioning despite significant effects. There was some evidence of moderated mediation for planning improvement if participants had poorer memory and/or made fewer WCST errors. The total CRT effect on work quality was d = 0.55, but the indirect (planning-mediated CRT effect) was d = 0.082 Conclusion Planning improvements led to better work quality but only accounted for a small proportion of the total effect on work outcome. Other specific and non-specific effects of CRT and the work programme are likely to account for some of the remaining effect. This is the first time complex models have been tested and future Phase 3 studies need to further test mediation and moderated mediation models. PMID:22503640
Improved cosine similarity measures of simplified neutrosophic sets for medical diagnoses.
Ye, Jun
2015-03-01
In pattern recognition and medical diagnosis, similarity measure is an important mathematical tool. To overcome some disadvantages of existing cosine similarity measures of simplified neutrosophic sets (SNSs) in vector space, this paper proposed improved cosine similarity measures of SNSs based on cosine function, including single valued neutrosophic cosine similarity measures and interval neutrosophic cosine similarity measures. Then, weighted cosine similarity measures of SNSs were introduced by taking into account the importance of each element. Further, a medical diagnosis method using the improved cosine similarity measures was proposed to solve medical diagnosis problems with simplified neutrosophic information. The improved cosine similarity measures between SNSs were introduced based on cosine function. Then, we compared the improved cosine similarity measures of SNSs with existing cosine similarity measures of SNSs by numerical examples to demonstrate their effectiveness and rationality for overcoming some shortcomings of existing cosine similarity measures of SNSs in some cases. In the medical diagnosis method, we can find a proper diagnosis by the cosine similarity measures between the symptoms and considered diseases which are represented by SNSs. Then, the medical diagnosis method based on the improved cosine similarity measures was applied to two medical diagnosis problems to show the applications and effectiveness of the proposed method. Two numerical examples all demonstrated that the improved cosine similarity measures of SNSs based on the cosine function can overcome the shortcomings of the existing cosine similarity measures between two vectors in some cases. By two medical diagnoses problems, the medical diagnoses using various similarity measures of SNSs indicated the identical diagnosis results and demonstrated the effectiveness and rationality of the diagnosis method proposed in this paper. The improved cosine measures of SNSs based on cosine function can overcome some drawbacks of existing cosine similarity measures of SNSs in vector space, and then their diagnosis method is very suitable for handling the medical diagnosis problems with simplified neutrosophic information and demonstrates the effectiveness and rationality of medical diagnoses. Copyright © 2014 Elsevier B.V. All rights reserved.
Rethinking the Quest for School Improvement: Some Findings from the DESSI Study.
ERIC Educational Resources Information Center
Huberman, A. Michael; Miles, Matthew B.
1984-01-01
A review of the Study of Dissemination Efforts Supporting School Improvement (DESSI) field study indicated a need for reorganization of the conceptual paradigms used to account for school improvement. Current paradigms do not account for the rational and conflict theories of social change. (DF)
An improved approximate-Bayesian model-choice method for estimating shared evolutionary history
2014-01-01
Background To understand biological diversification, it is important to account for large-scale processes that affect the evolutionary history of groups of co-distributed populations of organisms. Such events predict temporally clustered divergences times, a pattern that can be estimated using genetic data from co-distributed species. I introduce a new approximate-Bayesian method for comparative phylogeographical model-choice that estimates the temporal distribution of divergences across taxa from multi-locus DNA sequence data. The model is an extension of that implemented in msBayes. Results By reparameterizing the model, introducing more flexible priors on demographic and divergence-time parameters, and implementing a non-parametric Dirichlet-process prior over divergence models, I improved the robustness, accuracy, and power of the method for estimating shared evolutionary history across taxa. Conclusions The results demonstrate the improved performance of the new method is due to (1) more appropriate priors on divergence-time and demographic parameters that avoid prohibitively small marginal likelihoods for models with more divergence events, and (2) the Dirichlet-process providing a flexible prior on divergence histories that does not strongly disfavor models with intermediate numbers of divergence events. The new method yields more robust estimates of posterior uncertainty, and thus greatly reduces the tendency to incorrectly estimate models of shared evolutionary history with strong support. PMID:24992937
Introducing visual participatory methods to develop local knowledge on HIV in rural South Africa
Brooks, Chloe; Kazimierczak, Karolina; Ngobeni, Sizzy; Twine, Rhian; Tollman, Stephen; Kahn, Kathleen; Byass, Peter
2017-01-01
Introduction South Africa is a country faced with complex health and social inequalities, in which HIV/AIDS has had devastating impacts. The study aimed to gain insights into the perspectives of rural communities on HIV-related mortality. Methods A participatory action research (PAR) process, inclusive of a visual participatory method (Photovoice), was initiated to elicit and organise local knowledge and to identify priorities for action in a rural subdistrict underpinned by the Agincourt Health and Socio-Demographic Surveillance System (HDSS). We convened three village-based discussion groups, presented HDSS data on HIV-related mortality, elicited subjective perspectives on HIV/AIDS, systematised these into collective accounts and identified priorities for action. Framework analysis was performed on narrative and visual data, and practice theory was used to interpret the findings. Findings A range of social and health systems factors were identified as causes and contributors of HIV mortality. These included alcohol use/abuse, gender inequalities, stigma around disclosure of HIV status, problems with informal care, poor sanitation, harmful traditional practices, delays in treatment, problems with medications and problematic staff–patient relationships. To address these issues, developing youth facilities in communities, improving employment opportunities, timely treatment and extending community outreach for health education and health promotion were identified. Discussion Addressing social practices of blame, stigma and mistrust around HIV-related mortality may be a useful focus for policy and planning. Research that engages communities and authorities to coproduce evidence can capture these practices, improve communication and build trust. Conclusion Actions to reduce HIV should go beyond individual agency and structural forces to focus on how social practices embody these elements. Initiating PAR inclusive of visual methods can build shared understandings of disease burdens in social and health systems contexts. This can develop shared accountability and improve staff–patient relationships, which, over time, may address the issues identified, here related to stigma and blame. PMID:29071128
Teaching Elementary Accounting to Non-Accounting Majors
ERIC Educational Resources Information Center
Lloyd, Cynthia B.; Abbey, Augustus
2009-01-01
A central recurring theme in business education is the optimal strategy for improving introductory accounting, the gateway subject of business education. For many students, especially non-accounting majors, who are required to take introductory accounting as a requirement of the curriculum, introductory accounting has become a major obstacle for…
NASA Astrophysics Data System (ADS)
Subotnik, Joseph E.; Shenvi, Neil
2011-06-01
Fewest-switches surface hopping (FSSH) is a popular nonadiabatic dynamics method which treats nuclei with classical mechanics and electrons with quantum mechanics. In order to simulate the motion of a wave packet as accurately as possible, standard FSSH requires a stochastic sampling of the trajectories over a distribution of initial conditions corresponding, e.g., to the Wigner distribution of the initial quantum wave packet. Although it is well-known that FSSH does not properly account for decoherence effects, there is some confusion in the literature about whether or not this averaging over a distribution of initial conditions can approximate some of the effects of decoherence. In this paper, we not only show that averaging over initial conditions does not generally account for decoherence, but also why it fails to do so. We also show how an apparent improvement in accuracy can be obtained for a fortuitous choice of model problems, even though this improvement is not possible, in general. For a basic set of one-dimensional and two-dimensional examples, we find significantly improved results using our recently introduced augmented FSSH algorithm.
1983-09-27
monitoring subcontractor activity, DOD officials must take into account practical and cost considerations 20 * *, l o *° as well as legal constraints...AD-A134 967 VALUE ENGINEERING SHOULD BE IMPROYED AS PART OF THE i/i DEFENSE DEPARTMENT’S~ (U) GENERAL ACCOUNTING OFFICE WASHINGTON DC ACCOUNTING AND...RESOLUTION TEST CHART NATIONAL BUREAU OF STANDAR DS -I96 3 -A 7.... REPORT BY THE U.S. & General Accounting Office ,:: Value Engineering Should Be Improved
Trosman, Julia R.; Weldon, Christine B.; Douglas, Michael P.; Deverka, Patricia A.; Watkins, John; Phillips, Kathryn A.
2016-01-01
Background New payment and care organization approaches, such as the Accountable Care Organization (ACO), are reshaping accountability and shifting risk, as well as decision-making, from payers to providers, under the Triple Aim of health reform. The Triple Aim calls for improving experience of care, improving health of populations and reducing healthcare costs. In the era of accelerating scientific advancement of personalized medicine and other innovations, it is critical to understand how the transition to the ACO model impacts decision-making on adoption and utilization of innovative technologies. Methods We interviewed representatives from ten private payers and six provider institutions involved in implementing the ACO model (i.e. ACOs) to understand changes, challenges and facilitators of decision-making on medical innovations, including personalized medicine. We used the framework approach of qualitative research for study design and thematic analysis. Results We found that representatives from the participating payer companies and ACOs perceive similar challenges to ACOs’ decision-making in terms of achieving a balance between the components of the Triple Aim – improving care experience, improving population health and reducing costs. The challenges include the prevalence of cost over care quality considerations in ACOs’ decisions and ACOs’ insufficient analytical and technology assessment capacity to evaluate complex innovations such as personalized medicine. Decision-making facilitators included increased competition across ACOs and patients’ interest in personalized medicine. Conclusions As new payment models evolve, payers, ACOs and other stakeholders should address challenges and leverage opportunities to arm ACOs with robust, consistent, rigorous and transparent approaches to decision-making on medical innovations. PMID:28212967
Andronis, Lazaros; Barton, Pelham M
2016-04-01
Value of information (VoI) calculations give the expected benefits of decision making under perfect information (EVPI) or sample information (EVSI), typically on the premise that any treatment recommendations made in light of this information will be implemented instantly and fully. This assumption is unlikely to hold in health care; evidence shows that obtaining further information typically leads to "improved" rather than "perfect" implementation. To present a method of calculating the expected value of further research that accounts for the reality of improved implementation. This work extends an existing conceptual framework by introducing additional states of the world regarding information (sample information, in addition to current and perfect information) and implementation (improved implementation, in addition to current and optimal implementation). The extension allows calculating the "implementation-adjusted" EVSI (IA-EVSI), a measure that accounts for different degrees of implementation. Calculations of implementation-adjusted estimates are illustrated under different scenarios through a stylized case study in non-small cell lung cancer. In the particular case study, the population values for EVSI and IA-EVSI were £ 25 million and £ 8 million, respectively; thus, a decision assuming perfect implementation would have overestimated the expected value of research by about £ 17 million. IA-EVSI was driven by the assumed time horizon and, importantly, the specified rate of change in implementation: the higher the rate, the greater the IA-EVSI and the lower the difference between IA-EVSI and EVSI. Traditionally calculated measures of population VoI rely on unrealistic assumptions about implementation. This article provides a simple framework that accounts for improved, rather than perfect, implementation and offers more realistic estimates of the expected value of research. © The Author(s) 2015.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Minelli, Annalisa, E-mail: Annalisa.Minelli@univ-brest.fr; Marchesini, Ivan, E-mail: Ivan.Marchesini@irpi.cnr.it; Taylor, Faith E., E-mail: Faith.Taylor@kcl.ac.uk
Although there are clear economic and environmental incentives for producing energy from solar and wind power, there can be local opposition to their installation due to their impact upon the landscape. To date, no international guidelines exist to guide quantitative visual impact assessment of these facilities, making the planning process somewhat subjective. In this paper we demonstrate the development of a method and an Open Source GIS tool to quantitatively assess the visual impact of these facilities using line-of-site techniques. The methods here build upon previous studies by (i) more accurately representing the shape of energy producing facilities, (ii) takingmore » into account the distortion of the perceived shape and size of facilities caused by the location of the observer, (iii) calculating the possible obscuring of facilities caused by terrain morphology and (iv) allowing the combination of various facilities to more accurately represent the landscape. The tool has been applied to real and synthetic case studies and compared to recently published results from other models, and demonstrates an improvement in accuracy of the calculated visual impact of facilities. The tool is named r.wind.sun and is freely available from GRASS GIS AddOns. - Highlights: • We develop a tool to quantify wind turbine and photovoltaic panel visual impact. • The tool is freely available to download and edit as a module of GRASS GIS. • The tool takes into account visual distortion of the shape and size of objects. • The accuracy of calculation of visual impact is improved over previous methods.« less
Cheung, C Y Maurice; Williams, Thomas C R; Poolman, Mark G; Fell, David A; Ratcliffe, R George; Sweetlove, Lee J
2013-09-01
Flux balance models of metabolism generally utilize synthesis of biomass as the main determinant of intracellular fluxes. However, the biomass constraint alone is not sufficient to predict realistic fluxes in central heterotrophic metabolism of plant cells because of the major demand on the energy budget due to transport costs and cell maintenance. This major limitation can be addressed by incorporating transport steps into the metabolic model and by implementing a procedure that uses Pareto optimality analysis to explore the trade-off between ATP and NADPH production for maintenance. This leads to a method for predicting cell maintenance costs on the basis of the measured flux ratio between the oxidative steps of the oxidative pentose phosphate pathway and glycolysis. We show that accounting for transport and maintenance costs substantially improves the accuracy of fluxes predicted from a flux balance model of heterotrophic Arabidopsis cells in culture, irrespective of the objective function used in the analysis. Moreover, when the new method was applied to cells under control, elevated temperature and hyper-osmotic conditions, only elevated temperature led to a substantial increase in cell maintenance costs. It is concluded that the hyper-osmotic conditions tested did not impose a metabolic stress, in as much as the metabolic network is not forced to devote more resources to cell maintenance. © 2013 The Authors The Plant Journal © 2013 John Wiley & Sons Ltd.
Improved methods for the measurement and analysis of stellar magnetic fields
NASA Technical Reports Server (NTRS)
Saar, Steven H.
1988-01-01
The paper presents several improved methods for the measurement of magnetic fields on cool stars which take into account simple radiative transfer effects and the exact Zeeman patterns. Using these methods, high-resolution, low-noise data can be fitted with theoretical line profiles to determine the mean magnetic field strength in stellar active regions and a model-dependent fraction of the stellar surface (filling factor) covered by these regions. Random errors in the derived field strength and filling factor are parameterized in terms of signal-to-noise ratio, wavelength, spectral resolution, stellar rotation rate, and the magnetic parameters themselves. Weak line blends, if left uncorrected, can have significant systematic effects on the derived magnetic parameters, and thus several methods are developed to compensate partially for them. The magnetic parameters determined by previous methods likely have systematic errors because of such line blends and because of line saturation effects. Other sources of systematic error are explored in detail. These sources of error currently make it difficult to determine the magnetic parameters of individual stars to better than about + or - 20 percent.
Sluiter, Amie; Sluiter, Justin; Wolfrum, Ed; ...
2016-05-20
Accurate and precise chemical characterization of biomass feedstocks and process intermediates is a requirement for successful technical and economic evaluation of biofuel conversion technologies. The uncertainty in primary measurements of the fraction insoluble solid (FIS) content of dilute acid pretreated corn stover slurry is the major contributor to uncertainty in yield calculations for enzymatic hydrolysis of cellulose to glucose. This uncertainty is propagated through process models and impacts modeled fuel costs. The challenge in measuring FIS is obtaining an accurate measurement of insoluble matter in the pretreated materials, while appropriately accounting for all biomass derived components. Three methods were testedmore » to improve this measurement. One used physical separation of liquid and solid phases, and two utilized direct determination of dry matter content in two fractions. We offer a comparison of drying methods. Lastly, our results show utilizing a microwave dryer to directly determine dry matter content is the optimal method for determining FIS, based on the low time requirements and the method optimization done using model slurries.« less
NASA Astrophysics Data System (ADS)
Shaw, Stephen B.; Walter, M. Todd
2009-03-01
The Soil Conservation Service curve number (SCS-CN) method is widely used to predict storm runoff for hydraulic design purposes, such as sizing culverts and detention basins. As traditionally used, the probability of calculated runoff is equated to the probability of the causative rainfall event, an assumption that fails to account for the influence of variations in soil moisture on runoff generation. We propose a modification to the SCS-CN method that explicitly incorporates rainfall return periods and the frequency of different soil moisture states to quantify storm runoff risks. Soil moisture status is assumed to be correlated to stream base flow. Fundamentally, this approach treats runoff as the outcome of a bivariate process instead of dictating a 1:1 relationship between causative rainfall and resulting runoff volumes. Using data from the Fall Creek watershed in western New York and the headwaters of the French Broad River in the mountains of North Carolina, we show that our modified SCS-CN method improves frequency discharge predictions in medium-sized watersheds in the eastern United States in comparison to the traditional application of the method.
ERIC Educational Resources Information Center
General Accounting Office, Washington, DC.
This report discusses federally sponsored research at educational institutions and suggests ways to improve accountability for these funds. The following suggestions are made for minimizing problems presented in this report: (1) development of more definitive cost principles for both the institutions and the Federal auditors to follow; (2) more…
ERIC Educational Resources Information Center
van der Gaag, Jacques; Abetti, Pauline
2011-01-01
This policy brief outlines how national education accounts (NEAs) are created, and why they are a vast improvement over current financial tracking systems in the education sector. Examples from the health sector illustrate the benefits of national accounts for improving public services, and their ubiquity highlights the poor state of affairs of…
California School Accounting Manual. 1984 Edition.
ERIC Educational Resources Information Center
Lundin, Janet, Ed.
California's official school accounting procedures, amended in 1984 to clarify definitions and improve program cost accounting, are presented. Following an introduction that discusses general characteristics of school accounting, the manual explains the following areas of accounting practice: (1) financial reporting; (2) income; (3) expenditures;…
Fish Passage though Hydropower Turbines: Simulating Blade Strike using the Discrete Element Method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Richmond, Marshall C.; Romero Gomez, Pedro DJ
mong the hazardous hydraulic conditions affecting anadromous and resident fish during their passage though turbine flows, two are believed to cause considerable injury and mortality: collision on moving blades and decompression. Several methods are currently available to evaluate these stressors in installed turbines, i.e. using live fish or autonomous sensor devices, and in reduced-scale physical models, i.e. registering collisions from plastic beads. However, a priori estimates with computational modeling approaches applied early in the process of turbine design can facilitate the development of fish-friendly turbines. In the present study, we evaluated the frequency of blade strike and nadir pressure environmentmore » by modeling potential fish trajectories with the Discrete Element Method (DEM) applied to fish-like composite particles. In the DEM approach, particles are subjected to realistic hydraulic conditions simulated with computational fluid dynamics (CFD), and particle-structure interactions—representing fish collisions with turbine blades—are explicitly recorded and accounted for in the calculation of particle trajectories. We conducted transient CFD simulations by setting the runner in motion and allowing for better turbulence resolution, a modeling improvement over the conventional practice of simulating the system in steady state which was also done here. While both schemes yielded comparable bulk hydraulic performance, transient conditions exhibited a visual improvement in describing flow variability. We released streamtraces (steady flow solution) and DEM particles (transient solution) at the same location from where sensor fish (SF) have been released in field studies of the modeled turbine unit. The streamtrace-based results showed a better agreement with SF data than the DEM-based nadir pressures did because the former accounted for the turbulent dispersion at the intake but the latter did not. However, the DEM-based strike frequency is more representative of blade-strike probability than the steady solution is, mainly because DEM particles accounted for the full fish length, thus resolving (instead of modeling) the collision event.« less
Tiedeman, C.R.; Hill, M.C.; D'Agnese, F. A.; Faunt, C.C.
2003-01-01
Calibrated models of groundwater systems can provide substantial information for guiding data collection. This work considers using such models to guide hydrogeologic data collection for improving model predictions by identifying model parameters that are most important to the predictions. Identification of these important parameters can help guide collection of field data about parameter values and associated flow system features and can lead to improved predictions. Methods for identifying parameters important to predictions include prediction scaled sensitivities (PSS), which account for uncertainty on individual parameters as well as prediction sensitivity to parameters, and a new "value of improved information" (VOII) method presented here, which includes the effects of parameter correlation in addition to individual parameter uncertainty and prediction sensitivity. In this work, the PSS and VOII methods are demonstrated and evaluated using a model of the Death Valley regional groundwater flow system. The predictions of interest are advective transport paths originating at sites of past underground nuclear testing. Results show that for two paths evaluated the most important parameters include a subset of five or six of the 23 defined model parameters. Some of the parameters identified as most important are associated with flow system attributes that do not lie in the immediate vicinity of the paths. Results also indicate that the PSS and VOII methods can identify different important parameters. Because the methods emphasize somewhat different criteria for parameter importance, it is suggested that parameters identified by both methods be carefully considered in subsequent data collection efforts aimed at improving model predictions.
Mayne, Terence P; Paskaranandavadivel, Niranchan; Erickson, Jonathan C; OGrady, Gregory; Cheng, Leo K; Angeli, Timothy R
2018-02-01
High-resolution mapping of gastrointestinal (GI) slow waves is a valuable technique for research and clinical applications. Interpretation of high-resolution GI mapping data relies on animations of slow wave propagation, but current methods remain as rudimentary, pixelated electrode activation animations. This study aimed to develop improved methods of visualizing high-resolution slow wave recordings that increases ease of interpretation. The novel method of "wavefront-orientation" interpolation was created to account for the planar movement of the slow wave wavefront, negate any need for distance calculations, remain robust in atypical wavefronts (i.e., dysrhythmias), and produce an appropriate interpolation boundary. The wavefront-orientation method determines the orthogonal wavefront direction and calculates interpolated values as the mean slow wave activation-time (AT) of the pair of linearly adjacent electrodes along that direction. Stairstep upsampling increased smoothness and clarity. Animation accuracy of 17 human high-resolution slow wave recordings (64-256 electrodes) was verified by visual comparison to the prior method showing a clear improvement in wave smoothness that enabled more accurate interpretation of propagation, as confirmed by an assessment of clinical applicability performed by eight GI clinicians. Quantitatively, the new method produced accurate interpolation values compared to experimental data (mean difference 0.02 ± 0.05 s) and was accurate when applied solely to dysrhythmic data (0.02 ± 0.06 s), both within the error in manual AT marking (mean 0.2 s). Mean interpolation processing time was 6.0 s per wave. These novel methods provide a validated visualization platform that will improve analysis of high-resolution GI mapping in research and clinical translation.
Formulating Spatially Varying Performance in the Statistical Fusion Framework
Landman, Bennett A.
2012-01-01
To date, label fusion methods have primarily relied either on global (e.g. STAPLE, globally weighted vote) or voxelwise (e.g. locally weighted vote) performance models. Optimality of the statistical fusion framework hinges upon the validity of the stochastic model of how a rater errs (i.e., the labeling process model). Hitherto, approaches have tended to focus on the extremes of potential models. Herein, we propose an extension to the STAPLE approach to seamlessly account for spatially varying performance by extending the performance level parameters to account for a smooth, voxelwise performance level field that is unique to each rater. This approach, Spatial STAPLE, provides significant improvements over state-of-the-art label fusion algorithms in both simulated and empirical data sets. PMID:22438513
Accounting for Laminar Run & Trip Drag in Supersonic Cruise Performance Testing
NASA Technical Reports Server (NTRS)
Goodsell, Aga M.; Kennelly, Robert A.
1999-01-01
An improved laminar run and trip drag correction methodology for supersonic cruise performance testing was derived. This method required more careful analysis of the flow visualization images which revealed delayed transition particularly on the inboard upper surface, even for the largest trip disks. In addition, a new code was developed to estimate the laminar run correction. Once the data were corrected for laminar run, the correct approach to the analysis of the trip drag became evident. Although the data originally appeared confusing, the corrected data are consistent with previous results. Furthermore, the modified approach, which was described in this presentation, extends prior historical work by taking into account the delayed transition caused by the blunt leading edges.
The basis function approach for modeling autocorrelation in ecological data.
Hefley, Trevor J; Broms, Kristin M; Brost, Brian M; Buderman, Frances E; Kay, Shannon L; Scharf, Henry R; Tipton, John R; Williams, Perry J; Hooten, Mevin B
2017-03-01
Analyzing ecological data often requires modeling the autocorrelation created by spatial and temporal processes. Many seemingly disparate statistical methods used to account for autocorrelation can be expressed as regression models that include basis functions. Basis functions also enable ecologists to modify a wide range of existing ecological models in order to account for autocorrelation, which can improve inference and predictive accuracy. Furthermore, understanding the properties of basis functions is essential for evaluating the fit of spatial or time-series models, detecting a hidden form of collinearity, and analyzing large data sets. We present important concepts and properties related to basis functions and illustrate several tools and techniques ecologists can use when modeling autocorrelation in ecological data. © 2016 by the Ecological Society of America.
NASA Astrophysics Data System (ADS)
Gvozdev, Alexander S.; Melentjev, Vladimir S.
2018-01-01
When you create a modern gas turbine engines urgent task is to improve the reliability by preventing fatigue damages of rotor blades. Such damage is largely determined by the level of vibration stresses. In this paper, using the finite element method and transient analysis of propose a method calculating the damping characteristics of the plates of the pressed wire material “MR” around the root attachment of the compressor blades of a gas turbine engine. Where taken into account contact interaction between the blades and the impeller disk.
Improving Terminology Mapping in Clinical Text with Context-Sensitive Spelling Correction.
Dziadek, Juliusz; Henriksson, Aron; Duneld, Martin
2017-01-01
The mapping of unstructured clinical text to an ontology facilitates meaningful secondary use of health records but is non-trivial due to lexical variation and the abundance of misspellings in hurriedly produced notes. Here, we apply several spelling correction methods to Swedish medical text and evaluate their impact on SNOMED CT mapping; first in a controlled evaluation using medical literature text with induced errors, followed by a partial evaluation on clinical notes. It is shown that the best-performing method is context-sensitive, taking into account trigram frequencies and utilizing a corpus-based dictionary.
Wiens, J. David; Weekes, Anne
2011-01-01
A scientific study has determined that survey methods designed for spotted owls do not always detect barred owls that are actually present in spotted owl habitat. The researchers suggest that strategies to address potential interactions between spotted owls and barred owls will require carefully designed surveys that account for response behaviors and imperfect detection of both species. Species-specific sampling methods, which are proposed, can be used by forest managers to determine the occurrence and distribution of barred owls with high confidence. This fact sheet provides highlights of the research (Wiens and others, 2011).
Laboratory Diagnosis of Congenital Toxoplasmosis
Pomares, Christelle
2016-01-01
Recent studies have demonstrated that screening and treatment for toxoplasmosis during gestation result in a decrease of vertical transmission and clinical sequelae. Early treatment was associated with improved outcomes. Thus, laboratory methods should aim for early identification of infants with congenital toxoplasmosis (CT). Diagnostic approaches should include, at least, detection of Toxoplasma IgG, IgM, and IgA and a comprehensive review of maternal history, including the gestational age at which the mother was infected and treatment. Here, we review laboratory methods for the diagnosis of CT, with emphasis on serological tools. A diagnostic algorithm that takes into account maternal history is presented. PMID:27147724
Improving FMEA risk assessment through reprioritization of failures
NASA Astrophysics Data System (ADS)
Ungureanu, A. L.; Stan, G.
2016-08-01
Most of the current methods used to assess the failure and to identify the industrial equipment defects are based on the determination of Risk Priority Number (RPN). Although conventional RPN calculation is easy to understand and use, the methodology presents some limitations, such as the large number of duplicates and the difficulty of assessing the RPN indices. In order to eliminate the afore-mentioned shortcomings, this paper puts forward an easy and efficient computing method, called Failure Developing Mode and Criticality Analysis (FDMCA), which takes into account the failures and the defect evolution in time, from failure appearance to a breakdown.
2017-03-01
Accountability Office Highlights of GAO-17-183, a report to congressional committees March 2017 DEFENSE LOGISTICS Improved Performance Measures ...DEFENSE LOGISTICS Improved Performance Measures and Information Needed for Assessing Asset Visibility Initiatives...Report to Congressional Committees March 2017 GAO-17-183 United States Government Accountability Office United States Government
Winning performance improvement strategies--linking documentation and accounts receivable.
Braden, J H; Swadley, D
1996-01-01
When the HIM department at The University of Texas Medical Branch set out to improve documentation and accounts receivable management, it established a plan that encompassed a broad spectrum of data management process changes. The department examined and acknowledged the deficiencies in data management processes and used performance improvement tools to achieve successful results.
Improving State Accountability Systems for Postsecondary Vocational Education.
ERIC Educational Resources Information Center
Sheets, Robert G.
This paper makes recommendations for developing the next generation of state accountability systems for postsecondary vocational education (PVE). It focuses on the need to improve the core indicators for PVE; reduce the burden and improve the value of Carl D. Perkins Vocational and Applied Technology Education Amendments of 1998 (Perkins III)…
ERIC Educational Resources Information Center
Hawley, Willis D.; Valli, Linda
The National Partnership for Excellence in Education and Accountability in Teaching (NPEAT) helps place improvement of teaching at the center of reform efforts, addressing two problems that impede the development of systemic reforms to improve teaching quality: (1) absence of agreement about effective strategies for improving teaching among those…
Are hospital quality improvement and public accountability compatible?
Panzer, R J
1994-07-01
The goals of public accountability and quality improvement are compatible in theory but not necessarily in practice. Both concepts emphasize the customer. However, those working toward these two goals design systems with quite different roles and relationships between the providers and consumers of health care. Superficial interactions obstruct meaningful dialogue about how to build a better system meeting both sets of goals. Current practices of public accountability and quality improvement have fundamentally different paradigms concerning the roles and responsibilities of those who provide and those who consume health care. There are at least three ways to improve the current relationship between public accountability and quality improvement. First, optimizing the design and performance of each effort would be an improvement since the goals are highly compatible. Neither ideal currently meets its own expectations, creating distrust among the proponents of each when reality falls short. Second, the two efforts could be coordinated through joint community-level planning and sharing. Finally and optimally, the two concepts could be made part of the same community-level cooperative system, an approach that offers the greatest opportunity for achieving shared goals.
2012-01-01
This paper describes a modification of the basic directions of state accounting and control of radioactive substances and radioactive waste products, whose implementation will significantly improve the efficiency of its operation at the regional level. Selected areas are designed to improve accounting and control system for the submission of the enterprises established by the reporting forms, the quality of the information contained in them, as well as structures of information and process for collecting, analyzing and data processing concerning radioactive substances and waste products.
A Strategy for Improving the Quality of Entry-Level Management Accountants.
ERIC Educational Resources Information Center
Kreuze, Jerry G.; Newell, Gale E.
1996-01-01
Reasons for a lack of management accounting topics in business education include the following: (1) already full curriculum; (2) shortage of qualified teachers (Certified Public Accounting faculty outnumber Certified Management Accounting faculty 7:1); and (3) students prefer public accounting. (SK)
Stockdale, Susan E; Zuchowski, Jessica; Rubenstein, Lisa V; Sapir, Negar; Yano, Elizabeth M; Altman, Lisa; Fickel, Jacqueline J; McDougall, Skye; Dresselhaus, Timothy; Hamilton, Alison B
Although the patient-centered medical home endorses quality improvement principles, methods for supporting ongoing, systematic primary care quality improvement have not been evaluated. We introduced primary care quality councils at six Veterans Health Administration sites as an organizational intervention with three key design elements: (a) fostering interdisciplinary quality improvement leadership, (b) establishing a structured quality improvement process, and (c) facilitating organizationally aligned frontline quality improvement innovation. Our evaluation objectives were to (a) assess design element implementation, (b) describe implementation barriers and facilitators, and (c) assess successful quality improvement project completion and spread. We analyzed administrative records and conducted interviews with 85 organizational leaders. We developed and applied criteria for assessing design element implementation using hybrid deductive/inductive analytic techniques. All quality councils implemented interdisciplinary leadership and a structured quality improvement process, and all but one completed at least one quality improvement project and a toolkit for spreading improvements. Quality councils were perceived as most effective when service line leaders had well-functioning interdisciplinary communication. Matching positions within leadership hierarchies with appropriate supportive roles facilitated frontline quality improvement efforts. Two key resources were (a) a dedicated internal facilitator with project management, data collection, and presentation skills and (b) support for preparing customized data reports for identifying and addressing practice level quality issues. Overall, quality councils successfully cultivated interdisciplinary, multilevel primary care quality improvement leadership with accountability mechanisms and generated frontline innovations suitable for spread. Practice level performance data and quality improvement project management support were critical. In order to successfully facilitate systematic, sustainable primary care quality improvement, regional and executive health care system leaders should engage interdisciplinary practice level leadership in a priority-setting process that encourages frontline innovation and establish local structures such as quality councils to coordinate quality improvement initiatives, ensure accountability, and promote spread of best practices.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-01
... DEPARTMENT OF THE TREASURY Internal Revenue Service 26 CFR Part 1 [TD 9534] RIN 1545-BD81 Methods... regulations relating to the methods of accounting, including the inventory methods, to be used by corporations... liquidations. These regulations clarify and simplify the rules regarding the accounting methods to be used...
Iterative refinement of implicit boundary models for improved geological feature reproduction
NASA Astrophysics Data System (ADS)
Martin, Ryan; Boisvert, Jeff B.
2017-12-01
Geological domains contain non-stationary features that cannot be described by a single direction of continuity. Non-stationary estimation frameworks generate more realistic curvilinear interpretations of subsurface geometries. A radial basis function (RBF) based implicit modeling framework using domain decomposition is developed that permits introduction of locally varying orientations and magnitudes of anisotropy for boundary models to better account for the local variability of complex geological deposits. The interpolation framework is paired with a method to automatically infer the locally predominant orientations, which results in a rapid and robust iterative non-stationary boundary modeling technique that can refine locally anisotropic geological shapes automatically from the sample data. The method also permits quantification of the volumetric uncertainty associated with the boundary modeling. The methodology is demonstrated on a porphyry dataset and shows improved local geological features.
NASA Astrophysics Data System (ADS)
Zou, Bin; Lu, Da; Wu, Zhilu; Qiao, Zhijun G.
2016-05-01
The results of model-based target decomposition are the main features used to discriminate urban and non-urban area in polarimetric synthetic aperture radar (PolSAR) application. Traditional urban-area extraction methods based on modelbased target decomposition usually misclassified ground-trunk structure as urban-area or misclassified rotated urbanarea as forest. This paper introduces another feature named orientation angle to improve urban-area extraction scheme for the accurate mapping in urban by PolSAR image. The proposed method takes randomness of orientation angle into account for restriction of urban area first and, subsequently, implements rotation angle to improve results that oriented urban areas are recognized as double-bounce objects from volume scattering. ESAR L-band PolSAR data of the Oberpfaffenhofen Test Site Area was used to validate the proposed algorithm.
NASA Astrophysics Data System (ADS)
Gibergans-Báguena, J.; Llasat, M. C.
2007-12-01
The objective of this paper is to present the improvement of quantitative forecasting of daily rainfall in Catalonia (NE Spain) from an analogues technique, taking into account synoptic and local data. This method is based on an analogues sorting technique: meteorological situations similar to the current one, in terms of 700 and 1000 hPa geopotential fields at 00 UTC, complemented with the inclusion of some thermodynamic parameters extracted from an historical data file. Thermodynamic analysis acts as a highly discriminating feature for situations in which the synoptic situation fails to explain either atmospheric phenomena or rainfall distribution. This is the case in heavy rainfall situations, where the existence of instability and high water vapor content is essential. With the objective of including these vertical thermodynamic features, information provided by the Palma de Mallorca radiosounding (Spain) has been used. Previously, a selection of the most discriminating thermodynamic parameters for the daily rainfall was made, and then the analogues technique applied to them. Finally, three analog forecasting methods were applied for the quantitative daily rainfall forecasting in Catalonia. The first one is based on analogies from geopotential fields to synoptic scale; the second one is exclusively based on the search of similarity from local thermodynamic information and the third method combines the other two methods. The results show that this last method provides a substantial improvement of quantitative rainfall estimation.
Improved Frequency Fluctuation Model for Spectral Line Shape Calculations in Fusion Plasmas
NASA Astrophysics Data System (ADS)
Ferri, S.; Calisti, A.; Mossé, C.; Talin, B.; Lisitsa, V.
2010-10-01
A very fast method to calculate spectral line shapes emitted by plasmas accounting for charge particle dynamics and effects of an external magnetic field is proposed. This method relies on a new formulation of the Frequency Fluctuation Model (FFM), which yields to an expression of the dynamic line profile as a functional of the static distribution function of frequencies. This highly efficient formalism, not limited to hydrogen-like systems, allows to calculate pure Stark and Stark-Zeeman line shapes for a wide range of density, temperature and magnetic field values, which is of importance in plasma physics and astrophysics. Various applications of this method are presented for conditions related to fusion plasmas.
The radiation environment of OSO missions from 1974 to 1978
NASA Technical Reports Server (NTRS)
Stassinopoulos, E. G.
1973-01-01
Trapped particle radiation levels on several OSO missions were calculated for nominal trajectories using improved computational methods and new electron environment models. Temporal variations of the electron fluxes were considered and partially accounted for. Magnetic field calculations were performed with a current field model and extrapolated to a later epoch with linear time terms. Orbital flux integration results, which are presented in graphical and tabular form, are analyzed, explained, and discussed.
Statistical Approach To Extraction Of Texture In SAR
NASA Technical Reports Server (NTRS)
Rignot, Eric J.; Kwok, Ronald
1992-01-01
Improved statistical method of extraction of textural features in synthetic-aperture-radar (SAR) images takes account of effects of scheme used to sample raw SAR data, system noise, resolution of radar equipment, and speckle. Treatment of speckle incorporated into overall statistical treatment of speckle, system noise, and natural variations in texture. One computes speckle auto-correlation function from system transfer function that expresses effect of radar aperature and incorporates range and azimuth resolutions.
Nozzle Flow with Vibrational Nonequilibrium. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Landry, John Gary
1995-01-01
Flow of nitrogen gas through a converging-diverging nozzle is simulated. The flow is modeled using the Navier-Stokes equations that have been modified for vibrational nonequilibrium. The energy equation is replaced by two equations. One equation accounts for energy effects due to the translational and rotational degrees of freedom, and the other accounts for the affects due to the vibrational degree of freedom. The energy equations are coupled by a relaxation time which measures the time required for the vibrational energy component to equilibrate with the translational and rotational energy components. An improved relaxation time is used in this thesis. The equations are solved numerically using the Steger-Warming flux vector splitting method and the Implicit MacCormack method. The results show that uniform flow is produced outside of the boundary layer. Nonequilibrium exists in both the converging and diverging nozzle sections. The boundary layer region is characterized by a marked increase in translational-rotational temperature. The vibrational temperature remains frozen downstream of the nozzle, except in the boundary layer.
NASA Technical Reports Server (NTRS)
Pototzky, Anthony S; Murphy, Patrick C.
2014-01-01
Improving aerodynamic models for adverse loss-of-control conditions in flight is an area being researched under the NASA Aviation Safety Program. Aerodynamic models appropriate for loss of control conditions require a more general mathematical representation to predict nonlinear unsteady behaviors. As more general aerodynamic models are studied that include nonlinear higher order effects, the possibility of measurements that confound aerodynamic and structural responses are probable. In this study an initial step is taken to look at including structural flexibility in analysis of rigid-body forced-oscillation testing that accounts for dynamic rig, sting and balance flexibility. Because of the significant testing required and associated costs in a general study, it makes sense to capitalize on low cost analytical methods where possible, especially where structural flexibility can be accounted for by a low cost method. This paper provides an initial look at using linear lifting surface theory applied to rigid-body aircraft roll forced-oscillation tests.
Public reporting of cost and quality information in orthopaedics.
Marjoua, Youssra; Butler, Craig A; Bozic, Kevin J
2012-04-01
Public reporting of patient health outcomes offers the potential to incentivize quality improvement by fostering increased accountability among providers. Voluntary reporting of risk-adjusted outcomes in cardiac surgery, for example, is viewed as a "watershed event" in healthcare accountability. However, public reporting of outcomes, cost, and quality information in orthopaedic surgery remains limited by comparison, attributable in part to the lack of standard assessment methods and metrics, provider fear of inadequate adjustment of health outcomes for patient characteristics (risk adjustment), and historically weak market demand for this type of information. We review the origins of public reporting of outcomes in surgical care, identify existing initiatives specific to orthopaedics, outline the challenges and opportunities, and propose recommendations for public reporting of orthopaedic outcomes. We performed a comprehensive review of the literature through a bibliographic search of MEDLINE and Google Scholar databases from January 1990 to December 2010 to identify articles related to public reporting of surgical outcomes. Orthopaedic-specific quality reporting efforts include the early FDA adverse event reporting MedWatch program and the involvement of surgeons in the Physician Quality Reporting Initiative. Issues that require more work include balancing different stakeholder perspectives on quality reporting measures and methods, defining accountability and attribution for outcomes, and appropriately risk-adjusting outcomes. Given the current limitations associated with public reporting of quality and cost in orthopaedic surgery, valuable contributions can be made in developing specialty-specific evidence-based performance measures. We believe through leadership and involvement in policy formulation and development, orthopaedic surgeons are best equipped to accurately and comprehensively inform the quality reporting process and its application to improve the delivery and outcomes of orthopaedic care.
Salloch, Sabine; Schildmann, Jan; Vollmann, Jochen
2012-04-13
The methodology of medical ethics during the last few decades has shifted from a predominant use of normative-philosophical analyses to an increasing involvement of empirical methods. The articles which have been published in the course of this so-called 'empirical turn' can be divided into conceptual accounts of empirical-normative collaboration and studies which use socio-empirical methods to investigate ethically relevant issues in concrete social contexts. A considered reference to normative research questions can be expected from good quality empirical research in medical ethics. However, a significant proportion of empirical studies currently published in medical ethics lacks such linkage between the empirical research and the normative analysis. In the first part of this paper, we will outline two typical shortcomings of empirical studies in medical ethics with regard to a link between normative questions and empirical data: (1) The complete lack of normative analysis, and (2) cryptonormativity and a missing account with regard to the relationship between 'is' and 'ought' statements. Subsequently, two selected concepts of empirical-normative collaboration will be presented and how these concepts may contribute to improve the linkage between normative and empirical aspects of empirical research in medical ethics will be demonstrated. Based on our analysis, as well as our own practical experience with empirical research in medical ethics, we conclude with a sketch of concrete suggestions for the conduct of empirical research in medical ethics. High quality empirical research in medical ethics is in need of a considered reference to normative analysis. In this paper, we demonstrate how conceptual approaches of empirical-normative collaboration can enhance empirical research in medical ethics with regard to the link between empirical research and normative analysis.
2012-01-01
Background The methodology of medical ethics during the last few decades has shifted from a predominant use of normative-philosophical analyses to an increasing involvement of empirical methods. The articles which have been published in the course of this so-called 'empirical turn' can be divided into conceptual accounts of empirical-normative collaboration and studies which use socio-empirical methods to investigate ethically relevant issues in concrete social contexts. Discussion A considered reference to normative research questions can be expected from good quality empirical research in medical ethics. However, a significant proportion of empirical studies currently published in medical ethics lacks such linkage between the empirical research and the normative analysis. In the first part of this paper, we will outline two typical shortcomings of empirical studies in medical ethics with regard to a link between normative questions and empirical data: (1) The complete lack of normative analysis, and (2) cryptonormativity and a missing account with regard to the relationship between 'is' and 'ought' statements. Subsequently, two selected concepts of empirical-normative collaboration will be presented and how these concepts may contribute to improve the linkage between normative and empirical aspects of empirical research in medical ethics will be demonstrated. Based on our analysis, as well as our own practical experience with empirical research in medical ethics, we conclude with a sketch of concrete suggestions for the conduct of empirical research in medical ethics. Summary High quality empirical research in medical ethics is in need of a considered reference to normative analysis. In this paper, we demonstrate how conceptual approaches of empirical-normative collaboration can enhance empirical research in medical ethics with regard to the link between empirical research and normative analysis. PMID:22500496
Xie, Yaoqin; Chao, Ming; Xing, Lei
2009-01-01
Purpose To report a tissue feature-based image registration strategy with explicit inclusion of the differential motions of thoracic structures. Methods and Materials The proposed technique started with auto-identification of a number of corresponding points with distinct tissue features. The tissue feature points were found by using the scale-invariant feature transform (SIFT) method. The control point pairs were then sorted into different “colors” according to the organs they reside and used to model the involved organs individually. A thin-plate spline (TPS) method was used to register a structure characterized by the control points with a given “color”. The proposed technique was applied to study a digital phantom case, three lung and three liver cancer patients. Results For the phantom case, a comparison with the conventional TPS method showed that the registration accuracy was markedly improved when the differential motions of the lung and chest wall were taken into account. On average, the registration error and the standard deviation (SD) of the 15 points against the known ground truth are reduced from 3.0 mm to 0.5 mm and from 1.5 mm to 0.2 mm, respectively, when the new method was used. Similar level of improvement was achieved for the clinical cases. Conclusions The segmented deformable approach provides a natural and logical solution to model the discontinuous organ motions and greatly improves the accuracy and robustness of deformable registration. PMID:19545792
26 CFR 1.481-4 - Adjustments taken into account with consent.
Code of Federal Regulations, 2010 CFR
2010-04-01
... effecting a change in method of accounting, including the taxable year or years in which the amount of the... Commissioner's consent to a change in method of accounting. (b) An agreement to the terms and conditions of a change in method of accounting under § 1.446-1(e)(3), including the taxable year or years prescribed by...
Remote sensing of soil moisture content over bare fields at 1.4 GHz frequency
NASA Technical Reports Server (NTRS)
Wang, J. R.; Choudhury, B. J.
1980-01-01
A simple method of estimating moisture content (W) of a bare soil from the observed brightness temperature (T sub B) at 1.4 GHz is discussed. The method is based on a radiative transfer model calculation, which has been successfully used in the past to account for many observational results, with some modifications to take into account the effect of surface roughness. Besides the measured T sub B's, the three additional inputs required by the method are the effective soil thermodynamic temperature, the precise relation between W and the smooth field brightness temperature T sub B and a parameter specifying the surface roughness characteristics. The soil effective temperature can be readily measured and the procedures of estimating surface roughness parameter and obtaining the relation between W and smooth field brightness temperature are discussed in detail. Dual polarized radiometric measurements at an off-nadir incident angle are sufficient to estimate both surface roughness parameter and W, provided that the relation between W and smooth field brightness temperature at the same angle is known. The method of W estimate is demonstrated with two sets of experimental data, one from a controlled field experiment by a mobile tower and the other, from aircraft overflight. The results from both data sets are encouraging when the estimated W's are compared with the acquired ground truth of W's in the top 2 cm layer. An offset between the estimated and the measured W's exists in the results of the analyses, but that can be accounted for by the presently poor knowledge of the relationship between W and smooth field brightness temperature for various types of soils. An approach to quantify this relationship for different soils and thus improve the method of W estimate is suggested.
Strengthening the management of ESA - the Inter-Directorate Reform of Corporate and Risk Management
NASA Astrophysics Data System (ADS)
Feustel-Büechl, Jörg; Arend, Harald; Derio, Eric; Infante, Giovanni; Kreiner, Gerhard; Phaler, Jesse; Tabbert, Michael
2007-02-01
ESA has undertaken the Inter-Directorate Reform of Corporate and Risk Management to strengthen the Agency's internal operations. The reform was completed at the end of 2006, encompassing five dedicated projects on Risk Management, Agency-Wide Controlling System, Project Plan and Integrated Project Review, General Budget Structure and Charging Policy, and Corporate Information Systems. It has contributed to improved management of the Agency's internal operations by engaging all internal stakeholders in a common objective, introducing improvements to planning and management methods, elaborating consolidated information structures and tools, contributing to enhanced transparency and accountability, and by providing qualified new policies, processes and tools.
Deep Compaction Control of Sandy Soils
NASA Astrophysics Data System (ADS)
Bałachowski, Lech; Kurek, Norbert
2015-02-01
Vibroflotation, vibratory compaction, micro-blasting or heavy tamping are typical improvement methods for the cohesionless deposits of high thickness. The complex mechanism of deep soil compaction is related to void ratio decrease with grain rearrangements, lateral stress increase, prestressing effect of certain number of load cycles, water pressure dissipation, aging and other effects. Calibration chamber based interpretation of CPTU/DMT can be used to take into account vertical and horizontal stress and void ratio effects. Some examples of interpretation of soundings in pre-treated and compacted sands are given. Some acceptance criteria for compaction control are discussed. The improvement factors are analysed including the normalised approach based on the soil behaviour type index.
MacPhee, Maura
2002-12-01
The following article is an example of evidence-based practice applied to an institutional Quality Improvement (QI) project. QI originated in the 1980s and is best associated with the work of W. Deming (1986). It is also known as Continuous Quality Improvement, because a major principle of this approach is constant improvement of services or products. This improvement process contains other critical components: scientific method, employee participation and teamwork, accountable leadership, appropriate training and ongoing education, and client focus (Demming, 1986). QI has been globally successful and has helped transform American industry, including health care services. The following clinically based project illustrates the application of QI concepts and evidence-based practice to enhance outcomes. Copyright 2002, Elsevier Science (USA). All rights reserved.
Improving the performance of a filling line based on simulation
NASA Astrophysics Data System (ADS)
Jasiulewicz-Kaczmarek, M.; Bartkowiak, T.
2016-08-01
The paper describes the method of improving performance of a filling line based on simulation. This study concerns a production line that is located in a manufacturing centre of a FMCG company. A discrete event simulation model was built using data provided by maintenance data acquisition system. Two types of failures were identified in the system and were approximated using continuous statistical distributions. The model was validated taking into consideration line performance measures. A brief Pareto analysis of line failures was conducted to identify potential areas of improvement. Two improvements scenarios were proposed and tested via simulation. The outcome of the simulations were the bases of financial analysis. NPV and ROI values were calculated taking into account depreciation, profits, losses, current CIT rate and inflation. A validated simulation model can be a useful tool in maintenance decision-making process.
Code of Federal Regulations, 2010 CFR
2010-04-01
...) generally constitutes the use of an impermissible method of accounting, requiring a change to a permissible...)(i). (ii) Change in method of accounting; adoption of method of accounting—(A) In general. The annual... change to or from either of these methods is a change in method of accounting that requires the consent...
[Supercomputer investigation of the protein-ligand system low-energy minima].
Oferkin, I V; Sulimov, A V; Katkova, E V; Kutov, D K; Grigoriev, F V; Kondakova, O A; Sulimov, V B
2015-01-01
The accuracy of the protein-ligand binding energy calculations and ligand positioning is strongly influenced by the choice of the docking target function. This work demonstrates the evaluation of the five different target functions used in docking: functions based on MMFF94 force field and functions based on PM7 quantum-chemical method accounting or without accounting the implicit solvent model (PCM, COSMO or SGB). For these purposes the ligand positions corresponding to the minima of the target function and the experimentally known ligand positions in the protein active site (crystal ligand positions) were compared. Each function was examined on the same test-set of 16 protein-ligand complexes. The new parallelized docking program FLM based on Monte Carlo search algorithm was developed to perform the comprehensive low-energy minima search and to calculate the protein-ligand binding energy. This study demonstrates that the docking target function based on the MMFF94 force field can be used to detect the crystal or near crystal positions of the ligand by the finding the low-energy local minima spectrum of the target function. The importance of solvent accounting in the docking process for the accurate ligand positioning is also shown. The accuracy of the ligand positioning as well as the correlation between the calculated and experimentally determined protein-ligand binding energies are improved when the MMFF94 force field is substituted by the new PM7 method with implicit solvent accounting.
26 CFR 1.466-2 - Special protective election for certain taxpayers.
Code of Federal Regulations, 2010 CFR
2010-04-01
... method of accounting reasonably similar to the method described in § 1.451-4, to elect to treat that method of accounting as a proper one for those prior years. There are several differences between this... protective election (if treated as deductible under the accounting method for such years), even though such...
Using the Significant Learning Taxonomy and Active Learning to Improve Accounting Education
ERIC Educational Resources Information Center
Killian, Larita J.; Brandon, Christopher D.
2009-01-01
Like other members of the academy, accounting professors are challenged to improve student learning. We must help students move beyond the "bean counter" role and develop higher-level skills such as analysis, synthesis, and problem-solving. The Significant Learning Taxonomy was used as a template to improve learning in an introductory accounting…
ERIC Educational Resources Information Center
Glover, Todd A.; Reddy, Linda A.; Kettler, Ryan J.; Kunz, Alexander; Lekwa, Adam J.
2016-01-01
The accountability movement and high-stakes testing fail to attend to ongoing instructional improvements based on the regular assessment of student skills and teacher practices. Summative achievement data used for high-stakes accountability decisions are collected too late in the school year to inform instruction. This is especially problematic…
ERIC Educational Resources Information Center
Dano, Trine; Stensaker, Bjorn
2007-01-01
The role and function of external quality assurance is of great importance for the development of an internal quality culture in higher education. Research has shown that external quality assurance can stimulate but also create obstacles for institutional improvement. To strike a balance between improvement and accountability is, therefore, a key…
Ensemble stacking mitigates biases in inference of synaptic connectivity.
Chambers, Brendan; Levy, Maayan; Dechery, Joseph B; MacLean, Jason N
2018-01-01
A promising alternative to directly measuring the anatomical connections in a neuronal population is inferring the connections from the activity. We employ simulated spiking neuronal networks to compare and contrast commonly used inference methods that identify likely excitatory synaptic connections using statistical regularities in spike timing. We find that simple adjustments to standard algorithms improve inference accuracy: A signing procedure improves the power of unsigned mutual-information-based approaches and a correction that accounts for differences in mean and variance of background timing relationships, such as those expected to be induced by heterogeneous firing rates, increases the sensitivity of frequency-based methods. We also find that different inference methods reveal distinct subsets of the synaptic network and each method exhibits different biases in the accurate detection of reciprocity and local clustering. To correct for errors and biases specific to single inference algorithms, we combine methods into an ensemble. Ensemble predictions, generated as a linear combination of multiple inference algorithms, are more sensitive than the best individual measures alone, and are more faithful to ground-truth statistics of connectivity, mitigating biases specific to single inference methods. These weightings generalize across simulated datasets, emphasizing the potential for the broad utility of ensemble-based approaches.
NASA Astrophysics Data System (ADS)
Haslauer, C. P.; Allmendinger, M.; Gnann, S.; Heisserer, T.; Bárdossy, A.
2017-12-01
The basic problem of geostatistics is to estimate the primary variable (e.g. groundwater quality, nitrate) at an un-sampled location based on point measurements at locations in the vicinity. Typically, models are being used that describe the spatial dependence based on the geometry of the observation network. This presentation demonstrates methods that take the following properties additionally into account: the statistical distribution of the measurements, a different degree of dependence in different quantiles, censored measurements, the composition of categorical additional information in the neighbourhood (exhaustive secondary information), and the spatial dependence of a dependent secondary variable, possibly measured with a different observation network (non-exhaustive secondary data). Two modelling approaches are demonstrated individually and combined: The non-stationarity in the marginal distribution is accounted for by locally mixed distribution functions that depend on the composition of the categorical variable in the neighbourhood of each interpolation location. This methodology is currently being implemented for operational use at the environmental state agency of Baden-Württemberg. An alternative to co-Kriging in copula space with an arbitrary number of secondary parameters is presented: The method performs better than traditional techniques if the primary variable is undersampled and does not produce erroneous negative estimates. Even more, the quality of the uncertainty estimates is much improved. The worth of the secondary information is thoroughly evaluated. The improved geostatistical hydrogeological models are being analyzed using measurements of a large observation network ( 2500 measurement locations) in the state of Baden-Württemberg ( 36.000 km2). Typical groundwater quality parameters such as nitrate, chloride, barium, antrazine, and desethylatrazine are being assessed, cross-validated, and compared with traditional geostatistical methods. The secondary information of land use is available on a 30m x 30m raster. We show that the presented methods are not only better estimators (e.g. in the sense of an average quadratic error), but exhibit a much more realistic structure of the uncertainty and hence are improvements compared to existing methods.
Emergy assessment of ecological compensation of groundwater overexploitation in Xuchang city
NASA Astrophysics Data System (ADS)
Lv, C.; Ling, M.; Cao, Q.; Guo, X.
2017-12-01
In recent 30 years, the amount of groundwater extraction in China is increasing at a rate of 2.5 billion m3 per year. And the growing amount led to form a predatory exploitation in many parts, and caused serious exploitation problems, such as land subsidence, sea water intrusion, surface runoff reduction, vegetation decline, groundwater pollution, and so on. Ecological compensation of overexploitation has become an important mean to adjust the environmental benefits distribution relationship related to the groundwater system and to alleviate the problem of groundwater overexploitation. Based on the ecological economics emergy value theory and analysis method, the emergy loss value calculation method of eco-environmental problems caused by groundwater overexploitation, such as environmental land subsidence (collapse), salt (sea) water intrusion, surface runoff reduction, vegetation deterioration and groundwater pollution, is established, and the assessment method, which takes emergy loss value as the quantity of ecological compensation of groundwater overexploitation, is put forward. This method can reflect the disaster loss degree of groundwater overexploitation more intuitively, and it helps to improve, manage and restore a series of problems caused by groundwater overexploitation, construct a scientific and reasonable groundwater ecological compensation mechanism, and provide good ecological security for the sustainable and healthy development of national economy in our country. Taking Xuchang city as an application example, the results showed that the ecological economic loss of groundwater overexploitation was 109 million in 2015, accounting for 0.3% of the total GDP. Among them, the ecological economic loss of land subsidence is the largest, which was 77 million, accounting for 70.3% of the total loss, the second one is surface runoff reducing loss, which was 27 million, accounting for 24.7% of the total loss, and underground water pollution loss is the smallest, which was 5 million, accounting for only 5% of the total loss. To sum up, the ground subsidence is the most serious problem in many ecological environment problems caused by the groundwater overexploitation in Xuchang.
Rogers, Anne; Huxley, Peter; Evans, Sherrill; Gately, Claire
2008-05-01
Urban regeneration initiatives are considered to be one means of making a contribution to improving people's quality of life and mental health. This paper considers the relationship between lay perceptions of locality adversity, mental health and social capital in an area undergoing urban regeneration. Using qualitative methods as part of a larger multi-method study, perceptions of material, and non-material aspects of the locality and the way in which people vulnerable to mental health problems coped with living in adversity were identified as being more highly valued than intended or actual changes to structural elements such as the provision of housing or employment. Themes derived from narrative accounts included concerns about the absence of social control in the locality, the reputation of the area, a lack of faith in local agencies to make changes considered important to local residents, a reliance on personal coping strategies to manage adversity and perceived threats to mental health which reinforced a sense of social isolation. We suggest these elements are implicated in restricting opportunities and enhancing feelings of 'entrapment' contributing to low levels of local collective efficacy. The gap between social capital capacity at an individual level and links with collective community resources may in part have accounted for the absence of improvements in mental health during the early life of the urban regeneration initiative. In order to enhance quality of life or mental health, agencies involved in urban initiatives need as a basic minimum to promote security, increase leisure opportunities, and improve the image of the locality.
NASA Astrophysics Data System (ADS)
Mendizabal, A.; González-Díaz, J. B.; San Sebastián, M.; Echeverría, A.
2016-07-01
This paper describes the implementation of a simple strategy adopted for the inherent shrinkage method (ISM) to predict welding-induced distortion. This strategy not only makes it possible for the ISM to reach accuracy levels similar to the detailed transient analysis method (considered the most reliable technique for calculating welding distortion) but also significantly reduces the time required for these types of calculations. This strategy is based on the sequential activation of welding blocks to account for welding direction and transient movement of the heat source. As a result, a significant improvement in distortion prediction is achieved. This is demonstrated by experimentally measuring and numerically analyzing distortions in two case studies: a vane segment subassembly of an aero-engine, represented with 3D-solid elements, and a car body component, represented with 3D-shell elements. The proposed strategy proves to be a good alternative for quickly estimating the correct behaviors of large welded components and may have important practical applications in the manufacturing industry.
Need for Improved Methods to Collect and Present Spatial Epidemiologic Data for Vectorborne Diseases
Eisen, Rebecca J.
2007-01-01
Improved methods for collection and presentation of spatial epidemiologic data are needed for vectorborne diseases in the United States. Lack of reliable data for probable pathogen exposure site has emerged as a major obstacle to the development of predictive spatial risk models. Although plague case investigations can serve as a model for how to ideally generate needed information, this comprehensive approach is cost-prohibitive for more common and less severe diseases. New methods are urgently needed to determine probable pathogen exposure sites that will yield reliable results while taking into account economic and time constraints of the public health system and attending physicians. Recent data demonstrate the need for a change from use of the county spatial unit for presentation of incidence of vectorborne diseases to more precise ZIP code or census tract scales. Such fine-scale spatial risk patterns can be communicated to the public and medical community through Web-mapping approaches. PMID:18258029
Atela, Martin; Bakibinga, Pauline; Ettarh, Remare; Kyobutungi, Catherine; Cohn, Simon
2015-12-04
Enhancing accountability in health systems is increasingly emphasised as crucial for improving the nature and quality of health service delivery worldwide and particularly in developing countries. Accountability mechanisms include, among others, health facilities committees, suggestion boxes, facility and patient charters. However, there is a dearth of information regarding the nature of and factors that influence the performance of accountability mechanisms, especially in developing countries. We examine community members' experiences of one such accountability mechanism, the health facility charter in Kericho District, Kenya. A household survey was conducted in 2011 among 1,024 respondents (36% male, 64% female) aged 17 years and above stratified by health facility catchment area, situated in a division in Kericho District. In addition, sixteen focus group discussions were conducted with health facility users in the four health facility catchment areas. Quantitative data were analysed through frequency distributions and cross-tabulations. Qualitative data were transcribed and analysed using a thematic approach. The majority (65%) of household survey respondents had seen their local facility service charter, 84% of whom had read the information on the charter. Of these, 83% found the charter to be useful or very useful. According to the respondents, the charters provided useful information about the services offered and their costs, gave users a voice to curb potential overcharging and helped users plan their medical expenses before receiving the service. However, community members cited several challenges with using the charters: non-adherence to charter provisions by health workers; illegibility and language issues; lack of expenditure records; lack of time to read and understand them, often due to pressures around queuing; and socio-cultural limitations. Findings from this study suggest that improving the compliance of health facilities in districts across Kenya with regard to the implementation of the facility service charter is critical for accountability and community satisfaction with service delivery. To improve the compliance of health facilities, attention needs to be focused on mechanisms that help enforce official guidelines, address capacity gaps, and enhance public awareness of the charters and their use.
Advanced, Cost-Based Indices for Forecasting the Generation of Photovoltaic Power
NASA Astrophysics Data System (ADS)
Bracale, Antonio; Carpinelli, Guido; Di Fazio, Annarita; Khormali, Shahab
2014-01-01
Distribution systems are undergoing significant changes as they evolve toward the grids of the future, which are known as smart grids (SGs). The perspective of SGs is to facilitate large-scale penetration of distributed generation using renewable energy sources (RESs), encourage the efficient use of energy, reduce systems' losses, and improve the quality of power. Photovoltaic (PV) systems have become one of the most promising RESs due to the expected cost reduction and the increased efficiency of PV panels and interfacing converters. The ability to forecast power-production information accurately and reliably is of primary importance for the appropriate management of an SG and for making decisions relative to the energy market. Several forecasting methods have been proposed, and many indices have been used to quantify the accuracy of the forecasts of PV power production. Unfortunately, the indices that have been used have deficiencies and usually do not directly account for the economic consequences of forecasting errors in the framework of liberalized electricity markets. In this paper, advanced, more accurate indices are proposed that account directly for the economic consequences of forecasting errors. The proposed indices also were compared to the most frequently used indices in order to demonstrate their different, improved capability. The comparisons were based on the results obtained using a forecasting method based on an artificial neural network. This method was chosen because it was deemed to be one of the most promising methods available due to its capability for forecasting PV power. Numerical applications also are presented that considered an actual PV plant to provide evidence of the forecasting performances of all of the indices that were considered.
Lizunov, A Y; Gonchar, A L; Zaitseva, N I; Zosimov, V V
2015-10-26
We analyzed the frequency with which intraligand contacts occurred in a set of 1300 protein-ligand complexes [ Plewczynski et al. J. Comput. Chem. 2011 , 32 , 742 - 755 .]. Our analysis showed that flexible ligands often form intraligand hydrophobic contacts, while intraligand hydrogen bonds are rare. The test set was also thoroughly investigated and classified. We suggest a universal method for enhancement of a scoring function based on a potential of mean force (PMF-based score) by adding a term accounting for intraligand interactions. The method was implemented via in-house developed program, utilizing an Algo_score scoring function [ Ramensky et al. Proteins: Struct., Funct., Genet. 2007 , 69 , 349 - 357 .] based on the Tarasov-Muryshev PMF [ Muryshev et al. J. Comput.-Aided Mol. Des. 2003 , 17 , 597 - 605 .]. The enhancement of the scoring function was shown to significantly improve the docking and scoring quality for flexible ligands in the test set of 1300 protein-ligand complexes [ Plewczynski et al. J. Comput. Chem. 2011 , 32 , 742 - 755 .]. We then investigated the correlation of the docking results with two parameters of intraligand interactions estimation. These parameters are the weight of intraligand interactions and the minimum number of bonds between the ligand atoms required to take their interaction into account.
Tran, Damien; Fournier, Elodie; Durrieu, Gilles; Massabuau, Jean-Charles
2007-07-01
The objective of the present study was to monitor water-quality assessment by a biological method. Optimum dissolved inorganic mercury sensitivity in the freshwater bivalve Corbicula fluminea was estimated using a combined approach to determine their potentials and limits in detecting contaminants. Detection by bivalves is based on shell closure, a protective strategy when exposed to a water contaminant. To take the rate of spontaneous closures into account, stress associated with fixation by one valve in common valvometers was integrated, and the spontaneous rhythm was associated with daily activity. The response in conditions where the probability of spontaneous closing is the lowest was thus taken into account. To develop dose-response curves, impedance valvometry, in which lightweight impedance electrodes are applied to study free-ranging animals in low-stress conditions, also was used combined with a new analytical approach. The logistic regression dose-response curves take into account variations in both response time and metal concentration in water to significantly improve the methods aiming at determining the optimal sensitivity threshold response. This approach demonstrates that in C. fluminea, inorganic mercury concentrations under the range of 2.0 to 5.1 microg/L (95% confidence interval) cannot be detected within 5 h of addition.
Accounting for the environment.
Lutz, E; Munasinghe, M
1991-03-01
Environmental awareness in the 1980s has led to efforts to improve the current UN System of National Accounts (SNA) for better measurement of the value of environmental resources when estimating income. National governments, the UN, the International Monetary Fund, and the World Bank are interested in solving this issue. The World Bank relies heavily on national aggregates in income accounts compiled by means of the SNA that was published in 1968 and stressed gross domestic product (GDP). GDP measures mainly market activity, but it takes does not consider the consumption of natural capital, and indirectly inhibits sustained development. The deficiencies of the current method of accounting are inconsistent treatment of manmade and natural capital, the omission of natural resources and their depletion from balance sheets, and pollution cleanup costs from national income. In the calculation of GDP pollution is overlooked, and beneficial environmental inputs are valued at zero. The calculation of environmentally adjusted net domestic product (EDP) and environmentally adjusted net income (ENI) would lower income and growth rate, as the World Resources Institute found with respect to Indonesia for 1971-84. When depreciation for oil, timber, and top soil was included the net domestic product (NDP) was only 4% compared with a 7.1% GDP. The World Bank has advocated environmental accounting since 1983 in SNA revisions. The 1989 revised Blue Book of the SNA takes environment concerns into account. Relevant research is under way in Mexico and Papua New Guinea using the UN Statistical Office framework as a system for environmentally adjusted economic accounts that computes EDP and ENI and integrates environmental data with national accounts while preserving SNA concepts.
Multilevel Context of Depression in Two American Indian Tribes
Kaufman, Carol E.; Beals, Janette; Croy, Calvin; Jiang, Luohua; Novins, Douglas K.
2015-01-01
Objective Depression is a major debilitating disease. For American Indians living in tribal reservations, who endure disproportionately high levels of stress and poverty often associated with depression, determining the patterns and correlates is key to appropriate clinical assessment and intervention development. Yet, little attention has been given to the cultural context of correlates for depression, including the influence of family, cultural traditions or practices, or community conditions. Method We used data from a large representative psychiatric epidemiological study among American Indians in two reservation communities to estimate nested individual and multilevel models of past-year Major Depressive Episode (MDE) accounting for family, cultural, and community conditions. Results We found that models including culturally informed individual-level measures significantly improved the model fit over demographics alone. We found significant community-level variation in the probability of past-year MDE diagnosis in one tribe even after accounting for individual-level characteristics. Conclusions Accounting for culture, family, and community context will facilitate research, clinician assessment, and treatment of depression in diverse settings. PMID:24016293
Buckingham-Jeffery, Elizabeth; Morbey, Roger; House, Thomas; Elliot, Alex J; Harcourt, Sally; Smith, Gillian E
2017-05-19
As service provision and patient behaviour varies by day, healthcare data used for public health surveillance can exhibit large day of the week effects. These regular effects are further complicated by the impact of public holidays. Real-time syndromic surveillance requires the daily analysis of a range of healthcare data sources, including family doctor consultations (called general practitioners, or GPs, in the UK). Failure to adjust for such reporting biases during analysis of syndromic GP surveillance data could lead to misinterpretations including false alarms or delays in the detection of outbreaks. The simplest smoothing method to remove a day of the week effect from daily time series data is a 7-day moving average. Public Health England developed the working day moving average in an attempt also to remove public holiday effects from daily GP data. However, neither of these methods adequately account for the combination of day of the week and public holiday effects. The extended working day moving average was developed. This is a further data-driven method for adding a smooth trend curve to a time series graph of daily healthcare data, that aims to take both public holiday and day of the week effects into account. It is based on the assumption that the number of people seeking healthcare services is a combination of illness levels/severity and the ability or desire of patients to seek healthcare each day. The extended working day moving average was compared to the seven-day and working day moving averages through application to data from two syndromic indicators from the GP in-hours syndromic surveillance system managed by Public Health England. The extended working day moving average successfully smoothed the syndromic healthcare data by taking into account the combined day of the week and public holiday effects. In comparison, the seven-day and working day moving averages were unable to account for all these effects, which led to misleading smoothing curves. The results from this study make it possible to identify trends and unusual activity in syndromic surveillance data from GP services in real-time independently of the effects caused by day of the week and public holidays, thereby improving the public health action resulting from the analysis of these data.
A de-noising method using the improved wavelet threshold function based on noise variance estimation
NASA Astrophysics Data System (ADS)
Liu, Hui; Wang, Weida; Xiang, Changle; Han, Lijin; Nie, Haizhao
2018-01-01
The precise and efficient noise variance estimation is very important for the processing of all kinds of signals while using the wavelet transform to analyze signals and extract signal features. In view of the problem that the accuracy of traditional noise variance estimation is greatly affected by the fluctuation of noise values, this study puts forward the strategy of using the two-state Gaussian mixture model to classify the high-frequency wavelet coefficients in the minimum scale, which takes both the efficiency and accuracy into account. According to the noise variance estimation, a novel improved wavelet threshold function is proposed by combining the advantages of hard and soft threshold functions, and on the basis of the noise variance estimation algorithm and the improved wavelet threshold function, the research puts forth a novel wavelet threshold de-noising method. The method is tested and validated using random signals and bench test data of an electro-mechanical transmission system. The test results indicate that the wavelet threshold de-noising method based on the noise variance estimation shows preferable performance in processing the testing signals of the electro-mechanical transmission system: it can effectively eliminate the interference of transient signals including voltage, current, and oil pressure and maintain the dynamic characteristics of the signals favorably.
26 CFR 1.446-1 - General rule for methods of accounting.
Code of Federal Regulations, 2013 CFR
2013-04-01
..., as a cost taken into account in computing cost of goods sold, as a cost allocable to a long-term...; section 460, relating to the long-term contract methods. In addition, special methods of accounting for... regulations under sections 471 and 472), a change from the cash or accrual method to a long-term contract...
Government Accounting Standards: Past, Present and Future.
ERIC Educational Resources Information Center
Harmer, W. Gary
1993-01-01
States that government accounting is the product of mixing together budget-oriented and accounting-oriented voices. Presents a history of governmental accounting including the groups involved. An organization chart describes the current standard-setting structure. Accomplishments that improve reporting operations results are listed. (MLF)
Accounting Systems for School Districts.
ERIC Educational Resources Information Center
Atwood, E. Barrett, Jr.
1983-01-01
Advises careful analysis and improvement of existing school district accounting systems prior to investment in new ones. Emphasizes the importance of attracting and maintaining quality financial staffs, developing an accounting policies and procedures manual, and designing a good core accounting system before purchasing computer hardware and…
48 CFR 31.002 - Availability of accounting guide.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Availability of accounting... GENERAL CONTRACTING REQUIREMENTS CONTRACT COST PRINCIPLES AND PROCEDURES 31.002 Availability of accounting guide. Contractors needing assistance in developing or improving their accounting systems and procedures...
Student Preferences for Instructional Methods in an Accounting Curriculum
ERIC Educational Resources Information Center
Abeysekera, Indra
2015-01-01
Student preferences among instructional methods are largely unexplored across the accounting curriculum. The algorithmic rigor of courses and the societal culture can influence these preferences. This study explored students' preferences of instructional methods for learning in six courses of the accounting curriculum that differ in algorithmic…
The idea that the methods and models of accounting and bookkeeping might be useful in describing, understanding, and managing environmental systems is implicit in the title of H.T. Odum's book, Environmental Accounting: Emergy and Environmental Decision Making. In this paper, I ...
Municipal Water Demand: Statistical and Management Issues
NASA Astrophysics Data System (ADS)
Martin, William E.
In the foreword to this volume, Charles W. Howe, general editor of the Westview Press series on water policy and management, states that the goal of this book is to emphasize “the potential for improved water management with reduced economic and environmental costs by utilizing modern methods of demand estimation that take into account user responsiveness to price, conservation measures, and economic-demographic changes.” The authors accomplish their purpose, but the book itself leaves much to be desired.
Occupational medicine. The essentials of finance.
Fallon, J B
1989-01-01
Finance is concerned with the generation and use of funds to support organizational objectives whereas accounting records transactions and summarizes how funds are expended. Money has costs associated with its procurement and use. There are costs associated with maintaining equipment and inventory. Financial analysts have developed methods to evaluate a company's efficiency in using money. While the occupational physician may not be directly involved in financial activities, knowledge of the techniques used should improve an understanding of organizational limitations.
Addition by Subtraction: The Relation Between Dropout Rates and School-Level Academic Achievement.
Glennie, Elizabeth; Bonneau, Kara; Vandellen, Michelle; Dodge, Kenneth A
2012-01-01
Efforts to improve student achievement should increase graduation rates. However, work investigating the effects of student-level accountability has consistently demonstrated that increases in the standards for high school graduation are correlated with increases in dropout rates. The most favored explanation for this finding is that high-stakes testing policies that mandate grade repetition and high school exit exams may be the tipping point for students who are already struggling academically. These extra demands may, in fact, push students out of school. This article examines two hypotheses regarding the relation between school-level accountability and dropout rates. The first posits that improvements in school performance lead to improved success for everyone. If school-level accountability systems improve a school for all students, then the proportion of students performing at grade level increases, and the dropout rate decreases. The second hypothesis posits that schools facing pressure to improve their overall accountability score may pursue this increase at the cost of other student outcomes, including dropout rate. Our approach focuses on the dynamic relation between school-level academic achievement and dropout rates over time-that is, between one year's achievement and the subsequent year's dropout rate, and vice versa. This article employs longitudinal data of records on all students in North Carolina public schools over an 8-year period. Analyses employ fixed-effects models clustering schools and districts within years and controls each year for school size, percentage of students who were free/reduced-price lunch eligible, percentage of students who are ethnic minorities, and locale. This study finds partial evidence that improvements in school-level academic performance will lead to improvements (i.e., decreases) in school-level dropout rates. Schools with improved performance saw decreased dropout rates following these successes. However, we find more evidence of a negative side of the quest for improved academic performance. When dropout rates increase, the performance composites in subsequent years increase. Accountability systems need to remove any indirect benefit a school may receive from increasing its dropout rate. Schools should be held accountable for those who drop out of school. Given the personal and social costs of dropping out, accountability systems need to place more emphasis on dropout prevention. Such an emphasis could encompass increasing the dropout age and having the school's performance composite include scores of zero on end-of-grade tests for those who leave school.
Frailty Models for Familial Risk with Application to Breast Cancer.
Gorfine, Malka; Hsu, Li; Parmigiani, Giovanni
2013-12-01
In evaluating familial risk for disease we have two main statistical tasks: assessing the probability of carrying an inherited genetic mutation conferring higher risk; and predicting the absolute risk of developing diseases over time, for those individuals whose mutation status is known. Despite substantial progress, much remains unknown about the role of genetic and environmental risk factors, about the sources of variation in risk among families that carry high-risk mutations, and about the sources of familial aggregation beyond major Mendelian effects. These sources of heterogeneity contribute substantial variation in risk across families. In this paper we present simple and efficient methods for accounting for this variation in familial risk assessment. Our methods are based on frailty models. We implemented them in the context of generalizing Mendelian models of cancer risk, and compared our approaches to others that do not consider heterogeneity across families. Our extensive simulation study demonstrates that when predicting the risk of developing a disease over time conditional on carrier status, accounting for heterogeneity results in a substantial improvement in the area under the curve of the receiver operating characteristic. On the other hand, the improvement for carriership probability estimation is more limited. We illustrate the utility of the proposed approach through the analysis of BRCA1 and BRCA2 mutation carriers in the Washington Ashkenazi Kin-Cohort Study of Breast Cancer.
Babu, Giridhara R.; Sudhir, Paulomi M.; Mahapatra, Tanmay; Das, Aritra; Rathnaiah, Mohanbabu; Anand, Indiresh; Detels, Roger
2016-01-01
Background: There is limited scientific evidence on the relationship of job stress with quality of life (QoL). Purpose: This study aims to explore different domains of job stress affecting IT/ITES professionals and estimate the levels of stress that these professionals endure to reach positive levels of QoL given that other determinants operating between these two variables are accounted for. Materials and Methods: We estimated levels of stress that software professionals would have endured to reach positive levels of QoL considering that other factors operating between these two variables are accounted for. The study participants comprised 1071 software professionals who were recruited using a mixed sampling method. Participants answered a self-administered questionnaire containing questions on job stress, QoL, and confounders. Results: All the domains (physical, psychological, social, and environmental) of QoL showed statistically significant positive associations with increasing stress domains of autonomy, physical infrastructure, work environment, and emotional factors. Conclusions: The respondents clearly found the trade-off of higher stress to be acceptable for the improved QoL they enjoyed. It is also possible that stress might actually be responsible for improvements in QoL either directly or through mediation of variables such as personal values and aspirations. Yerkes-Dodson law and stress appraisal models of Folkman and Lazarus may explain the plausible positive association. PMID:28194085
A diagram retrieval method with multi-label learning
NASA Astrophysics Data System (ADS)
Fu, Songping; Lu, Xiaoqing; Liu, Lu; Qu, Jingwei; Tang, Zhi
2015-01-01
In recent years, the retrieval of plane geometry figures (PGFs) has attracted increasing attention in the fields of mathematics education and computer science. However, the high cost of matching complex PGF features leads to the low efficiency of most retrieval systems. This paper proposes an indirect classification method based on multi-label learning, which improves retrieval efficiency by reducing the scope of compare operation from the whole database to small candidate groups. Label correlations among PGFs are taken into account for the multi-label classification task. The primitive feature selection for multi-label learning and the feature description of visual geometric elements are conducted individually to match similar PGFs. The experiment results show the competitive performance of the proposed method compared with existing PGF retrieval methods in terms of both time consumption and retrieval quality.
NASA Astrophysics Data System (ADS)
Kuntman, Ertan; Canillas, Adolf; Arteaga, Oriol
2017-11-01
Experimental Mueller matrices contain certain amount of uncertainty in their elements and these uncertainties can create difficulties for decomposition methods based on analytic solutions. In an earlier paper [1], we proposed a decomposition method for depolarizing Mueller matrices by using certain symmetry conditions. However, because of the experimental error, that method creates over-determined systems with non-unique solutions. Here we propose to use least squares minimization approach in order to improve the accuracy of our results. In this method, we are taking into account the number of independent parameters of the corresponding symmetry and the rank constraints on the component matrices to decide on our fitting model. This approach is illustrated with experimental Mueller matrices that include material media with different Mueller symmetries.
Chow, Vincent S; Huang, Wenhai; Puterman, Martin L
2009-01-01
Operations research (OR) is playing an increasing role in the support of many health care initiatives. However one of the main challenges facing OR practitioners is the availability and the integrity of operations data. Hospital information systems (HIS) are often designed with a clinical or accounting focus and may lack the data necessary for operational studies. In this paper, we illustrate the data processing methods and data challenges faced by our team during a study of surgical scheduling practices at the Vancouver Island Health Authority. We also provide some general recommendations to improve HIS from an operations perspective. In general, more integration between operations researchers and HIS specialists are required to support ongoing operational improvements in the health care sector.
Effect of recent popularity on heat-conduction based recommendation models
NASA Astrophysics Data System (ADS)
Li, Wen-Jun; Dong, Qiang; Shi, Yang-Bo; Fu, Yan; He, Jia-Lin
2017-05-01
Accuracy and diversity are two important measures in evaluating the performance of recommender systems. It has been demonstrated that the recommendation model inspired by the heat conduction process has high diversity yet low accuracy. Many variants have been introduced to improve the accuracy while keeping high diversity, most of which regard the current node-degree of an item as its popularity. However in this way, a few outdated items of large degree may be recommended to an enormous number of users. In this paper, we take the recent popularity (recently increased item degrees) into account in the heat-conduction based methods, and propose accordingly the improved recommendation models. Experimental results on two benchmark data sets show that the accuracy can be largely improved while keeping the high diversity compared with the original models.
From clinical integration to accountable care.
Shields, Mark
2011-01-01
Four key challenges to reforming health care organizations can be addressed by a clinical integration model patterned after Advocate Physician Partners (APP). These challenges are: predominance of small group practices, dominant fee-for-service reimbursement methods, weaknesses of the traditional hospital medical staff structure and a need to partner with commercial insurance companies. APP has demonstrated teamwork between 3800 physicians and hospitals to improve quality, patient safety and cost-effectiveness. Building on this model, an innovative contract with Blue Cross Blue Shield of Illinois serves as a prototype for a commercial Accountable Care Organization. For this contract to succeed, APP must outperform the market competition. To accomplish this, APP has implemented strategies to reduce readmissions, avoid unnecessary admissions and emergency room visits, expand primary care access, and enhance quality and patient safety.
3D Observations techniques for the solar corona
NASA Astrophysics Data System (ADS)
Portier-Fozzani, F.; Papadopoulo, T.; Fermin, I.; Bijaoui, A.; Stereo/Secchi 3D Team; et al.
In this talk, we will present a review of the different 3D techniques concerning observations of the solar corona made by EUV imageur (such as SOHO/EIT and STEREO/SECCHI) and by coronagraphs (SOHO/LASCO and STEREO/SECCHI). Tomographic reconstructions need magnetic extrapolation to constraint the model (classical triangle mash reconstruction, or more evoluated pixon method). For 3D reconstruction the other approach is stereovision. Stereoscopic techniques are built in a specific way to take into account the optical thin medium of the solar corona, which makes most of the classical stereo method not directly applicable. To improve such method we need to take into account how to describe an image by computer vision : an image is not only a set of intensities but its descriptions/representations in term of sub-objects is needed for the structures extractions and matching. We will describe optical flow methods to follow the structures, and decomposition in sub-areas depending of the solar cycle. After recalling results obtained with geometric loops reconstructions and their consequences for twist measurement and helicity evaluation, we will describe how we can mix pixel and conceptual recontruction for stereovision. We could then include epipolar geometry and Multiscale Vision Model (MVM) to enhance the reconstruction. These concepts are under development for STEREO/SECCHI.
Shen, Chung-Wei; Chen, Yi-Hau
2018-03-13
We propose a model selection criterion for semiparametric marginal mean regression based on generalized estimating equations. The work is motivated by a longitudinal study on the physical frailty outcome in the elderly, where the cluster size, that is, the number of the observed outcomes in each subject, is "informative" in the sense that it is related to the frailty outcome itself. The new proposal, called Resampling Cluster Information Criterion (RCIC), is based on the resampling idea utilized in the within-cluster resampling method (Hoffman, Sen, and Weinberg, 2001, Biometrika 88, 1121-1134) and accommodates informative cluster size. The implementation of RCIC, however, is free of performing actual resampling of the data and hence is computationally convenient. Compared with the existing model selection methods for marginal mean regression, the RCIC method incorporates an additional component accounting for variability of the model over within-cluster subsampling, and leads to remarkable improvements in selecting the correct model, regardless of whether the cluster size is informative or not. Applying the RCIC method to the longitudinal frailty study, we identify being female, old age, low income and life satisfaction, and chronic health conditions as significant risk factors for physical frailty in the elderly. © 2018, The International Biometric Society.
Overcoming bottlenecks in the membrane protein structural biology pipeline.
Hardy, David; Bill, Roslyn M; Jawhari, Anass; Rothnie, Alice J
2016-06-15
Membrane proteins account for a third of the eukaryotic proteome, but are greatly under-represented in the Protein Data Bank. Unfortunately, recent technological advances in X-ray crystallography and EM cannot account for the poor solubility and stability of membrane protein samples. A limitation of conventional detergent-based methods is that detergent molecules destabilize membrane proteins, leading to their aggregation. The use of orthologues, mutants and fusion tags has helped improve protein stability, but at the expense of not working with the sequence of interest. Novel detergents such as glucose neopentyl glycol (GNG), maltose neopentyl glycol (MNG) and calixarene-based detergents can improve protein stability without compromising their solubilizing properties. Styrene maleic acid lipid particles (SMALPs) focus on retaining the native lipid bilayer of a membrane protein during purification and biophysical analysis. Overcoming bottlenecks in the membrane protein structural biology pipeline, primarily by maintaining protein stability, will facilitate the elucidation of many more membrane protein structures in the near future. © 2016 The Author(s). published by Portland Press Limited on behalf of the Biochemical Society.
A fictitious domain approach for the Stokes problem based on the extended finite element method
NASA Astrophysics Data System (ADS)
Court, Sébastien; Fournié, Michel; Lozinski, Alexei
2014-01-01
In the present work, we propose to extend to the Stokes problem a fictitious domain approach inspired by eXtended Finite Element Method and studied for Poisson problem in [Renard]. The method allows computations in domains whose boundaries do not match. A mixed finite element method is used for fluid flow. The interface between the fluid and the structure is localized by a level-set function. Dirichlet boundary conditions are taken into account using Lagrange multiplier. A stabilization term is introduced to improve the approximation of the normal trace of the Cauchy stress tensor at the interface and avoid the inf-sup condition between the spaces for velocity and the Lagrange multiplier. Convergence analysis is given and several numerical tests are performed to illustrate the capabilities of the method.
White, Alexander James; Tretiak, Sergei; Mozyrsky, Dima V.
2016-04-25
Accurate simulation of the non-adiabatic dynamics of molecules in excited electronic states is key to understanding molecular photo-physical processes. Here we present a novel method, based on a semiclassical approximation, that is as efficient as the commonly used mean field Ehrenfest or ad hoc surface hopping methods and properly accounts for interference and decoherence effects. This novel method is an extension of Heller's thawed Gaussian wave-packet dynamics that includes coupling between potential energy surfaces. By studying several standard test problems we demonstrate that the accuracy of the method can be systematically improved while maintaining high efficiency. The method is suitablemore » for investigating the role of quantum coherence in the non-adiabatic dynamics of many-atom molecules.« less
Historical droughts in Mediterranean regions during the last 500 years: a data/model approach
NASA Astrophysics Data System (ADS)
Brewer, S.; Alleaume, S.; Guiot, J.; Nicault, A.
2007-06-01
We present here a new method for comparing the output of General Circulation Models (GCMs) with proxy-based reconstructions, using time series of reconstructed and simulated climate parameters. The method uses k-means clustering to allow comparison between different periods that have similar spatial patterns, and a fuzzy logic-based distance measure in order to take reconstruction errors into account. The method has been used to test two coupled ocean-atmosphere GCMs over the Mediterranean region for the last 500 years, using an index of drought stress, the Palmer Drought Severity Index. The results showed that, whilst no model exactly simulated the reconstructed changes, all simulations were an improvement over using the mean climate, and a good match was found after 1650 with a model run that took into account changes in volcanic forcing, solar irradiance, and greenhouse gases. A more detailed investigation of the output of this model showed the existence of a set of atmospheric circulation patterns linked to the patterns of drought stress: 1) a blocking pattern over northern Europe linked to dry conditions in the south prior to the Little Ice Age (LIA) and during the 20th century; 2) a NAO-positive like pattern with increased westerlies during the LIA; 3) a NAO-negative like period shown in the model prior to the LIA, but that occurs most frequently in the data during the LIA. The results of the comparison show the improvement in simulated climate as various forcings are included and help to understand the atmospheric changes that are linked to the observed reconstructed climate changes.
7 CFR 1781.21 - Borrower accounting methods, management, reporting, and audits.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 12 2010-01-01 2010-01-01 false Borrower accounting methods, management, reporting... DEVELOPMENT (RCD) LOANS AND WATERSHED (WS) LOANS AND ADVANCES § 1781.21 Borrower accounting methods, management, reporting, and audits. These activities will be handled in accordance with the provisions of...
7 CFR 1767.13 - Departures from the prescribed RUS Uniform System of Accounts.
Code of Federal Regulations, 2010 CFR
2010-01-01
... accounting methodologies and principles that depart from the provisions herein; or (2) File with such... borrower's rates, based upon accounting methods and principles inconsistent with the provisions of this... accounting methods or principles for the borrower that are inconsistent with the provisions of this part, the...
26 CFR 1.446-1 - General rule for methods of accounting.
Code of Federal Regulations, 2010 CFR
2010-04-01
... also the accounting treatment of any item. Examples of such over-all methods are the cash receipts and... special items include the accounting treatment prescribed for research and experimental expenditures, soil... books of account and on his return, as for example, a reconciliation of any differences between such...
O'Sullivan, Cormac T; Dexter, Franklin; Lubarsky, David A; Vigoda, Michael M
2007-02-01
A systematic and comprehensive review of the scientific literature revealed 4 evidence-based methods that contribute to a positive return on investment from anesthesia information management systems (AIMS): reducing anesthetic-related drug costs, improving staff scheduling and reducing staffing costs, increasing anesthesia billing and capture of anesthesia-related charges, and increased hospital reimbursement through improved hospital coding. There were common features to these interventions. Whereas an AIMS may be the ideal choice to achieve these cost reductions and revenue increases, alternative existing systems may be satisfactory for the studied applications (i.e., the incremental advantage to the AIMS may be less than predicted from applying each study to each facility). Savings are likely heterogeneous among institutions, making an internal survey using standard accounting methods necessary to perform a valid return on investment analysis. Financial advantages can be marked for the anesthesia providers, although hospitals are more likely to purchase the AIMS.
Impact of sampling strategy on stream load estimates in till landscape of the Midwest
Vidon, P.; Hubbard, L.E.; Soyeux, E.
2009-01-01
Accurately estimating various solute loads in streams during storms is critical to accurately determine maximum daily loads for regulatory purposes. This study investigates the impact of sampling strategy on solute load estimates in streams in the US Midwest. Three different solute types (nitrate, magnesium, and dissolved organic carbon (DOC)) and three sampling strategies are assessed. Regardless of the method, the average error on nitrate loads is higher than for magnesium or DOC loads, and all three methods generally underestimate DOC loads and overestimate magnesium loads. Increasing sampling frequency only slightly improves the accuracy of solute load estimates but generally improves the precision of load calculations. This type of investigation is critical for water management and environmental assessment so error on solute load calculations can be taken into account by landscape managers, and sampling strategies optimized as a function of monitoring objectives. ?? 2008 Springer Science+Business Media B.V.
Complementarity and Area-Efficiency in the Prioritization of the Global Protected Area Network.
Kullberg, Peter; Toivonen, Tuuli; Montesino Pouzols, Federico; Lehtomäki, Joona; Di Minin, Enrico; Moilanen, Atte
2015-01-01
Complementarity and cost-efficiency are widely used principles for protected area network design. Despite the wide use and robust theoretical underpinnings, their effects on the performance and patterns of priority areas are rarely studied in detail. Here we compare two approaches for identifying the management priority areas inside the global protected area network: 1) a scoring-based approach, used in recently published analysis and 2) a spatial prioritization method, which accounts for complementarity and area-efficiency. Using the same IUCN species distribution data the complementarity method found an equal-area set of priority areas with double the mean species ranges covered compared to the scoring-based approach. The complementarity set also had 72% more species with full ranges covered, and lacked any coverage only for half of the species compared to the scoring approach. Protected areas in our complementarity-based solution were on average smaller and geographically more scattered. The large difference between the two solutions highlights the need for critical thinking about the selected prioritization method. According to our analysis, accounting for complementarity and area-efficiency can lead to considerable improvements when setting management priorities for the global protected area network.
Improving Science Education through Accountability Relationships in Schools.
ERIC Educational Resources Information Center
Wildy, Helen; Wallace, John
1997-01-01
Presents a contrast between bureaucratic and professional models of accountability and their impact on the science education enterprise. Topics include improving performance, climate of trust, principles and consequences, demonstrating acceptance of responsibilities, and feedback. Concludes that it is necessary to develop the skills and processes…
Endovascular aneurysm repair delivery redesign leads to quality improvement and cost reduction
Warner, Courtney J.; Horvath, Alexander J.; Powell, Richard J.; Columbo, Jesse A.; Walsh, Teri R.; Goodney, Philip P.; Walsh, Daniel B.; Stone, David H.
2017-01-01
Objective Endovascular aneurysm repair (EVAR) is now a mainstay of therapy for abdominal aortic aneurysm, although it remains associated with significant expense. We performed a comprehensive analysis of EVAR delivery at an academic medical center to identify targets for quality improvement and cost reduction in light of impending health care reform. Methods All infrarenal EVARs performed from April 2011 to March 2012 were identified (N = 127). Procedures were included if they met standard commercial instructions for use guidelines, used a single manufacturer, and were billed to Medicare diagnosis-related group 238 (n = 49). By use of DMAIC (define, measure, analyze, improve, and control) quality improvement methodology (define, measure, analyze, improve, control), targets for EVAR quality improvement were identified and high-yield changes were implemented. Procedure technical costs were calculated before and after process redesign. Results Perioperative services and clinic visits were identified as targets for quality improvement efforts and cost reduction. Mean technical costs before the intervention were $31,672, with endograft implants accounting for 52%. Pricing redesign in collaboration with hospital purchasing reduced mean EVAR technical costs to $28,607, a 10% reduction in overall cost, with endograft implants now accounting for 46%. Perioperative implementation of instrument tray redesign reduced instrument use by 32% (184 vs 132 instruments), saving $50,000 annually. Unnecessary clinic visits were reduced by 39% (1.6 vs 1.1 clinic visits per patient) through implementation of a preclinic imaging protocol. There was no difference in mean length of stay after the intervention. Conclusions Comprehensive EVAR delivery redesign leads to cost reduction and waste elimination while preserving quality. Future efforts to achieve more competitive and transparent device pricing will make EVAR more cost neutral and enhance its financial sustainability for health care systems. PMID:25935271
The structure of liquid metals probed by XAS
NASA Astrophysics Data System (ADS)
Filipponi, Adriano; Di Cicco, Andrea; Iesari, Fabio; Trapananti, Angela
2017-08-01
X-ray absorption spectroscopy (XAS) is a powerful technique to investigate the short-range order around selected atomic species in condensed matter. The theoretical framework and previous applications to undercooled elemental liquid metals are briefly reviewed. Specific results on undercooled liquid Ni obtained using a peak fitting approach validated on the spectra of solid Ni are presented. This method provides a clear evidence that a signature from close packed triangular configurations of nearest neighbors survives in the liquid state and is clearly detectable below k ≈ 5 Å-1, stimulating the improvement of data-analysis methods that account properly for the ensemble average, such as Reverse Monte Carlo.
Laboratory Diagnosis of Congenital Toxoplasmosis.
Pomares, Christelle; Montoya, Jose G
2016-10-01
Recent studies have demonstrated that screening and treatment for toxoplasmosis during gestation result in a decrease of vertical transmission and clinical sequelae. Early treatment was associated with improved outcomes. Thus, laboratory methods should aim for early identification of infants with congenital toxoplasmosis (CT). Diagnostic approaches should include, at least, detection of Toxoplasma IgG, IgM, and IgA and a comprehensive review of maternal history, including the gestational age at which the mother was infected and treatment. Here, we review laboratory methods for the diagnosis of CT, with emphasis on serological tools. A diagnostic algorithm that takes into account maternal history is presented. Copyright © 2016 Pomares and Montoya.
Eco-Material Selection for Auto Bodies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mayyas, Ahmad T; Omar, Mohammed; Hayajneh, Mohammed T.
In the last decades, majority of automakers started to include lightweight materials in their vehicles to meet hard environmental regulations and to improve fuel efficiency of their vehicles. As a result, eco-material selection for vehicles emerged as a new discipline under design for environment. This chapter will summarize methods of eco-material selections for automotive applications with more emphasis into auto-bodies. A set of metrics for eco-material selection that takes into account all economic, environmental and social factors will be developed using numerical and qualitative methods. These metrics cover products' environmental impact, functionality and manufacturability, in addition to the economic andmore » societal factors.« less
Six Lessons We Learned Applying Six Sigma
NASA Technical Reports Server (NTRS)
Carroll, Napoleon; Casleton, Christa H.
2005-01-01
As Chief Financial Officer of Kennedy Space Center (KSC), I'm not only responsible for financial planning and accounting but also for building strong partnerships with the CFO customers, who include Space Shuttle and International Space Station operations as well all who manage the KSC Spaceport. My never ending goal is to design, manage and continuously improve our core business processes so that they deliver world class products and services to the CFO's customers. I became interested in Six Sigma as Christa Casleton (KSC's first Six Sigma Black belt) applied Six Sigma tools and methods to our Plan and Account for Travel Costs Process. Her analysis was fresh, innovative and thorough but, even more impressive, was her approach to ensure ongoing, continuous process improvement. Encouraged by the results, I launched two more process improvement initiatives aimed at applying Six Sigma principles to CFO processes that not only touch most of my employees but also have direct customer impact. As many of you know, Six Sigma is a measurement scale that compares the output of a process with customer requirements. That's straight forward, but demands that you not only understand your processes but also know your products and the critical customer requirements. The objective is to isolate and eliminate the causes of process variation so that the customer sees consistently high quality.
Prioritizing quality improvement in general surgery.
Schilling, Peter L; Dimick, Justin B; Birkmeyer, John D
2008-11-01
Despite growing interest in quality improvement, uncertainty remains about which procedures offer the most room for improvement in general surgery. In this context, we sought to describe the relative contribution of different procedures to overall morbidity, mortality, and excess length of stay in general surgery. Using data from the American College of Surgeons' National Surgery Quality Improvement Program (ACS-NSQIP), we identified all patients undergoing a general surgery procedure in 2005 and 2006 (n=129,233). Patients were placed in 36 distinct procedure groups based on Current Procedural Terminology codes. We first examined procedure groups according to their relative contribution to overall morbidity and mortality. We then assessed procedure groups according to their contribution to overall excess length of stay. Ten procedure groups alone accounted for 62% of complications and 54% of excess hospital days. Colectomy accounted for the greatest share of adverse events, followed by small intestine resection, inpatient cholecystectomy, and ventral hernia repair. In contrast, several common procedures contributed little to overall morbidity and mortality. For example, outpatient cholecystectomy, breast procedures, thyroidectomy, parathyroidectomy, and outpatient inguinal hernia repair together accounted for 34% of procedures, but only 6% of complications (and only 4% of major complications). These same procedures accounted for < 1% of excess hospital days. A relatively small number of procedures account for a disproportionate share of the morbidity, mortality, and excess hospital days in general surgery. Focusing quality improvement efforts on these procedures may be an effective strategy for improving patient care and reducing cost.
Research Assessments and Rankings: Accounting for Accountability in "Higher Education Ltd"
ERIC Educational Resources Information Center
Singh, Geeta
2008-01-01
Over the past two decades, higher education in advanced capitalist societies has undergone a process of radical "reform". A key element of this reform has been the introduction of a number of accounting-based techniques in the pursuit of improved accountability and transparency. While the "old" accounting was to do with stewardship, the "new"…
High Accuracy Human Activity Recognition Based on Sparse Locality Preserving Projections.
Zhu, Xiangbin; Qiu, Huiling
2016-01-01
Human activity recognition(HAR) from the temporal streams of sensory data has been applied to many fields, such as healthcare services, intelligent environments and cyber security. However, the classification accuracy of most existed methods is not enough in some applications, especially for healthcare services. In order to improving accuracy, it is necessary to develop a novel method which will take full account of the intrinsic sequential characteristics for time-series sensory data. Moreover, each human activity may has correlated feature relationship at different levels. Therefore, in this paper, we propose a three-stage continuous hidden Markov model (TSCHMM) approach to recognize human activities. The proposed method contains coarse, fine and accurate classification. The feature reduction is an important step in classification processing. In this paper, sparse locality preserving projections (SpLPP) is exploited to determine the optimal feature subsets for accurate classification of the stationary-activity data. It can extract more discriminative activities features from the sensor data compared with locality preserving projections. Furthermore, all of the gyro-based features are used for accurate classification of the moving data. Compared with other methods, our method uses significantly less number of features, and the over-all accuracy has been obviously improved.
High Accuracy Human Activity Recognition Based on Sparse Locality Preserving Projections
2016-01-01
Human activity recognition(HAR) from the temporal streams of sensory data has been applied to many fields, such as healthcare services, intelligent environments and cyber security. However, the classification accuracy of most existed methods is not enough in some applications, especially for healthcare services. In order to improving accuracy, it is necessary to develop a novel method which will take full account of the intrinsic sequential characteristics for time-series sensory data. Moreover, each human activity may has correlated feature relationship at different levels. Therefore, in this paper, we propose a three-stage continuous hidden Markov model (TSCHMM) approach to recognize human activities. The proposed method contains coarse, fine and accurate classification. The feature reduction is an important step in classification processing. In this paper, sparse locality preserving projections (SpLPP) is exploited to determine the optimal feature subsets for accurate classification of the stationary-activity data. It can extract more discriminative activities features from the sensor data compared with locality preserving projections. Furthermore, all of the gyro-based features are used for accurate classification of the moving data. Compared with other methods, our method uses significantly less number of features, and the over-all accuracy has been obviously improved. PMID:27893761
Prioritizing chemicals for environmental management in China based on screening of potential risks
NASA Astrophysics Data System (ADS)
Yu, Xiangyi; Mao, Yan; Sun, Jinye; Shen, Yingwa
2014-03-01
The rapid development of China's chemical industry has created increasing pressure to improve the environmental management of chemicals. To bridge the large gap between the use and safe management of chemicals, we performed a comprehensive review of the international methods used to prioritize chemicals for environmental management. By comparing domestic and foreign methods, we confirmed the presence of this gap and identified potential solutions. Based on our literature review, we developed an appropriate screening method that accounts for the unique characteristics of chemical use within China. The proposed method is based on an evaluation using nine indices of the potential hazard posed by a chemical: three environmental hazard indices (persistence, bioaccumulation, and eco-toxicity), four health hazard indices (acute toxicity, carcinogenicity, mutagenicity, and reproductive and developmental toxicity), and two environmental exposure hazard indices (chemical amount and utilization pattern). The results of our screening agree with results of previous efforts from around the world, confirming the validity of the new system. The classification method will help decisionmakers to prioritize and identify the chemicals with the highest environmental risk, thereby providing a basis for improving chemical management in China.
On NUFFT-based gridding for non-Cartesian MRI
NASA Astrophysics Data System (ADS)
Fessler, Jeffrey A.
2007-10-01
For MRI with non-Cartesian sampling, the conventional approach to reconstructing images is to use the gridding method with a Kaiser-Bessel (KB) interpolation kernel. Recently, Sha et al. [L. Sha, H. Guo, A.W. Song, An improved gridding method for spiral MRI using nonuniform fast Fourier transform, J. Magn. Reson. 162(2) (2003) 250-258] proposed an alternative method based on a nonuniform FFT (NUFFT) with least-squares (LS) design of the interpolation coefficients. They described this LS_NUFFT method as shift variant and reported that it yielded smaller reconstruction approximation errors than the conventional shift-invariant KB approach. This paper analyzes the LS_NUFFT approach in detail. We show that when one accounts for a certain linear phase factor, the core of the LS_NUFFT interpolator is in fact real and shift invariant. Furthermore, we find that the KB approach yields smaller errors than the original LS_NUFFT approach. We show that optimizing certain scaling factors can lead to a somewhat improved LS_NUFFT approach, but the high computation cost seems to outweigh the modest reduction in reconstruction error. We conclude that the standard KB approach, with appropriate parameters as described in the literature, remains the practical method of choice for gridding reconstruction in MRI.
Unique strategies to control reproduction in camels.
Skidmore, J A; Morton, K M; Billah, M
2010-01-01
The reproductive efficiency of camels is low under natural pastural conditions and so the use of artifical insemination and embryo transfer are becoming increasingly important to improve their breeding potential. Methods to control their reproductive cycle are therefore essential. This review describes characteristics of the ovarian follicular wave pattern in camels and exogenous hormonal control of ovulation. It also summarizes the difficulties involved with artifical insemination and analyzing the highly gelatinous semen, and reports on the latest methods used to try and reduce the viscosity and liquefy camel semen. In addition an account is given of different hormonal and physical methods used to synchronise follicular waves, and various hormone treatments used to broaden the availability of ovulated, asynchronous and non-ovulated recipients are discussed.
Kuzmin, S V; Gurvich, V B; Dikonskaya, O V; Malykh, O L; Yarushin, S V; Romanov, S V; Kornilkov, A S
2013-01-01
The information and analytical framework for the introduction of health risk assessment and risk management methodologies in the Sverdlovsk Region is the system of socio-hygienic monitoring. Techniques of risk management that take into account the choice of most cost-effective and efficient actions for improvement of the sanitary and epidemiologic situation at the level of the region, municipality, or a business entity of the Russian Federation, have been developed and proposed. To assess the efficiency of planning and activities for health risk management common method approaches and economic methods of "cost-effectiveness" and "cost-benefit" analyses provided in method recommendations and introduced in the Russian Federation are applied.
Electronic structure of the Cu + impurity center in sodium chloride
NASA Astrophysics Data System (ADS)
Chermette, H.; Pedrini, C.
1981-08-01
The multiple-scattering Xα method is used to describe the electronic structure of Cu+ in sodium chloride. Several improvements are brought to the conventional Xα calculation. In particular, the cluster approximation is used by taking into account external lattice potential. The ''transition state'' procedure is applied in order to get the various multiplet levels. The fine electronic structure of the impurity centers is obtained after a calculation of the spin-orbit interactions. These results are compared with those given by a modified charge-consistent extended Hückel method (Fenske-type calculation) and the merit of each method is discussed. The present calculation produces good quantitative agreement with experiment concerning mainly the optical excitations and the emission mechanism of the Cu+ luminescent centers in NaCl.
Accurate finite difference methods for time-harmonic wave propagation
NASA Technical Reports Server (NTRS)
Harari, Isaac; Turkel, Eli
1994-01-01
Finite difference methods for solving problems of time-harmonic acoustics are developed and analyzed. Multidimensional inhomogeneous problems with variable, possibly discontinuous, coefficients are considered, accounting for the effects of employing nonuniform grids. A weighted-average representation is less sensitive to transition in wave resolution (due to variable wave numbers or nonuniform grids) than the standard pointwise representation. Further enhancement in method performance is obtained by basing the stencils on generalizations of Pade approximation, or generalized definitions of the derivative, reducing spurious dispersion, anisotropy and reflection, and by improving the representation of source terms. The resulting schemes have fourth-order accurate local truncation error on uniform grids and third order in the nonuniform case. Guidelines for discretization pertaining to grid orientation and resolution are presented.
Wang, Yun; Huang, Fangzhou
2018-01-01
The selection of feature genes with high recognition ability from the gene expression profiles has gained great significance in biology. However, most of the existing methods have a high time complexity and poor classification performance. Motivated by this, an effective feature selection method, called supervised locally linear embedding and Spearman's rank correlation coefficient (SLLE-SC2), is proposed which is based on the concept of locally linear embedding and correlation coefficient algorithms. Supervised locally linear embedding takes into account class label information and improves the classification performance. Furthermore, Spearman's rank correlation coefficient is used to remove the coexpression genes. The experiment results obtained on four public tumor microarray datasets illustrate that our method is valid and feasible. PMID:29666661
Xu, Jiucheng; Mu, Huiyu; Wang, Yun; Huang, Fangzhou
2018-01-01
The selection of feature genes with high recognition ability from the gene expression profiles has gained great significance in biology. However, most of the existing methods have a high time complexity and poor classification performance. Motivated by this, an effective feature selection method, called supervised locally linear embedding and Spearman's rank correlation coefficient (SLLE-SC 2 ), is proposed which is based on the concept of locally linear embedding and correlation coefficient algorithms. Supervised locally linear embedding takes into account class label information and improves the classification performance. Furthermore, Spearman's rank correlation coefficient is used to remove the coexpression genes. The experiment results obtained on four public tumor microarray datasets illustrate that our method is valid and feasible.
Modelling the training effects of kinaesthetic acuity measurement in children.
Sims, K; Morton, J
1998-07-01
In previous papers (Sims, Henderson, Hulme, & Morton, 1996a; Sims, Henderson, Morton, & Hulme, 1996b) we have found that the motor skills of clumsy children are capable of significant improvement following relatively brief interventions. Most remarkably, this included a 10-minute intervention while testing the kinaesthetic acuity of the children using a staircase method (Pest). In this paper, we show that Pest testing improves the kinaesthetic acuity of normal children as well. We analyse the available data on the development and improvement of motor skills and kinaesthetic acuity and derive a causal model for the underlying skills. We show that at least three independent cognitive/biological components are required to account for the data. These three components are affected differently by the various interventions that have been tried. We deduce that improvement on a general test of motor impairment can be found as a result of training in kinaesthetic acuity or through other, independent factors.
Jha, Abhinav K.; Mena, Esther; Caffo, Brian; Ashrafinia, Saeed; Rahmim, Arman; Frey, Eric; Subramaniam, Rathan M.
2017-01-01
Abstract. Recently, a class of no-gold-standard (NGS) techniques have been proposed to evaluate quantitative imaging methods using patient data. These techniques provide figures of merit (FoMs) quantifying the precision of the estimated quantitative value without requiring repeated measurements and without requiring a gold standard. However, applying these techniques to patient data presents several practical difficulties including assessing the underlying assumptions, accounting for patient-sampling-related uncertainty, and assessing the reliability of the estimated FoMs. To address these issues, we propose statistical tests that provide confidence in the underlying assumptions and in the reliability of the estimated FoMs. Furthermore, the NGS technique is integrated within a bootstrap-based methodology to account for patient-sampling-related uncertainty. The developed NGS framework was applied to evaluate four methods for segmenting lesions from F-Fluoro-2-deoxyglucose positron emission tomography images of patients with head-and-neck cancer on the task of precisely measuring the metabolic tumor volume. The NGS technique consistently predicted the same segmentation method as the most precise method. The proposed framework provided confidence in these results, even when gold-standard data were not available. The bootstrap-based methodology indicated improved performance of the NGS technique with larger numbers of patient studies, as was expected, and yielded consistent results as long as data from more than 80 lesions were available for the analysis. PMID:28331883
NASA Technical Reports Server (NTRS)
Streett, C. L.
1981-01-01
A viscous-inviscid interaction method has been developed by using a three-dimensional integral boundary-layer method which produces results in good agreement with a finite-difference method in a fraction of the computer time. The integral method is stable and robust and incorporates a model for computation in a small region of streamwise separation. A locally two-dimensional wake model, accounting for thickness and curvature effects, is also included in the interaction procedure. Computation time spent in converging an interacted result is, many times, only slightly greater than that required to converge an inviscid calculation. Results are shown from the interaction method, run at experimental angle of attack, Reynolds number, and Mach number, on a wing-body test case for which viscous effects are large. Agreement with experiment is good; in particular, the present wake model improves prediction of the spanwise lift distribution and lower surface cove pressure.
Swinburn, Boyd; Kraak, Vivica; Rutter, Harry; Vandevijvere, Stefanie; Lobstein, Tim; Sacks, Gary; Gomes, Fabio; Marsh, Tim; Magnusson, Roger
2015-06-20
To achieve WHO's target to halt the rise in obesity and diabetes, dramatic actions are needed to improve the healthiness of food environments. Substantial debate surrounds who is responsible for delivering effective actions and what, specifically, these actions should entail. Arguments are often reduced to a debate between individual and collective responsibilities, and between hard regulatory or fiscal interventions and soft voluntary, education-based approaches. Genuine progress lies beyond the impasse of these entrenched dichotomies. We argue for a strengthening of accountability systems across all actors to substantially improve performance on obesity reduction. In view of the industry opposition and government reluctance to regulate for healthier food environments, quasiregulatory approaches might achieve progress. A four step accountability framework (take the account, share the account, hold to account, and respond to the account) is proposed. The framework identifies multiple levers for change, including quasiregulatory and other approaches that involve government-specified and government-monitored progress of private sector performance, government procurement mechanisms, improved transparency, monitoring of actions, and management of conflicts of interest. Strengthened accountability systems would support government leadership and stewardship, constrain the influence of private sector actors with major conflicts of interest on public policy development, and reinforce the engagement of civil society in creating demand for healthy food environments and in monitoring progress towards obesity action objectives. Copyright © 2015 Elsevier Ltd. All rights reserved.
Your Audit and Financial Controls.
ERIC Educational Resources Information Center
Hatch, Mary B.; And Others
Audits should be performed on school accounting systems because they are required by law and they provide independent reviews of school financial procedures and suggestions for improvement. A licensed certified public accountant, public accountant, or an accountant who has met the Continuation of Education requirement should perform the audit.…
An Accounting Writing Proficiency Survey
ERIC Educational Resources Information Center
Firch, Tim; Campbell, Annhenrie; Filling, Steven; Lindsay, David H.
2011-01-01
Although there has been much discussion about improving college student writing with college-level courses, little is known about how accounting programs, in particular, are addressing the writing proficiency challenge. This study surveys the 852 accounting programs in the United States to identify the frequency and types of accounting writing…
Two-point method uncertainty during control and measurement of cylindrical element diameters
NASA Astrophysics Data System (ADS)
Glukhov, V. I.; Shalay, V. V.; Radev, H.
2018-04-01
The topic of the article is devoted to the urgent problem of the reliability of technical products geometric specifications measurements. The purpose of the article is to improve the quality of parts linear sizes control by the two-point measurement method. The article task is to investigate methodical extended uncertainties in measuring cylindrical element linear sizes. The investigation method is a geometric modeling of the element surfaces shape and location deviations in a rectangular coordinate system. The studies were carried out for elements of various service use, taking into account their informativeness, corresponding to the kinematic pairs classes in theoretical mechanics and the number of constrained degrees of freedom in the datum element function. Cylindrical elements with informativity of 4, 2, 1 and θ (zero) were investigated. The uncertainties estimation of in two-point measurements was made by comparing the results of of linear dimensions measurements with the functional diameters maximum and minimum of the element material. Methodical uncertainty is formed when cylindrical elements with maximum informativeness have shape deviations of the cut and the curvature types. Methodical uncertainty is formed by measuring the element average size for all types of shape deviations. The two-point measurement method cannot take into account the location deviations of a dimensional element, so its use for elements with informativeness less than the maximum creates unacceptable methodical uncertainties in measurements of the maximum, minimum and medium linear dimensions. Similar methodical uncertainties also exist in the arbitration control of the linear dimensions of the cylindrical elements by limiting two-point gauges.
77 FR 25181 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-27
... for use in quality improvement activities, plan accountability, public reporting, and improving health... accountable for the quality of care they are delivering. This reporting requirement allows CMS to obtain the... 0938-0969); Frequency: Occasionally; Affected Public: Private Sector (Business or other for-profit and...
Combined Use of Integral Experiments and Covariance Data
NASA Astrophysics Data System (ADS)
Palmiotti, G.; Salvatores, M.; Aliberti, G.; Herman, M.; Hoblit, S. D.; McKnight, R. D.; Obložinský, P.; Talou, P.; Hale, G. M.; Hiruta, H.; Kawano, T.; Mattoon, C. M.; Nobre, G. P. A.; Palumbo, A.; Pigni, M.; Rising, M. E.; Yang, W.-S.; Kahler, A. C.
2014-04-01
In the frame of a US-DOE sponsored project, ANL, BNL, INL and LANL have performed a joint multidisciplinary research activity in order to explore the combined use of integral experiments and covariance data with the objective to both give quantitative indications on possible improvements of the ENDF evaluated data files and to reduce at the same time crucial reactor design parameter uncertainties. Methods that have been developed in the last four decades for the purposes indicated above have been improved by some new developments that benefited also by continuous exchanges with international groups working in similar areas. The major new developments that allowed significant progress are to be found in several specific domains: a) new science-based covariance data; b) integral experiment covariance data assessment and improved experiment analysis, e.g., of sample irradiation experiments; c) sensitivity analysis, where several improvements were necessary despite the generally good understanding of these techniques, e.g., to account for fission spectrum sensitivity; d) a critical approach to the analysis of statistical adjustments performance, both a priori and a posteriori; e) generalization of the assimilation method, now applied for the first time not only to multigroup cross sections data but also to nuclear model parameters (the "consistent" method). This article describes the major results obtained in each of these areas; a large scale nuclear data adjustment, based on the use of approximately one hundred high-accuracy integral experiments, will be reported along with a significant example of the application of the new "consistent" method of data assimilation.
26 CFR 1.164-1 - Deduction for taxes.
Code of Federal Regulations, 2010 CFR
2010-04-01
... the taxable year within which paid or accrued, according to the method of accounting used in computing... thereto, during the taxable year even though the taxpayer uses the accrual method of accounting for other... for the taxable year in which paid or accrued, according to the method of accounting used in computing...
7 CFR 1942.128 - Borrower accounting methods, management reports and audits.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 13 2010-01-01 2009-01-01 true Borrower accounting methods, management reports and... Rescue and Other Small Community Facilities Projects § 1942.128 Borrower accounting methods, management... under Public Law 103-354 1942-53, “Cash Flow Report,” instead of page one of schedule one and schedule...
Introducing visual participatory methods to develop local knowledge on HIV in rural South Africa.
Brooks, Chloe; D'Ambruoso, Lucia; Kazimierczak, Karolina; Ngobeni, Sizzy; Twine, Rhian; Tollman, Stephen; Kahn, Kathleen; Byass, Peter
2017-01-01
South Africa is a country faced with complex health and social inequalities, in which HIV/AIDS has had devastating impacts. The study aimed to gain insights into the perspectives of rural communities on HIV-related mortality. A participatory action research (PAR) process, inclusive of a visual participatory method (Photovoice), was initiated to elicit and organise local knowledge and to identify priorities for action in a rural subdistrict underpinned by the Agincourt Health and Socio-Demographic Surveillance System (HDSS). We convened three village-based discussion groups, presented HDSS data on HIV-related mortality, elicited subjective perspectives on HIV/AIDS, systematised these into collective accounts and identified priorities for action. Framework analysis was performed on narrative and visual data, and practice theory was used to interpret the findings. A range of social and health systems factors were identified as causes and contributors of HIV mortality. These included alcohol use/abuse, gender inequalities, stigma around disclosure of HIV status, problems with informal care, poor sanitation, harmful traditional practices, delays in treatment, problems with medications and problematic staff-patient relationships. To address these issues, developing youth facilities in communities, improving employment opportunities, timely treatment and extending community outreach for health education and health promotion were identified. Addressing social practices of blame, stigma and mistrust around HIV-related mortality may be a useful focus for policy and planning. Research that engages communities and authorities to coproduce evidence can capture these practices, improve communication and build trust. Actions to reduce HIV should go beyond individual agency and structural forces to focus on how social practices embody these elements. Initiating PAR inclusive of visual methods can build shared understandings of disease burdens in social and health systems contexts. This can develop shared accountability and improve staff-patient relationships, which, over time, may address the issues identified, here related to stigma and blame.
Wertz, Marcia Stanley; Nosek, Marcianna; McNiesh, Susan; Marlow, Elizabeth
2011-04-12
This paper illustrates the use of composite first person narrative interpretive methods, as described by Todres, across a range of phenomena. This methodology introduces texture into the presently understood structures of phenomena and thereby creates new understandings of the phenomenon, bringing about a form of understanding that is relationally alive that contributes to improved caring practices. The method is influenced by the work of Gendlin, Heidegger, van Manen, Gadamer, and Merleau-Ponty. The method's applicability to different research topics is demonstrated through the composite narratives of nursing students learning nursing practice in an accelerated and condensed program, obese female adolescents attempting weight control, chronically ill male parolees, and midlife women experiencing distress during menopause. Within current research, these four phenomena have been predominantly described and understood through quantified articulations that give the reader a structural understanding of the phenomena, but the more embodied or "contextual" human qualities of the phenomena are often not visible. The "what is it like" or the "unsaid" aspects of such human phenomena are not clear to the reader when proxies are used to "account for" a variety of situated conditions. This novel method is employed to re-present narrative data and findings from research through first person accounts that blend the voices of the participants with those of the researcher, emphasizing the connectedness, the "we" among all participants, researchers, and listeners. These re-presentations allow readers to develop more embodied understandings of both the texture and structure of each of the phenomena and illustrate the use of the composite account as a way for researchers to better understand and convey the wholeness of the experience of any phenomenon under inquiry.
Using real options analysis to support strategic management decisions
NASA Astrophysics Data System (ADS)
Kabaivanov, Stanimir; Markovska, Veneta; Milev, Mariyan
2013-12-01
Decision making is a complex process that requires taking into consideration multiple heterogeneous sources of uncertainty. Standard valuation and financial analysis techniques often fail to properly account for all these sources of risk as well as for all sources of additional flexibility. In this paper we explore applications of a modified binomial tree method for real options analysis (ROA) in an effort to improve decision making process. Usual cases of use of real options are analyzed with elaborate study on the applications and advantages that company management can derive from their application. A numeric results based on extending simple binomial tree approach for multiple sources of uncertainty are provided to demonstrate the improvement effects on management decisions.
A Worst-Case Approach for On-Line Flutter Prediction
NASA Technical Reports Server (NTRS)
Lind, Rick C.; Brenner, Martin J.
1998-01-01
Worst-case flutter margins may be computed for a linear model with respect to a set of uncertainty operators using the structured singular value. This paper considers an on-line implementation to compute these robust margins in a flight test program. Uncertainty descriptions are updated at test points to account for unmodeled time-varying dynamics of the airplane by ensuring the robust model is not invalidated by measured flight data. Robust margins computed with respect to this uncertainty remain conservative to the changing dynamics throughout the flight. A simulation clearly demonstrates this method can improve the efficiency of flight testing by accurately predicting the flutter margin to improve safety while reducing the necessary flight time.
NASA Astrophysics Data System (ADS)
Torres, Veronica C.; Vuong, Victoria D.; Wilson, Todd; Wewel, Joshua; Byrne, Richard W.; Tichauer, Kenneth M.
2017-09-01
Nerve preservation during surgery is critical because damage can result in significant morbidity. This remains a challenge especially for skull base surgeries where cranial nerves (CNs) are involved because visualization and access are particularly poor in that location. We present a paired-agent imaging method to enhance identification of CNs using nerve-specific fluorophores. Two myelin-targeting imaging agents were evaluated, Oxazine 4 and Rhodamine 800, and coadministered with a control agent, indocyanine green, either intravenously or topically in rats. Fluorescence imaging was performed on excised brains ex vivo, and nerve contrast was evaluated via paired-agent ratiometric data analysis. Although contrast was improved among all experimental groups using paired-agent imaging compared to conventional, solely targeted imaging, Oxazine 4 applied directly exhibited the greatest enhancement, with a minimum 3 times improvement in CNs delineation. This work highlights the importance of accounting for nonspecific signal of targeted agents, and demonstrates that paired-agent imaging is one method capable of doing so. Although staining, rinsing, and imaging protocols need to be optimized, these findings serve as a demonstration for the potential use of paired-agent imaging to improve contrast of CNs, and consequently, surgical outcome.
Strategic Accountability Is Key to Making PLCs Effective
ERIC Educational Resources Information Center
Easton, Lois Brown
2017-01-01
Professional Learning Communities (PLCs) are often criticized for failing to focus on real problems of teaching and learning and for failing to deliver improvement. That is where accountability comes into play. Strategic accountability distinguishes PLCs that are effective from those that are not. Everyone knows what accountability is, but the…
A New Approach to Accountability: Creating Effective Learning Environments for Programs
ERIC Educational Resources Information Center
Surr, Wendy
2012-01-01
This article describes a new paradigm for accountability that envisions afterschool programs as learning organizations continually engaged in improving quality. Nearly 20 years into the era of results-based accountability, a new generation of afterschool accountability systems is emerging. Rather than aiming to test whether programs have produced…
Accountability, California Style: Counting or Accounting?
ERIC Educational Resources Information Center
Russell, Michael; Higgins, Jennifer; Raczek, Anastasia
2004-01-01
Across the nation and at nearly all levels of our educational system, efforts to hold schools accountable for student learning dominate strategies for improving the quality of education. At both the national and state level, student testing stands at the center of educational accountability programs, such that schools are effectively held…
34 CFR 200.12 - Single State accountability system.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 34 Education 1 2010-07-01 2010-07-01 false Single State accountability system. 200.12 Section 200... Improving Basic Programs Operated by Local Educational Agencies State Accountability System § 200.12 Single State accountability system. (a)(1) Each State must demonstrate in its State plan that the State has...
Maluka, Stephen; Kamuzora, Peter; San Sebastián, Miguel; Byskov, Jens; Ndawi, Benedict; Hurtig, Anna-Karin
2010-12-01
In 2006, researchers and decision-makers launched a five-year project - Response to Accountable Priority Setting for Trust in Health Systems (REACT) - to improve planning and priority-setting through implementing the Accountability for Reasonableness framework in Mbarali District, Tanzania. The objective of this paper is to explore the acceptability of Accountability for Reasonableness from the perspectives of the Council Health Management Team, local government officials, health workforce and members of user boards and committees. Individual interviews were carried out with different categories of actors and stakeholders in the district. The interview guide consisted of a series of questions, asking respondents to describe their perceptions regarding each condition of the Accountability for Reasonableness framework in terms of priority setting. Interviews were analysed using thematic framework analysis. Documentary data were used to support, verify and highlight the key issues that emerged. Almost all stakeholders viewed Accountability for Reasonableness as an important and feasible approach for improving priority-setting and health service delivery in their context. However, a few aspects of Accountability for Reasonableness were seen as too difficult to implement given the socio-political conditions and traditions in Tanzania. Respondents mentioned: budget ceilings and guidelines, low level of public awareness, unreliable and untimely funding, as well as the limited capacity of the district to generate local resources as the major contextual factors that hampered the full implementation of the framework in their context. This study was one of the first assessments of the applicability of Accountability for Reasonableness in health care priority-setting in Tanzania. The analysis, overall, suggests that the Accountability for Reasonableness framework could be an important tool for improving priority-setting processes in the contexts of resource-poor settings. However, the full implementation of Accountability for Reasonableness would require a proper capacity-building plan, involving all relevant stakeholders, particularly members of the community since public accountability is the ultimate aim, and it is the community that will live with the consequences of priority-setting decisions.
Aerosol Measurements in the Mid-Atlantic: Trends and Uncertainty
NASA Astrophysics Data System (ADS)
Hains, J. C.; Chen, L. A.; Taubman, B. F.; Dickerson, R. R.
2006-05-01
Elevated levels of PM2.5 are associated with cardiovascular and respiratory problems and even increased mortality rates. In 2002 we ran two commonly used PM2.5 speciation samplers (an IMPROVE sampler and an EPA sampler) in parallel at Fort Meade, Maryland (a suburban site located in the Baltimore- Washington urban corridor). The filters were analyzed at different labs. This experiment allowed us to calculate the 'real world' uncertainties associated with these instruments. The EPA method retrieved a January average PM2.5 mass of 9.3 μg/m3 with a standard deviation of 2.8 μg/m3, while the IMPROVE method retrieved an average mass of 7.3 μg/m3 with a standard deviation of 2.1 μg/m3. The EPA method retrieved a July average PM2.5 mass of 26.4 μg/m3 with a standard deviation of 14.6 μg/m3, while the IMPROVE method retrieved an average mass of 23.3 μg/m3 with a standard deviation of 13.0 μg/m3. We calculated a 5% uncertainty associated with the EPA and IMPROVE methods that accounts for uncertainties in flow control strategies and laboratory analysis. The RMS difference between the two methods in January was 2.1 μg/m3, which is about 25% of the monthly average mass and greater than the uncertainty we calculated. In July the RMS difference between the two methods was 5.2 μg/m3, about 20% of the monthly average mass, and greater than the uncertainty we calculated. The EPA methods retrieve consistently higher concentrations of PM2.5 than the IMPROVE methods on a daily basis in January and July. This suggests a systematic bias possibly resulting from contamination of either of the sampling methods. We reconstructed the mass and found that both samplers have good correlation between reconstructed and gravimetric mass, though the IMPROVE method has slightly better correlation than the EPA method. In January, organic carbon is the largest contributor to PM2.5 mass, and in July both sulfate and organic matter contribute substantially to PM2.5. Source apportionment models suggest that regional and local power plants are the major sources of sulfate, while mobile and vegetative burning factors are the major sources of organic carbon.
Molyneux, Sassy; Atela, Martin; Angwenyi, Vibian; Goodman, Catherine
2012-01-01
Public accountability has re-emerged as a top priority for health systems all over the world, and particularly in developing countries where governments have often failed to provide adequate public sector services for their citizens. One approach to strengthening public accountability is through direct involvement of clients, users or the general public in health delivery, here termed ‘community accountability’. The potential benefits of community accountability, both as an end in itself and as a means of improving health services, have led to significant resources being invested by governments and non-governmental organizations. Data are now needed on the implementation and impact of these initiatives on the ground. A search of PubMed using a systematic approach, supplemented by a hand search of key websites, identified 21 papers from low- or middle-income countries describing at least one measure to enhance community accountability that was linked with peripheral facilities. Mechanisms covered included committees and groups (n = 19), public report cards (n = 1) and patients’ rights charters (n = 1). In this paper we summarize the data presented in these papers, including impact, and factors influencing impact, and conclude by commenting on the methods used, and the issues they raise. We highlight that the international interest in community accountability mechanisms linked to peripheral facilities has not been matched by empirical data, and present a conceptual framework and a set of ideas that might contribute to future studies. PMID:22279082
Data for Improvement, Data for Accountability
ERIC Educational Resources Information Center
Weiss, Janet A.
2012-01-01
This commentary on the special issue on data use highlights the distinctions between data systems intended to improve the performance of school staff and those intended to hold schools and districts accountable for outcomes. It advises researchers to be alert to the differences in the policy logics connected with each approach.
A blue/green water-based accounting framework for assessment of water security
NASA Astrophysics Data System (ADS)
Rodrigues, Dulce B. B.; Gupta, Hoshin V.; Mendiondo, Eduardo M.
2014-09-01
A comprehensive assessment of water security can incorporate several water-related concepts, while accounting for Blue and Green Water (BW and GW) types defined in accordance with the hydrological processes involved. Here we demonstrate how a quantitative analysis of provision probability and use of BW and GW can be conducted, so as to provide indicators of water scarcity and vulnerability at the basin level. To illustrate the approach, we use the Soil and Water Assessment Tool (SWAT) to model the hydrology of an agricultural basin (291 km2) within the Cantareira Water Supply System in Brazil. To provide a more comprehensive basis for decision making, we analyze the BW and GW-Footprint components against probabilistic levels (50th and 30th percentile) of freshwater availability for human activities, during a 23 year period. Several contrasting situations of BW provision are distinguished, using different hydrological-based methodologies for specifying monthly Environmental Flow Requirements (EFRs), and the risk of natural EFR violation is evaluated by use of a freshwater provision index. Our results reveal clear spatial and temporal patterns of water scarcity and vulnerability levels within the basin. Taking into account conservation targets for the basin, it appears that the more restrictive EFR methods are more appropriate than the method currently employed at the study basin. The blue/green water-based accounting framework developed here provides a useful integration of hydrologic, ecosystem and human needs information on a monthly basis, thereby improving our understanding of how and where water-related threats to human and aquatic ecosystem security can arise.
NASA Astrophysics Data System (ADS)
Medjkoune, Sofiane; Mouchère, Harold; Petitrenaud, Simon; Viard-Gaudin, Christian
2013-01-01
The work reported in this paper concerns the problem of mathematical expressions recognition. This task is known to be a very hard one. We propose to alleviate the difficulties by taking into account two complementary modalities. The modalities referred to are handwriting and audio ones. To combine the signals coming from both modalities, various fusion methods are explored. Performances evaluated on the HAMEX dataset show a significant improvement compared to a single modality (handwriting) based system.
Code of Federal Regulations, 2010 CFR
2010-04-01
... the taxpayer is changed to a method proper under the accrual method of accounting, then the taxpayer may elect to have such change treated as not a change in method of accounting to which the provisions... recomputed under a proper method of accounting for dealer reserve income for each taxable year to which the...
Vision improvement in pilots with presbyopia following perceptual learning.
Sterkin, Anna; Levy, Yuval; Pokroy, Russell; Lev, Maria; Levian, Liora; Doron, Ravid; Yehezkel, Oren; Fried, Moshe; Frenkel-Nir, Yael; Gordon, Barak; Polat, Uri
2017-11-24
Israeli Air Force (IAF) pilots continue flying combat missions after the symptoms of natural near-vision deterioration, termed presbyopia, begin to be noticeable. Because modern pilots rely on the displays of the aircraft control and performance instruments, near visual acuity (VA) is essential in the cockpit. We aimed to apply a method previously shown to improve visual performance of presbyopes, and test whether presbyopic IAF pilots can overcome the limitation imposed by presbyopia. Participants were selected by the IAF aeromedical unit as having at least initial presbyopia and trained using a structured personalized perceptual learning method (GlassesOff application), based on detecting briefly presented low-contrast Gabor stimuli, under the conditions of spatial and temporal constraints, from a distance of 40 cm. Our results show that despite their initial visual advantage over age-matched peers, training resulted in robust improvements in various basic visual functions, including static and temporal VA, stereoacuity, spatial crowding, contrast sensitivity and contrast discrimination. Moreover, improvements generalized to higher-level tasks, such as sentence reading and aerial photography interpretation (specifically designed to reflect IAF pilots' expertise in analyzing noisy low-contrast input). In concert with earlier suggestions, gains in visual processing speed are plausible to account, at least partially, for the observed training-induced improvements. Copyright © 2017 Elsevier Ltd. All rights reserved.
Code of Federal Regulations, 2011 CFR
2011-01-01
... liability account balances at a failed insured depository institution. 360.8 Section 360.8 Banks and Banking... RECEIVERSHIP RULES § 360.8 Method for determining deposit and other liability account balances at a failed... FDIC will use to determine deposit and other liability account balances for insurance coverage and...
Code of Federal Regulations, 2010 CFR
2010-01-01
... liability account balances at a failed insured depository institution. 360.8 Section 360.8 Banks and Banking... RECEIVERSHIP RULES § 360.8 Method for determining deposit and other liability account balances at a failed... FDIC will use to determine deposit and other liability account balances for insurance coverage and...
26 CFR 1.451-5 - Advance payments for goods and long-term contracts.
Code of Federal Regulations, 2010 CFR
2010-04-01
... accounting for tax purposes if such method results in including advance payments in gross receipts no later... the case of a taxpayer accounting for advance payments for tax purposes pursuant to a long-term contract method of accounting under § 1.460-4, or of a taxpayer accounting for advance payments with...
ERIC Educational Resources Information Center
Abeysekera, Indra
2015-01-01
The role of work-integrated learning in student preferences of instructional methods is largely unexplored across the accounting curriculum. This study conducted six experiments to explore student preferences of instructional methods for learning, in six courses of the accounting curriculum that differed in algorithmic rigor, in the context of a…
Off-Angle Iris Correction Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Santos-Villalobos, Hector J; Thompson, Joseph T; Karakaya, Mahmut
In many real world iris recognition systems obtaining consistent frontal images is problematic do to inexperienced or uncooperative users, untrained operators, or distracting environments. As a result many collected images are unusable by modern iris matchers. In this chapter we present four methods for correcting off-angle iris images to appear frontal which makes them compatible with existing iris matchers. The methods include an affine correction, a retraced model of the human eye, measured displacements, and a genetic algorithm optimized correction. The affine correction represents a simple way to create an iris image that appears frontal but it does not accountmore » for refractive distortions of the cornea. The other method account for refraction. The retraced model simulates the optical properties of the cornea. The other two methods are data driven. The first uses optical flow to measure the displacements of the iris texture when compared to frontal images of the same subject. The second uses a genetic algorithm to learn a mapping that optimizes the Hamming Distance scores between off-angle and frontal images. In this paper we hypothesize that the biological model presented in our earlier work does not adequately account for all variations in eye anatomy and therefore the two data-driven approaches should yield better performance. Results are presented using the commercial VeriEye matcher that show that the genetic algorithm method clearly improves over prior work and makes iris recognition possible up to 50 degrees off-angle.« less
Arnold, Suzanne V.; Lei, Yang; Reynolds, Matthew R.; Magnuson, Elizabeth A.; Suri, Rakesh M.; Tuzcu, E. Murat; Petersen, John L.; Douglas, Pamela S.; Svensson, Lars G.; Gada, Hemal; Thourani, Vinod H.; Kodali, Susheel K.; Mack, Michael J.; Leon, Martin B.; Cohen, David J.
2014-01-01
Background In patients with severe aortic stenosis, transcatheter aortic valve replacement (TAVR) improves survival compared with nonsurgical therapy but with higher in-hospital and lifetime costs. Complications associated with TAVR may decrease with greater experience and improved devices, thereby reducing the overall cost of the procedure. Therefore, we sought to estimate the impact of peri-procedural complications on in-hospital costs and length of stay of TAVR. Methods and Results Using detailed cost data from 406 TAVR patients enrolled in the PARTNER I trial, we developed multivariable models to estimate the incremental cost and length of stay associated with specific peri-procedural complications. Attributable costs and length of stay for each complication were calculated by multiplying the independent cost of each event by its frequency in the treatment group. Mean cost for the initial hospitalization was $79,619 ± 40,570 ($50,891 excluding the valve); 49% of patients had ≥1 complication. Seven complications were independently associated with increased hospital costs, with major bleeding, arrhythmia and death accounting for the largest attributable cost per patient. Renal failure and the need for repeat TAVR, although less frequent, were also associated with substantial incremental and attributable costs. Overall, complications accounted for $12,475/patient in initial hospital costs and 2.4 days of hospitalization. Conclusion In the PARTNER trial, peri-procedural complications were frequent, costly, and accounted for approximately 25% of non-implant related hospital costs. Avoidance of complications should improve the cost-effectiveness of TAVR for inoperable and high-risk patients, but reductions in the cost of uncomplicated TAVR will also be necessary for optimal efficiency. PMID:25336467
Case management redesign in an urban facility.
Almaden, Stefany; Freshman, Brenda; Quaye, Beverly
2011-01-01
To explore strategies for improving patient throughput and to redesign case management processes to facilitate level of care transitions and safe discharges. Large Urban Medical Center in South Los Angeles County, with 384 licensed beds that services poor, underserved communities. Both qualitative and quantitative methods were applied. Combined theoretical frameworks were used for needs assessment, intervention strategies, and change management. Observations, interviews, surveys, and database extraction methods were used. The sample consisted of case management staff members and several other staff from nursing, social work, and emergency department staff. Postintervention measures indicated improvement in reimbursements for services, reduction in length of stay, increased productivity, improved patients' access to care, and avoiding unnecessary readmission or emergency department visits. Effective change management strategies must consider multiple factors that influence daily operations and service delivery. Creating accountability by using performance measures associated with patient transitions is highlighted by the case study results. The authors developed a process model to assist in identifying and tracking outcome measures related to patient throughput, front-end assessments, and effective patient care transitions. This model can be used in future research to further investigate best case management practices.
Exploiting domain information for Word Sense Disambiguation of medical documents.
Stevenson, Mark; Agirre, Eneko; Soroa, Aitor
2012-01-01
Current techniques for knowledge-based Word Sense Disambiguation (WSD) of ambiguous biomedical terms rely on relations in the Unified Medical Language System Metathesaurus but do not take into account the domain of the target documents. The authors' goal is to improve these methods by using information about the topic of the document in which the ambiguous term appears. The authors proposed and implemented several methods to extract lists of key terms associated with Medical Subject Heading terms. These key terms are used to represent the document topic in a knowledge-based WSD system. They are applied both alone and in combination with local context. A standard measure of accuracy was calculated over the set of target words in the widely used National Library of Medicine WSD dataset. The authors report a significant improvement when combining those key terms with local context, showing that domain information improves the results of a WSD system based on the Unified Medical Language System Metathesaurus alone. The best results were obtained using key terms obtained by relevance feedback and weighted by inverse document frequency.
Exploiting domain information for Word Sense Disambiguation of medical documents
Agirre, Eneko; Soroa, Aitor
2011-01-01
Objective Current techniques for knowledge-based Word Sense Disambiguation (WSD) of ambiguous biomedical terms rely on relations in the Unified Medical Language System Metathesaurus but do not take into account the domain of the target documents. The authors' goal is to improve these methods by using information about the topic of the document in which the ambiguous term appears. Design The authors proposed and implemented several methods to extract lists of key terms associated with Medical Subject Heading terms. These key terms are used to represent the document topic in a knowledge-based WSD system. They are applied both alone and in combination with local context. Measurements A standard measure of accuracy was calculated over the set of target words in the widely used National Library of Medicine WSD dataset. Results and discussion The authors report a significant improvement when combining those key terms with local context, showing that domain information improves the results of a WSD system based on the Unified Medical Language System Metathesaurus alone. The best results were obtained using key terms obtained by relevance feedback and weighted by inverse document frequency. PMID:21900701
Automated Fall Detection With Quality Improvement “Rewind” to Reduce Falls in Hospital Rooms
Rantz, Marilyn J.; Banerjee, Tanvi S.; Cattoor, Erin; Scott, Susan D.; Skubic, Marjorie; Popescu, Mihail
2014-01-01
The purpose of this study was to test the implementation of a fall detection and “rewind” privacy-protecting technique using the Microsoft® Kinect™ to not only detect but prevent falls from occurring in hospitalized patients. Kinect sensors were placed in six hospital rooms in a step-down unit and data were continuously logged. Prior to implementation with patients, three researchers performed a total of 18 falls (walking and then falling down or falling from the bed) and 17 non-fall events (crouching down, stooping down to tie shoe laces, and lying on the floor). All falls and non-falls were correctly identified using automated algorithms to process Kinect sensor data. During the first 8 months of data collection, processing methods were perfected to manage data and provide a “rewind” method to view events that led to falls for post-fall quality improvement process analyses. Preliminary data from this feasibility study show that using the Microsoft Kinect sensors provides detection of falls, fall risks, and facilitates quality improvement after falls in real hospital environments unobtrusively, while taking into account patient privacy. PMID:24296567
Pomeroy, Linda; Burnett, Susan; Anderson, Janet E; Fulop, Naomi J
2017-01-01
Background Health systems worldwide are increasingly holding boards of healthcare organisations accountable for the quality of care that they provide. Previous empirical research has found associations between certain board practices and higher quality patient care; however, little is known about how boards govern for quality improvement (QI). Methods We conducted fieldwork over a 30-month period in 15 healthcare provider organisations in England as part of a wider evaluation of a board-level organisational development intervention. Our data comprised board member interviews (n=65), board meeting observations (60 hours) and documents (30 sets of board meeting papers, 15 board minutes and 15 Quality Accounts). We analysed the data using a framework developed from existing evidence of links between board practices and quality of care. We mapped the variation in how boards enacted governance of QI and constructed a measure of QI governance maturity. We then compared organisations to identify the characteristics of those with mature QI governance. Results We found that boards with higher levels of maturity in relation to governing for QI had the following characteristics: explicitly prioritising QI; balancing short-term (external) priorities with long-term (internal) investment in QI; using data for QI, not just quality assurance; engaging staff and patients in QI; and encouraging a culture of continuous improvement. These characteristics appeared to be particularly enabled and facilitated by board-level clinical leaders. Conclusions This study contributes to a deeper understanding of how boards govern for QI. The identified characteristics of organisations with mature QI governance seemed to be enabled by active clinical leadership. Future research should explore the biographies, identities and work practices of board-level clinical leaders and their role in organisation-wide QI. PMID:28689191
Serial Sonographic Assessment of Pulmonary Edema in Patients With Hypertensive Acute Heart Failure.
Martindale, Jennifer L; Secko, Michael; Kilpatrick, John F; deSouza, Ian S; Paladino, Lorenzo; Aherne, Andrew; Mehta, Ninfa; Conigiliaro, Alyssa; Sinert, Richard
2018-02-01
Objective measures of clinical improvement in patients with acute heart failure (AHF) are lacking. The aim of this study was to determine whether repeated lung sonography could semiquantitatively capture changes in pulmonary edema (B-lines) in patients with hypertensive AHF early in the course of treatment. We conducted a feasibility study in a cohort of adults with acute onset of dyspnea, severe hypertension in the field or at triage (systolic blood pressure ≥ 180 mm Hg), and a presumptive diagnosis of AHF. Patients underwent repeated dyspnea and lung sonographic assessments using a 10-cm visual analog scale (VAS) and an 8-zone scanning protocol. Lung sonographic assessments were performed at the time of triage, initial VAS improvement, and disposition from the emergency department. Sonographic pulmonary edema was independently scored offline in a randomized and blinded fashion by using a scoring method that accounted for both the sum of discrete B-lines and degree of B-line fusion. Sonographic pulmonary edema scores decreased significantly from initial to final sonographic assessments (P < .001). The median percentage decrease among the 20 included patient encounters was 81% (interquartile range, 55%-91%). Although sonographic pulmonary edema scores correlated with VAS scores (ρ = 0.64; P < .001), the magnitude of the change in these scores did not correlate with each other (ρ = -0.04; P = .89). Changes in sonographic pulmonary edema can be semiquantitatively measured by serial 8-zone lung sonography using a scoring method that accounts for B-line fusion. Sonographic pulmonary edema improves in patients with hypertensive AHF during the initial hours of treatment. © 2017 by the American Institute of Ultrasound in Medicine.
NASA Astrophysics Data System (ADS)
Barbieri, L.; Adair, C.; Galford, G. L.; Wyngaard, J.
2017-12-01
We present on a full season of low-cost sUAS agricultural monitoring for improved GHG emissions accounting and mitigation. Agriculture contributes 10-12% of global anthropogenic GHG emissions, and roughly half are from agricultural soils. A variety of land management strategies can be implemented to reduce GHG emissions, but agricultural lands are complex and heterogenous. Nutrient cycling processes that ultimately regulate GHG emission rates are affected by environmental and management dynamics that vary spatially and temporally (e.g. soil properties, manure spreading). Thus, GHG mitigation potential is also variable, and determining best practices for mitigation is challenging, especially considering potential conflicting pressure to manage agricultural lands for other objectives (e.g. decrease agricultural runoff). Monitoring complexity from agricultural lands is critical for regional GHG accounting and decision making, but current methods (e.g., static chambers) are time intensive, expensive, and use in-situ equipment. These methods lack the spatio-temporal flexibility necessary to reduce the high uncertainty in regional emissions estimates, while traditional remote sensing methods often do not provide adequate spatio-temporal resolution for robust field-level monitoring. Small Unmanned Aerial Systems (sUAS) provide the range and the rapid response data collection needed to monitor key variables on the landscape (imagery) and from the atmosphere (CO2 concentrations), and can provide ways to bridge between in-situ and remote sensing data. Initial results show good agreement between sUAS CO2 sensors with more traditional equipment, and at a fraction of the cost. We present results from test flights over managed agricultural landscapes in Vermont, showcasing capabilities from both sUAS imagery and atmospheric data collected from on-board sensors (CO2, PTH). We then compare results from two different in-flight data collection methods: Vertical Profile and Horizontal Surveys. We conclude with results from the integration of these sUAS data with concurrently collected in-field measurements from static chambers and Landsat imagery, demonstrating enhanced understanding of agricultural landscapes and improved GHG emissions monitoring with the addition of sUAS collected data.
Code of Federal Regulations, 2010 CFR
2010-01-01
... methods and principles of accounting prescribed by the state regulatory body having jurisdiction over the... telecommunications companies (47 CFR part 32), as those methods and principles of accounting are supplemented from... instruments by prescribing accounting principles, methodologies, and procedures applicable to all...
Attitudes and Opinions of Canadian Nephrologists Toward Continuous Quality Improvement Options.
Iskander, Carina; McQuillan, Rory; Nesrallah, Gihad; Rabbat, Christian; Mendelssohn, David C
2017-01-01
A shift to holding individual physicians accountable for patient outcomes, rather than facilities, is intuitively attractive to policy makers and to the public. We were interested in nephrologists' attitudes to, and awareness of, quality metrics and how nephrologists would view a potential switch from the current model of facility-based quality measurement and reporting to publically available reports at the individual physician level. The study was conducted using a web-based survey instrument (Online Appendix 1). The survey was initially pilot tested on a group of 8 nephrologists from across Canada. The survey was then finalized and e-mailed to 330 nephrologists through the Canadian Society of Nephrology (CSN) e-mail distribution list. The 127 respondents were 80% university based, and 33% were medical/dialysis directors. The response rate was 43%. Results demonstrate that 89% of Canadian nephrologists are engaged in efforts to improve the quality of patient care. A minority of those surveyed (29%) had training in quality improvement. They feel accountable for this and would welcome the inclusion of patient-centered metrics of care quality. Support for public reporting as an effective strategy on an individual nephrologist level was 30%. Support for public reporting of individual nephrologist performance was low. The care of nephrology patients will be best served by the continued development of a critical mass of physicians trained in patient safety and quality improvement, by focusing on patient-centered metrics of care delivery, and by validating that all proposed new methods are shown to improve patient care and outcomes.
Jha, Manish K.; Minhajuddin, Abu; Greer, Tracy L.; Carmody, Thomas; Rush, A. John; Trivedi, Madhukar H.
2018-01-01
Objective Depression symptom severity, the most commonly studied outcome in antidepressant treatment trials, accounts for only a small portion of burden related to major depression. While lost work productivity is the biggest contributor to depression’s economic burden, few studies have systematically evaluated the independent effect of treatment on work productivity and the relationship between changes in work productivity and longer-term clinical course. Method Work productivity was measured repeatedly by the Work Productivity and Activity Impairment (WPAI) self-report in 331 employed participants with major depression enrolled in the Combining Medications to Enhance Depression Outcomes (CO-MED) trial. Trajectories of change in work productivity during the first 6 weeks of treatment were identified and used to predict remission at 3 and 7 months. Results Participants reported reduced absence from work and increased work productivity with antidepressant treatment even after controlling for changes in depression severity. Three distinct trajectories of changes in work productivity were identified: 1) robust early improvement (24%), 2) minimal change (49%), and 3) high-impairment slight reduction (27%). As compared to other participants, those with robust improvement had 3–5 times higher remission rates at 3 months and 2–5 times higher remission rates at 7 months, even after controlling for select baseline variables and remission status at week 6. Conclusions In this secondary analysis, self-reported work productivity improved in depressed patients with antidepressant treatment even after accounting for depressive symptom reduction. Early improvement in work productivity is associated with much higher remission rates after 3 and 7 months of treatment. PMID:27523501
Gilad, Yoav; Pritchard, Jonathan K.; Stephens, Matthew
2015-01-01
Understanding global gene regulation depends critically on accurate annotation of regulatory elements that are functional in a given cell type. CENTIPEDE, a powerful, probabilistic framework for identifying transcription factor binding sites from tissue-specific DNase I cleavage patterns and genomic sequence content, leverages the hypersensitivity of factor-bound chromatin and the information in the DNase I spatial cleavage profile characteristic of each DNA binding protein to accurately infer functional factor binding sites. However, the model for the spatial profile in this framework fails to account for the substantial variation in the DNase I cleavage profiles across different binding sites. Neither does it account for variation in the profiles at the same binding site across multiple replicate DNase I experiments, which are increasingly available. In this work, we introduce new methods, based on multi-scale models for inhomogeneous Poisson processes, to account for such variation in DNase I cleavage patterns both within and across binding sites. These models account for the spatial structure in the heterogeneity in DNase I cleavage patterns for each factor. Using DNase-seq measurements assayed in a lymphoblastoid cell line, we demonstrate the improved performance of this model for several transcription factors by comparing against the Chip-seq peaks for those factors. Finally, we explore the effects of DNase I sequence bias on inference of factor binding using a simple extension to our framework that allows for a more flexible background model. The proposed model can also be easily applied to paired-end ATAC-seq and DNase-seq data. msCentipede, a Python implementation of our algorithm, is available at http://rajanil.github.io/msCentipede. PMID:26406244
Raj, Anil; Shim, Heejung; Gilad, Yoav; Pritchard, Jonathan K; Stephens, Matthew
2015-01-01
Understanding global gene regulation depends critically on accurate annotation of regulatory elements that are functional in a given cell type. CENTIPEDE, a powerful, probabilistic framework for identifying transcription factor binding sites from tissue-specific DNase I cleavage patterns and genomic sequence content, leverages the hypersensitivity of factor-bound chromatin and the information in the DNase I spatial cleavage profile characteristic of each DNA binding protein to accurately infer functional factor binding sites. However, the model for the spatial profile in this framework fails to account for the substantial variation in the DNase I cleavage profiles across different binding sites. Neither does it account for variation in the profiles at the same binding site across multiple replicate DNase I experiments, which are increasingly available. In this work, we introduce new methods, based on multi-scale models for inhomogeneous Poisson processes, to account for such variation in DNase I cleavage patterns both within and across binding sites. These models account for the spatial structure in the heterogeneity in DNase I cleavage patterns for each factor. Using DNase-seq measurements assayed in a lymphoblastoid cell line, we demonstrate the improved performance of this model for several transcription factors by comparing against the Chip-seq peaks for those factors. Finally, we explore the effects of DNase I sequence bias on inference of factor binding using a simple extension to our framework that allows for a more flexible background model. The proposed model can also be easily applied to paired-end ATAC-seq and DNase-seq data. msCentipede, a Python implementation of our algorithm, is available at http://rajanil.github.io/msCentipede.
NASA Astrophysics Data System (ADS)
Barry, Stephen; O'Regan, Bernadette
2016-08-01
This study describes a new methodology to calculate Non-Methane Volatile Organic Compounds from Domestic Solvent Use including Fungicides over the period 1992-2014. Improved emissions data compiled at a much more refined level can help policy-makers develop more effective policy's to address environmental issues. However, a number of problems were found when member states attempt to use national statistics for Domestic Solvent Use including Fungicides. For instance, EMEP/EEA (2013) provides no guidance regarding which activity data should be used, resulting in emission estimates being potentially inconsistent and un-comparable. Also, previous methods and emission factors described in the EMEP/EEA (2013) guidebook do not exactly match data collected by state agencies. This makes using national statistics difficult. In addition, EMEP/EEA (2013) use broader categories than necessary (e.g. Cosmetics Aerosol/Non Aerosol) to estimate emissions while activity data is available at a more refined level scale (e.g. Personal Cleaning Products, Hair Products, Cosmetics, Deodorants and Perfumes). This can make identifying the drivers of emissions unclear. This study builds upon Tzanidakis et al. (2012) whereby it provides a method for collecting activity data from state statistics, developed country specific emission factors based on a survey of 177 Irish products and importantly, used a new method to account for the volatility of organic compounds found in commonly available domestic solvent containing products. This is the first study to account for volatility based on the characteristics of organic compounds and therefore is considered a more accurate method of accounting for emissions from this emission source. The results of this study can also be used to provide a simple method for other member parties to account for the volatility of organic compounds using sectorial adjustment factors described here. For comparison purposes, emission estimates were calculated using the Tier 1 approach currently used in the emission inventory, using activity data and emission factors unadjusted for volatility and adjusted for volatility. The unadjusted estimate is useful, because it demonstrates the failure to properly account for volatility can produce significantly over-estimated emissions from the Domestic Solvent Usage sector. Unadjusted emissions were found to be 30% lower than the EMEP/EEA (2013) Tier 1 period in 2014. Emissions were found to reduce a further 20.9% when the volatility of the organic compounds was included. This new method shows that member parties may be significantly overestimating emissions from Domestic Solvent Use including pesticides and further work should include refining organic compound content and the sectorial adjustment factor of products.
Addition by Subtraction: The Relation Between Dropout Rates and School-Level Academic Achievement
GLENNIE, ELIZABETH; BONNEAU, KARA; VANDELLEN, MICHELLE; DODGE, KENNETH A.
2013-01-01
Background/Context Efforts to improve student achievement should increase graduation rates. However, work investigating the effects of student-level accountability has consistently demonstrated that increases in the standards for high school graduation are correlated with increases in dropout rates. The most favored explanation for this finding is that high-stakes testing policies that mandate grade repetition and high school exit exams may be the tipping point for students who are already struggling academically. These extra demands may, in fact, push students out of school. Purpose/Objective/Focus This article examines two hypotheses regarding the relation between school-level accountability and dropout rates. The first posits that improvements in school performance lead to improved success for everyone. If school-level accountability systems improve a school for all students, then the proportion of students performing at grade level increases, and the dropout rate decreases. The second hypothesis posits that schools facing pressure to improve their overall accountability score may pursue this increase at the cost of other student outcomes, including dropout rate. Research Design Our approach focuses on the dynamic relation between school-level academic achievement and dropout rates over time—that is, between one year’s achievement and the subsequent year’s dropout rate, and vice versa. This article employs longitudinal data of records on all students in North Carolina public schools over an 8-year period. Analyses employ fixed-effects models clustering schools and districts within years and controls each year for school size, percentage of students who were free/reduced-price lunch eligible, percentage of students who are ethnic minorities, and locale. Findings/Results This study finds partial evidence that improvements in school-level academic performance will lead to improvements (i.e., decreases) in school-level dropout rates. Schools with improved performance saw decreased dropout rates following these successes. However, we find more evidence of a negative side of the quest for improved academic performance. When dropout rates increase, the performance composites in subsequent years increase. Conclusions/recommendations Accountability systems need to remove any indirect benefit a school may receive from increasing its dropout rate. Schools should be held accountable for those who drop out of school. Given the personal and social costs of dropping out, accountability systems need to place more emphasis on dropout prevention. Such an emphasis could encompass increasing the dropout age and having the school’s performance composite include scores of zero on end-of-grade tests for those who leave school. PMID:24013958
Mutale, Wilbroad; Vardoy-Mutale, Anne-Thora; Kachemba, Arthur; Mukendi, Roman; Clarke, Kupela; Mulenga, Dennis
2017-01-01
Background Research has shown that the modes of leadership and management may influence health outcomes. However, majority of health leaders and managers in many low-income countries are promoted on account of clinical expertise. It has been recognised that these new managers are often ill-prepared for managing complex health systems. In response to this challenge, the Zambian Ministry of Health (MoH) has developed the Governance and Management Capacity Building (GMCB) Strategic Plan (2012–2016), whose overarching goal is to improve health sector governance and create an environment that is result-oriented, accountable and transparent. This led to the introduction of a new in-service leadership and management course, which has come to be known as the Zambia Management and Leadership Academy (ZMLA). This paper presents the results of an impact evaluation of the ZMLA programme conducted in 2014. Methods This was a cross-sectional mixed method study. The study targeted health workers, stakeholders and course implementers. ZMLA trainees were targeted to gain perspectives on the extent to which the programme affected levels of self-confidence resulting from knowledge gained. Perspectives were sought from both ZMLA and non ZMLA trainees to measure changes in the work environment. Stakeholder perspectives were collected from trainers and key informants involved in providing ZMLA training. Results On average, knowledge levels increased by 38% after each workshop. A comparison of the average self-rated scores from 444 management and leadership survey responses before ZMLA and after ZMLA training showed a significant increase in the proportion of participants that felt adequately trained to undertake management and leadership, from 63% (before) to 99% (after) in phase 1 and 43% (before) to 98% (after) in the phase II cohort. The calculated before and after percentage change for work environment themes ranged from 5.8% to 13.4%. Majority of respondents perceived improvements in the workplace environment, especially in handling human resource management matters. The smallest improvement was noted in ethics and accountability. Qualitative interviews showed improvements in the meeting culture and a greater appreciation for the importance of meetings. Shared vision, teamwork and coordination seemed to have improved more in work places where the overall manager had received ZMLA training. Conclusion Leadership and management training will be a key ingredient in health system strengthening in low-income settings. The ZMLA model was found to be acceptable and effective in improving knowledge and skills for health system managers with minimal disruption to health services. PMID:28742853
Narrative methods in quality improvement research
Greenhalgh, T; Russell, J; Swinglehurst, D
2005-01-01
This paper reviews and critiques the different approaches to the use of narrative in quality improvement research. The defining characteristics of narrative are chronology (unfolding over time); emplotment (the literary juxtaposing of actions and events in an implicitly causal sequence); trouble (that is, harm or the risk of harm); and embeddedness (the personal story nests within a particular social, historical and organisational context). Stories are about purposeful action unfolding in the face of trouble and, as such, have much to offer quality improvement researchers. But the quality improvement report (a story about efforts to implement change), which is common, must be distinguished carefully from narrative based quality improvement research (focused systematic enquiry that uses narrative methods to generate new knowledge), which is currently none. We distinguish four approaches to the use of narrative in quality improvement research—narrative interview; naturalistic story gathering; organisational case study; and collective sense-making—and offer a rationale, describe how data can be collected and analysed, and discuss the strengths and limitations of each using examples from the quality improvement literature. Narrative research raises epistemological questions about the nature of narrative truth (characterised by sense-making and emotional impact rather than scientific objectivity), which has implications for how rigour should be defined (and how it might be achieved) in this type of research. We offer some provisional guidance for distinguishing high quality narrative research in a quality improvement setting from other forms of narrative account such as report, anecdote, and journalism. PMID:16326792
Benefit transfer and spatial heterogeneity of preferences for water quality improvements.
Martin-Ortega, J; Brouwer, R; Ojea, E; Berbel, J
2012-09-15
The improvement in the water quality resulting from the implementation of the EU Water Framework Directive is expected to generate substantial non-market benefits. A wide spread estimation of these benefits across Europe will require the application of benefit transfer. We use a spatially explicit valuation design to account for the spatial heterogeneity of preferences to help generate lower transfer errors. A map-based choice experiment is applied in the Guadalquivir River Basin (Spain), accounting simultaneously for the spatial distribution of water quality improvements and beneficiaries. Our results show that accounting for the spatial heterogeneity of preferences generally produces lower transfer errors. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Santare, Michael H.; Pipes, R. Byron; Beaussart, A. J.; Coffin, D. W.; Otoole, B. J.; Shuler, S. F.
1993-01-01
Flexible manufacturing methods are needed to reduce the cost of using advanced composites in structural applications. One method that allows for this is the stretch forming of long discontinuous fiber materials with thermoplastic matrices. In order to exploit this flexibility in an economical way, a thorough understanding of the relationship between manufacturing and component performance must be developed. This paper reviews some of the recent work geared toward establishing this understanding. Micromechanics models have been developed to predict the formability of the material during processing. The latest improvement of these models includes the viscoelastic nature of the matrix and comparison with experimental data. A finite element scheme is described which can be used to model the forming process. This model uses equivalent anisotropic viscosities from the micromechanics models and predicts the microstructure in the formed part. In addition, structural models have been built to account for the material property gradients that can result from the manufacturing procedures. Recent developments in this area include the analysis of stress concentrations and a failure model each accounting for the heterogeneous material fields.
Sensor fusion for antipersonnel landmine detection: a case study
NASA Astrophysics Data System (ADS)
den Breejen, Eric; Schutte, Klamer; Cremer, Frank
1999-08-01
In this paper the multi sensor fusion results obtained within the European research project GEODE are presented. The layout of the test lane and the individual sensors used are described. The implementation of the SCOOP algorithm improves the ROC curves, as the false alarm surface and the number of false alarms both are taken into account. The confidence grids, as produced by the sensor manufacturers, of the sensors are used as input for the different sensor fusion methods implemented. The multisensor fusion methods implemented are Bayes, Dempster-Shafer, fuzzy probabilities and rules. The mapping of the confidence grids to the input parameters for fusion methods is an important step. Due to limited amount of the available data the entire test lane is used for training and evaluation. All four sensor fusion methods provide better detection results than the individual sensors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Takeda, T.; Shimazu, Y.; Hibi, K.
2012-07-01
Under the R and D project to improve the modeling accuracy for the design of fast breeder reactors the authors are developing a neutronics calculation method for designing a large commercial type sodium- cooled fast reactor. The calculation method is established by taking into account the special features of the reactor such as the use of annular fuel pellet, inner duct tube in large fuel assemblies, large core. The Verification and Validation, and Uncertainty Qualification (V and V and UQ) of the calculation method is being performed by using measured data from the prototype FBR Monju. The results of thismore » project will be used in the design and analysis of the commercial type demonstration FBR, known as the Japanese Sodium fast Reactor (JSFR). (authors)« less
One-dimensional stitching interferometry assisted by a triple-beam interferometer
Xue, Junpeng; Huang, Lei; Gao, Bo; ...
2017-04-13
In this work, we proposed for stitching interferometry to use a triple-beam interferometer to measure both the distance and the tilt for all sub-apertures before the stitching process. The relative piston between two neighboring sub-apertures is then calculated by using the data in the overlapping area. Comparisons are made between our method, and the classical least-squares principle stitching method. Our method can improve the accuracy and repeatability of the classical stitching method when a large number of sub-aperture topographies are taken into account. Our simulations and experiments on flat and spherical mirrors indicate that our proposed method can decrease themore » influence of the interferometer error from the stitched result. The comparison of stitching system with Fizeau interferometry data is about 2 nm root mean squares and the repeatability is within ± 2.5 nm peak to valley.« less
Taguchi method of experimental design in materials education
NASA Technical Reports Server (NTRS)
Weiser, Martin W.
1993-01-01
Some of the advantages and disadvantages of the Taguchi Method of experimental design as applied to Materials Science will be discussed. This is a fractional factorial method that employs the minimum number of experimental trials for the information obtained. The analysis is also very simple to use and teach, which is quite advantageous in the classroom. In addition, the Taguchi loss function can be easily incorporated to emphasize that improvements in reproducibility are often at least as important as optimization of the response. The disadvantages of the Taguchi Method include the fact that factor interactions are normally not accounted for, there are zero degrees of freedom if all of the possible factors are used, and randomization is normally not used to prevent environmental biasing. In spite of these disadvantages it is felt that the Taguchi Method is extremely useful for both teaching experimental design and as a research tool, as will be shown with a number of brief examples.
Measures of Kindergarten Spelling and Their Relations to Later Spelling Performance.
Treiman, Rebecca; Kessler, Brett; Pollo, Tatiana Cury; Byrne, Brian; Olson, Richard K
2016-01-01
Learning the orthographic forms of words is important for both spelling and reading. To determine whether some methods of scoring children's early spellings predict later spelling performance better than do other methods, we analyzed data from 374 U.S. and Australian children who took a 10-word spelling test at the end of kindergarten (mean age 6 years, 2 months) and a standardized spelling test approximately two years later. Surprisingly, scoring methods that took account of phonological plausibility did not outperform methods that were based only on orthographic correctness. The scoring method that is most widely used in research with young children, which allots a certain number of points to each word and which considers both orthographic and phonological plausibility, did not rise to the top as a predictor. Prediction of Grade 2 spelling performance was improved to a small extent by considering children's tendency to reverse letters in kindergarten.
Measures of Kindergarten Spelling and Their Relations to Later Spelling Performance
Treiman, Rebecca; Kessler, Brett; Pollo, Tatiana Cury; Byrne, Brian; Olson, Richard K.
2016-01-01
Learning the orthographic forms of words is important for both spelling and reading. To determine whether some methods of scoring children’s early spellings predict later spelling performance better than do other methods, we analyzed data from 374 U.S. and Australian children who took a 10-word spelling test at the end of kindergarten (mean age 6 years, 2 months) and a standardized spelling test approximately two years later. Surprisingly, scoring methods that took account of phonological plausibility did not outperform methods that were based only on orthographic correctness. The scoring method that is most widely used in research with young children, which allots a certain number of points to each word and which considers both orthographic and phonological plausibility, did not rise to the top as a predictor. Prediction of Grade 2 spelling performance was improved to a small extent by considering children’s tendency to reverse letters in kindergarten. PMID:27761101
ERIC Educational Resources Information Center
Kane, Thomas J.; Staiger, Douglas O.; Geppert, Jeffrey
2002-01-01
The accountability debate tends to devolve into a battle between the pro-testing and anti-testing crowds. When it comes to the design of a school accountability system, the devil is truly in the details. A well-designed accountability plan may go a long way toward giving school personnel the kinds of signals they need to improve performance.…
Accountability for Learning: How Teachers and School Leaders Can Take Charge
ERIC Educational Resources Information Center
Reeves, Douglas B.
2004-01-01
Accountability. The very mention of the word strikes fear in the hearts of many teachers and school leaders, leading to confusion and panic rather than improved student achievement. Author Douglas B. Reeves explains how to transform accountability from destructive and demoralizing accounting drills into a constructive decision-making process that…
ERIC Educational Resources Information Center
Cameron, Robyn Ann; O'Leary, Conor
2015-01-01
Ethical instruction is critical in accounting education. However, does accounting ethics teaching actually instil core ethical values or simply catalogue how students should act when confronted with typical accounting ethical dilemmas? This study extends current literature by distinguishing between moral/ethical and legal/ethical matters and then…
Jha, Manish Kumar; Minhajuddin, Abu; Thase, Michael E.; Jarrett, Robin B.
2014-01-01
Background Major Depressive Disorder is common, often recurrent and/or chronic. Theoretically, assessing quality of life (QoL) in addition to the current practice of assessing depressive symptoms has the potential to offer a more comprehensive evaluation of the effects of treatment interventions and course of illness. Methods Before and after acute-phase cognitive therapy (CT), 492 patients from Continuation Phase Cognitive Therapy Relapse Prevention trial (Jarrett et al., 2013, Jarrett and Thase, 2010) completed the Quality of Life Enjoyment and Satisfaction Questionnaire (Q-LES-Q), Inventory of Depressive Symptomatology Self-report (IDS-SR) & Beck Depression Inventory (BDI); clinicians completed Hamilton Rating Scale for Depression-17-items. Repeated measures analysis of variance evaluated the improvement in QoL before/after CT and measured the effect sizes. Change analyses to assess clinical significance (Hageman and Arrindell, 1999) were conducted. Results At the end of acute-phase CT, a repeated measure analysis of variance produced a statistically significant increase in Q-LES-Q scores with effect sizes of 0.48 - 1.3; 76.9 - 91.4% patients reported clinically significant improvement. Yet, only 11 - 38.2% QoL scores normalized. An analysis of covariance showed that change in depression severity (covariates=IDS-SR, BDI) completely accounted for the improvement in Q-LES-Q scores. Limitations There were only two time points of observation; clinically significant change analyses lacked matched normal controls; and generalizability is constrained by sampling characteristics. Conclusions: Quality of life improves significantly in patients with recurrent MDD after CT; however, this improvement is completely accounted for by change in depression severity. Normalization of QoL in all patients may require targeted, additional, and/or longer treatment. PMID:25082112
Improving patient-level costing in the English and the German 'DRG' system.
Vogl, Matthias
2013-03-01
The purpose of this paper is to develop ways to improve patient-level cost apportioning (PLCA) in the English and German inpatient 'DRG' cost accounting systems, to support regulators in improving costing schemes, and to give clinicians and hospital management sophisticated tools to measure and link their management. The paper analyzes and evaluates the PLCA step in the cost accounting schemes of both countries according to the impact on the key aspects of DRG introduction: transparency and efficiency. The goal is to generate a best available PLCA standard with enhanced accuracy and managerial relevance, the main requirements of cost accounting. A best available PLCA standard in 'DRG' cost accounting uses: (1) the cost-matrix from the German system; (2) a third axis in this matrix, representing service-lines or clinical pathways; (3) a scoring system for key cost drivers with the long-term objective of time-driven activity-based costing and (4) a point of delivery separation. Both systems have elements that the other system can learn from. By combining their strengths, regulators are supported in enhancing PLCA systems, improving the accuracy of national reimbursement and the managerial relevance of inpatient cost accounting systems, in order to reduce costs in health care. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Kern, Christoph; Deutschmann, Tim; Werner, Cynthia; Sutton, A. Jeff; Elias, Tamar; Kelly, Peter J.
2012-01-01
Sulfur dioxide (SO2) is monitored using ultraviolet (UV) absorption spectroscopy at numerous volcanoes around the world due to its importance as a measure of volcanic activity and a tracer for other gaseous species. Recent studies have shown that failure to take realistic radiative transfer into account during the spectral retrieval of the collected data often leads to large errors in the calculated emission rates. Here, the framework for a new evaluation method which couples a radiative transfer model to the spectral retrieval is described. In it, absorption spectra are simulated, and atmospheric parameters are iteratively updated in the model until a best match to the measurement data is achieved. The evaluation algorithm is applied to two example Differential Optical Absorption Spectroscopy (DOAS) measurements conducted at Kilauea volcano (Hawaii). The resulting emission rates were 20 and 90% higher than those obtained with a conventional DOAS retrieval performed between 305 and 315 nm, respectively, depending on the different SO2 and aerosol loads present in the volcanic plume. The internal consistency of the method was validated by measuring and modeling SO2 absorption features in a separate wavelength region around 375 nm and comparing the results. Although additional information about the measurement geometry and atmospheric conditions is needed in addition to the acquired spectral data, this method for the first time provides a means of taking realistic three-dimensional radiative transfer into account when analyzing UV-spectral absorption measurements of volcanic SO2 plumes.
Imputation approaches for animal movement modeling
Scharf, Henry; Hooten, Mevin B.; Johnson, Devin S.
2017-01-01
The analysis of telemetry data is common in animal ecological studies. While the collection of telemetry data for individual animals has improved dramatically, the methods to properly account for inherent uncertainties (e.g., measurement error, dependence, barriers to movement) have lagged behind. Still, many new statistical approaches have been developed to infer unknown quantities affecting animal movement or predict movement based on telemetry data. Hierarchical statistical models are useful to account for some of the aforementioned uncertainties, as well as provide population-level inference, but they often come with an increased computational burden. For certain types of statistical models, it is straightforward to provide inference if the latent true animal trajectory is known, but challenging otherwise. In these cases, approaches related to multiple imputation have been employed to account for the uncertainty associated with our knowledge of the latent trajectory. Despite the increasing use of imputation approaches for modeling animal movement, the general sensitivity and accuracy of these methods have not been explored in detail. We provide an introduction to animal movement modeling and describe how imputation approaches may be helpful for certain types of models. We also assess the performance of imputation approaches in two simulation studies. Our simulation studies suggests that inference for model parameters directly related to the location of an individual may be more accurate than inference for parameters associated with higher-order processes such as velocity or acceleration. Finally, we apply these methods to analyze a telemetry data set involving northern fur seals (Callorhinus ursinus) in the Bering Sea. Supplementary materials accompanying this paper appear online.
Dowdy, David W; Pai, Madhukar
2012-11-01
Epidemiology occupies a unique role as a knowledge-generating scientific discipline with roots in the knowledge translation of public health practice. As our fund of incompletely-translated knowledge expands and as budgets for health research contract, epidemiology must rediscover and adapt its historical skill set in knowledge translation. The existing incentive structures of academic epidemiology - designed largely for knowledge generation - are ill-equipped to train and develop epidemiologists as knowledge translators. A useful heuristic is the epidemiologist as Accountable Health Advocate (AHA) who enables society to judge the value of research, develops new methods to translate existing knowledge into improved health, and actively engages with policymakers and society. Changes to incentive structures could include novel funding streams (and review), alternative publication practices, and parallel frameworks for professional advancement and promotion.
Applicability of LET to single events in microelectronic structures
NASA Astrophysics Data System (ADS)
Xapsos, Michael A.
1992-12-01
LET is often used as a single parameter to determine the energy deposited in a microelectronic structure by a single event. The accuracy of this assumption is examined for ranges of ion energies and volumes of silicon appropriate for modern microelectronics. It is shown to be accurate only under very restricted conditions. Significant differences arise because (1) LET is related to energy lost by the ion, not energy deposited in the volume; and (2) LET is an average value and does not account for statistical variations in energy deposition. Criteria are suggested for determining when factors other than LET should be considered, and new analytical approaches are presented to account for them. One implication of these results is that improvements can be made in space upset rate predictions by incorporating the new methods into currently used codes such as CREME and CRUP.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Borowik, Piotr, E-mail: pborow@poczta.onet.pl; Thobel, Jean-Luc, E-mail: jean-luc.thobel@iemn.univ-lille1.fr; Adamowicz, Leszek, E-mail: adamo@if.pw.edu.pl
Standard computational methods used to take account of the Pauli Exclusion Principle into Monte Carlo (MC) simulations of electron transport in semiconductors may give unphysical results in low field regime, where obtained electron distribution function takes values exceeding unity. Modified algorithms were already proposed and allow to correctly account for electron scattering on phonons or impurities. Present paper extends this approach and proposes improved simulation scheme allowing including Pauli exclusion principle for electron–electron (e–e) scattering into MC simulations. Simulations with significantly reduced computational cost recreate correct values of the electron distribution function. Proposed algorithm is applied to study transport propertiesmore » of degenerate electrons in graphene with e–e interactions. This required adapting the treatment of e–e scattering in the case of linear band dispersion relation. Hence, this part of the simulation algorithm is described in details.« less
The Choreography of Accountability
ERIC Educational Resources Information Center
Webb, P. Taylor
2006-01-01
The prevailing performance discourse in education claims school improvements can be achieved through transparent accountability procedures. The article identifies how teachers generate performances of their work in order to satisfy accountability demands. By identifying sources of teachers' knowledge that produce choreographed performances, I…
47 CFR 32.2680 - Amortizable tangible assets.
Code of Federal Regulations, 2010 CFR
2010-10-01
... SYSTEM OF ACCOUNTS FOR TELECOMMUNICATIONS COMPANIES Instructions for Balance Sheet Accounts § 32.2680... acquired under capital leases and the original cost of leasehold improvements of the type of character required of Class A companies in Accounts 2681 and 2682. ...
Headey, Derek; Frongillo, Edward A; Tran, Lan Mai; Rawat, Rahul; Ruel, Marie T; Menon, Purnima
2017-01-01
Background: Child linear growth sometimes improves in both intervention and comparison groups in evaluations of nutrition interventions, possibly because of spillover intervention effects to nonintervention areas or improvements in underlying determinants of nutritional change in both areas. Objective: We aimed to understand what changes in underlying socioeconomic characteristics and behavioral factors are important in explaining improvements in child linear growth. Methods: Baseline (2010) and endline (2014) surveys from the Alive & Thrive impact evaluation were used to identify the underlying determinants of height-for-age z scores (HAZs) among children aged 24–48 mo in Bangladesh (n = 4311) and 24–59 mo in Vietnam (n = 4002). Oaxaca-Blinder regression decompositions were used to examine which underlying determinants contributed to HAZ changes over time. Results: HAZs improved significantly between 2010 and 2014 in Bangladesh (∼0.18 SDs) and Vietnam (0.25 SDs). Underlying determinants improved substantially over time and were larger in Vietnam than in Bangladesh. Multiple regression models revealed significant associations between changes in HAZs and socioeconomic status (SES), food security, maternal education, hygiene, and birth weight in both countries. Changes in HAZs were significantly associated with maternal nutrition knowledge and child dietary diversity in Bangladesh, and with prenatal visits in Vietnam. Improvements in maternal nutrition knowledge in Bangladesh accounted for 20% of the total HAZ change, followed by maternal education (13%), SES (12%), hygiene (10%), and food security (9%). HAZ improvements in Vietnam were accounted for by changes in SES (26%), prenatal visits (25%), hygiene (19%), child birth weight (10%), and maternal education (7%). The decomposition models in both countries performed well, explaining >75% of the HAZ changes. Conclusions: Decomposition is a useful and simple technique for analyzing nonintervention drivers of nutritional change in intervention and comparison areas. Improvements in underlying determinants explained rapid improvements in HAZs between 2010 and 2014 in Bangladesh and Vietnam. PMID:28122930
Nguyen, Phuong Hong; Headey, Derek; Frongillo, Edward A; Tran, Lan Mai; Rawat, Rahul; Ruel, Marie T; Menon, Purnima
2017-03-01
Background: Child linear growth sometimes improves in both intervention and comparison groups in evaluations of nutrition interventions, possibly because of spillover intervention effects to nonintervention areas or improvements in underlying determinants of nutritional change in both areas. Objective: We aimed to understand what changes in underlying socioeconomic characteristics and behavioral factors are important in explaining improvements in child linear growth. Methods: Baseline (2010) and endline (2014) surveys from the Alive & Thrive impact evaluation were used to identify the underlying determinants of height-for-age z scores (HAZs) among children aged 24-48 mo in Bangladesh ( n = 4311) and 24-59 mo in Vietnam ( n = 4002). Oaxaca-Blinder regression decompositions were used to examine which underlying determinants contributed to HAZ changes over time. Results: HAZs improved significantly between 2010 and 2014 in Bangladesh (∼0.18 SDs) and Vietnam (0.25 SDs). Underlying determinants improved substantially over time and were larger in Vietnam than in Bangladesh. Multiple regression models revealed significant associations between changes in HAZs and socioeconomic status (SES), food security, maternal education, hygiene, and birth weight in both countries. Changes in HAZs were significantly associated with maternal nutrition knowledge and child dietary diversity in Bangladesh, and with prenatal visits in Vietnam. Improvements in maternal nutrition knowledge in Bangladesh accounted for 20% of the total HAZ change, followed by maternal education (13%), SES (12%), hygiene (10%), and food security (9%). HAZ improvements in Vietnam were accounted for by changes in SES (26%), prenatal visits (25%), hygiene (19%), child birth weight (10%), and maternal education (7%). The decomposition models in both countries performed well, explaining >75% of the HAZ changes. Conclusions: Decomposition is a useful and simple technique for analyzing nonintervention drivers of nutritional change in intervention and comparison areas. Improvements in underlying determinants explained rapid improvements in HAZs between 2010 and 2014 in Bangladesh and Vietnam.
Improved modeling of clinical data with kernel methods.
Daemen, Anneleen; Timmerman, Dirk; Van den Bosch, Thierry; Bottomley, Cecilia; Kirk, Emma; Van Holsbeke, Caroline; Valentin, Lil; Bourne, Tom; De Moor, Bart
2012-02-01
Despite the rise of high-throughput technologies, clinical data such as age, gender and medical history guide clinical management for most diseases and examinations. To improve clinical management, available patient information should be fully exploited. This requires appropriate modeling of relevant parameters. When kernel methods are used, traditional kernel functions such as the linear kernel are often applied to the set of clinical parameters. These kernel functions, however, have their disadvantages due to the specific characteristics of clinical data, being a mix of variable types with each variable its own range. We propose a new kernel function specifically adapted to the characteristics of clinical data. The clinical kernel function provides a better representation of patients' similarity by equalizing the influence of all variables and taking into account the range r of the variables. Moreover, it is robust with respect to changes in r. Incorporated in a least squares support vector machine, the new kernel function results in significantly improved diagnosis, prognosis and prediction of therapy response. This is illustrated on four clinical data sets within gynecology, with an average increase in test area under the ROC curve (AUC) of 0.023, 0.021, 0.122 and 0.019, respectively. Moreover, when combining clinical parameters and expression data in three case studies on breast cancer, results improved overall with use of the new kernel function and when considering both data types in a weighted fashion, with a larger weight assigned to the clinical parameters. The increase in AUC with respect to a standard kernel function and/or unweighted data combination was maximum 0.127, 0.042 and 0.118 for the three case studies. For clinical data consisting of variables of different types, the proposed kernel function--which takes into account the type and range of each variable--has shown to be a better alternative for linear and non-linear classification problems. Copyright © 2011 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Hosal-Akman, Nazli; Simga-Mugan, Can
2010-01-01
This study explores the effect of teaching methods on the academic performance of students in accounting courses. The study was carried out over two semesters at a well-known university in Turkey in principles of financial accounting and managerial accounting courses. Students enrolled in the courses were assigned to treatment and control groups.…
ERIC Educational Resources Information Center
Prichard Committee for Academic Excellence, Lexington, KY.
As part of a Kentucky effort to improve public education and to increase taxpayer confidence in expanded educational funding, a citizen committee examined and made recommendations on approaches to school accountability and assessment, particularly student testing in the Kentucky Instructional Results Information System (KIRIS). The recommendations…
Getting tough with home health accounts receivable.
Cantone, L; Bullock, A
2000-04-01
Home health organizations seeking to improve the performance of their accounts receivable should concentrate first on establishing practices that promote the greatest percentage of clean claims and reviewing payments received to be sure rates paid are optimal. They also may find it worthwhile to survey their business office operations for potential improvements to key areas.
Toma, Madalina; Dreischulte, Tobias; Gray, Nicola M; Campbell, Diane; Guthrie, Bruce
2018-07-01
As quality improvement (QI) programmes have become progressively larger scale, the risks of implementation having unintended consequences are increasingly recognised. More routine use of balancing measures to monitor unintended consequences has been proposed to evaluate overall effectiveness, but in practice published improvement interventions hardly ever report identification or measurement of consequences other than intended goals of improvement. We conducted 15 semistructured interviews and two focus groups with 24 improvement experts to explore the current understanding of balancing measures in QI and inform a more balanced accounting of the overall impact of improvement interventions. Data were analysed iteratively using the framework approach. Participants described the consequences of improvement in terms of desirability/undesirability and the extent to which they were expected/unexpected when planning improvement. Four types of consequences were defined: expected desirable consequences ( goals ); expected undesirable consequences ( trade-offs ); unexpected undesirable consequences ( unpleasant surprises ); and unexpected desirable consequences ( pleasant surprises ). Unexpected consequences were considered important but rarely measured in existing programmes, and an improvement pause to take stock after implementation would allow these to be more actively identified and managed. A balanced accounting of all consequences of improvement interventions can facilitate staff engagement and reduce resistance to change, but has to be offset against the cost of additional data collection. Improvement measurement is usually focused on measuring intended goals , with minimal use of balancing measures which when used, typically monitor trade-offs expected before implementation. This paper proposes that improvers and leaders should seek a balanced accounting of all consequences of improvement across the life of an improvement programme, including deliberately pausing after implementation to identify and quantitatively or qualitatively evaluate any pleasant or unpleasant surprises. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Hilderink, Henk B M; Plasmans, Marjanne H D; Snijders, Bianca E P; Boshuizen, Hendriek C; Poos, M J J C René; van Gool, Coen H
2016-01-01
Various Burden of Disease (BoD) studies do not account for multimorbidity in their BoD estimates. Ignoring multimorbidity can lead to inaccuracies in BoD estimations, particularly in ageing populations that include large proportions of persons with two or more health conditions. The objective of this study is to improve BoD estimates for the Netherlands by accounting for multimorbidity. For this purpose, we analyzed different methods for 1) estimating the prevalence of multimorbidity and 2) deriving Disability Weights (DWs) for multimorbidity by using existing data on single health conditions. We included 25 health conditions from the Dutch Burden of Disease study that have a high rate of prevalence and that make a large contribution to the total number of Years Lived with a Disability (YLD). First, we analyzed four methods for estimating the prevalence of multimorbid conditions (i.e. independent, independent age- and sex-specific, dependent, and dependent sex- and age-specific). Secondly, we analyzed three methods for calculating the Combined Disability Weights (CDWs) associated with multimorbid conditions (i.e. additive, multiplicative and maximum limit). A combination of these two approaches was used to recalculate the number of YLDs, which is a component of the Disability-Adjusted Life Years (DALY). This study shows that the YLD estimates for 25 health conditions calculated using the multiplicative method for Combined Disability Weights are 5 % lower, and 14 % lower when using the maximum limit method, than when calculated using the additive method. Adjusting for sex- and age-specific dependent co-occurrence of health conditions reduces the number of YLDs by 10 % for the multiplicative method and by 26 % for the maximum limit method. The adjustment is higher for health conditions with a higher prevalence in old age, like heart failure (up to 43 %) and coronary heart diseases (up to 33 %). Health conditions with a high prevalence in middle age, such as anxiety disorders, have a moderate adjustment (up to 13 %). We conclude that BoD calculations that do not account for multimorbidity can result in an overestimation of the actual BoD. This may affect public health policy strategies that focus on single health conditions if the underlying cost-effectiveness analysis overestimates the intended effects. The methodology used in this study could be further refined to provide greater insight into co-occurrence and the possible consequences of multimorbid conditions in terms of disability for particular combinations of health conditions.
Zhang, Yang; Jiang, Ping; Zhang, Hongyan; Cheng, Peng
2018-01-23
Thermal infrared remote sensing has become one of the main technology methods used for urban heat island research. When applying urban land surface temperature inversion of the thermal infrared band, problems with intensity level division arise because the method is subjective. However, this method is one of the few that performs heat island intensity level identification. This paper will build an intensity level identifier for an urban heat island, by using weak supervision and thought-based learning in an improved, restricted Boltzmann machine (RBM) model. The identifier automatically initializes the annotation and optimizes the model parameters sequentially until the target identifier is completed. The algorithm needs very little information about the weak labeling of the target training sample and generates an urban heat island intensity spatial distribution map. This study can provide reliable decision-making support for urban ecological planning and effective protection of urban ecological security. The experimental results showed the following: (1) The heat island effect in Wuhan is existent and intense. Heat island areas are widely distributed. The largest heat island area is in Wuhan, followed by the sub-green island. The total area encompassed by heat island and strong island levels accounts for 54.16% of the land in Wuhan. (2) Partially based on improved RBM identification, this method meets the research demands of determining the spatial distribution characteristics of the internal heat island effect; its identification accuracy is superior to that of comparable methods.
Improved accuracy of intraocular lens power calculation with the Zeiss IOLMaster.
Olsen, Thomas
2007-02-01
This study aimed to demonstrate how the level of accuracy in intraocular lens (IOL) power calculation can be improved with optical biometry using partial optical coherence interferometry (PCI) (Zeiss IOLMaster) and current anterior chamber depth (ACD) prediction algorithms. Intraocular lens power in 461 consecutive cataract operations was calculated using both PCI and ultrasound and the accuracy of the results of each technique were compared. To illustrate the importance of ACD prediction per se, predictions were calculated using both a recently published 5-variable method and the Haigis 2-variable method and the results compared. All calculations were optimized in retrospect to account for systematic errors, including IOL constants and other off-set errors. The average absolute IOL prediction error (observed minus expected refraction) was 0.65 dioptres with ultrasound and 0.43 D with PCI using the 5-variable ACD prediction method (p < 0.00001). The number of predictions within +/- 0.5 D, +/- 1.0 D and +/- 2.0 D of the expected outcome was 62.5%, 92.4% and 99.9% with PCI, compared with 45.5%, 77.3% and 98.4% with ultrasound, respectively (p < 0.00001). The 2-variable ACD method resulted in an average error in PCI predictions of 0.46 D, which was significantly higher than the error in the 5-variable method (p < 0.001). The accuracy of IOL power calculation can be significantly improved using calibrated axial length readings obtained with PCI and modern IOL power calculation formulas incorporating the latest generation ACD prediction algorithms.
Jiang, Ping; Zhang, Hongyan; Cheng, Peng
2018-01-01
Thermal infrared remote sensing has become one of the main technology methods used for urban heat island research. When applying urban land surface temperature inversion of the thermal infrared band, problems with intensity level division arise because the method is subjective. However, this method is one of the few that performs heat island intensity level identification. This paper will build an intensity level identifier for an urban heat island, by using weak supervision and thought-based learning in an improved, restricted Boltzmann machine (RBM) model. The identifier automatically initializes the annotation and optimizes the model parameters sequentially until the target identifier is completed. The algorithm needs very little information about the weak labeling of the target training sample and generates an urban heat island intensity spatial distribution map. This study can provide reliable decision-making support for urban ecological planning and effective protection of urban ecological security. The experimental results showed the following: (1) The heat island effect in Wuhan is existent and intense. Heat island areas are widely distributed. The largest heat island area is in Wuhan, followed by the sub-green island. The total area encompassed by heat island and strong island levels accounts for 54.16% of the land in Wuhan. (2) Partially based on improved RBM identification, this method meets the research demands of determining the spatial distribution characteristics of the internal heat island effect; its identification accuracy is superior to that of comparable methods. PMID:29360786
Accounting for ecosystem services in life cycle assessment, Part I: a critical review.
Zhang, Yi; Singh, Shweta; Bakshi, Bhavik R
2010-04-01
If life cycle oriented methods are to encourage sustainable development, they must account for the role of ecosystem goods and services, since these form the basis of planetary activities and human well-being. This article reviews methods that are relevant to accounting for the role of nature and that could be integrated into life cycle oriented approaches. These include methods developed by ecologists for quantifying ecosystem services, by ecological economists for monetary valuation, and life cycle methods such as conventional life cycle assessment, thermodynamic methods for resource accounting such as exergy and emergy analysis, variations of the ecological footprint approach, and human appropriation of net primary productivity. Each approach has its strengths: economic methods are able to quantify the value of cultural services; LCA considers emissions and assesses their impact; emergy accounts for supporting services in terms of cumulative exergy; and ecological footprint is intuitively appealing and considers biocapacity. However, no method is able to consider all the ecosystem services, often due to the desire to aggregate all resources in terms of a single unit. This review shows that comprehensive accounting for ecosystem services in LCA requires greater integration among existing methods, hierarchical schemes for interpreting results via multiple levels of aggregation, and greater understanding of the role of ecosystems in supporting human activities. These present many research opportunities that must be addressed to meet the challenges of sustainability.
2014-01-01
Background Protein-protein docking is an in silico method to predict the formation of protein complexes. Due to limited computational resources, the protein-protein docking approach has been developed under the assumption of rigid docking, in which one of the two protein partners remains rigid during the protein associations and water contribution is ignored or implicitly presented. Despite obtaining a number of acceptable complex predictions, it seems to-date that most initial rigid docking algorithms still find it difficult or even fail to discriminate successfully the correct predictions from the other incorrect or false positive ones. To improve the rigid docking results, re-ranking is one of the effective methods that help re-locate the correct predictions in top high ranks, discriminating them from the other incorrect ones. In this paper, we propose a new re-ranking technique using a new energy-based scoring function, namely IFACEwat - a combined Interface Atomic Contact Energy (IFACE) and water effect. The IFACEwat aims to further improve the discrimination of the near-native structures of the initial rigid docking algorithm ZDOCK3.0.2. Unlike other re-ranking techniques, the IFACEwat explicitly implements interfacial water into the protein interfaces to account for the water-mediated contacts during the protein interactions. Results Our results showed that the IFACEwat increased both the numbers of the near-native structures and improved their ranks as compared to the initial rigid docking ZDOCK3.0.2. In fact, the IFACEwat achieved a success rate of 83.8% for Antigen/Antibody complexes, which is 10% better than ZDOCK3.0.2. As compared to another re-ranking technique ZRANK, the IFACEwat obtains success rates of 92.3% (8% better) and 90% (5% better) respectively for medium and difficult cases. When comparing with the latest published re-ranking method F2Dock, the IFACEwat performed equivalently well or even better for several Antigen/Antibody complexes. Conclusions With the inclusion of interfacial water, the IFACEwat improves mostly results of the initial rigid docking, especially for Antigen/Antibody complexes. The improvement is achieved by explicitly taking into account the contribution of water during the protein interactions, which was ignored or not fully presented by the initial rigid docking and other re-ranking techniques. In addition, the IFACEwat maintains sufficient computational efficiency of the initial docking algorithm, yet improves the ranks as well as the number of the near native structures found. As our implementation so far targeted to improve the results of ZDOCK3.0.2, and particularly for the Antigen/Antibody complexes, it is expected in the near future that more implementations will be conducted to be applicable for other initial rigid docking algorithms. PMID:25521441
Biegert, Edward; Vowinckel, Bernhard; Meiburg, Eckart
2017-03-21
We present a collision model for phase-resolved Direct Numerical Simulations of sediment transport that couple the fluid and particles by the Immersed Boundary Method. Typically, a contact model for these types of simulations comprises a lubrication force for particles in close proximity to another solid object, a normal contact force to prevent particles from overlapping, and a tangential contact force to account for friction. Our model extends the work of previous authors to improve upon the time integration scheme to obtain consistent results for particle-wall collisions. Furthermore, we account for polydisperse spherical particles and introduce new criteria to account formore » enduring contact, which occurs in many sediment transport situations. This is done without using arbitrary values for physically-defined parameters and by maintaining the full momentum balance of a particle in enduring contact. Lastly, we validate our model against several test cases for binary particle-wall collisions as well as the collective motion of a sediment bed sheared by a viscous flow, yielding satisfactory agreement with experimental data by various authors.« less
The Impact of WIC on Birth Outcomes: New Evidence from South Carolina.
Sonchak, Lyudmyla
2016-07-01
Objectives To investigate the impact of the Special Supplemental Nutrition Program for Women, Infants and Children (WIC) on a variety of infant health outcomes using recent South Carolina Vital Statistics data (2004-2012). Methods To account for non-random WIC participation, the study relies on a maternal fixed effects estimation, due to the availability of unique maternally linked data. Results The results indicate that WIC participation is associated with an increase in birth weight and length of gestation, decrease in the probability of low birth weight, prematurity, and Neonatal Intensive Care Unit admission. Additionally, addressing gestational bias and accounting for the length of gestation, WIC participation is associated with a decrease in the probability of delivering a low weight infant and a small for gestational age infant among black mothers. Conclusions for Practice Accounting for non-random program participation, the study documents a large improvement in birth outcomes among infants of WIC participating mothers. Even in the context of somewhat restrictive gestation-adjusted specification, the positive impact of WIC remains within the subsample of black mothers.
Regional regression of flood characteristics employing historical information
Tasker, Gary D.; Stedinger, J.R.
1987-01-01
Streamflow gauging networks provide hydrologic information for use in estimating the parameters of regional regression models. The regional regression models can be used to estimate flood statistics, such as the 100 yr peak, at ungauged sites as functions of drainage basin characteristics. A recent innovation in regional regression is the use of a generalized least squares (GLS) estimator that accounts for unequal station record lengths and sample cross correlation among the flows. However, this technique does not account for historical flood information. A method is proposed here to adjust this generalized least squares estimator to account for possible information about historical floods available at some stations in a region. The historical information is assumed to be in the form of observations of all peaks above a threshold during a long period outside the systematic record period. A Monte Carlo simulation experiment was performed to compare the GLS estimator adjusted for historical floods with the unadjusted GLS estimator and the ordinary least squares estimator. Results indicate that using the GLS estimator adjusted for historical information significantly improves the regression model. ?? 1987.
NASA Astrophysics Data System (ADS)
Biegert, Edward; Vowinckel, Bernhard; Meiburg, Eckart
2017-07-01
We present a collision model for phase-resolved Direct Numerical Simulations of sediment transport that couple the fluid and particles by the Immersed Boundary Method. Typically, a contact model for these types of simulations comprises a lubrication force for particles in close proximity to another solid object, a normal contact force to prevent particles from overlapping, and a tangential contact force to account for friction. Our model extends the work of previous authors to improve upon the time integration scheme to obtain consistent results for particle-wall collisions. Furthermore, we account for polydisperse spherical particles and introduce new criteria to account for enduring contact, which occurs in many sediment transport situations. This is done without using arbitrary values for physically-defined parameters and by maintaining the full momentum balance of a particle in enduring contact. We validate our model against several test cases for binary particle-wall collisions as well as the collective motion of a sediment bed sheared by a viscous flow, yielding satisfactory agreement with experimental data by various authors.
Wang, Li; Wang, Xiaoyi; Jin, Xuebo; Xu, Jiping; Zhang, Huiyan; Yu, Jiabin; Sun, Qian; Gao, Chong; Wang, Lingbin
2017-03-01
The formation process of algae is described inaccurately and water blooms are predicted with a low precision by current methods. In this paper, chemical mechanism of algae growth is analyzed, and a correlation analysis of chlorophyll-a and algal density is conducted by chemical measurement. Taking into account the influence of multi-factors on algae growth and water blooms, the comprehensive prediction method combined with multivariate time series and intelligent model is put forward in this paper. Firstly, through the process of photosynthesis, the main factors that affect the reproduction of the algae are analyzed. A compensation prediction method of multivariate time series analysis based on neural network and Support Vector Machine has been put forward which is combined with Kernel Principal Component Analysis to deal with dimension reduction of the influence factors of blooms. Then, Genetic Algorithm is applied to improve the generalization ability of the BP network and Least Squares Support Vector Machine. Experimental results show that this method could better compensate the prediction model of multivariate time series analysis which is an effective way to improve the description accuracy of algae growth and prediction precision of water blooms.
Characterization of Triaxial Braided Composite Material Properties for Impact Simulation
NASA Technical Reports Server (NTRS)
Roberts, Gary D.; Goldberg, Robert K.; Biniendak, Wieslaw K.; Arnold, William A.; Littell, Justin D.; Kohlman, Lee W.
2009-01-01
The reliability of impact simulations for aircraft components made with triaxial braided carbon fiber composites is currently limited by inadequate material property data and lack of validated material models for analysis. Improvements to standard quasi-static test methods are needed to account for the large unit cell size and localized damage within the unit cell. The deformation and damage of a triaxial braided composite material was examined using standard quasi-static in-plane tension, compression, and shear tests. Some modifications to standard test specimen geometries are suggested, and methods for measuring the local strain at the onset of failure within the braid unit cell are presented. Deformation and damage at higher strain rates is examined using ballistic impact tests on 61- by 61- by 3.2-mm (24- by 24- by 0.125-in.) composite panels. Digital image correlation techniques were used to examine full-field deformation and damage during both quasi-static and impact tests. An impact analysis method is presented that utilizes both local and global deformation and failure information from the quasi-static tests as input for impact simulations. Improvements that are needed in test and analysis methods for better predictive capability are examined.
van der Waals forces in density functional theory: a review of the vdW-DF method.
Berland, Kristian; Cooper, Valentino R; Lee, Kyuho; Schröder, Elsebeth; Thonhauser, T; Hyldgaard, Per; Lundqvist, Bengt I
2015-06-01
A density functional theory (DFT) that accounts for van der Waals (vdW) interactions in condensed matter, materials physics, chemistry, and biology is reviewed. The insights that led to the construction of the Rutgers-Chalmers van der Waals density functional (vdW-DF) are presented with the aim of giving a historical perspective, while also emphasizing more recent efforts which have sought to improve its accuracy. In addition to technical details, we discuss a range of recent applications that illustrate the necessity of including dispersion interactions in DFT. This review highlights the value of the vdW-DF method as a general-purpose method, not only for dispersion bound systems, but also in densely packed systems where these types of interactions are traditionally thought to be negligible.
Epidemiology and Diagnosis of Helicobacter pylori infection.
Mentis, Andreas; Lehours, Philippe; Mégraud, Francis
2015-09-01
During the period reviewed, prevalence studies were essentially performed in less economically advanced countries and a high prevalence was found. The traditional risk factors for Helicobacter pylori positivity were mostly found. Transmission studied by molecular typing showed a familial transmission. The eventual role of water transmission was explored in several studies with controversial results. Concerning diagnosis, most of the invasive and noninvasive methods used for the diagnosis of H. pylori infection are long standing with efficient performance. The most interesting recent improvements in H. pylori diagnosis include advances in endoscopy, developments in molecular methods, and the introduction of omics-based techniques. Interpretation of old or newer method should take into account the pretest probability and the prevalence of H. pylori in the population under investigation. © 2015 John Wiley & Sons Ltd.
Cross-domain latent space projection for person re-identification
NASA Astrophysics Data System (ADS)
Pu, Nan; Wu, Song; Qian, Li; Xiao, Guoqiang
2018-04-01
In this paper, we research the problem of person re-identification and propose a cross-domain latent space projection (CDLSP) method to address the problems of the absence or insufficient labeled data in the target domain. Under the assumption that the visual features in the source domain and target domain share the similar geometric structure, we transform the visual features from source domain and target domain to a common latent space by optimizing the object function defined in the manifold alignment method. Moreover, the proposed object function takes into account the specific knowledge in the re-id with the aim to improve the performance of re-id under complex situations. Extensive experiments conducted on four benchmark datasets show the proposed CDLSP outperforms or is competitive with stateof- the-art methods for person re-identification.
Model Calibration with Censored Data
Cao, Fang; Ba, Shan; Brenneman, William A.; ...
2017-06-28
Here, the purpose of model calibration is to make the model predictions closer to reality. The classical Kennedy-O'Hagan approach is widely used for model calibration, which can account for the inadequacy of the computer model while simultaneously estimating the unknown calibration parameters. In many applications, the phenomenon of censoring occurs when the exact outcome of the physical experiment is not observed, but is only known to fall within a certain region. In such cases, the Kennedy-O'Hagan approach cannot be used directly, and we propose a method to incorporate the censoring information when performing model calibration. The method is applied tomore » study the compression phenomenon of liquid inside a bottle. The results show significant improvement over the traditional calibration methods, especially when the number of censored observations is large.« less
Xiao, Jian; Cao, Hongyuan; Chen, Jun
2017-09-15
Next generation sequencing technologies have enabled the study of the human microbiome through direct sequencing of microbial DNA, resulting in an enormous amount of microbiome sequencing data. One unique characteristic of microbiome data is the phylogenetic tree that relates all the bacterial species. Closely related bacterial species have a tendency to exhibit a similar relationship with the environment or disease. Thus, incorporating the phylogenetic tree information can potentially improve the detection power for microbiome-wide association studies, where hundreds or thousands of tests are conducted simultaneously to identify bacterial species associated with a phenotype of interest. Despite much progress in multiple testing procedures such as false discovery rate (FDR) control, methods that take into account the phylogenetic tree are largely limited. We propose a new FDR control procedure that incorporates the prior structure information and apply it to microbiome data. The proposed procedure is based on a hierarchical model, where a structure-based prior distribution is designed to utilize the phylogenetic tree. By borrowing information from neighboring bacterial species, we are able to improve the statistical power of detecting associated bacterial species while controlling the FDR at desired levels. When the phylogenetic tree is mis-specified or non-informative, our procedure achieves a similar power as traditional procedures that do not take into account the tree structure. We demonstrate the performance of our method through extensive simulations and real microbiome datasets. We identified far more alcohol-drinking associated bacterial species than traditional methods. R package StructFDR is available from CRAN. chen.jun2@mayo.edu. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Henze Bancroft, Leah C; Strigel, Roberta M; Hernando, Diego; Johnson, Kevin M; Kelcz, Frederick; Kijowski, Richard; Block, Walter F
2016-03-01
Chemical shift based fat/water decomposition methods such as IDEAL are frequently used in challenging imaging environments with large B0 inhomogeneity. However, they do not account for the signal modulations introduced by a balanced steady state free precession (bSSFP) acquisition. Here we demonstrate improved performance when the bSSFP frequency response is properly incorporated into the multipeak spectral fat model used in the decomposition process. Balanced SSFP allows for rapid imaging but also introduces a characteristic frequency response featuring periodic nulls and pass bands. Fat spectral components in adjacent pass bands will experience bulk phase offsets and magnitude modulations that change the expected constructive and destructive interference between the fat spectral components. A bSSFP signal model was incorporated into the fat/water decomposition process and used to generate images of a fat phantom, and bilateral breast and knee images in four normal volunteers at 1.5 Tesla. Incorporation of the bSSFP signal model into the decomposition process improved the performance of the fat/water decomposition. Incorporation of this model allows rapid bSSFP imaging sequences to use robust fat/water decomposition methods such as IDEAL. While only one set of imaging parameters were presented, the method is compatible with any field strength or repetition time. © 2015 Wiley Periodicals, Inc.
NASA Technical Reports Server (NTRS)
Jammu, Vinay B.; Danai, Koroush; Lewicki, David G.
1996-01-01
A diagnostic method is introduced for helicopter gearboxes that uses knowledge of the gear-box structure and characteristics of the 'features' of vibration to define the influences of faults on features. The 'structural influences' in this method are defined based on the root mean square value of vibration obtained from a simplified lumped-mass model of the gearbox. The structural influences are then converted to fuzzy variables, to account for the approximate nature of the lumped-mass model, and used as the weights of a connectionist network. Diagnosis in this Structure-Based Connectionist Network (SBCN) is performed by propagating the abnormal vibration features through the weights of SBCN to obtain fault possibility values for each component in the gearbox. Upon occurrence of misdiagnoses, the SBCN also has the ability to improve its diagnostic performance. For this, a supervised training method is presented which adapts the weights of SBCN to minimize the number of misdiagnoses. For experimental evaluation of the SBCN, vibration data from a OH-58A helicopter gearbox collected at NASA Lewis Research Center is used. Diagnostic results indicate that the SBCN is able to diagnose about 80% of the faults without training, and is able to improve its performance to nearly 100% after training.
NASA Astrophysics Data System (ADS)
Rykov, S. P.; Rykova, O. A.; Koval, V. S.; Makhno, D. E.; Fedotov, K. V.
2018-03-01
The paper aims to analyze vibrations of the dynamic system equivalent of the suspension system with regard to tyre ability to smooth road irregularities. The research is based on static dynamics for linear systems of automated control, methods of correlation, spectral and numerical analysis. Input of new data on the smoothing effect of the pneumatic tyre reflecting changes of a contact area between the wheel and road under vibrations of the suspension makes the system non-linear which requires using numerical analysis methods. Taking into account the variable smoothing ability of the tyre when calculating suspension vibrations, one can approximate calculation and experimental results and improve the constant smoothing ability of the tyre.
Simplification of the Kalman filter for meteorological data assimilation
NASA Technical Reports Server (NTRS)
Dee, Dick P.
1991-01-01
The paper proposes a new statistical method of data assimilation that is based on a simplification of the Kalman filter equations. The forecast error covariance evolution is approximated simply by advecting the mass-error covariance field, deriving the remaining covariances geostrophically, and accounting for external model-error forcing only at the end of each forecast cycle. This greatly reduces the cost of computation of the forecast error covariance. In simulations with a linear, one-dimensional shallow-water model and data generated artificially, the performance of the simplified filter is compared with that of the Kalman filter and the optimal interpolation (OI) method. The simplified filter produces analyses that are nearly optimal, and represents a significant improvement over OI.
Completely automated modal analysis procedure based on the combination of different OMA methods
NASA Astrophysics Data System (ADS)
Ripamonti, Francesco; Bussini, Alberto; Resta, Ferruccio
2018-03-01
In this work a completely automated output-only Modal Analysis procedure is presented and all its benefits are listed. Based on the merging of different Operational Modal Analysis methods and a statistical approach, the identification process has been improved becoming more robust and giving as results only the real natural frequencies, damping ratios and mode shapes of the system. The effect of the temperature can be taken into account as well, leading to the creation of a better tool for automated Structural Health Monitoring. The algorithm has been developed and tested on a numerical model of a scaled three-story steel building present in the laboratories of Politecnico di Milano.
Predicting missing links in complex networks based on common neighbors and distance
Yang, Jinxuan; Zhang, Xiao-Dong
2016-01-01
The algorithms based on common neighbors metric to predict missing links in complex networks are very popular, but most of these algorithms do not account for missing links between nodes with no common neighbors. It is not accurate enough to reconstruct networks by using these methods in some cases especially when between nodes have less common neighbors. We proposed in this paper a new algorithm based on common neighbors and distance to improve accuracy of link prediction. Our proposed algorithm makes remarkable effect in predicting the missing links between nodes with no common neighbors and performs better than most existing currently used methods for a variety of real-world networks without increasing complexity. PMID:27905526
48 CFR 9904.412-64 - Transition method.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Transition method. 9904... ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.412-64 Transition method. To be acceptable, any method of... previously provided for, shall not be redundantly provided for under revised methods. Conversely, costs that...
48 CFR 9904.412-64 - Transition method.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 7 2013-10-01 2012-10-01 true Transition method. 9904.412... ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.412-64 Transition method. To be acceptable, any method of... previously provided for, shall not be redundantly provided for under revised methods. Conversely, costs that...
48 CFR 9904.412-64 - Transition method.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 7 2014-10-01 2014-10-01 false Transition method. 9904... ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.412-64 Transition method. To be acceptable, any method of... previously provided for, shall not be redundantly provided for under revised methods. Conversely, costs that...
48 CFR 9904.412-64 - Transition method.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 7 2012-10-01 2012-10-01 false Transition method. 9904... ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.412-64 Transition method. To be acceptable, any method of... previously provided for, shall not be redundantly provided for under revised methods. Conversely, costs that...
Modeling Occupancy of Hosts by Mistletoe Seeds after Accounting for Imperfect Detectability
Fadini, Rodrigo F.; Cintra, Renato
2015-01-01
The detection of an organism in a given site is widely used as a state variable in many metapopulation and epidemiological studies. However, failure to detect the species does not necessarily mean that it is absent. Assessing detectability is important for occupancy (presence—absence) surveys; and identifying the factors reducing detectability may help improve survey precision and efficiency. A method was used to estimate the occupancy status of host trees colonized by mistletoe seeds of Psittacanthus plagiophyllus as a function of host covariates: host size and presence of mistletoe infections on the same or on the nearest neighboring host (the cashew tree Anacardium occidentale). The technique also evaluated the effect of taking detectability into account for estimating host occupancy by mistletoe seeds. Individual host trees were surveyed for presence of mistletoe seeds with the aid of two or three observers to estimate detectability and occupancy. Detectability was, on average, 17% higher in focal-host trees with infected neighbors, while decreased about 23 to 50% from smallest to largest hosts. The presence of mistletoe plants in the sample tree had negligible effect on detectability. Failure to detect hosts as occupied decreased occupancy by 2.5% on average, with maximum of 10% for large and isolated hosts. The method presented in this study has potential for use with metapopulation studies of mistletoes, especially those focusing on the seed stage, but also as improvement of accuracy in occupancy models estimates often used for metapopulation dynamics of tree-dwelling plants in general. PMID:25973754
Carbon Budget and its Dynamics over Northern Eurasia Forest Ecosystems
NASA Astrophysics Data System (ADS)
Shvidenko, Anatoly; Schepaschenko, Dmitry; Kraxner, Florian; Maksyutov, Shamil
2016-04-01
The presentation contains an overview of recent findings and results of assessment of carbon cycling of forest ecosystems of Northern Eurasia. From a methodological point of view, there is a clear tendency in understanding a need of a Full and Verified Carbon Account (FCA), i.e. in reliable assessment of uncertainties for all modules and all stages of FCA. FCA is considered as a fuzzy (underspecified) system that supposes a system integration of major methods of carbon cycling study (land-ecosystem approach, LEA; process-based models; eddy covariance; and inverse modelling). Landscape-ecosystem approach 1) serves for accumulation of all relevant knowledge of landscape and ecosystems; 2) for strict systems designing the account, 3) contains all relevant spatially distributed empirical and semi-empirical data and models, and 4) is presented in form of an Integrated Land Information System (ILIS). The ILIS includes a hybrid land cover in a spatially and temporarily explicit way and corresponding attributive databases. The forest mask is provided by utilizing multi-sensor remote sensing data, geographically weighed regression and validation within GEO-wiki platform. By-pixel parametrization of forest cover is based on a special optimization algorithms using all available knowledge and information sources (data of forest inventory and different surveys, observations in situ, official statistics of forest management etc.). Major carbon fluxes within the LEA (NPP, HR, disturbances etc.) are estimated based on fusion of empirical data and aggregations with process-based elements by sets of regionally distributed models. Uncertainties within LEA are assessed for each module and at each step of the account. Within method results of LEA and corresponding uncertainties are harmonized and mutually constrained with independent outputs received by other methods based on the Bayesian approach. The above methodology have been applied to carbon account of Russian forests for 2000-2012. It has been shown that the Net Ecosystem Carbon Budget (NECB) of Russian forests for this period was in range of 0.5-0.7 Pg C yr-1 with a slight negative trend during the period due to acceleration of disturbance regimes and negative impacts of weather extremes (heat waves etc.). Uncertainties of the FCA for individual years were estimated at about 25% (CI 0.9). It has been shown that some models (e.g. majority of DGVMs) do not describe some processes on permafrost satisfactory while results of applications of ensembles of inverse models on average are closed to empirical assessments. A most important conclusion from this experience is that future improvements of knowledge of carbon cycling of Northern Eurasia forests requires development of an integrated observing system as a unified information background, as well as systems methodological improvements of all methods of cognition of carbon cycling.
Martin Hilber, Adriane; Blake, Carolyn; Bohle, Leah F; Bandali, Sarah; Agbon, Esther; Hulton, Louise
2016-12-01
To describe the types of maternal and newborn health program accountability mechanisms implemented and evaluated in recent years in Sub-Saharan Africa, how these have been implemented, their effectiveness, and future prospects to improve governance and MNH outcomes. A structured review selected 38 peer-reviewed papers between 2006 and 2016 in Sub-Saharan Africa to include in the analysis. Performance accountability in MNH through maternal and perinatal death surveillance was the most common accountability mechanism used. Political and democratic accountability through advocacy, human rights, and global tracking of progress on indicators achieved greatest results when multiple stakeholders were involved. Financial accountability can be effective but depend on external support. Overall, this review shows that accountability is more effective when clear expectations are backed by social and political advocacy and multistakeholder engagement, and supported by incentives for positive action. There are few accountability mechanisms in MNH in Sub-Saharan Africa between decision-makers and those affected by those decisions with both the power and the will to enforce answerability. Increasing accountability depends not only on how mechanisms are enforced but also, on how providers and managers understand accountability. Copyright © 2016 International Federation of Gynecology and Obstetrics. Published by Elsevier Ireland Ltd. All rights reserved.
ERIC Educational Resources Information Center
Whitesell, Emilyn Ruble
2015-01-01
School accountability systems are a popular approach to improving education outcomes in the United States. These systems intend to "hold schools accountable" by assessing school performance on specific metrics, publishing accountability reports, and some combination of rewarding and sanctioning schools based on performance. Additionally,…