Sample records for model implementation reducing

  1. Design of multivariable feedback control systems via spectral assignment using reduced-order models and reduced-order observers

    NASA Technical Reports Server (NTRS)

    Mielke, R. R.; Tung, L. J.; Carraway, P. I., III

    1984-01-01

    The feasibility of using reduced order models and reduced order observers with eigenvalue/eigenvector assignment procedures is investigated. A review of spectral assignment synthesis procedures is presented. Then, a reduced order model which retains essential system characteristics is formulated. A constant state feedback matrix which assigns desired closed loop eigenvalues and approximates specified closed loop eigenvectors is calculated for the reduced order model. It is shown that the eigenvalue and eigenvector assignments made in the reduced order system are retained when the feedback matrix is implemented about the full order system. In addition, those modes and associated eigenvectors which are not included in the reduced order model remain unchanged in the closed loop full order system. The full state feedback design is then implemented by using a reduced order observer. It is shown that the eigenvalue and eigenvector assignments of the closed loop full order system rmain unchanged when a reduced order observer is used. The design procedure is illustrated by an actual design problem.

  2. Design of multivariable feedback control systems via spectral assignment using reduced-order models and reduced-order observers

    NASA Technical Reports Server (NTRS)

    Mielke, R. R.; Tung, L. J.; Carraway, P. I., III

    1985-01-01

    The feasibility of using reduced order models and reduced order observers with eigenvalue/eigenvector assignment procedures is investigated. A review of spectral assignment synthesis procedures is presented. Then, a reduced order model which retains essential system characteristics is formulated. A constant state feedback matrix which assigns desired closed loop eigenvalues and approximates specified closed loop eigenvectors is calculated for the reduced order model. It is shown that the eigenvalue and eigenvector assignments made in the reduced order system are retained when the feedback matrix is implemented about the full order system. In addition, those modes and associated eigenvectors which are not included in the reduced order model remain unchanged in the closed loop full order system. The fulll state feedback design is then implemented by using a reduced order observer. It is shown that the eigenvalue and eigenvector assignments of the closed loop full order system remain unchanged when a reduced order observer is used. The design procedure is illustrated by an actual design problem.

  3. Reducing cancer risk in rural communities through supermarket interventions.

    PubMed

    McCool, Barent N; Lyford, Conrad P; Hensarling, Natalie; Pence, Barbara; McCool, Audrey C; Thapa, Janani; Belasco, Eric; Carter, Tyra M

    2013-09-01

    Cancer risk is high, and prevention efforts are often minimal in rural communities. Feasible means of encouraging lifestyles that will reduce cancer risk for residents of rural communities are needed. This project developed and tested a model that could be feasibly adopted by rural communities to reduce cancer risk. This model focuses on incorporating multi-faceted cancer risk education in the local supermarket. As the supermarket functions both as the primary food source and an information source in small rural communities, the supermarket focus encourages the development of a community environment supportive of lifestyles that should reduce residents' risk for cancer. The actions taken to implement the model and the challenges that communities would have in implementing the model are identified.

  4. Stimulating household flood risk mitigation investments through insurance and subsidies: an Agent-Based Modelling approach

    NASA Astrophysics Data System (ADS)

    Haer, Toon; Botzen, Wouter; de Moel, Hans; Aerts, Jeroen

    2015-04-01

    In the period 1998-2009, floods triggered roughly 52 billion euro in insured economic losses making floods the most costly natural hazard in Europe. Climate change and socio/economic trends are expected to further aggrevate floods losses in many regions. Research shows that flood risk can be significantly reduced if households install protective measures, and that the implementation of such measures can be stimulated through flood insurance schemes and subsidies. However, the effectiveness of such incentives to stimulate implementation of loss-reducing measures greatly depends on the decision process of individuals and is hardly studied. In our study, we developed an Agent-Based Model that integrates flood damage models, insurance mechanisms, subsidies, and household behaviour models to assess the effectiveness of different economic tools on stimulating households to invest in loss-reducing measures. Since the effectiveness depends on the decision making process of individuals, the study compares different household decision models ranging from standard economic models, to economic models for decision making under risk, to more complex decision models integrating economic models and risk perceptions, opinion dynamics, and the influence of flood experience. The results show the effectiveness of incentives to stimulate investment in loss-reducing measures for different household behavior types, while assuming climate change scenarios. It shows how complex decision models can better reproduce observed real-world behaviour compared to traditional economic models. Furthermore, since flood events are included in the simulations, the results provide an analysis of the dynamics in insured and uninsured losses for households, the costs of reducing risk by implementing loss-reducing measures, the capacity of the insurance market, and the cost of government subsidies under different scenarios. The model has been applied to the City of Rotterdam in The Netherlands.

  5. A modeling framework for evaluating streambank stabilization practices for reach-scale sediment reduction

    USDA-ARS?s Scientific Manuscript database

    Streambank stabilization techniques are often implemented to reduce sediment loads from unstable streambanks. Process-based models can predict sediment yields with stabilization scenarios prior to implementation. However, a framework does not exist on how to effectively utilize these models to evalu...

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fadika, Zacharia; Dede, Elif; Govindaraju, Madhusudhan

    MapReduce is increasingly becoming a popular framework, and a potent programming model. The most popular open source implementation of MapReduce, Hadoop, is based on the Hadoop Distributed File System (HDFS). However, as HDFS is not POSIX compliant, it cannot be fully leveraged by applications running on a majority of existing HPC environments such as Teragrid and NERSC. These HPC environments typicallysupport globally shared file systems such as NFS and GPFS. On such resourceful HPC infrastructures, the use of Hadoop not only creates compatibility issues, but also affects overall performance due to the added overhead of the HDFS. This paper notmore » only presents a MapReduce implementation directly suitable for HPC environments, but also exposes the design choices for better performance gains in those settings. By leveraging inherent distributed file systems' functions, and abstracting them away from its MapReduce framework, MARIANE (MApReduce Implementation Adapted for HPC Environments) not only allows for the use of the model in an expanding number of HPCenvironments, but also allows for better performance in such settings. This paper shows the applicability and high performance of the MapReduce paradigm through MARIANE, an implementation designed for clustered and shared-disk file systems and as such not dedicated to a specific MapReduce solution. The paper identifies the components and trade-offs necessary for this model, and quantifies the performance gains exhibited by our approach in distributed environments over Apache Hadoop in a data intensive setting, on the Magellan testbed at the National Energy Research Scientific Computing Center (NERSC).« less

  7. Systems, methods and apparatus for pattern matching in procedure development and verification

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G. (Inventor); Rouff, Christopher A. (Inventor); Rash, James L. (Inventor)

    2011-01-01

    Systems, methods and apparatus are provided through which, in some embodiments, a formal specification is pattern-matched from scenarios, the formal specification is analyzed, and flaws in the formal specification are corrected. The systems, methods and apparatus may include pattern-matching an equivalent formal model from an informal specification. Such a model can be analyzed for contradictions, conflicts, use of resources before the resources are available, competition for resources, and so forth. From such a formal model, an implementation can be automatically generated in a variety of notations. The approach can improve the resulting implementation, which, in some embodiments, is provably equivalent to the procedures described at the outset, which in turn can improve confidence that the system reflects the requirements, and in turn reduces system development time and reduces the amount of testing required of a new system. Moreover, in some embodiments, two or more implementations can be "reversed" to appropriate formal models, the models can be combined, and the resulting combination checked for conflicts. Then, the combined, error-free model can be used to generate a new (single) implementation that combines the functionality of the original separate implementations, and may be more likely to be correct.

  8. Physical lumping methods for developing linear reduced models for high speed propulsion systems

    NASA Technical Reports Server (NTRS)

    Immel, S. M.; Hartley, Tom T.; Deabreu-Garcia, J. Alex

    1991-01-01

    In gasdynamic systems, information travels in one direction for supersonic flow and in both directions for subsonic flow. A shock occurs at the transition from supersonic to subsonic flow. Thus, to simulate these systems, any simulation method implemented for the quasi-one-dimensional Euler equations must have the ability to capture the shock. In this paper, a technique combining both backward and central differencing is presented. The equations are subsequently linearized about an operating point and formulated into a linear state space model. After proper implementation of the boundary conditions, the model order is reduced from 123 to less than 10 using the Schur method of balancing. Simulations comparing frequency and step response of the reduced order model and the original system models are presented.

  9. Establishment of a Re-Entry Model to Reduce Recidivism Among Court Ward Students.

    ERIC Educational Resources Information Center

    Kammuller, Kenneth C.

    The development and implementation of a Re-Entry Model designed to facilitate and monitor the adjustment of high school students returned to public school from a county commitment facility is described. Transition and follow-up procedures implemented through the model with a liaison teacher at the commitment facility school designated as the…

  10. Modelling mitigation options to reduce diffuse nitrogen water pollution from agriculture.

    PubMed

    Bouraoui, Fayçal; Grizzetti, Bruna

    2014-01-15

    Agriculture is responsible for large scale water quality degradation and is estimated to contribute around 55% of the nitrogen entering the European Seas. The key policy instrument for protecting inland, transitional and coastal water resources is the Water Framework Directive (WFD). Reducing nutrient losses from agriculture is crucial to the successful implementation of the WFD. There are several mitigation measures that can be implemented to reduce nitrogen losses from agricultural areas to surface and ground waters. For the selection of appropriate measures, models are useful for quantifying the expected impacts and the associated costs. In this article we review some of the models used in Europe to assess the effectiveness of nitrogen mitigation measures, ranging from fertilizer management to the construction of riparian areas and wetlands. We highlight how the complexity of models is correlated with the type of scenarios that can be tested, with conceptual models mostly used to evaluate the impact of reduced fertilizer application, and the physically-based models used to evaluate the timing and location of mitigation options and the response times. We underline the importance of considering the lag time between the implementation of measures and effects on water quality. Models can be effective tools for targeting mitigation measures (identifying critical areas and timing), for evaluating their cost effectiveness, for taking into consideration pollution swapping and considering potential trade-offs in contrasting environmental objectives. Models are also useful for involving stakeholders during the development of catchments mitigation plans, increasing their acceptability. © 2013.

  11. Phased implementation of spaced clinic visits for stable HIV-positive patients in Rwanda to support Treat All.

    PubMed

    Nsanzimana, Sabin; Remera, Eric; Ribakare, Muhayimpundu; Burns, Tracy; Dludlu, Sibongile; Mills, Edward J; Condo, Jeanine; Bucher, Heiner C; Ford, Nathan

    2017-07-21

    In 2016, Rwanda implemented "Treat All," requiring the national HIV programme to increase antiretroviral (ART) treatment coverage to all people living with HIV. Approximately half of the 164,262 patients on ART have been on treatment for more than five years, and long-term retention of patients in care is an increasing concern. To address these challenges, the Ministry of Health has introduced a differentiated service delivery approach to reduce the frequency of clinical visits and medication dispensing for eligible patients. This article draws on key policy documents and the views of technical experts involved in policy development to describe the process of implementation of differentiated service delivery in Rwanda. Implementation of differentiated service delivery followed a phased approach to ensure that all steps are clearly defined and agreed by all partners. Key steps included: definition of scope, including defining which patients were eligible for transition to the new model; definition of the key model components; preparation for patient enrolment; considerations for special patient groups; engagement of implementing partners; securing political and financial support; forecasting drug supply; revision, dissemination and implementation of ART guidelines; and monitoring and evaluation. Based on the outcomes of the evaluation of the new service delivery model, the Ministry of Health will review and strategically reduce costs to the national HIV program and to the patient by exploring and implementing adjustments to the service delivery model.

  12. A framework for the case-specific assessment of Green Infrastructure in mitigating urban flood hazards

    NASA Astrophysics Data System (ADS)

    Schubert, Jochen E.; Burns, Matthew J.; Fletcher, Tim D.; Sanders, Brett F.

    2017-10-01

    This research outlines a framework for the case-specific assessment of Green Infrastructure (GI) performance in mitigating flood hazard in small urban catchments. The urban hydrologic modeling tool (MUSIC) is coupled with a fine resolution 2D hydrodynamic model (BreZo) to test to what extent retrofitting an urban watershed with GI, rainwater tanks and infiltration trenches in particular, can propagate flood management benefits downstream and support intuitive flood hazard maps useful for communicating and planning with communities. The hydrologic and hydraulic models are calibrated based on current catchment conditions, then modified to represent alternative GI scenarios including a complete lack of GI versus a full implementation of GI. Flow in the hydrologic/hydraulic models is forced using a range of synthetic rainfall events with annual exceedance probabilities (AEPs) between 1-63% and durations from 10 min to 24 h. Flood hazard benefits mapped by the framework include maximum flood depths and extents, flow intensity (m2/s), flood duration, and critical storm duration leading to maximum flood conditions. Application of the system to the Little Stringybark Creek (LSC) catchment shows that across the range of AEPs tested and for storm durations equal or less than 3 h, presently implemented GI reduces downstream flooded area on average by 29%, while a full implementation of GI would reduce downstream flooded area on average by 91%. A full implementation of GI could also lower maximum flow intensities by 83% on average, reducing the drowning hazard posed by urban streams and improving the potential for access by emergency responders. For storm durations longer than 3 h, a full implementation of GI lacks the capacity to retain the resulting rainfall depths and only reduces flooded area by 8% and flow intensity by 5.5%.

  13. Modelling the ability of source control measures to reduce inundation risk in a community-scale urban drainage system

    NASA Astrophysics Data System (ADS)

    Mei, Chao; Liu, Jiahong; Wang, Hao; Shao, Weiwei; Xia, Lin; Xiang, Chenyao; Zhou, Jinjun

    2018-06-01

    Urban inundation is a serious challenge that increasingly confronts the residents of many cities, as well as policymakers, in the context of rapid urbanization and climate change worldwide. In recent years, source control measures (SCMs) such as green roofs, permeable pavements, rain gardens, and vegetative swales have been implemented to address flood inundation in urban settings, and proven to be cost-effective and sustainable. In order to investigate the ability of SCMs on reducing inundation in a community-scale urban drainage system, a dynamic rainfall-runoff model of a community-scale urban drainage system was developed based on SWMM. SCMs implementing scenarios were modelled under six design rainstorm events with return period ranging from 2 to 100 years, and inundation risks of the drainage system were evaluated before and after the proposed implementation of SCMs, with a risk-evaluation method based on SWMM and analytic hierarchy process (AHP). Results show that, SCMs implementation resulting in significantly reduction of hydrological indexes that related to inundation risks, range of reduction rates of average flow, peak flow, and total flooded volume of the drainage system were 28.1-72.1, 19.0-69.2, and 33.9-56.0 %, respectively, under six rainfall events with return periods ranging from 2 to 100 years. Corresponding, the inundation risks of the drainage system were significantly reduced after SCMs implementation, the risk values falling below 0.2 when the rainfall return period was less than 10 years. Simulation results confirm the effectiveness of SCMs on mitigating inundation, and quantified the potential of SCMs on reducing inundation risks in the urban drainage system, which provided scientific references for implementing SCMs for inundation control of the study area.

  14. A Modelling Approach to Estimate the Impact of Sodium Reduction in Soups on Cardiovascular Health in the Netherlands

    PubMed Central

    Bruins, Maaike J.; Dötsch-Klerk, Mariska; Matthee, Joep; Kearney, Mary; van Elk, Kathelijn; Weber, Peter; Eggersdorfer, Manfred

    2015-01-01

    Hypertension is a major modifiable risk factor for cardiovascular disease and mortality, which could be lowered by reducing dietary sodium. The potential health impact of a product reformulation in the Netherlands was modelled, selecting packaged soups containing on average 25% less sodium as an example of an achievable product reformulation when implemented gradually. First, the blood pressure lowering resulting from sodium intake reduction was modelled. Second, the predicted blood pressure lowering was translated into potentially preventable incidence and mortality cases from stroke, acute myocardial infarction (AMI), angina pectoris, and heart failure (HF) implementing one year salt reduction. Finally, the potentially preventable subsequent lifetime Disability-Adjusted Life Years (DALYs) were calculated. The sodium reduction in soups might potentially reduce the incidence and mortality of stroke by approximately 0.5%, AMI and angina by 0.3%, and HF by 0.2%. The related burden of disease could be reduced by approximately 800 lifetime DALYs. This modelling approach can be used to provide insight into the potential public health impact of sodium reduction in specific food products. The data demonstrate that an achievable food product reformulation to reduce sodium can potentially benefit public health, albeit modest. When implemented across multiple product categories and countries, a significant health impact could be achieved. PMID:26393647

  15. What happened to the no-wait hospital? A case study of implementation of operational plans for reduced waits.

    PubMed

    Hansson, Johan; Tolf, Sara; Øvretveit, John; Carlsson, Jan; Brommels, Mats

    2012-01-01

    Both research and practice show that waiting lists are hard to reduce. Implementing complex interventions for reduced waits is an intricate and challenging process that requires special attention for surrounding factors helping and hindering the implementation. This article reports a case study of a hospital implementation of operational plans for reduced waits, with an emphasis on the process of change. A case study research design, theoretically informed by the Pettigrew and Whipp model of strategic change, was applied. Data were gathered from individual and focus group interviews with informants from different organizational levels at different times and from documents and plans. The findings revealed arrangements both helping and hindering the implementation work. Helping factors were the hospital's contemporary savings requirements and experiences from similar change initiatives. Those hindering the actions to plan and agree the changes were unclear support functions and unclear task prioritization. One contribution of this study is to demonstrate the advantages, disadvantages, and challenges of a contextualized case study for increased understanding of factors influencing organizational change implementation. One lesson for current policy is to regard context factors that are critical for successful implementation.

  16. Exploiting the chaotic behaviour of atmospheric models with reconfigurable architectures

    NASA Astrophysics Data System (ADS)

    Russell, Francis P.; Düben, Peter D.; Niu, Xinyu; Luk, Wayne; Palmer, T. N.

    2017-12-01

    Reconfigurable architectures are becoming mainstream: Amazon, Microsoft and IBM are supporting such architectures in their data centres. The computationally intensive nature of atmospheric modelling is an attractive target for hardware acceleration using reconfigurable computing. Performance of hardware designs can be improved through the use of reduced-precision arithmetic, but maintaining appropriate accuracy is essential. We explore reduced-precision optimisation for simulating chaotic systems, targeting atmospheric modelling, in which even minor changes in arithmetic behaviour will cause simulations to diverge quickly. The possibility of equally valid simulations having differing outcomes means that standard techniques for comparing numerical accuracy are inappropriate. We use the Hellinger distance to compare statistical behaviour between reduced-precision CPU implementations to guide reconfigurable designs of a chaotic system, then analyse accuracy, performance and power efficiency of the resulting implementations. Our results show that with only a limited loss in accuracy corresponding to less than 10% uncertainty in input parameters, the throughput and energy efficiency of a single-precision chaotic system implemented on a Xilinx Virtex-6 SX475T Field Programmable Gate Array (FPGA) can be more than doubled.

  17. Design and implementation of a combined influenza immunization and tuberculosis screening campaign with simulation modelling.

    PubMed

    Heim, Joseph A; Huang, Hao; Zabinsky, Zelda B; Dickerson, Jane; Wellner, Monica; Astion, Michael; Cruz, Doris; Vincent, Jeanne; Jack, Rhona

    2015-08-01

    Design and implement a concurrent campaign of influenza immunization and tuberculosis (TB) screening for health care workers (HCWs) that can reduce the number of clinic visits for each HCW. A discrete-event simulation model was developed to support issues of resource allocation decisions in planning and operations phases. The campaign was compressed to100 days in 2010 and further compressed to 75 days in 2012 and 2013. With more than 5000 HCW arrivals in 2011, 2012 and 2013, the 14-day goal of TB results was achieved for each year and reduced to about 4 days in 2012 and 2013. Implementing a concurrent campaign allows less number of visiting clinics and the compressing of campaign length allows earlier immunization. The support of simulation modelling can provide useful evaluations of different configurations. © 2015 John Wiley & Sons, Ltd.

  18. Implementation of a Diabetes Educator Care Model to Reduce Paediatric Admission for Diabetic Ketoacidosis.

    PubMed

    Deeb, Asma; Yousef, Hana; Abdelrahman, Layla; Tomy, Mary; Suliman, Shaker; Attia, Salima; Al Suwaidi, Hana

    2016-01-01

    Introduction. Diabetic Ketoacidosis (DKA) is a serious complication that can be life-threatening. Management of DKA needs admission in a specialized center and imposes major constraints on hospital resources. Aim. We plan to study the impact of adapting a diabetes-educator care model on reducing the frequency of hospital admission of children and adolescents presenting with DKA. Method. We have proposed a model of care led by diabetes educators for children and adolescents with diabetes. The team consisted of highly trained nurses. The model effectiveness is measured by comparing the rate of hospital admission for DKA over 4-year period to the baseline year prior to implementing the model. Results. There were 158 admissions for DKA over a 5-year period. Number of patients followed up in the outpatient diabetes clinics increased from 37 to 331 patients at the start and the end of the study years. Admission rate showed a downward trend over the five-year period. Percentage of admission for DKA is reduced from 210% to 1.8% (P 0.001). Conclusion. Diabetes educator care model is an effective and a sustainable measure to reduce hospital admission for DKA in children and adolescents.

  19. Reduction of Tunnel Dynamics at the National Transonic Facility (Invited)

    NASA Technical Reports Server (NTRS)

    Kilgore, W. A.; Balakrishna, S.; Butler, D. H.

    2001-01-01

    This paper describes the results of recent efforts to reduce the tunnel dynamics at the National Transonic Facility. The results presented describe the findings of an extensive data analysis, the proposed solutions to reduce dynamics and the results of implementing these solutions. These results show a 90% reduction in the dynamics around the model support structure and a small impact on reducing model dynamics. Also presented are several continuing efforts to further reduce dynamics.

  20. Implementation of internal model based control and individual pitch control to reduce fatigue loads and tower vibrations in wind turbines

    NASA Astrophysics Data System (ADS)

    Mohammadi, Ebrahim; Fadaeinedjad, Roohollah; Moschopoulos, Gerry

    2018-05-01

    Vibration control and fatigue loads reduction are important issues in large-scale wind turbines. Identifying the vibration frequencies and tuning dampers and controllers at these frequencies are major concerns in many control methods. In this paper, an internal model control (IMC) method with an adaptive algorithm is implemented to first identify the vibration frequency of the wind turbine tower and then to cancel the vibration signal. Standard individual pitch control (IPC) is also implemented to compare the performance of the controllers in term of fatigue loads reduction. Finally, the performance of the system when both controllers are implemented together is evaluated. Simulation results demonstrate that using only IMC or IPC alone has advantages and can reduce fatigue loads on specific components. IMC can identify and suppress tower vibrations in both fore-aft and side-to-side directions, whereas, IPC can reduce fatigue loads on blades, shaft and yaw bearings. When both IMC and IPC are implemented together, the advantages of both controllers can be used. The aforementioned analysis and comparisons were not studied in literature and this study fills this gap. FAST, AreoDyn and Simulink are used to simulate the mechanical, aerodynamic and electrical aspects of wind turbine.

  1. Can agent based models effectively reduce fisheries management implementation uncertainty?

    NASA Astrophysics Data System (ADS)

    Drexler, M.

    2016-02-01

    Uncertainty is an inherent feature of fisheries management. Implementation uncertainty remains a challenge to quantify often due to unintended responses of users to management interventions. This problem will continue to plague both single species and ecosystem based fisheries management advice unless the mechanisms driving these behaviors are properly understood. Equilibrium models, where each actor in the system is treated as uniform and predictable, are not well suited to forecast the unintended behaviors of individual fishers. Alternatively, agent based models (AMBs) can simulate the behaviors of each individual actor driven by differing incentives and constraints. This study evaluated the feasibility of using AMBs to capture macro scale behaviors of the US West Coast Groundfish fleet. Agent behavior was specified at the vessel level. Agents made daily fishing decisions using knowledge of their own cost structure, catch history, and the histories of catch and quota markets. By adding only a relatively small number of incentives, the model was able to reproduce highly realistic macro patterns of expected outcomes in response to management policies (catch restrictions, MPAs, ITQs) while preserving vessel heterogeneity. These simulations indicate that agent based modeling approaches hold much promise for simulating fisher behaviors and reducing implementation uncertainty. Additional processes affecting behavior, informed by surveys, are continually being added to the fisher behavior model. Further coupling of the fisher behavior model to a spatial ecosystem model will provide a fully integrated social, ecological, and economic model capable of performing management strategy evaluations to properly consider implementation uncertainty in fisheries management.

  2. Hierarchical matrices implemented into the boundary integral approaches for gravity field modelling

    NASA Astrophysics Data System (ADS)

    Čunderlík, Róbert; Vipiana, Francesca

    2017-04-01

    Boundary integral approaches applied for gravity field modelling have been recently developed to solve the geodetic boundary value problems numerically, or to process satellite observations, e.g. from the GOCE satellite mission. In order to obtain numerical solutions of "cm-level" accuracy, such approaches require very refined level of the disretization or resolution. This leads to enormous memory requirements that need to be reduced. An implementation of the Hierarchical Matrices (H-matrices) can significantly reduce a numerical complexity of these approaches. A main idea of the H-matrices is based on an approximation of the entire system matrix that is split into a family of submatrices. Large submatrices are stored in factorized representation, while small submatrices are stored in standard representation. This allows reducing memory requirements significantly while improving the efficiency. The poster presents our preliminary results of implementations of the H-matrices into the existing boundary integral approaches based on the boundary element method or the method of fundamental solution.

  3. Applying Reduced Generator Models in the Coarse Solver of Parareal in Time Parallel Power System Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duan, Nan; Dimitrovski, Aleksandar D; Simunovic, Srdjan

    2016-01-01

    The development of high-performance computing techniques and platforms has provided many opportunities for real-time or even faster-than-real-time implementation of power system simulations. One approach uses the Parareal in time framework. The Parareal algorithm has shown promising theoretical simulation speedups by temporal decomposing a simulation run into a coarse simulation on the entire simulation interval and fine simulations on sequential sub-intervals linked through the coarse simulation. However, it has been found that the time cost of the coarse solver needs to be reduced to fully exploit the potentials of the Parareal algorithm. This paper studies a Parareal implementation using reduced generatormore » models for the coarse solver and reports the testing results on the IEEE 39-bus system and a 327-generator 2383-bus Polish system model.« less

  4. An Approach for Dynamic Optimization of Prevention Program Implementation in Stochastic Environments

    NASA Astrophysics Data System (ADS)

    Kang, Yuncheol; Prabhu, Vittal

    The science of preventing youth problems has significantly advanced in developing evidence-based prevention program (EBP) by using randomized clinical trials. Effective EBP can reduce delinquency, aggression, violence, bullying and substance abuse among youth. Unfortunately the outcomes of EBP implemented in natural settings usually tend to be lower than in clinical trials, which has motivated the need to study EBP implementations. In this paper we propose to model EBP implementations in natural settings as stochastic dynamic processes. Specifically, we propose Markov Decision Process (MDP) for modeling and dynamic optimization of such EBP implementations. We illustrate these concepts using simple numerical examples and discuss potential challenges in using such approaches in practice.

  5. Multiplexed Predictive Control of a Large Commercial Turbofan Engine

    NASA Technical Reports Server (NTRS)

    Richter, hanz; Singaraju, Anil; Litt, Jonathan S.

    2008-01-01

    Model predictive control is a strategy well-suited to handle the highly complex, nonlinear, uncertain, and constrained dynamics involved in aircraft engine control problems. However, it has thus far been infeasible to implement model predictive control in engine control applications, because of the combination of model complexity and the time allotted for the control update calculation. In this paper, a multiplexed implementation is proposed that dramatically reduces the computational burden of the quadratic programming optimization that must be solved online as part of the model-predictive-control algorithm. Actuator updates are calculated sequentially and cyclically in a multiplexed implementation, as opposed to the simultaneous optimization taking place in conventional model predictive control. Theoretical aspects are discussed based on a nominal model, and actual computational savings are demonstrated using a realistic commercial engine model.

  6. Optimizing Automatic Deployment Using Non-functional Requirement Annotations

    NASA Astrophysics Data System (ADS)

    Kugele, Stefan; Haberl, Wolfgang; Tautschnig, Michael; Wechs, Martin

    Model-driven development has become common practice in design of safety-critical real-time systems. High-level modeling constructs help to reduce the overall system complexity apparent to developers. This abstraction caters for fewer implementation errors in the resulting systems. In order to retain correctness of the model down to the software executed on a concrete platform, human faults during implementation must be avoided. This calls for an automatic, unattended deployment process including allocation, scheduling, and platform configuration.

  7. NOTE: Implementation of angular response function modeling in SPECT simulations with GATE

    NASA Astrophysics Data System (ADS)

    Descourt, P.; Carlier, T.; Du, Y.; Song, X.; Buvat, I.; Frey, E. C.; Bardies, M.; Tsui, B. M. W.; Visvikis, D.

    2010-05-01

    Among Monte Carlo simulation codes in medical imaging, the GATE simulation platform is widely used today given its flexibility and accuracy, despite long run times, which in SPECT simulations are mostly spent in tracking photons through the collimators. In this work, a tabulated model of the collimator/detector response was implemented within the GATE framework to significantly reduce the simulation times in SPECT. This implementation uses the angular response function (ARF) model. The performance of the implemented ARF approach has been compared to standard SPECT GATE simulations in terms of the ARF tables' accuracy, overall SPECT system performance and run times. Considering the simulation of the Siemens Symbia T SPECT system using high-energy collimators, differences of less than 1% were measured between the ARF-based and the standard GATE-based simulations, while considering the same noise level in the projections, acceleration factors of up to 180 were obtained when simulating a planar 364 keV source seen with the same SPECT system. The ARF-based and the standard GATE simulation results also agreed very well when considering a four-head SPECT simulation of a realistic Jaszczak phantom filled with iodine-131, with a resulting acceleration factor of 100. In conclusion, the implementation of an ARF-based model of collimator/detector response for SPECT simulations within GATE significantly reduces the simulation run times without compromising accuracy.

  8. Preliminary effects of parent-implemented behavioural interventions for stereotypy.

    PubMed

    Lanovaz, Marc J; Rapp, John T; Maciw, Isabella; Dorion, Catherine; Prégent-Pelletier, Émilie

    2016-06-01

    The purpose of our study was to replicate and extend previous research on using multicomponent behavioural interventions designed to reduce engagement in stereotypy by examining their effects when implemented by parents over several months. We used an alternating treatment design to examine the effects of the parent-implemented interventions on engagement in stereotypy and appropriate behaviour in three children with autism and other developmental disabilities. The parent-implemented multicomponent treatments reduced vocal stereotypy in all three participants and increased engagement in appropriate behaviour in two participants. These effects persisted up to 24 weeks following the parent training sessions. Altogether, our preliminary results support (a) the involvement of parents as behaviour change agents to reduce engagement in stereotypy and (b) the scheduling of regular, but infrequent (i.e. weekly to monthly), follow-up meetings to monitor the effects of behavioural interventions in outpatient and home-based service delivery models.

  9. Reduced Order Model Basis Vector Generation: Generates Basis Vectors fro ROMs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arrighi, Bill

    2016-03-03

    libROM is a library that implements order reduction via singular value decomposition (SVD) of sampled state vectors. It implements 2 parallel, incremental SVD algorithms and one serial, non-incremental algorithm. It also provides a mechanism for adaptive sampling of basis vectors.

  10. Lean business model and implementation of a geriatric fracture center.

    PubMed

    Kates, Stephen L

    2014-05-01

    Geriatric hip fracture is a common event associated with high costs of care and often with suboptimal outcomes for the patients. Ideally, a new care model to manage geriatric hip fractures would address both quality and safety of patient care as well as the need for reduced costs of care. The geriatric fracture center model of care is one such model reported to improve both outcomes and quality of care. It is a lean business model applied to medicine. This article describes basic lean business concepts applied to geriatric fracture care and information needed to successfully implement a geriatric fracture center. It is written to assist physicians and surgeons in their efforts to implement an improved care model for their patients. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Design and implementation of a random neural network routing engine.

    PubMed

    Kocak, T; Seeber, J; Terzioglu, H

    2003-01-01

    Random neural network (RNN) is an analytically tractable spiked neural network model that has been implemented in software for a wide range of applications for over a decade. This paper presents the hardware implementation of the RNN model. Recently, cognitive packet networks (CPN) is proposed as an alternative packet network architecture where there is no routing table, instead the RNN based reinforcement learning is used to route packets. Particularly, we describe implementation details for the RNN based routing engine of a CPN network processor chip: the smart packet processor (SPP). The SPP is a dual port device that stores, modifies, and interprets the defining characteristics of multiple RNN models. In addition to hardware design improvements over the software implementation such as the dual access memory, output calculation step, and reduced output calculation module, this paper introduces a major modification to the reinforcement learning algorithm used in the original CPN specification such that the number of weight terms are reduced from 2n/sup 2/ to 2n. This not only yields significant memory savings, but it also simplifies the calculations for the steady state probabilities (neuron outputs in RNN). Simulations have been conducted to confirm the proper functionality for the isolated SPP design as well as for the multiple SPP's in a networked environment.

  12. Model Reduction for Control System Design

    NASA Technical Reports Server (NTRS)

    Enns, D. F.

    1985-01-01

    An approach and a technique for effectively obtaining reduced order mathematical models of a given large order model for the purposes of synthesis, analysis and implementation of control systems is developed. This approach involves the use of an error criterion which is the H-infinity norm of a frequency weighted error between the full and reduced order models. The weightings are chosen to take into account the purpose for which the reduced order model is intended. A previously unknown error bound in the H-infinity norm for reduced order models obtained from internally balanced realizations was obtained. This motivated further development of the balancing technique to include the frequency dependent weightings. This resulted in the frequency weighted balanced realization and a new model reduction technique. Two approaches to designing reduced order controllers were developed. The first involves reducing the order of a high order controller with an appropriate weighting. The second involves linear quadratic Gaussian synthesis based on a reduced order model obtained with an appropriate weighting.

  13. Incorporating Resource Protection Constraints in an Analysis of Landscape Fuel-Treatment Effectiveness in the Northern Sierra Nevada, CA, USA.

    PubMed

    Dow, Christopher B; Collins, Brandon M; Stephens, Scott L

    2016-03-01

    Finding novel ways to plan and implement landscape-level forest treatments that protect sensitive wildlife and other key ecosystem components, while also reducing the risk of large-scale, high-severity fires, can prove to be difficult. We examined alternative approaches to landscape-scale fuel-treatment design for the same landscape. These approaches included two different treatment scenarios generated from an optimization algorithm that reduces modeled fire spread across the landscape, one with resource-protection constrains and one without the same. We also included a treatment scenario that was the actual fuel-treatment network implemented, as well as a no-treatment scenario. For all the four scenarios, we modeled hazardous fire potential based on conditional burn probabilities, and projected fire emissions. Results demonstrate that in all the three active treatment scenarios, hazardous fire potential, fire area, and emissions were reduced by approximately 50 % relative to the untreated condition. Results depict that incorporation of constraints is more effective at reducing modeled fire outputs, possibly due to the greater aggregation of treatments, creating greater continuity of fuel-treatment blocks across the landscape. The implementation of fuel-treatment networks using different planning techniques that incorporate real-world constraints can reduce the risk of large problematic fires, allow for landscape-level heterogeneity that can provide necessary ecosystem services, create mixed forest stand structures on a landscape, and promote resilience in the uncertain future of climate change.

  14. A genetic algorithm-based job scheduling model for big data analytics.

    PubMed

    Lu, Qinghua; Li, Shanshan; Zhang, Weishan; Zhang, Lei

    Big data analytics (BDA) applications are a new category of software applications that process large amounts of data using scalable parallel processing infrastructure to obtain hidden value. Hadoop is the most mature open-source big data analytics framework, which implements the MapReduce programming model to process big data with MapReduce jobs. Big data analytics jobs are often continuous and not mutually separated. The existing work mainly focuses on executing jobs in sequence, which are often inefficient and consume high energy. In this paper, we propose a genetic algorithm-based job scheduling model for big data analytics applications to improve the efficiency of big data analytics. To implement the job scheduling model, we leverage an estimation module to predict the performance of clusters when executing analytics jobs. We have evaluated the proposed job scheduling model in terms of feasibility and accuracy.

  15. A diagnostic interface for the ICOsahedral Non-hydrostatic (ICON) modelling framework based on the Modular Earth Submodel System (MESSy v2.50)

    NASA Astrophysics Data System (ADS)

    Kern, Bastian; Jöckel, Patrick

    2016-10-01

    Numerical climate and weather models have advanced to finer scales, accompanied by large amounts of output data. The model systems hit the input and output (I/O) bottleneck of modern high-performance computing (HPC) systems. We aim to apply diagnostic methods online during the model simulation instead of applying them as a post-processing step to written output data, to reduce the amount of I/O. To include diagnostic tools into the model system, we implemented a standardised, easy-to-use interface based on the Modular Earth Submodel System (MESSy) into the ICOsahedral Non-hydrostatic (ICON) modelling framework. The integration of the diagnostic interface into the model system is briefly described. Furthermore, we present a prototype implementation of an advanced online diagnostic tool for the aggregation of model data onto a user-defined regular coarse grid. This diagnostic tool will be used to reduce the amount of model output in future simulations. Performance tests of the interface and of two different diagnostic tools show, that the interface itself introduces no overhead in form of additional runtime to the model system. The diagnostic tools, however, have significant impact on the model system's runtime. This overhead strongly depends on the characteristics and implementation of the diagnostic tool. A diagnostic tool with high inter-process communication introduces large overhead, whereas the additional runtime of a diagnostic tool without inter-process communication is low. We briefly describe our efforts to reduce the additional runtime from the diagnostic tools, and present a brief analysis of memory consumption. Future work will focus on optimisation of the memory footprint and the I/O operations of the diagnostic interface.

  16. The role of public policies in reducing smoking prevalence: results from the Michigan SimSmoke tobacco policy simulation model.

    PubMed

    Levy, David T; Huang, An-Tsun; Havumaki, Joshua S; Meza, Rafael

    2016-05-01

    Michigan has implemented several of the tobacco control policies recommended by the World Health Organization MPOWER goals. We consider the effect of those policies and additional policies consistent with MPOWER goals on smoking prevalence and smoking-attributable deaths (SADs). The SimSmoke tobacco control policy simulation model is used to examine the effect of past policies and a set of additional policies to meet the MPOWER goals. The model is adapted to Michigan using state population, smoking, and policy data starting in 1993. SADs are estimated using standard attribution methods. Upon validating the model, SimSmoke is used to distinguish the effect of policies implemented since 1993 against a counterfactual with policies kept at their 1993 levels. The model is then used to project the effect of implementing stronger policies beginning in 2014. SimSmoke predicts smoking prevalence accurately between 1993 and 2010. Since 1993, a relative reduction in smoking rates of 22 % by 2013 and of 30 % by 2054 can be attributed to tobacco control policies. Of the 22 % reduction, 44 % is due to taxes, 28 % to smoke-free air laws, 26 % to cessation treatment policies, and 2 % to youth access. Moreover, 234,000 SADs are projected to be averted by 2054. With additional policies consistent with MPOWER goals, the model projects that, by 2054, smoking prevalence can be further reduced by 17 % with 80,000 deaths averted relative to the absence of those policies. Michigan SimSmoke shows that tobacco control policies, including cigarette taxes, smoke-free air laws, and cessation treatment policies, have substantially reduced smoking and SADs. Higher taxes, strong mass media campaigns, and cessation treatment policies would further reduce smoking prevalence and SADs.

  17. The Role of Public Policies in Reducing Smoking Prevalence: Results from the Michigan SimSmoke Tobacco Policy Simulation Model

    PubMed Central

    Levy, David T.; Huang, An-Tsun; Havumaki, Joshua S.; Meza, Rafael

    2016-01-01

    Introduction Michigan has implemented several of the tobacco control policies recommended by the World Health Organization MPOWER goals. We consider the effect of those policies and additional policies consistent with MPOWER goals on smoking prevalence and smoking-attributable deaths (SADs). Methods The SimSmoke tobacco control policy simulation model is used to examine the effect of past policies and a set of additional policies to meet the MPOWER goals. The model is adapted to Michigan using state population, smoking and policy data starting in 1993. SADs are estimated using standard attribution methods. Upon validating the model, SimSmoke is used to distinguish the effect of policies implemented since 1993 against a counterfactual with policies kept at their 1993 levels. The model is then used to project the effect of implementing stronger policies beginning in 2014. Results SimSmoke predicts smoking prevalence accurately between 1993 and 2010. Since 1993, a relative reduction in smoking rates of 22% by 2013 and of 30% by 2054 can be attributed to tobacco control policies. Of the 22% reduction, 44% is due to taxes, 28% to smoke-free air laws, 26% to cessation treatment policies, and 2% to youth access. Moreover, 234,000 smoking-attributable deaths are projected to be averted by 2054. With additional policies consistent with MPOWER goals, the model projects that, by 2054, smoking prevalence can be further reduced by 17% with 80,000 deaths averted relative to the absence of those policies. Conclusions Michigan SimSmoke shows that tobacco control policies, including cigarette taxes, smoke-free air laws and cessation treatment policies, have substantially reduced smoking and smoking-attributable deaths. Higher taxes, strong mass media campaigns and cessation treatment policies would further reduce smoking prevalence and smoking-attributable deaths. PMID:26983616

  18. The role of tobacco control policies in reducing smoking and deaths in a middle income nation: results from the Thailand SimSmoke simulation model.

    PubMed

    Levy, D T; Benjakul, S; Ross, H; Ritthiphakdee, B

    2008-02-01

    With the male smoking prevalence near 60% in 1991, Thailand was one of the first Asian nations to implement strict tobacco control policies. However, the success of their efforts has not been well documented. The role of tobacco control policies are examined using the "SimSmoke" tobacco control model. We first validated the model against survey data on smoking prevalence. We then distinguished the effect of policies implemented between 1991 and 2006 from long-term trends in smoking rates. We also estimated smoking attributable deaths and lives saved as a result of the policies. The model validates well against survey data. The model shows that by the year 2006, policies implemented between 1991 and 2006 had already decreased smoking prevalence by 25% compared to what it would have been in the absence of the policies. Tax increases on cigarettes and advertising bans had the largest impact, followed by media anti-smoking campaigns, clean air laws and health warnings. The model estimates that the policies saved 31 867 lives by 2006 and will have saved 319,456 lives by 2026. The results document the success of Thailand in reducing smoking prevalence and reducing the number of lives lost to smoking, thereby showing the potential of tobacco control policies specifically in a middle-income country. Additional improvements can be realised through higher taxes, stronger clean air policies, comprehensive cessation treatment policies, and targeted media campaigns.

  19. PyGirl: Generating Whole-System VMs from High-Level Prototypes Using PyPy

    NASA Astrophysics Data System (ADS)

    Bruni, Camillo; Verwaest, Toon

    Virtual machines (VMs) emulating hardware devices are generally implemented in low-level languages for performance reasons. This results in unmaintainable systems that are difficult to understand. In this paper we report on our experience using the PyPy toolchain to improve the portability and reduce the complexity of whole-system VM implementations. As a case study we implement a VM prototype for a Nintendo Game Boy, called PyGirl, in which the high-level model is separated from low-level VM implementation issues. We shed light on the process of refactoring from a low-level VM implementation in Java to a high-level model in RPython. We show that our whole-system VM written with PyPy is significantly less complex than standard implementations, without substantial loss in performance.

  20. A Wind Tunnel Model to Explore Unsteady Circulation Control for General Aviation Applications

    NASA Technical Reports Server (NTRS)

    Cagle, Christopher M.; Jones, Gregory S.

    2002-01-01

    Circulation Control airfoils have been demonstrated to provide substantial improvements in lift over conventional airfoils. The General Aviation Circular Control model is an attempt to address some of the concerns of this technique. The primary focus is to substantially reduce the amount of air mass flow by implementing unsteady flow. This paper describes a wind tunnel model that implements unsteady circulation control by pulsing internal pneumatic valves and details some preliminary results from the first test entry.

  1. On DSS Implementation in the Dynamic Model of the Digital Oil field

    NASA Astrophysics Data System (ADS)

    Korovin, Iakov S.; Khisamutdinov, Maksim V.; Kalyaev, Anatoly I.

    2018-02-01

    Decision support systems (DSS), especially based on the artificial intelligence (AI) techniques are been widely applied in different domains nowadays. In the paper we depict an approach of implementing DSS in to Digital Oil Field (DOF) dynamic model structure in order to reduce the human factor influence, considering the automation of all production processes to be the DOF model clue element. As the basic tool of data handling we propose the hybrid application on artificial neural networks and evolutional algorithms.

  2. Please Reduce Cycle Time

    DTIC Science & Technology

    2014-12-01

    observed an ERP system implementation that encountered this exact model. The modified COTS software worked and passed the acceptance tests but never... software -intensive program. We decided to create a very detailed master sched- ule with multiple supporting subschedules that linked and Implementing ...processes in place as part of the COTS implementation . For hardware , COTS can also present some risks. Many pro- grams use COTS computers and servers

  3. Modeling the leadership attributes of top management in green innovation implementation

    NASA Astrophysics Data System (ADS)

    Ishak, Noormaizatul Akmar; Ramli, Mohammad Fadzli

    2015-05-01

    The implementation of green innovation in the companies is the interest of the governments all over the world. This has been the main focus of the Copenhagen Protocol and Kyoto Protocol that require all governments to preserve the nature through green initiatives. This paper proposes a mathematical model on the leadership attributes of the top management in ensuring green innovation implementation in their companies' strategies to reduce operational cost. With green innovation implementation in the Government-Linked Companies (GLCs), we identify the leadership attributes are tied up to the leadership style of the top managers in the companies. Through this model we have proved that green type leadership always contributes better in cost saving, therefore it is a more efficient leadership attribute for the GLCs especially.

  4. Digital telephony analysis model and issues

    NASA Astrophysics Data System (ADS)

    Keuthan, Lynn M.

    1995-09-01

    Experts in the fields of digital telephony and communications security have stated the need for an analytical tool for evaluating complex issues. Some important policy issues discussed by experts recently include implementing digital wire-taps, implementation of the 'Clipper Chip', required registration of encryption/decryption keys, and export control of cryptographic equipment. Associated with the implementation of these policies are direct costs resulting from implementation, indirect cost benefits from implementation, and indirect costs resulting from the risks of implementation or factors reducing cost benefits. Presented herein is a model for analyzing digital telephony policies and systems and their associated direct costs and indirect benefit and risk factors. In order to present the structure of the model, issues of national importance and business-related issues are discussed. The various factors impacting the implementation of the associated communications systems and communications security are summarized, and various implementation tradeoffs are compared based on economic benefits/impact. The importance of the issues addressed herein, as well as other digital telephony issues, has greatly increased with the enormous increases in communication system connectivity due to the advance of the National Information Infrastructure.

  5. The dynamical analysis of modified two-compartment neuron model and FPGA implementation

    NASA Astrophysics Data System (ADS)

    Lin, Qianjin; Wang, Jiang; Yang, Shuangming; Yi, Guosheng; Deng, Bin; Wei, Xile; Yu, Haitao

    2017-10-01

    The complexity of neural models is increasing with the investigation of larger biological neural network, more various ionic channels and more detailed morphologies, and the implementation of biological neural network is a task with huge computational complexity and power consumption. This paper presents an efficient digital design using piecewise linearization on field programmable gate array (FPGA), to succinctly implement the reduced two-compartment model which retains essential features of more complicated models. The design proposes an approximate neuron model which is composed of a set of piecewise linear equations, and it can reproduce different dynamical behaviors to depict the mechanisms of a single neuron model. The consistency of hardware implementation is verified in terms of dynamical behaviors and bifurcation analysis, and the simulation results including varied ion channel characteristics coincide with the biological neuron model with a high accuracy. Hardware synthesis on FPGA demonstrates that the proposed model has reliable performance and lower hardware resource compared with the original two-compartment model. These investigations are conducive to scalability of biological neural network in reconfigurable large-scale neuromorphic system.

  6. Modeling and Analysis of the Hurricane Imaging Radiometer (HIRAD)

    NASA Technical Reports Server (NTRS)

    Mauro, Stephanie

    2013-01-01

    The Hurricane Imaging Radiometer (HIRad) is a payload carried by an unmanned aerial vehicle (UAV) at altitudes up to 60,000 ft with the purpose of measuring ocean surface wind speeds and near ocean surface rain rates in hurricanes. The payload includes several components that must maintain steady temperatures throughout the flight. Minimizing the temperature drift of these components allows for accurate data collection and conclusions to be drawn concerning the behavior of hurricanes. HIRad has flown on several different UAVs over the past two years during the fall hurricane season. Based on the data from the 2011 flight, a Thermal Desktop model was created to simulate the payload and reproduce the temperatures. Using this model, recommendations were made to reduce the temperature drift through the use of heaters controlled by resistance temperature detector (RTD) sensors. The suggestions made were implemented for the 2012 hurricane season and further data was collected. The implementation of the heaters reduced the temperature drift for a portion of the flight, but after a period of time, the temperatures rose. With this new flight data, the thermal model was updated and correlated. Detailed analysis was conducted to determine a more effective way to reduce the temperature drift. The final recommendations made were to adjust the set temperatures of the heaters for 2013 flights and implement hardware changes for flights beyond 2013.

  7. Thermal Modeling and Analysis of the Hurricane Imaging Radiometer (HIRad)

    NASA Technical Reports Server (NTRS)

    Mauro, Stephanie

    2013-01-01

    The Hurricane Imaging Radiometer (HIRad) is a payload carried by an unmanned aerial vehicle (UAV) at altitudes up to 60,000 ft with the purpose of measuring ocean surface wind speeds and near ocean surface rain rates in hurricanes. The payload includes several components that must maintain steady temperatures throughout the flight. Minimizing the temperature drift of these components allows for accurate data collection and conclusions to be drawn concerning the behavior of hurricanes. HIRad has flown on several different UAVs over the past two years during the fall hurricane season. Based on the data from the 2011 flight, a Thermal Desktop model was created to simulate the payload and reproduce the temperatures. Using this model, recommendations were made to reduce the temperature drift through the use of heaters controlled by resistance temperature detector (RTD) sensors. The suggestions made were implemented for the 2012 hurricane season and further data was collected. The implementation of the heaters reduced the temperature drift for a portion of the flight, but after a period of time, the temperatures rose. With this new flight data, the thermal model was updated and correlated. Detailed analysis was conducted to determine a more effective way to reduce the temperature drift. The final recommendations made were to adjust the set temperatures of the heaters for 2013 flights and implement hardware changes for flights beyond 2013.

  8. A Case Report: Cornerstone Health Care Reduced the Total Cost of Care Through Population Segmentation and Care Model Redesign.

    PubMed

    Green, Dale E; Hamory, Bruce H; Terrell, Grace E; O'Connell, Jasmine

    2017-08-01

    Over the course of a single year, Cornerstone Health Care, a multispecialty group practice in North Carolina, redesigned the underlying care models for 5 of its highest-risk populations-late-stage congestive heart failure, oncology, Medicare-Medicaid dual eligibles, those with 5 or more chronic conditions, and the most complex patients with multiple late-stage chronic conditions. At the 1-year mark, the results of the program were analyzed. Overall costs for the patients studied were reduced by 12.7% compared to the year before enrollment. All fully implemented programs delivered between 10% and 16% cost savings. The key area for savings factor was hospitalization, which was reduced by 30% across all programs. The greatest area of cost increase was "other," a category that consisted in large part of hospice services. Full implementation was key; 2 primary care sites that reverted to more traditional models failed to show the same pattern of savings.

  9. Fast Learning for Immersive Engagement in Energy Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bush, Brian W; Bugbee, Bruce; Gruchalla, Kenny M

    The fast computation which is critical for immersive engagement with and learning from energy simulations would be furthered by developing a general method for creating rapidly computed simplified versions of NREL's computation-intensive energy simulations. Created using machine learning techniques, these 'reduced form' simulations can provide statistically sound estimates of the results of the full simulations at a fraction of the computational cost with response times - typically less than one minute of wall-clock time - suitable for real-time human-in-the-loop design and analysis. Additionally, uncertainty quantification techniques can document the accuracy of the approximate models and their domain of validity. Approximationmore » methods are applicable to a wide range of computational models, including supply-chain models, electric power grid simulations, and building models. These reduced-form representations cannot replace or re-implement existing simulations, but instead supplement them by enabling rapid scenario design and quality assurance for large sets of simulations. We present an overview of the framework and methods we have implemented for developing these reduced-form representations.« less

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    March-Leuba, S.; Jansen, J.F.; Kress, R.L.

    A new program package, Symbolic Manipulator Laboratory (SML), for the automatic generation of both kinematic and static manipulator models in symbolic form is presented. Critical design parameters may be identified and optimized using symbolic models as shown in the sample application presented for the Future Armor Rearm System (FARS) arm. The computer-aided development of the symbolic models yields equations with reduced numerical complexity. Important considerations have been placed on the closed form solutions simplification and on the user friendly operation. The main emphasis of this research is the development of a methodology which is implemented in a computer program capablemore » of generating symbolic kinematic and static forces models of manipulators. The fact that the models are obtained trigonometrically reduced is among the most significant results of this work and the most difficult to implement. Mathematica, a commercial program that allows symbolic manipulation, is used to implement the program package. SML is written such that the user can change any of the subroutines or create new ones easily. To assist the user, an on-line help has been written to make of SML a user friendly package. Some sample applications are presented. The design and optimization of the 5-degrees-of-freedom (DOF) FARS manipulator using SML is discussed. Finally, the kinematic and static models of two different 7-DOF manipulators are calculated symbolically.« less

  11. Implementation of a new prenatal care model to reduce office visits and increase connectivity and continuity of care: protocol for a mixed-methods study.

    PubMed

    Ridgeway, Jennifer L; LeBlanc, Annie; Branda, Megan; Harms, Roger W; Morris, Megan A; Nesbitt, Kate; Gostout, Bobbie S; Barkey, Lenae M; Sobolewski, Susan M; Brodrick, Ellen; Inselman, Jonathan; Baron, Anne; Sivly, Angela; Baker, Misty; Finnie, Dawn; Chaudhry, Rajeev; Famuyide, Abimbola O

    2015-12-02

    Most low-risk pregnant women receive the standard model of prenatal care with frequent office visits. Research suggests that a reduced schedule of visits among low-risk women could be implemented without increasing adverse maternal or fetal outcomes, but patient satisfaction with these models varies. We aim to determine the effectiveness and feasibility of a new prenatal care model (OB Nest) that enhances a reduced visit model by adding virtual connections that improve continuity of care and patient-directed access to care. This mixed-methods study uses a hybrid effectiveness-implementation design in a single center randomized controlled trial (RCT). Embedding process evaluation in an experimental design like an RCT allows researchers to answer both "Did it work?" and "How or why did it work (or not work)?" when studying complex interventions, as well as providing knowledge for translation into practice after the study. The RE-AIM framework was used to ensure attention to evaluating program components in terms of sustainable adoption and implementation. Low-risk patients recruited from the Obstetrics Division at Mayo Clinic (Rochester, MN) will be randomized to OB Nest or usual care. OB Nest patients will be assigned to a dedicated nursing team, scheduled for 8 pre-planned office visits with a physician or midwife and 6 telephone or online nurse visits (compared to 12 pre-planned physician or midwife office visits in the usual care group), and provided fetal heart rate and blood pressure home monitoring equipment and information on joining an online care community. Quantitative methods will include patient surveys and medical record abstraction. The primary quantitative outcome is patient-reported satisfaction. Other outcomes include fidelity to items on the American Congress of Obstetricians and Gynecologists standards of care list, health care utilization (e.g. numbers of antenatal office visits), and maternal and fetal outcomes (e.g. gestational age at delivery), as well as validated patient-reported measures of pregnancy-related stress and perceived quality of care. Quantitative analysis will be performed according to the intention to treat principle. Qualitative methods will include interviews and focus groups with providers, staff, and patients, and will explore satisfaction, intervention adoption, and implementation feasibility. We will use methods of qualitative thematic analysis at three stages. Mixed methods analysis will involve the use of qualitative data to lend insight to quantitative findings. This study will make important contributions to the literature on reduced visit models by evaluating a novel prenatal care model with components to increase patient connectedness (even with fewer pre-scheduled office visits), as demonstrated on a range of patient-important outcomes. The use of a hybrid effectiveness-implementation approach, as well as attention to patient and provider perspectives on program components and implementation, may uncover important information that can inform long-term feasibility and potentially speed future translation. Trial registration identifier: NCT02082275 Submitted: March 6, 2014.

  12. The role of non-CO2 mitigation within the dairy sector in pursuing climate goals

    NASA Astrophysics Data System (ADS)

    Rolph, K.; Forest, C. E.

    2017-12-01

    Mitigation of non-CO2 climate forcing agents must complement the mitigation of carbon dioxide (CO2) to achieve long-term temperature and climate policy goals. By using multi-gas mitigation strategies, society can limit the rate of temperature change on decadal timescales and reduce the cost of implementing policies that only consider CO2 mitigation. The largest share of global non-CO2 greenhouse gas emissions is attributed to agriculture, with activities related to dairy production contributing the most in this sector. Approximately 4% of global anthropogenic greenhouse gas emissions is released from the dairy sub-sector, primarily through enteric fermentation, feed production, and manure management. Dairy farmers can significantly reduce their emissions by implementing better management practices. This study assesses the potential mitigation of projected climate change if greenhouse gases associated with the dairy sector were reduced. To compare the performance of several mitigation measures under future climate change, we employ a fully coupled earth system model of intermediate complexity, the MIT Integrated Global System Model (IGSM). The model includes an interactive carbon-cycle capable of addressing important feedbacks between the climate and terrestrial biosphere. Mitigation scenarios are developed using estimated emission reductions of implemented management practices studied by the USDA-funded Sustainable Dairy Project (Dairy-CAP). We examine pathways to reach the US dairy industry's voluntary goal of reducing dairy emissions 25% by 2020. We illustrate the importance of ongoing mitigation efforts in the agricultural industry to reduce non-CO2 greenhouse gas emissions towards established climate goals.

  13. Automated model integration at source code level: An approach for implementing models into the NASA Land Information System

    NASA Astrophysics Data System (ADS)

    Wang, S.; Peters-Lidard, C. D.; Mocko, D. M.; Kumar, S.; Nearing, G. S.; Arsenault, K. R.; Geiger, J. V.

    2014-12-01

    Model integration bridges the data flow between modeling frameworks and models. However, models usually do not fit directly into a particular modeling environment, if not designed for it. An example includes implementing different types of models into the NASA Land Information System (LIS), a software framework for land-surface modeling and data assimilation. Model implementation requires scientific knowledge and software expertise and may take a developer months to learn LIS and model software structure. Debugging and testing of the model implementation is also time-consuming due to not fully understanding LIS or the model. This time spent is costly for research and operational projects. To address this issue, an approach has been developed to automate model integration into LIS. With this in mind, a general model interface was designed to retrieve forcing inputs, parameters, and state variables needed by the model and to provide as state variables and outputs to LIS. Every model can be wrapped to comply with the interface, usually with a FORTRAN 90 subroutine. Development efforts need only knowledge of the model and basic programming skills. With such wrappers, the logic is the same for implementing all models. Code templates defined for this general model interface could be re-used with any specific model. Therefore, the model implementation can be done automatically. An automated model implementation toolkit was developed with Microsoft Excel and its built-in VBA language. It allows model specifications in three worksheets and contains FORTRAN 90 code templates in VBA programs. According to the model specification, the toolkit generates data structures and procedures within FORTRAN modules and subroutines, which transfer data between LIS and the model wrapper. Model implementation is standardized, and about 80 - 90% of the development load is reduced. In this presentation, the automated model implementation approach is described along with LIS programming interfaces, the general model interface and five case studies, including a regression model, Noah-MP, FASST, SAC-HTET/SNOW-17, and FLake. These different models vary in complexity with software structure. Also, we will describe how these complexities were overcome through using this approach and results of model benchmarks within LIS.

  14. Test Driven Development: Lessons from a Simple Scientific Model

    NASA Astrophysics Data System (ADS)

    Clune, T. L.; Kuo, K.

    2010-12-01

    In the commercial software industry, unit testing frameworks have emerged as a disruptive technology that has permanently altered the process by which software is developed. Unit testing frameworks significantly reduce traditional barriers, both practical and psychological, to creating and executing tests that verify software implementations. A new development paradigm, known as test driven development (TDD), has emerged from unit testing practices, in which low-level tests (i.e. unit tests) are created by developers prior to implementing new pieces of code. Although somewhat counter-intuitive, this approach actually improves developer productivity. In addition to reducing the average time for detecting software defects (bugs), the requirement to provide procedure interfaces that enable testing frequently leads to superior design decisions. Although TDD is widely accepted in many software domains, its applicability to scientific modeling still warrants reasonable skepticism. While the technique is clearly relevant for infrastructure layers of scientific models such as the Earth System Modeling Framework (ESMF), numerical and scientific components pose a number of challenges to TDD that are not often encountered in commercial software. Nonetheless, our experience leads us to believe that the technique has great potential not only for developer productivity, but also as a tool for understanding and documenting the basic scientific assumptions upon which our models are implemented. We will provide a brief introduction to test driven development and then discuss our experience in using TDD to implement a relatively simple numerical model that simulates the growth of snowflakes. Many of the lessons learned are directly applicable to larger scientific models.

  15. Implementation of aerosol-cloud interactions in the regional atmosphere-aerosol model COSMO-MUSCAT(5.0) and evaluation using satellite data

    NASA Astrophysics Data System (ADS)

    Dipu, Sudhakar; Quaas, Johannes; Wolke, Ralf; Stoll, Jens; Mühlbauer, Andreas; Sourdeval, Odran; Salzmann, Marc; Heinold, Bernd; Tegen, Ina

    2017-06-01

    The regional atmospheric model Consortium for Small-scale Modeling (COSMO) coupled to the Multi-Scale Chemistry Aerosol Transport model (MUSCAT) is extended in this work to represent aerosol-cloud interactions. Previously, only one-way interactions (scavenging of aerosol and in-cloud chemistry) and aerosol-radiation interactions were included in this model. The new version allows for a microphysical aerosol effect on clouds. For this, we use the optional two-moment cloud microphysical scheme in COSMO and the online-computed aerosol information for cloud condensation nuclei concentrations (Cccn), replacing the constant Cccn profile. In the radiation scheme, we have implemented a droplet-size-dependent cloud optical depth, allowing now for aerosol-cloud-radiation interactions. To evaluate the models with satellite data, the Cloud Feedback Model Intercomparison Project Observation Simulator Package (COSP) has been implemented. A case study has been carried out to understand the effects of the modifications, where the modified modeling system is applied over the European domain with a horizontal resolution of 0.25° × 0.25°. To reduce the complexity in aerosol-cloud interactions, only warm-phase clouds are considered. We found that the online-coupled aerosol introduces significant changes for some cloud microphysical properties. The cloud effective radius shows an increase of 9.5 %, and the cloud droplet number concentration is reduced by 21.5 %.

  16. Cost-effectiveness analysis of implementing an antimicrobial stewardship program in critical care units.

    PubMed

    Ruiz-Ramos, Jesus; Frasquet, Juan; Romá, Eva; Poveda-Andres, Jose Luis; Salavert-Leti, Miguel; Castellanos, Alvaro; Ramirez, Paula

    2017-06-01

    To evaluate the cost-effectiveness of antimicrobial stewardship (AS) program implementation focused on critical care units based on assumptions for the Spanish setting. A decision model comparing costs and outcomes of sepsis, community-acquired pneumonia, and nosocomial infections (including catheter-related bacteremia, urinary tract infection, and ventilator-associated pneumonia) in critical care units with or without an AS was designed. Model variables and costs, along with their distributions, were obtained from the literature. The study was performed from the Spanish National Health System (NHS) perspective, including only direct costs. The Incremental Cost-Effectiveness Ratio (ICER) was analysed regarding the ability of the program to reduce multi-drug resistant bacteria. Uncertainty in ICERs was evaluated with probabilistic sensitivity analyses. In the short-term, implementing an AS reduces the consumption of antimicrobials with a net benefit of €71,738. In the long-term, the maintenance of the program involves an additional cost to the system of €107,569. Cost per avoided resistance was €7,342, and cost-per-life-years gained (LYG) was €9,788. Results from the probabilistic sensitivity analysis showed that there was a more than 90% likelihood that an AS would be cost-effective at a level of €8,000 per LYG. Wide variability of economic results obtained from the implementation of this type of AS program and short information on their impact on patient evolution and any resistance avoided. Implementing an AS focusing on critical care patients is a long-term cost-effective tool. Implementation costs are amortized by reducing antimicrobial consumption to prevent infection by multidrug-resistant pathogens.

  17. Artificial neural network modeling using clinical and knowledge independent variables predicts salt intake reduction behavior

    PubMed Central

    Isma’eel, Hussain A.; Sakr, George E.; Almedawar, Mohamad M.; Fathallah, Jihan; Garabedian, Torkom; Eddine, Savo Bou Zein

    2015-01-01

    Background High dietary salt intake is directly linked to hypertension and cardiovascular diseases (CVDs). Predicting behaviors regarding salt intake habits is vital to guide interventions and increase their effectiveness. We aim to compare the accuracy of an artificial neural network (ANN) based tool that predicts behavior from key knowledge questions along with clinical data in a high cardiovascular risk cohort relative to the least square models (LSM) method. Methods We collected knowledge, attitude and behavior data on 115 patients. A behavior score was calculated to classify patients’ behavior towards reducing salt intake. Accuracy comparison between ANN and regression analysis was calculated using the bootstrap technique with 200 iterations. Results Starting from a 69-item questionnaire, a reduced model was developed and included eight knowledge items found to result in the highest accuracy of 62% CI (58-67%). The best prediction accuracy in the full and reduced models was attained by ANN at 66% and 62%, respectively, compared to full and reduced LSM at 40% and 34%, respectively. The average relative increase in accuracy over all in the full and reduced models is 82% and 102%, respectively. Conclusions Using ANN modeling, we can predict salt reduction behaviors with 66% accuracy. The statistical model has been implemented in an online calculator and can be used in clinics to estimate the patient’s behavior. This will help implementation in future research to further prove clinical utility of this tool to guide therapeutic salt reduction interventions in high cardiovascular risk individuals. PMID:26090333

  18. Preliminary results in implementing a model of the world economy on the CYBER 205: A case of large sparse nonsymmetric linear equations

    NASA Technical Reports Server (NTRS)

    Szyld, D. B.

    1984-01-01

    A brief description of the Model of the World Economy implemented at the Institute for Economic Analysis is presented, together with our experience in converting the software to vector code. For each time period, the model is reduced to a linear system of over 2000 variables. The matrix of coefficients has a bordered block diagonal structure, and we show how some of the matrix operations can be carried out on all diagonal blocks at once.

  19. Implementation of New Process Models for Tailored Polymer Composite Structures into Processing Software Packages

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nguyen, Ba Nghiep; Jin, Xiaoshi; Wang, Jin

    2010-02-23

    This report describes the work conducted under the Cooperative Research and Development Agreement (CRADA) (Nr. 260) between the Pacific Northwest National Laboratory (PNNL) and Autodesk, Inc. to develop and implement process models for injection-molded long-fiber thermoplastics (LFTs) in processing software packages. The structure of this report is organized as follows. After the Introduction Section (Section 1), Section 2 summarizes the current fiber orientation models developed for injection-molded short-fiber thermoplastics (SFTs). Section 3 provides an assessment of these models to determine their capabilities and limitations, and the developments needed for injection-molded LFTs. Section 4 then focuses on the development of amore » new fiber orientation model for LFTs. This model is termed the anisotropic rotary diffusion - reduced strain closure (ARD-RSC) model as it explores the concept of anisotropic rotary diffusion to capture the fiber-fiber interaction in long-fiber suspensions and uses the reduced strain closure method of Wang et al. to slow down the orientation kinetics in concentrated suspensions. In contrast to fiber orientation modeling, before this project, no standard model was developed to predict the fiber length distribution in molded fiber composites. Section 5 is therefore devoted to the development of a fiber length attrition model in the mold. Sections 6 and 7 address the implementations of the models in AMI, and the conclusions drawn from this work is presented in Section 8.« less

  20. Cost-effectiveness of a quality improvement programme to reduce central line-associated bloodstream infections in intensive care units in the USA

    PubMed Central

    Herzer, Kurt R; Niessen, Louis; Constenla, Dagna O; Ward, William J; Pronovost, Peter J

    2014-01-01

    Objective To assess the cost-effectiveness of a multifaceted quality improvement programme focused on reducing central line-associated bloodstream infections in intensive care units. Design Cost-effectiveness analysis using a decision tree model to compare programme to non-programme intensive care units. Setting USA. Population Adult patients in the intensive care unit. Costs Economic costs of the programme and of central line-associated bloodstream infections were estimated from the perspective of the hospital and presented in 2013 US dollars. Main outcome measures Central line-associated bloodstream infections prevented, deaths averted due to central line-associated bloodstream infections prevented, and incremental cost-effectiveness ratios. Probabilistic sensitivity analysis was performed. Results Compared with current practice, the programme is strongly dominant and reduces bloodstream infections and deaths at no additional cost. The probabilistic sensitivity analysis showed that there was an almost 80% probability that the programme reduces bloodstream infections and the infections’ economic costs to hospitals. The opportunity cost of a bloodstream infection to a hospital was the most important model parameter in these analyses. Conclusions This multifaceted quality improvement programme, as it is currently implemented by hospitals on an increasingly large scale in the USA, likely reduces the economic costs of central line-associated bloodstream infections for US hospitals. Awareness among hospitals about the programme's benefits should enhance implementation. The programme's implementation has the potential to substantially reduce morbidity, mortality and economic costs associated with central line-associated bloodstream infections. PMID:25256190

  1. Using Agent Base Models to Optimize Large Scale Network for Large System Inventories

    NASA Technical Reports Server (NTRS)

    Shameldin, Ramez Ahmed; Bowling, Shannon R.

    2010-01-01

    The aim of this paper is to use Agent Base Models (ABM) to optimize large scale network handling capabilities for large system inventories and to implement strategies for the purpose of reducing capital expenses. The models used in this paper either use computational algorithms or procedure implementations developed by Matlab to simulate agent based models in a principal programming language and mathematical theory using clusters, these clusters work as a high performance computational performance to run the program in parallel computational. In both cases, a model is defined as compilation of a set of structures and processes assumed to underlie the behavior of a network system.

  2. Adapting and Implementing a Community Program to Improve Retention in Care among Patients with HIV in Southern Haiti: "Group of 6".

    PubMed

    Naslund, John A; Dionne-Odom, Jodie; Junior Destiné, Cléonas; Jogerst, Kristen M; Renold Sénécharles, Redouin; Jean Louis, Michelande; Desir, Jasmin; Néptune Ledan, Yvette; Beauséjour, Jude Ronald; Charles, Roland; Werbel, Alice; Talbot, Elizabeth A; Joseph, Patrice; Pape, Jean William; Wright, Peter F

    2014-01-01

    Objective. In Mozambique, a patient-led Community ART Group model developed by Médecins Sans Frontières improved retention in care and adherence to antiretroviral therapy (ART) among persons with HIV. We describe the adaptation and implementation of this model within the HIV clinic located in the largest public hospital in Haiti's Southern Department. Methods. Our adapted model was named Group of 6. Hospital staff enabled stable patients with HIV receiving ART to form community groups with 4-6 members to facilitate monthly ART distribution, track progress and adherence, and provide support. Implementation outcomes included recruitment success, participant retention, group completion of monthly monitoring forms, and satisfaction surveys. Results. Over one year, 80 patients from nine communities enrolled into 15 groups. Six participants left to receive HIV care elsewhere, two moved away, and one died of a non-HIV condition. Group members successfully completed monthly ART distribution and returned 85.6% of the monthly monitoring forms. Members reported that Group of 6 made their HIV management easier and hospital staff reported that it reduced their workload. Conclusions. We report successful adaptation and implementation of a validated community HIV-care model in Southern Haiti. Group of 6 can reduce barriers to ART adherence, and will be integrated as a routine care option.

  3. Reducing eating disorder risk factors: A pilot effectiveness trial of a train-the-trainer approach to dissemination and implementation.

    PubMed

    Greif, Rebecca; Becker, Carolyn Black; Hildebrandt, Tom

    2015-12-01

    Impediments limit dissemination and implementation of evidence-based interventions (EBIs), including lack of sufficient training. One strategy to increase implementation of EBIs is the train-the-trainer (TTT) model. The Body Project is a peer-led body image program that reduces eating disorder (ED) risk factors. This study examined the effectiveness of a TTT model at reducing risk factors in Body Project participants. Specifically, this study examined whether a master trainer could train a novice trainer to train undergraduate peer leaders to administer the Body Project such that individuals who received the Body Project (i.e., participants) would evidence comparable outcomes to previous trials. We hypothesized that participants would evidence reductions in ED risk factors, with effect sizes similar to previous trials. Utilizing a TTT model, a master trainer trained a novice trainer to train undergraduate peer leaders to administer the Body Project to undergraduate women. Undergraduate women aged 18 years or older who received the Body Project intervention participated in the trial and completed measures at baseline, post-treatment, and five-month follow-up. Primary outcomes included body dissatisfaction, thin ideal internalization, negative affect, and ED pathology. Participants demonstrated significant reductions in thin ideal internalization, ED pathology and body dissatisfaction at post-treatment and 5-month follow-up. At 5 months, using three different strategies for managing missing data, effect sizes were larger or comparable to earlier trials for 3 out of 4 variables. Results support a TTT model for Body Project implementation and the importance of utilizing sensitivity analyses for longitudinal datasets with missing data. © 2015 Wiley Periodicals, Inc.

  4. To what extent can green infrastructure mitigate downstream flooding in a peri-urban catchment?

    NASA Astrophysics Data System (ADS)

    Schubert, J. E.; Burns, M.; Sanders, B. F.; Flethcher, T.

    2016-12-01

    In this research, we couple an urban hydrologic model (MUSIC, eWater, AUS) with a fine resolution 2D hydrodynamic model (BreZo, UC Irvine, USA) to test to what extent retrofitting an urban watershed with stormwater control measures (SCMs) can propagate flood management benefits downstream. Our study site is the peri-urban Little Stringybark Creek (LSC) catchment in eastern Melbourne, AUS, with an area of 4.5 km2 and connected impervious area of 9%. Urban development is mainly limited to the upper 2 km2of the catchment. Since 2009 the LSC catchment has been the subject of a large-scale experiment aiming to restore morenatural flow by implementing over 300 SCMs, such as rain tanks and infiltration trenches, resulting in runoff from 50% of connected impervious areas now being intercepted by some form of SCM. For our study we calibrated the hydrologic and hydraulic models based on current catchment conditions, then we developed models representing alternative SCM scenarios including a complete lack of SCMs versus a full implementation of SCMs. Flow in the hydrologic/hydraulic models is forced using a range of synthetic rainfall events with annual exceedance probabilities (AEPs) between 63-1% and durations between 10 min to 24 hr. Metrics of SCM efficacy in changing flood regime include flood depths and extents, flow intensity (m2/s), flood duration, and critical storm duration leading to maximum flood conditions. Results indicate that across the range of AEPs tested and for storm durations equal or less than 3 hours, current SCM conditions reduce downstream flooded area on average by 29%, while a full implementation of SCMs would reduce downstream flooded area on average by 91%. A full implementation of SCMs could also lower maximum flow intensities by 83% on average, reducing damage potential to structures in the flow path and increasing the ability for vehicles to evacuate flooded streets. We also found that for storm durations longer than 3 hours, the SCMs capacity to retain rainfall runoff volumes is much decreased, with a full implementation of SCMs only reducing flooded area by 8% and flow intensity by 5.5%. Therefore additional measures are required for downstream flood hazard mitigation from long duration events.

  5. Application of analytical redundancy management to Shuttle crafts. [computerized simulation of microelectronic implementation

    NASA Technical Reports Server (NTRS)

    Montgomery, R. C.; Tabak, D.

    1979-01-01

    The study involves the bank of filters approach to analytical redundancy management since this is amenable to microelectronic implementation. Attention is given to a study of the UD factorized filter to determine if it gives more accurate estimates than the standard Kalman filter when data processing word size is reduced. It is reported that, as the word size is reduced, the effect of modeling error dominates the filter performance of the two filters. However, the UD filter is shown to maintain a slight advantage in tracking performance. It is concluded that because of the UD filter's stability in the serial processing mode, it remains the leading candidate for microelectronic implementation.

  6. Recent National Transonic Facility Test Process Improvements (Invited)

    NASA Technical Reports Server (NTRS)

    Kilgore, W. A.; Balakrishna, S.; Bobbitt, C. W., Jr.; Adcock, J. B.

    2001-01-01

    This paper describes the results of two recent process improvements; drag feed-forward Mach number control and simultaneous force/moment and pressure testing, at the National Transonic Facility. These improvements have reduced the duration and cost of testing. The drag feed-forward Mach number control reduces the Mach number settling time by using measured model drag in the Mach number control algorithm. Simultaneous force/moment and pressure testing allows simultaneous collection of force/moment and pressure data without sacrificing data quality thereby reducing the overall testing time. Both improvements can be implemented at any wind tunnel. Additionally the NTF is working to develop and implement continuous pitch as a testing option as an additional method to reduce costs and maintain data quality.

  7. Recent National Transonic Facility Test Process Improvements (Invited)

    NASA Technical Reports Server (NTRS)

    Kilgore, W. A.; Balakrishna, S.; Bobbitt, C. W., Jr.; Adcock, J. B.

    2001-01-01

    This paper describes the results of two recent process improvements; drag feed-forward Mach number control and simultaneous force/moment and pressure testing, at the National Transonic Facility. These improvements have reduced the duration and cost of testing. The drag feedforward Mach number control reduces the Mach number settling time by using measured model drag in the Mach number control algorithm. Simultaneous force/moment and pressure testing allows simultaneous collection of force/moment and pressure data without sacrificing data quality thereby reducing the overall testing time. Both improvements can be implemented at any wind tunnel. Additionally the NTF is working to develop and implement continuous pitch as a testing option as an additional method to reduce costs and maintain data quality.

  8. The US Air Force suicide prevention program: implications for public health policy.

    PubMed

    Knox, Kerry L; Pflanz, Steven; Talcott, Gerald W; Campise, Rick L; Lavigne, Jill E; Bajorska, Alina; Tu, Xin; Caine, Eric D

    2010-12-01

    We evaluated the effectiveness of the US Air Force Suicide Prevention Program (AFSPP) in reducing suicide, and we measured the extent to which air force installations implemented the program. We determined the AFSPP's impact on suicide rates in the air force by applying an intervention regression model to data from 1981 through 2008, providing 16 years of data before the program's 1997 launch and 11 years of data after launch. Also, we measured implementation of program components at 2 points in time: during a 2004 increase in suicide rates, and 2 years afterward. Suicide rates in the air force were significantly lower after the AFSPP was launched than before, except during 2004. We also determined that the program was being implemented less rigorously in 2004. The AFSPP effectively prevented suicides in the US Air Force. The long-term effectiveness of this program depends upon extensive implementation and effective monitoring of implementation. Suicides can be reduced through a multilayered, overlapping approach that encompasses key prevention domains and tracks implementation of program activities.

  9. A grounded theory model for reducing stigma in health professionals in Canada.

    PubMed

    Knaak, S; Patten, S

    2016-08-01

    The Mental Health Commission of Canada was formed as a national catalyst for improving the mental health system. One of its initiatives is Opening Minds (OM), whose mandate is to reduce mental health-related stigma. This article reports findings from a qualitative study on antistigma interventions for healthcare providers, which includes a process model articulating key stages and strategies for implementing successful antistigma programmes. The study employed a grounded theory methodology. Data collection involved in-depth interviews with programme stakeholders, direct observation of programmes, a review of programme documents, and qualitative feedback from programme participants. Analysis proceeded via the constant comparison method. A model was generated to visually present key findings. Twenty-three in-depth interviews were conducted representing 18 different programmes. Eight programmes were observed directly, 48 programme documents were reviewed, and data from 1812 programme participants were reviewed. The analysis led to a four-stage process model for implementing successful antistigma programmes targeting healthcare providers, informed by the basic social process 'targeting the roots of healthcare provider stigma'. The process model developed through this research may function as a tool to help guide the development and implementation of antistigma programmes in healthcare contexts. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  10. Analysis of the implementation of a personalized care model in diabetes mellitus as an example of chronic disease with information and communication technology support.

    PubMed

    López-Martínez, N; Segú, J L; Vázquez-Castro, J; Brosa, M; Bohigas, L; Comellas, M J; Kalfhaus, L

    2017-04-01

    Diabetes mellitus affects 13.8% of the adult population in Spain, representing some 8.2% of total Spanish health spending, which may be reduced by optimizing treatment and disease monitoring. Areas covered: This perspective article aims to evaluate the possible clinical and economic outcomes of implementing a theoretical personalized care model in diabetes supported by information and communications technology in Spain vs. conventional care. Moreover, we assessed the value of emminens® eConecta, a solution designed to support the operational implementation of this model, which enables the connection and participation of patients and health professionals, facilitates patient education, decision-making, access to information, and data analysis. We carried out a review of the available evidence, consultations with experts and a clinical and cost estimation. Expert commentary: The experts consulted considered that the proposed model is consistent with Spanish strategies on chronicity, supports the management of chronicity/diabetes, and may improve the most important aspects of disease management. In the literature, this type of care models improved or provided equal disease control compared with conventional care, potentiated self-management strategies and reduced the high use of resources. Cost estimation showed a reduction of -12% in total direct costs and around -34% in the costs of outpatient visits.

  11. Dedication increases productivity: an analysis of the implementation of a dedicated medical team in the emergency department.

    PubMed

    Ramos, Pedro; Paiva, José Artur

    2017-12-01

    In several European countries, emergency departments (EDs) now employ a dedicated team of full-time emergency medicine (EM) physicians, with a distinct leadership and bed-side emergency training, in all similar to other hospital departments. In Portugal, however, there are still two very different models for staffing EDs: a classic model, where EDs are mostly staffed with young inexperienced physicians from different medical departments who take turns in the ED in 12-h shifts and a dedicated model, recently implemented in some hospitals, where the ED is staffed by a team of doctors with specific medical competencies in emergency medicine that work full-time in the ED. Our study assesses the effect of an intervention in a large academic hospital ED in Portugal in 2002, and it is the first to test the hypothesis that implementing a dedicated team of doctors with EM expertise increases the productivity and reduces costs in the ED, maintaining the quality of care provided to patients. A pre-post design was used for comparing the change on the organisational model of delivering care in our medical ED. All emergency medical admissions were tracked in 2002 (classic model with 12-h shift in the ED) and 2005/2006 (dedicated team with full-time EM physicians), and productivity, costs with medical human resources and quality of care measures were compared. We found that medical productivity (number of patients treated per hour of medical work) increased dramatically after the creation of the dedicated team (X 2 KW = 31.135; N = 36; p < 0.001) and costs with ED medical work reduced both in regular hours and overtime. Moreover, hospitalisation rates decreased and the length of stay in the ED increased significantly after the creation of the dedicated team. Implementing a dedicated team of doctors increased the medical productivity and reduced costs in our ED. Our findings have straightforward implication for Portuguese policymakers aiming at reducing hospital costs while coping with increased ED demand.

  12. Carbon Management In the Post-Cap-and-Trade Carbon Economy

    NASA Astrophysics Data System (ADS)

    DeGroff, F. A.

    2013-12-01

    This abstract outlines an economic model that integrates carbon externalities seamlessly into the national and international economies. The model incorporates a broad carbon metric used to value all carbon in the biosphere, as well as all transnational commerce. The model minimizes the cost associated with carbon management, and allows for the variation in carbon avidity between jurisdictions. When implemented over time, the model reduces the deadweight loss while minimizing social cost, thus maximizing the marginal social benefit commonly associated with Pigouvian taxes. Once implemented, the model provides a comprehensive economic construct for governments, industry and consumers to efficiently weigh the cost of carbon, and effectively participate in helping to reduce their direct and indirect use of carbon, while allowing individual jurisdictions to decide their own carbon value, without the need for explicit, express agreement of all countries. The model uses no credits, requires no caps, and matches climate changing behavior to costs. The steps to implement the model for a particular jurisdiction are: 1) Define the Carbon Metric to value changes in Carbon Quality. 2) Apply the Carbon Metric to assess the Carbon Toll a) for all changes in Carbon Quality and b) for imports and exports. This economic model has 3 clear advantages. 1) The carbon pricing and cost scheme use existing and generally accepted accounting methodologies to ensure the veracity and verifiability of carbon management efforts with minimal effort and expense using standard auditing protocols. Implementing this economic model will not require any special training, tools, or systems for any entity to achieve their minimum carbon target goals within their jurisdictional framework. 2) Given the spectrum of carbon affinities worldwide, the model recognizes and provides for flexible carbon pricing regimes, but does not penalize domestic carbon-consuming producers subject to imports from exporters in lower carbon-pricing jurisdictions. Thus, the economic model avoids a key shortcoming of cap-and-trade carbon pricing, and eliminates any incentive to inefficiently shift carbon consumption to jurisdictions with lower carbon tolls. 3) The economic model is a comprehensive, efficient and effective strategy that allows for the implementation of a carbon-pricing structure without the complete, explicit agreement of carbon consumers worldwide.

  13. Quasilinear diffusion coefficients in a finite Larmor radius expansion for ion cyclotron heated plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Jungpyo; Wright, John; Bertelli, Nicola

    In this study, a reduced model of quasilinear velocity diffusion by a small Larmor radius approximation is derived to couple the Maxwell’s equations and the Fokker Planck equation self-consistently for the ion cyclotron range of frequency waves in a tokamak. The reduced model ensures the important properties of the full model by Kennel-Engelmann diffusion, such as diffusion directions, wave polarizations, and H-theorem. The kinetic energy change (Wdot ) is used to derive the reduced model diffusion coefficients for the fundamental damping (n = 1) and the second harmonic damping (n = 2) to the lowest order of the finite Larmormore » radius expansion. The quasilinear diffusion coefficients are implemented in a coupled code (TORIC-CQL3D) with the equivalent reduced model of the dielectric tensor. We also present the simulations of the ITER minority heating scenario, in which the reduced model is verified within the allowable errors from the full model results.« less

  14. Quasilinear diffusion coefficients in a finite Larmor radius expansion for ion cyclotron heated plasmas

    DOE PAGES

    Lee, Jungpyo; Wright, John; Bertelli, Nicola; ...

    2017-04-24

    In this study, a reduced model of quasilinear velocity diffusion by a small Larmor radius approximation is derived to couple the Maxwell’s equations and the Fokker Planck equation self-consistently for the ion cyclotron range of frequency waves in a tokamak. The reduced model ensures the important properties of the full model by Kennel-Engelmann diffusion, such as diffusion directions, wave polarizations, and H-theorem. The kinetic energy change (Wdot ) is used to derive the reduced model diffusion coefficients for the fundamental damping (n = 1) and the second harmonic damping (n = 2) to the lowest order of the finite Larmormore » radius expansion. The quasilinear diffusion coefficients are implemented in a coupled code (TORIC-CQL3D) with the equivalent reduced model of the dielectric tensor. We also present the simulations of the ITER minority heating scenario, in which the reduced model is verified within the allowable errors from the full model results.« less

  15. Odel of Dynamic Integration of Lean Shop Floor Management Within the Organizational Management System

    NASA Astrophysics Data System (ADS)

    Iuga, Virginia; Kifor, Claudiu

    2014-12-01

    The key to achieve a sustainable development lies in the customer satisfaction through improved quality, reduced cost, reduced delivery lead times and proper communication. The objective of the lean manufacturing system (LMS) is to identify and eliminate the processes and resources which do not add value to a product. The following paper aims to present a proposal of further development of integrated management systems in organizations through the implementation of lean shop floor management. In the first part of the paper, a dynamic model of the implementation steps will be presented. Furthermore, the paper underlines the importance of implementing a lean culture parallel with each step of integrating the lean methods and tools. The paper also describes the Toyota philosophy, tools, and the supporting lean culture necessary to implementing an efficient lean system in productive organizations

  16. Model reduction in a subset of the original states

    NASA Technical Reports Server (NTRS)

    Yae, K. H.; Inman, D. J.

    1992-01-01

    A model reduction method is investigated to provide a smaller structural dynamic model for subsequent structural control design. A structural dynamic model is assumed to be derived from finite element analysis. It is first converted into the state space form, and is further reduced by the internal balancing method. Through the co-ordinate transformation derived from the states that are deleted during reduction, the reduced model is finally expressed with the states that are members of the original states. Therefore, the states in the final reduced model represent the degrees of freedom of the nodes that are selected by the designer. The procedure provides a more practical implementation of model reduction for applications in which specific nodes, such as sensor and/or actuator attachment points, are to be retained in the reduced model. Thus, it ensures that the reduced model is under the same input and output condition as the original physical model. The procedure is applied to two simple examples and comparisons are made between the full and reduced order models. The method can be applied to a linear, continuous and time-invariant model of structural dynamics with nonproportional viscous damping.

  17. The Potential Importance of Conservation, Restoration and Altered Management Practices for Water Quality in the Wabash River Watershed

    NASA Astrophysics Data System (ADS)

    Yang, G.; Best, E. P.; Goodwin, S.

    2013-12-01

    Non-point source (NPS) pollution is one of the leading causes of water quality impairment within the United States. Conservation, restoration and altered management (CRAM) practices may effectively reduce NPS pollutants to receiving water bodies and enhance local and regional ecosystem services. Barriers for the implementation of CRAM include uncertainties related to the extent to which nutrients are removed by CRAM at various spatial and temporal scales, longevity, optimal placement of CRAM within the landscape, and implementation / operation / maintenance costs. We conducted a study aimed at the identification of optimal placement of CRAM in watersheds that reduces N loading to an environmentally sustainable level, at an acceptable, known, cost. For this study, we used a recently developed screening-level modeling approach, WQM-TMDL-N, running in the ArcGIS environment, to estimate nitrogen loading under current land use conditions (NLCD 2006). This model was equipped with a new option to explore the performances of placement of various CRAM types and areas to reduce nitrogen loading to a State-accepted Total Maximum Daily Load (TMDL) standard, with related annual average TN concentration, and a multi-objective algorithm optimizing load and cost. CRAM practices explored for implementation in rural area included buffer strips, nutrient management practices, and wetland restoration. We initially applied this modeling approach to the Tippecanoe River (TR) watershed (8-digit HUC), a headwater of the Wabash River (WR) watershed, where CRAM implementation in rural and urban areas is being planned and implemented at various spatial scales. Consequences of future land use are explored using a 2050 land use/land cover map forecasted by the Land Transformation Model. The WR watershed, IN, drains two-thirds of the state's 92 counties and supports predominantly agricultural land use. Because the WR accounts for over 40% of the nutrient loads of the Ohio River and significantly contributes to the anoxic zone in the Gulf of Mexico (GOM), reduction in TN loading of the WR are expected to directly benefit downstream ecosystem services, including fisheries in the GOM. This modeling approach can be used in support of sustainable integrated watershed management planning.

  18. Large-scale seismic waveform quality metric calculation using Hadoop

    NASA Astrophysics Data System (ADS)

    Magana-Zook, S.; Gaylord, J. M.; Knapp, D. R.; Dodge, D. A.; Ruppert, S. D.

    2016-09-01

    In this work we investigated the suitability of Hadoop MapReduce and Apache Spark for large-scale computation of seismic waveform quality metrics by comparing their performance with that of a traditional distributed implementation. The Incorporated Research Institutions for Seismology (IRIS) Data Management Center (DMC) provided 43 terabytes of broadband waveform data of which 5.1 TB of data were processed with the traditional architecture, and the full 43 TB were processed using MapReduce and Spark. Maximum performance of 0.56 terabytes per hour was achieved using all 5 nodes of the traditional implementation. We noted that I/O dominated processing, and that I/O performance was deteriorating with the addition of the 5th node. Data collected from this experiment provided the baseline against which the Hadoop results were compared. Next, we processed the full 43 TB dataset using both MapReduce and Apache Spark on our 18-node Hadoop cluster. These experiments were conducted multiple times with various subsets of the data so that we could build models to predict performance as a function of dataset size. We found that both MapReduce and Spark significantly outperformed the traditional reference implementation. At a dataset size of 5.1 terabytes, both Spark and MapReduce were about 15 times faster than the reference implementation. Furthermore, our performance models predict that for a dataset of 350 terabytes, Spark running on a 100-node cluster would be about 265 times faster than the reference implementation. We do not expect that the reference implementation deployed on a 100-node cluster would perform significantly better than on the 5-node cluster because the I/O performance cannot be made to scale. Finally, we note that although Big Data technologies clearly provide a way to process seismic waveform datasets in a high-performance and scalable manner, the technology is still rapidly changing, requires a high degree of investment in personnel, and will likely require significant changes in other parts of our infrastructure. Nevertheless, we anticipate that as the technology matures and third-party tool vendors make it easier to manage and operate clusters, Hadoop (or a successor) will play a large role in our seismic data processing.

  19. Hardware architecture for projective model calculation and false match refining using random sample consensus algorithm

    NASA Astrophysics Data System (ADS)

    Azimi, Ehsan; Behrad, Alireza; Ghaznavi-Ghoushchi, Mohammad Bagher; Shanbehzadeh, Jamshid

    2016-11-01

    The projective model is an important mapping function for the calculation of global transformation between two images. However, its hardware implementation is challenging because of a large number of coefficients with different required precisions for fixed point representation. A VLSI hardware architecture is proposed for the calculation of a global projective model between input and reference images and refining false matches using random sample consensus (RANSAC) algorithm. To make the hardware implementation feasible, it is proved that the calculation of the projective model can be divided into four submodels comprising two translations, an affine model and a simpler projective mapping. This approach makes the hardware implementation feasible and considerably reduces the required number of bits for fixed point representation of model coefficients and intermediate variables. The proposed hardware architecture for the calculation of a global projective model using the RANSAC algorithm was implemented using Verilog hardware description language and the functionality of the design was validated through several experiments. The proposed architecture was synthesized by using an application-specific integrated circuit digital design flow utilizing 180-nm CMOS technology as well as a Virtex-6 field programmable gate array. Experimental results confirm the efficiency of the proposed hardware architecture in comparison with software implementation.

  20. Looking Forward: The Promise of Widespread Implementation of Parent Training Programs

    PubMed Central

    Forgatch, Marion S.; Patterson, Gerald R.; Gewirtz, Abigail H.

    2013-01-01

    Over the past quarter century a body of parent training programs has been developed and validated as effective in reducing child behavior problems, but few of these have made their way into routine practice. This article describes the long and winding road of implementation as applied to children's mental health. Adopting Rogers' (1995) diffusion framework and Fixsen and colleagues' implementation framework (Fixsen, Naoom, Blase, Friedman, & Wallace, 2005), we review more than a decade of research on the implementation of Parent Management Training – Oregon Model (PMTO®). Data from US and international PMTO implementations are used to illustrate the payoffs and the challenges of making empirically supported interventions routine practice in the community. Technological advances that break down barriers to communication across distances, the availability of efficacious programs suitable for implementation, and the urgent need for high quality mental health care provide strong rationales for prioritizing attention to implementation. Over the next quarter of a century, the challenge is to reduce the prevalence of children's psychopathology by creating science-based delivery systems to reach families in need, everywhere. PMID:24443650

  1. A symbolic/subsymbolic interface protocol for cognitive modeling

    PubMed Central

    Simen, Patrick; Polk, Thad

    2009-01-01

    Researchers studying complex cognition have grown increasingly interested in mapping symbolic cognitive architectures onto subsymbolic brain models. Such a mapping seems essential for understanding cognition under all but the most extreme viewpoints (namely, that cognition consists exclusively of digitally implemented rules; or instead, involves no rules whatsoever). Making this mapping reduces to specifying an interface between symbolic and subsymbolic descriptions of brain activity. To that end, we propose parameterization techniques for building cognitive models as programmable, structured, recurrent neural networks. Feedback strength in these models determines whether their components implement classically subsymbolic neural network functions (e.g., pattern recognition), or instead, logical rules and digital memory. These techniques support the implementation of limited production systems. Though inherently sequential and symbolic, these neural production systems can exploit principles of parallel, analog processing from decision-making models in psychology and neuroscience to explain the effects of brain damage on problem solving behavior. PMID:20711520

  2. Archetype Model-Driven Development Framework for EHR Web System.

    PubMed

    Kobayashi, Shinji; Kimura, Eizen; Ishihara, Ken

    2013-12-01

    This article describes the Web application framework for Electronic Health Records (EHRs) we have developed to reduce construction costs for EHR sytems. The openEHR project has developed clinical model driven architecture for future-proof interoperable EHR systems. This project provides the specifications to standardize clinical domain model implementations, upon which the ISO/CEN 13606 standards are based. The reference implementation has been formally described in Eiffel. Moreover C# and Java implementations have been developed as reference. While scripting languages had been more popular because of their higher efficiency and faster development in recent years, they had not been involved in the openEHR implementations. From 2007, we have used the Ruby language and Ruby on Rails (RoR) as an agile development platform to implement EHR systems, which is in conformity with the openEHR specifications. We implemented almost all of the specifications, the Archetype Definition Language parser, and RoR scaffold generator from archetype. Although some problems have emerged, most of them have been resolved. We have provided an agile EHR Web framework, which can build up Web systems from archetype models using RoR. The feasibility of the archetype model to provide semantic interoperability of EHRs has been demonstrated and we have verified that that it is suitable for the construction of EHR systems.

  3. Using simulation modeling to improve patient flow at an outpatient orthopedic clinic.

    PubMed

    Rohleder, Thomas R; Lewkonia, Peter; Bischak, Diane P; Duffy, Paul; Hendijani, Rosa

    2011-06-01

    We report on the use of discrete event simulation modeling to support process improvements at an orthopedic outpatient clinic. The clinic was effective in treating patients, but waiting time and congestion in the clinic created patient dissatisfaction and staff morale issues. The modeling helped to identify improvement alternatives including optimized staffing levels, better patient scheduling, and an emphasis on staff arriving promptly. Quantitative results from the modeling provided motivation to implement the improvements. Statistical analysis of data taken before and after the implementation indicate that waiting time measures were significantly improved and overall patient time in the clinic was reduced.

  4. Symplectic multi-particle tracking on GPUs

    NASA Astrophysics Data System (ADS)

    Liu, Zhicong; Qiang, Ji

    2018-05-01

    A symplectic multi-particle tracking model is implemented on the Graphic Processing Units (GPUs) using the Compute Unified Device Architecture (CUDA) language. The symplectic tracking model can preserve phase space structure and reduce non-physical effects in long term simulation, which is important for beam property evaluation in particle accelerators. Though this model is computationally expensive, it is very suitable for parallelization and can be accelerated significantly by using GPUs. In this paper, we optimized the implementation of the symplectic tracking model on both single GPU and multiple GPUs. Using a single GPU processor, the code achieves a factor of 2-10 speedup for a range of problem sizes compared with the time on a single state-of-the-art Central Processing Unit (CPU) node with similar power consumption and semiconductor technology. It also shows good scalability on a multi-GPU cluster at Oak Ridge Leadership Computing Facility. In an application to beam dynamics simulation, the GPU implementation helps save more than a factor of two total computing time in comparison to the CPU implementation.

  5. Integration of EEG lead placement templates into traditional technologist-based staffing models reduces costs in continuous video-EEG monitoring service.

    PubMed

    Kolls, Brad J; Lai, Amy H; Srinivas, Anang A; Reid, Robert R

    2014-06-01

    The purpose of this study was to determine the relative cost reductions within different staffing models for continuous video-electroencephalography (cvEEG) service by introducing a template system for 10/20 lead application. We compared six staffing models using decision tree modeling based on historical service line utilization data from the cvEEG service at our center. Templates were integrated into technologist-based service lines in six different ways. The six models studied were templates for all studies, templates for intensive care unit (ICU) studies, templates for on-call studies, templates for studies of ≤ 24-hour duration, technologists for on-call studies, and technologists for all studies. Cost was linearly related to the study volume for all models with the "templates for all" model incurring the lowest cost. The "technologists for all" model carried the greatest cost. Direct cost comparison shows that any introduction of templates results in cost savings, with the templates being used for patients located in the ICU being the second most cost efficient and the most practical of the combined models to implement. Cost difference between the highest and lowest cost models under the base case produced an annual estimated savings of $267,574. Implementation of the ICU template model at our institution under base case conditions would result in a $205,230 savings over our current "technologist for all" model. Any implementation of templates into a technologist-based cvEEG service line results in cost savings, with the most significant annual savings coming from using the templates for all studies, but the most practical implementation approach with the second highest cost reduction being the template used in the ICU. The lowered costs determined in this work suggest that a template-based cvEEG service could be supported at smaller centers with significantly reduced costs and could allow for broader use of cvEEG patient monitoring.

  6. Computer-Aided Process Model For Carbon/Phenolic Materials

    NASA Technical Reports Server (NTRS)

    Letson, Mischell A.; Bunker, Robert C.

    1996-01-01

    Computer program implements thermochemical model of processing of carbon-fiber/phenolic-matrix composite materials into molded parts of various sizes and shapes. Directed toward improving fabrication of rocket-engine-nozzle parts, also used to optimize fabrication of other structural components, and material-property parameters changed to apply to other materials. Reduces costs by reducing amount of laboratory trial and error needed to optimize curing processes and to predict properties of cured parts.

  7. Improvements to Fidelity, Generation and Implementation of Physics-Based Lithium-Ion Reduced-Order Models

    NASA Astrophysics Data System (ADS)

    Rodriguez Marco, Albert

    Battery management systems (BMS) require computationally simple but highly accurate models of the battery cells they are monitoring and controlling. Historically, empirical equivalent-circuit models have been used, but increasingly researchers are focusing their attention on physics-based models due to their greater predictive capabilities. These models are of high intrinsic computational complexity and so must undergo some kind of order-reduction process to make their use by a BMS feasible: we favor methods based on a transfer-function approach of battery cell dynamics. In prior works, transfer functions have been found from full-order PDE models via two simplifying assumptions: (1) a linearization assumption--which is a fundamental necessity in order to make transfer functions--and (2) an assumption made out of expedience that decouples the electrolyte-potential and electrolyte-concentration PDEs in order to render an approach to solve for the transfer functions from the PDEs. This dissertation improves the fidelity of physics-based models by eliminating the need for the second assumption and, by linearizing nonlinear dynamics around different constant currents. Electrochemical transfer functions are infinite-order and cannot be expressed as a ratio of polynomials in the Laplace variable s. Thus, for practical use, these systems need to be approximated using reduced-order models that capture the most significant dynamics. This dissertation improves the generation of physics-based reduced-order models by introducing different realization algorithms, which produce a low-order model from the infinite-order electrochemical transfer functions. Physics-based reduced-order models are linear and describe cell dynamics if operated near the setpoint at which they have been generated. Hence, multiple physics-based reduced-order models need to be generated at different setpoints (i.e., state-of-charge, temperature and C-rate) in order to extend the cell operating range. This dissertation improves the implementation of physics-based reduced-order models by introducing different blending approaches that combine the pre-computed models generated (offline) at different setpoints in order to produce good electrochemical estimates (online) along the cell state-of-charge, temperature and C-rate range.

  8. Workplace Lactation Programs in Small WIC Service Sites: A Potential Model.

    PubMed

    Angeletti, Michelle A; Llossas, Jose R

    2018-03-01

    The Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) has an opportunity to protect, promote, and support breastfeeding by implementing and modeling workplace lactation programs in small WIC agencies that may have barriers regarding the lack of both human and financial resources. The goal of this article was to describe effective strategies for agency administrators in small WIC service sites so that they can reduce barriers, successfully implement workplace lactation policies and programs, and model successful strategies for other small employers. Copyright © 2017 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  9. Verification of BOUT++ by the method of manufactured solutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dudson, B. D., E-mail: benjamin.dudson@york.ac.uk; Hill, P.; Madsen, J.

    2016-06-15

    BOUT++ is a software package designed for solving plasma fluid models. It has been used to simulate a wide range of plasma phenomena ranging from linear stability analysis to 3D plasma turbulence and is capable of simulating a wide range of drift-reduced plasma fluid and gyro-fluid models. A verification exercise has been performed as part of a EUROfusion Enabling Research project, to rigorously test the correctness of the algorithms implemented in BOUT++, by testing order-of-accuracy convergence rates using the Method of Manufactured Solutions (MMS). We present tests of individual components including time-integration and advection schemes, non-orthogonal toroidal field-aligned coordinate systemsmore » and the shifted metric procedure which is used to handle highly sheared grids. The flux coordinate independent approach to differencing along magnetic field-lines has been implemented in BOUT++ and is here verified using the MMS in a sheared slab configuration. Finally, we show tests of three complete models: 2-field Hasegawa-Wakatani in 2D slab, 3-field reduced magnetohydrodynamics (MHD) in 3D field-aligned toroidal coordinates, and 5-field reduced MHD in slab geometry.« less

  10. Municipal bylaw to reduce cosmetic/non-essential pesticide use on household lawns - a policy implementation evaluation

    PubMed Central

    2011-01-01

    Background Pesticide use on urban lawns and gardens contributes to environmental contamination and human exposure. Municipal policies to restrict use and educate households on viable alternatives deserve study. We describe the development and implementation of a cosmetic/non-essential pesticide bylaw by a municipal health department in Toronto, Ontario, Canada and assess changes in resident practices associated with bylaw implementation. Methods Implementation indicators built on a logic model and were elaborated through key informant interviews. Bylaw impacts on awareness and practice changes were documented through telephone surveys administered seasonally pre, during and post implementation (2003-2008). Multivariable logistic regression models assessed associations of demographic variables and gardening season with respondent awareness and practices. Results Implementation indicators documented multiple municipal health department activities and public involvement in complaints from commencement of the educational phase. During the enforcement phases only 40 warning letters and 7 convictions were needed. The number of lawn care companies increased. Among survey respondents, awareness of the bylaw and the Natural Lawn campaign reached 69% and 76% respectively by 2008. Substantial decreases in the proportion of households applying pesticides (25 to 11%) or hiring lawn care companies for application (15 to 5%) occurred. Parallel absolute increases in use of natural lawn care methods occurred among households themselves (21%) and companies they contracted (7%). Conclusions Bylaws or ordinances implemented through education and enforcement are a viable policy option for reducing urban cosmetic pesticide use. PMID:21867501

  11. A tailored implementation strategy to reduce the duration of intravenous antibiotic treatment in community-acquired pneumonia: a controlled before-and-after study.

    PubMed

    Engel, M F; Bruns, A H W; Hulscher, M E J L; Gaillard, C A J M; Sankatsing, S U C; Teding van Berkhout, F; Emmelot-Vonk, M H; Kuck, E M; Steeghs, M H M; den Breeijen, J H; Stellato, R K; Hoepelman, A I M; Oosterheert, J J

    2014-11-01

    We previously showed that 40 % of clinically stable patients hospitalised for community-acquired pneumonia (CAP) are not switched to oral therapy in a timely fashion because of physicians' barriers. We aimed to decrease this proportion by implementing a novel protocol. In a multi-centre controlled before-and-after study, we evaluated the effect of an implementation strategy tailored to previously identified barriers to an early switch. In three Dutch hospitals, a protocol dictating a timely switch strategy was implemented using educational sessions, pocket reminders and active involvement of nursing staff. Primary outcomes were the proportion of patients switched timely and the duration of intravenous antibiotic therapy. Length of hospital stay (LOS), patient outcome, education effects 6 months after implementation and implementation costs were secondary outcomes. Statistical analysis was performed using mixed-effects models. Prior to implementation, 146 patients were included and, after implementation, 213 patients were included. The case mix was comparable. The implementation did not change the proportion of patients switched on time (66 %). The median duration of intravenous antibiotic administration decreased from 4 days [interquartile range (IQR) 2-5] to 3 days (IQR 2-4), a decrease of 21 % [95 % confidence interval (CI) 11 %; 30 %) in the multi-variable analysis. LOS and patient outcome were comparable before and after implementation. Forty-three percent (56/129) of physicians attended the educational sessions. After 6 months, 24 % (10/42) of the interviewed attendees remembered the protocol's main message. Cumulative implementation costs were 5,798 (20/reduced intravenous treatment day). An implementation strategy tailored to previously identified barriers reduced the duration of intravenous antibiotic administration in hospitalised CAP patients by 1 day, at minimal cost.

  12. Cost-effectiveness of a quality improvement programme to reduce central line-associated bloodstream infections in intensive care units in the USA.

    PubMed

    Herzer, Kurt R; Niessen, Louis; Constenla, Dagna O; Ward, William J; Pronovost, Peter J

    2014-09-25

    To assess the cost-effectiveness of a multifaceted quality improvement programme focused on reducing central line-associated bloodstream infections in intensive care units. Cost-effectiveness analysis using a decision tree model to compare programme to non-programme intensive care units. USA. Adult patients in the intensive care unit. Economic costs of the programme and of central line-associated bloodstream infections were estimated from the perspective of the hospital and presented in 2013 US dollars. Central line-associated bloodstream infections prevented, deaths averted due to central line-associated bloodstream infections prevented, and incremental cost-effectiveness ratios. Probabilistic sensitivity analysis was performed. Compared with current practice, the programme is strongly dominant and reduces bloodstream infections and deaths at no additional cost. The probabilistic sensitivity analysis showed that there was an almost 80% probability that the programme reduces bloodstream infections and the infections' economic costs to hospitals. The opportunity cost of a bloodstream infection to a hospital was the most important model parameter in these analyses. This multifaceted quality improvement programme, as it is currently implemented by hospitals on an increasingly large scale in the USA, likely reduces the economic costs of central line-associated bloodstream infections for US hospitals. Awareness among hospitals about the programme's benefits should enhance implementation. The programme's implementation has the potential to substantially reduce morbidity, mortality and economic costs associated with central line-associated bloodstream infections. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  13. Development of an Aeroelastic Analysis Including a Viscous Flow Model

    NASA Technical Reports Server (NTRS)

    Keith, Theo G., Jr.; Bakhle, Milind A.

    2001-01-01

    Under this grant, Version 4 of the three-dimensional Navier-Stokes aeroelastic code (TURBO-AE) has been developed and verified. The TURBO-AE Version 4 aeroelastic code allows flutter calculations for a fan, compressor, or turbine blade row. This code models a vibrating three-dimensional bladed disk configuration and the associated unsteady flow (including shocks, and viscous effects) to calculate the aeroelastic instability using a work-per-cycle approach. Phase-lagged (time-shift) periodic boundary conditions are used to model the phase lag between adjacent vibrating blades. The direct-store approach is used for this purpose to reduce the computational domain to a single interblade passage. A disk storage option, implemented using direct access files, is available to reduce the large memory requirements of the direct-store approach. Other researchers have implemented 3D inlet/exit boundary conditions based on eigen-analysis. Appendix A: Aeroelastic calculations based on three-dimensional euler analysis. Appendix B: Unsteady aerodynamic modeling of blade vibration using the turbo-V3.1 code.

  14. Prioritizing watersheds for conservation actions in the southeastern coastal plain ecoregion.

    PubMed

    Jang, Taeil; Vellidis, George; Kurkalova, Lyubov A; Boll, Jan; Hyman, Jeffrey B

    2015-03-01

    The aim of this study was to apply and evaluate a recently developed prioritization model which uses the synoptic approach to geographically prioritize watersheds in which Best Management Practices (BMPs) can be implemented to reduce water quality problems resulting from erosion and sedimentation. The model uses a benefit-cost framework to rank candidate watersheds within an ecoregion or river basin so that BMP implementation within the highest ranked watersheds will result in the most water quality improvement per conservation dollar invested. The model was developed to prioritize BMP implementation efforts in ecoregions containing watersheds associated with the USDA-NRCS Conservation Effects Assessment Project (CEAP). We applied the model to HUC-8 watersheds within the southeastern Coastal Plain ecoregion (USA) because not only is it an important agricultural area but also because it contains a well-studied medium-sized CEAP watershed which is thought to be representative of the ecoregion. The results showed that the three HUC-8 watersheds with the highest rankings (most water quality improvement expected per conservation dollar invested) were located in the southern Alabama, northern Florida, and eastern Virginia. Within these watersheds, measures of community attitudes toward conservation practices were highly ranked, and these indicators seemed to push the watersheds to the top of the rankings above other similar watersheds. The results, visualized as maps, can be used to screen and reduce the number of watersheds that need further assessment by managers and decision-makers within the study area. We anticipate that this model will allow agencies like USDA-NRCS to geographically prioritize BMP implementation efforts.

  15. A Practical Approach to Implementing Real-Time Semantics

    NASA Technical Reports Server (NTRS)

    Luettgen, Gerald; Bhat, Girish; Cleaveland, Rance

    1999-01-01

    This paper investigates implementations of process algebras which are suitable for modeling concurrent real-time systems. It suggests an approach for efficiently implementing real-time semantics using dynamic priorities. For this purpose a proces algebra with dynamic priority is defined, whose semantics corresponds one-to-one to traditional real-time semantics. The advantage of the dynamic-priority approach is that it drastically reduces the state-space sizes of the systems in question while preserving all properties of their functional and real-time behavior. The utility of the technique is demonstrated by a case study which deals with the formal modeling and verification of the SCSI-2 bus-protocol. The case study is carried out in the Concurrency Workbench of North Carolina, an automated verification tool in which the process algebra with dynamic priority is implemented. It turns out that the state space of the bus-protocol model is about an order of magnitude smaller than the one resulting from real-time semantics. The accuracy of the model is proved by applying model checking for verifying several mandatory properties of the bus protocol.

  16. A Reduced Duty Hours Model for Senior Internal Medicine Residents: A Qualitative Analysis of Residents' Experiences and Perceptions.

    PubMed

    Mathew, Rebecca; Gundy, Serena; Ulic, Diana; Haider, Shariq; Wasi, Parveen

    2016-09-01

    To assess senior internal medicine residents' experience of the implementation of a reduced duty hours model with night float, the transition from the prior 26-hour call system, and the new model's effects on resident quality of life and perceived patient safety in the emergency department and clinical teaching unit at McMaster University. Qualitative data were collected during May 2013-July 2014, through resident focus groups held prior to implementation of a reduced duty hours model and 10 to 12 months postimplementation. Data analysis was guided by a constructivist grounded theory based in a relativist paradigm. Transcripts were coded; codes were collapsed into themes. Thematic analysis revealed five themes. Residents described reduced fatigue in the early morning, counterbalanced with worsened long-term fatigue on night float blocks; anticipation of negative impacts of the loss of distributed on-call experience and on-call shift volume; an urgency to sleep postcall in anticipation of consecutive night float shifts accompanied by conflicting role demands to stay postcall for care continuity; increased handover frequency accompanied by inaccurate/incomplete communication of patients' issues; and improvement in the senior resident experience on the clinical teaching unit, with increased ownership over patient care and improved relationships with junior housestaff. A reduced duty hours model with night float has potential to improve residents' perceived fatigue on call and care continuity on the clinical teaching unit. This must be weighed against increased handover frequency and loss of the postcall day, which may negatively affect patient care and resident quality of life.

  17. Using Probabilistic Terrorism Risk Modeling for Regulatory Benefit-Cost Analysis. Application to the Western Hemisphere Travel Initiative Implemented in the Land Environment

    DTIC Science & Technology

    2007-05-01

    Dixon and Stern, 2004), and gun violence prevention programs ( Tita et al., 2003). As DHS considers promulgating regulations and implementing new...communication 2/21/07. Tita , G., K. J. Riley, G. Ridgeway, C. A. Grammich, A. Abrahamse, and P. W. Greenwood (2003), Reducing Gun Violence: Results

  18. New Jersey Grant Program To Reduce Student Disruption in Schools: Award Recipients.

    ERIC Educational Resources Information Center

    New Jersey State Dept. of Education, Trenton. Div. of General Academic Education.

    New Jersey's $1 million Grant Program to Reduce Student Disruption in Schools is intended to provide resources to individual school districts or groups of cooperating districts for developing and implementing programs for chronically disruptive students, and thereby to identify models to make available to other districts throughout the state. Out…

  19. Reduction in Mortality Following Pediatric Rapid Response Team Implementation.

    PubMed

    Kolovos, Nikoleta S; Gill, Jeff; Michelson, Peter H; Doctor, Allan; Hartman, Mary E

    2018-05-01

    To evaluate the effectiveness of a physician-led rapid response team program on morbidity and mortality following unplanned admission to the PICU. Before-after study. Single-center quaternary-referral PICU. All unplanned PICU admissions from the ward from 2005 to 2011. The dataset was divided into pre- and post-rapid response team groups for comparison. A Cox proportional hazards model was used to identify the patient characteristics associated with mortality following unplanned PICU admission. Following rapid response team implementation, Pediatric Risk of Mortality, version 3, illness severity was reduced (28.7%), PICU length of stay was less (19.0%), and mortality declined (22%). Relative risk of death following unplanned admission to the PICU after rapid response team implementation was 0.685. For children requiring unplanned admission to the PICU, rapid response team implementation is associated with reduced mortality, admission severity of illness, and length of stay. Rapid response team implementation led to more proximal capture and aggressive intervention in the trajectory of a decompensating pediatric ward patient.

  20. Development of the Symbolic Manipulator Laboratory modeling package for the kinematic design and optimization of the Future Armor Rearm System robot. Ammunition Logistics Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    March-Leuba, S.; Jansen, J.F.; Kress, R.L.

    1992-08-01

    A new program package, Symbolic Manipulator Laboratory (SML), for the automatic generation of both kinematic and static manipulator models in symbolic form is presented. Critical design parameters may be identified and optimized using symbolic models as shown in the sample application presented for the Future Armor Rearm System (FARS) arm. The computer-aided development of the symbolic models yields equations with reduced numerical complexity. Important considerations have been placed on the closed form solutions simplification and on the user friendly operation. The main emphasis of this research is the development of a methodology which is implemented in a computer program capablemore » of generating symbolic kinematic and static forces models of manipulators. The fact that the models are obtained trigonometrically reduced is among the most significant results of this work and the most difficult to implement. Mathematica, a commercial program that allows symbolic manipulation, is used to implement the program package. SML is written such that the user can change any of the subroutines or create new ones easily. To assist the user, an on-line help has been written to make of SML a user friendly package. Some sample applications are presented. The design and optimization of the 5-degrees-of-freedom (DOF) FARS manipulator using SML is discussed. Finally, the kinematic and static models of two different 7-DOF manipulators are calculated symbolically.« less

  1. From 'solution shop' model to 'focused factory' in hospital surgery: increasing care value and predictability.

    PubMed

    Cook, David; Thompson, Jeffrey E; Habermann, Elizabeth B; Visscher, Sue L; Dearani, Joseph A; Roger, Veronique L; Borah, Bijan J

    2014-05-01

    The full-service US hospital has been described organizationally as a "solution shop," in which medical problems are assumed to be unstructured and to require expert physicians to determine each course of care. If universally applied, this model contributes to unwarranted variation in care, which leads to lower quality and higher costs. We purposely disrupted the adult cardiac surgical practice that we led at Mayo Clinic, in Rochester, Minnesota, by creating a "focused factory" model (characterized by a uniform approach to delivering a limited set of high-quality products) within the practice's solution shop. Key elements of implementing the new model were mapping the care process, segmenting the patient population, using information technology to communicate clearly defined expectations, and empowering nonphysician providers at the bedside. Using a set of criteria, we determined that the focused-factory model was appropriate for 67 percent of cardiac surgical patients. We found that implementation of the model reduced resource use, length-of-stay, and cost. Variation was markedly reduced, and outcomes were improved. Assigning patients to different care models increases care value and the predictability of care process, outcomes, and costs while preserving (in a lesser clinical footprint) the strengths of the solution shop. We conclude that creating a focused-factory model within a solution shop, by applying industrial engineering principles and health information technology tools and changing the model of work, is very effective in both improving quality and reducing costs.

  2. Modeling and simulation of ocean wave propagation using lattice Boltzmann method

    NASA Astrophysics Data System (ADS)

    Nuraiman, Dian

    2017-10-01

    In this paper, we present on modeling and simulation of ocean wave propagation from the deep sea to the shoreline. This requires high computational cost for simulation with large domain. We propose to couple a 1D shallow water equations (SWE) model with a 2D incompressible Navier-Stokes equations (NSE) model in order to reduce the computational cost. The coupled model is solved using the lattice Boltzmann method (LBM) with the lattice Bhatnagar-Gross-Krook (BGK) scheme. Additionally, a special method is implemented to treat the complex behavior of free surface close to the shoreline. The result shows the coupled model can reduce computational cost significantly compared to the full NSE model.

  3. Reduced complexity structural modeling for automated airframe synthesis

    NASA Technical Reports Server (NTRS)

    Hajela, Prabhat

    1987-01-01

    A procedure is developed for the optimum sizing of wing structures based on representing the built-up finite element assembly of the structure by equivalent beam models. The reduced-order beam models are computationally less demanding in an optimum design environment which dictates repetitive analysis of several trial designs. The design procedure is implemented in a computer program requiring geometry and loading information to create the wing finite element model and its equivalent beam model, and providing a rapid estimate of the optimum weight obtained from a fully stressed design approach applied to the beam. The synthesis procedure is demonstrated for representative conventional-cantilever and joined wing configurations.

  4. The US Air Force Suicide Prevention Program: Implications for Public Health Policy

    PubMed Central

    Pflanz, Steven; Talcott, Gerald W.; Campise, Rick L.; Lavigne, Jill E.; Bajorska, Alina; Tu, Xin; Caine, Eric D.

    2010-01-01

    Objectives. We evaluated the effectiveness of the US Air Force Suicide Prevention Program (AFSPP) in reducing suicide, and we measured the extent to which air force installations implemented the program. Methods. We determined the AFSPP's impact on suicide rates in the air force by applying an intervention regression model to data from 1981 through 2008, providing 16 years of data before the program's 1997 launch and 11 years of data after launch. Also, we measured implementation of program components at 2 points in time: during a 2004 increase in suicide rates, and 2 years afterward. Results. Suicide rates in the air force were significantly lower after the AFSPP was launched than before, except during 2004. We also determined that the program was being implemented less rigorously in 2004. Conclusions. The AFSPP effectively prevented suicides in the US Air Force. The long-term effectiveness of this program depends upon extensive implementation and effective monitoring of implementation. Suicides can be reduced through a multilayered, overlapping approach that encompasses key prevention domains and tracks implementation of program activities. PMID:20466973

  5. Evaluation of reduced point charge models of proteins through Molecular Dynamics simulations: application to the Vps27 UIM-1-Ubiquitin complex.

    PubMed

    Leherte, Laurence; Vercauteren, Daniel P

    2014-02-01

    Reduced point charge models of amino acids are designed, (i) from local extrema positions in charge density distribution functions built from the Poisson equation applied to smoothed molecular electrostatic potential (MEP) functions, and (ii) from local maxima positions in promolecular electron density distribution functions. Corresponding charge values are fitted versus all-atom Amber99 MEPs. To easily generate reduced point charge models for protein structures, libraries of amino acid templates are built. The program GROMACS is used to generate stable Molecular Dynamics trajectories of an Ubiquitin-ligand complex (PDB: 1Q0W), under various implementation schemes, solvation, and temperature conditions. Point charges that are not located on atoms are considered as virtual sites with a nul mass and radius. The results illustrate how the intra- and inter-molecular H-bond interactions are affected by the degree of reduction of the point charge models and give directions for their implementation; a special attention to the atoms selected to locate the virtual sites and to the Coulomb-14 interactions is needed. Results obtained at various temperatures suggest that the use of reduced point charge models allows to probe local potential hyper-surface minima that are similar to the all-atom ones, but are characterized by lower energy barriers. It enables to generate various conformations of the protein complex more rapidly than the all-atom point charge representation. Copyright © 2013 Elsevier Inc. All rights reserved.

  6. SLS Navigation Model-Based Design Approach

    NASA Technical Reports Server (NTRS)

    Oliver, T. Emerson; Anzalone, Evan; Geohagan, Kevin; Bernard, Bill; Park, Thomas

    2018-01-01

    The SLS Program chose to implement a Model-based Design and Model-based Requirements approach for managing component design information and system requirements. This approach differs from previous large-scale design efforts at Marshall Space Flight Center where design documentation alone conveyed information required for vehicle design and analysis and where extensive requirements sets were used to scope and constrain the design. The SLS Navigation Team has been responsible for the Program-controlled Design Math Models (DMMs) which describe and represent the performance of the Inertial Navigation System (INS) and the Rate Gyro Assemblies (RGAs) used by Guidance, Navigation, and Controls (GN&C). The SLS Navigation Team is also responsible for the navigation algorithms. The navigation algorithms are delivered for implementation on the flight hardware as a DMM. For the SLS Block 1-B design, the additional GPS Receiver hardware is managed as a DMM at the vehicle design level. This paper provides a discussion of the processes and methods used to engineer, design, and coordinate engineering trades and performance assessments using SLS practices as applied to the GN&C system, with a particular focus on the Navigation components. These include composing system requirements, requirements verification, model development, model verification and validation, and modeling and analysis approaches. The Model-based Design and Requirements approach does not reduce the effort associated with the design process versus previous processes used at Marshall Space Flight Center. Instead, the approach takes advantage of overlap between the requirements development and management process, and the design and analysis process by efficiently combining the control (i.e. the requirement) and the design mechanisms. The design mechanism is the representation of the component behavior and performance in design and analysis tools. The focus in the early design process shifts from the development and management of design requirements to the development of usable models, model requirements, and model verification and validation efforts. The models themselves are represented in C/C++ code and accompanying data files. Under the idealized process, potential ambiguity in specification is reduced because the model must be implementable versus a requirement which is not necessarily subject to this constraint. Further, the models are shown to emulate the hardware during validation. For models developed by the Navigation Team, a common interface/standalone environment was developed. The common environment allows for easy implementation in design and analysis tools. Mechanisms such as unit test cases ensure implementation as the developer intended. The model verification and validation process provides a very high level of component design insight. The origin and implementation of the SLS variant of Model-based Design is described from the perspective of the SLS Navigation Team. The format of the models and the requirements are described. The Model-based Design approach has many benefits but is not without potential complications. Key lessons learned associated with the implementation of the Model Based Design approach and process from infancy to verification and certification are discussed

  7. The Effect of a Professional Development Classroom Management Model on At-Risk Elementary Students' Misbehaviors

    ERIC Educational Resources Information Center

    Reglin, Gary; Akpo-Sanni, Joretta; Losike-Sedimo, Nonofo

    2012-01-01

    The problem in the study was that at-risk elementary school students had too many classroom disruptive behaviors. The purpose was to investigate the effect a Professional Development Classroom Management Model would have on reducing these students' misbehaviors. The study implemented a classroom management model to improve the classroom management…

  8. SLMRACE: a noise-free RACE implementation with reduced computational time

    NASA Astrophysics Data System (ADS)

    Chauvin, Juliet; Provenzi, Edoardo

    2017-05-01

    We present a faster and noise-free implementation of the RACE algorithm. RACE has mixed characteristics between the famous Retinex model of Land and McCann and the automatic color equalization (ACE) color-correction algorithm. The original random spray-based RACE implementation suffers from two main problems: its computational time and the presence of noise. Here, we will show that it is possible to adapt two techniques recently proposed by Banić et al. to the RACE framework in order to drastically decrease the computational time and noise generation. The implementation will be called smart-light-memory-RACE (SLMRACE).

  9. Implementing autonomous clinical nurse specialist prescriptive authority: a competency-based transition model.

    PubMed

    Klein, Tracy Ann

    2012-01-01

    The purpose of this study was to identify and implement a competency-based regulatory model that transitions clinical nurse specialists (CNSs) to autonomous prescriptive authority pursuant to change in state law. Prescriptive authority for CNSs may be optional or restricted under current state law. Implementation of the APRN Consensus Model includes full prescriptive authority for all advanced practice registered nurses. Clinical nurse specialists face barriers to establishing their prescribing authority when laws or practice change. Identification of transition models will assist CNSs who need to add prescriptive authority to their scope of practice. Identification and implementation of a competency-based transition model for expansion of CNS prescriptive authority. By January 1, 2012, 9 CNSs in the state exemplar have completed a practicum and been granted full prescriptive authority including scheduled drug prescribing. No complaints or board actions resulted from the transition to autonomous prescribing. Transition to prescribing may be facilitated through competency-based outcomes including practicum hours as appropriate to the individual CNS nursing specialty. Outcomes from this model can be used to develop and further validate educational and credentialing policies to reduce barriers for CNSs requiring prescriptive authority in other states.

  10. Archetype Model-Driven Development Framework for EHR Web System

    PubMed Central

    Kimura, Eizen; Ishihara, Ken

    2013-01-01

    Objectives This article describes the Web application framework for Electronic Health Records (EHRs) we have developed to reduce construction costs for EHR sytems. Methods The openEHR project has developed clinical model driven architecture for future-proof interoperable EHR systems. This project provides the specifications to standardize clinical domain model implementations, upon which the ISO/CEN 13606 standards are based. The reference implementation has been formally described in Eiffel. Moreover C# and Java implementations have been developed as reference. While scripting languages had been more popular because of their higher efficiency and faster development in recent years, they had not been involved in the openEHR implementations. From 2007, we have used the Ruby language and Ruby on Rails (RoR) as an agile development platform to implement EHR systems, which is in conformity with the openEHR specifications. Results We implemented almost all of the specifications, the Archetype Definition Language parser, and RoR scaffold generator from archetype. Although some problems have emerged, most of them have been resolved. Conclusions We have provided an agile EHR Web framework, which can build up Web systems from archetype models using RoR. The feasibility of the archetype model to provide semantic interoperability of EHRs has been demonstrated and we have verified that that it is suitable for the construction of EHR systems. PMID:24523991

  11. Using a Systematic Conceptual Model for a Process Evaluation of a Middle School Obesity Risk-Reduction Nutrition Curriculum Intervention: Choice, Control & Change

    PubMed Central

    Lee, Heewon; Contento, Isobel R.; Koch, Pamela

    2012-01-01

    Objective To use and review a conceptual model of process evaluation and to examine the implementation of a nutrition education curriculum, Choice, Control & Change, designed to promote dietary and physical activity behaviors that reduce obesity risk. Design A process evaluation study based on a systematic conceptual model. Setting Five middle schools in New York City. Participants 562 students in 20 classes and their science teachers (n=8). Main Outcome Measures Based on the model, teacher professional development, teacher implementation, and student reception were evaluated. Also measured were teacher characteristics, teachers’ curriculum evaluation, and satisfaction with teaching the curriculum. Analysis Descriptive statistics and Spearman’s Rho Correlation for quantitative analysis and content analysis for qualitative data were used. Results Mean score of the teacher professional development evaluation was 4.75 on a 5-point scale. Average teacher implementation rate was 73%, and student reception rate was 69%. Ongoing teacher support was highly valued by teachers. Teachers’ satisfaction with teaching the curriculum was highly correlated with students’ satisfaction (p <.05). Teachers’ perception of amount of student work was negatively correlated with implementation and with student satisfaction (p<.05). Conclusions and implications Use of a systematic conceptual model and comprehensive process measures improves understanding of the implementation process and helps educators to better implement interventions as designed. PMID:23321021

  12. Using a systematic conceptual model for a process evaluation of a middle school obesity risk-reduction nutrition curriculum intervention: choice, control & change.

    PubMed

    Lee, Heewon; Contento, Isobel R; Koch, Pamela

    2013-03-01

    To use and review a conceptual model of process evaluation and to examine the implementation of a nutrition education curriculum, Choice, Control & Change, designed to promote dietary and physical activity behaviors that reduce obesity risk. A process evaluation study based on a systematic conceptual model. Five middle schools in New York City. Five hundred sixty-two students in 20 classes and their science teachers (n = 8). Based on the model, teacher professional development, teacher implementation, and student reception were evaluated. Also measured were teacher characteristics, teachers' curriculum evaluation, and satisfaction with teaching the curriculum. Descriptive statistics and Spearman ρ correlation for quantitative analysis and content analysis for qualitative data were used. Mean score of the teacher professional development evaluation was 4.75 on a 5-point scale. Average teacher implementation rate was 73%, and the student reception rate was 69%. Ongoing teacher support was highly valued by teachers. Teacher satisfaction with teaching the curriculum was highly correlated with student satisfaction (P < .05). Teacher perception of amount of student work was negatively correlated with implementation and with student satisfaction (P < .05). Use of a systematic conceptual model and comprehensive process measures improves understanding of the implementation process and helps educators to better implement interventions as designed. Copyright © 2013 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  13. Peace Works: A Program Evaluation Model

    ERIC Educational Resources Information Center

    LeBlanc, Patrice; Lacey, Candace

    2002-01-01

    The purpose of this paper is to describe the model used to evaluate the two-year Allegany Grant between the Peace Education Foundation and the Miami-Dade County Public Schools that focused on reducing school violence through implementing the "Peace Works" program. A mixed method design using simultaneous methodologies was employed. For…

  14. Estimating changes in public health following implementation of hazard analysis and critical control point in the United States broiler slaughter industry.

    PubMed

    Williams, Michael S; Ebel, Eric D

    2012-01-01

    A common approach to reducing microbial contamination has been the implementation of a Hazard Analysis and Critical Control Point (HACCP) program to prevent or reduce contamination during production. One example is the Pathogen Reduction HACCP program implemented by the U.S. Department of Agriculture's Food Safety and Inspection Service (FSIS). This program consisted of a staged implementation between 1996 and 2000 to reduce microbial contamination on meat and poultry products. Of the commodities regulated by FSIS, one of the largest observed reductions was for Salmonella contamination on broiler chicken carcasses. Nevertheless, how this reduction might have influenced the total number of salmonellosis cases in the United States has not been assessed. This study incorporates information from public health surveillance and surveys of the poultry slaughter industry into a model that estimates the number of broiler-related salmonellosis cases through time. The model estimates that-following the 56% reduction in the proportion of contaminated broiler carcasses observed between 1995 and 2000-approximately 190,000 fewer annual salmonellosis cases (attributed to broilers) occurred in 2000 compared with 1995. The uncertainty bounds for this estimate range from approximately 37,000 to 500,000 illnesses. Estimated illnesses prevented, due to the more modest reduction in contamination of 13% between 2000 and 2007, were not statistically significant. An analysis relating the necessary magnitude of change in contamination required for detection via human surveillance also is provided.

  15. Advanced Fluid Reduced Order Models for Compressible Flow.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tezaur, Irina Kalashnikova; Fike, Jeffrey A.; Carlberg, Kevin Thomas

    This report summarizes fiscal year (FY) 2017 progress towards developing and implementing within the SPARC in-house finite volume flow solver advanced fluid reduced order models (ROMs) for compressible captive-carriage flow problems of interest to Sandia National Laboratories for the design and qualification of nuclear weapons components. The proposed projection-based model order reduction (MOR) approach, known as the Proper Orthogonal Decomposition (POD)/Least- Squares Petrov-Galerkin (LSPG) method, can substantially reduce the CPU-time requirement for these simulations, thereby enabling advanced analyses such as uncertainty quantification and de- sign optimization. Following a description of the project objectives and FY17 targets, we overview briefly themore » POD/LSPG approach to model reduction implemented within SPARC . We then study the viability of these ROMs for long-time predictive simulations in the context of a two-dimensional viscous laminar cavity problem, and describe some FY17 enhancements to the proposed model reduction methodology that led to ROMs with improved predictive capabilities. Also described in this report are some FY17 efforts pursued in parallel to the primary objective of determining whether the ROMs in SPARC are viable for the targeted application. These include the implemen- tation and verification of some higher-order finite volume discretization methods within SPARC (towards using the code to study the viability of ROMs on three-dimensional cavity problems) and a novel structure-preserving constrained POD/LSPG formulation that can improve the accuracy of projection-based reduced order models. We conclude the report by summarizing the key takeaways from our FY17 findings, and providing some perspectives for future work.« less

  16. Monte Carlo simulation of photon migration in a cloud computing environment with MapReduce

    PubMed Central

    Pratx, Guillem; Xing, Lei

    2011-01-01

    Monte Carlo simulation is considered the most reliable method for modeling photon migration in heterogeneous media. However, its widespread use is hindered by the high computational cost. The purpose of this work is to report on our implementation of a simple MapReduce method for performing fault-tolerant Monte Carlo computations in a massively-parallel cloud computing environment. We ported the MC321 Monte Carlo package to Hadoop, an open-source MapReduce framework. In this implementation, Map tasks compute photon histories in parallel while a Reduce task scores photon absorption. The distributed implementation was evaluated on a commercial compute cloud. The simulation time was found to be linearly dependent on the number of photons and inversely proportional to the number of nodes. For a cluster size of 240 nodes, the simulation of 100 billion photon histories took 22 min, a 1258 × speed-up compared to the single-threaded Monte Carlo program. The overall computational throughput was 85,178 photon histories per node per second, with a latency of 100 s. The distributed simulation produced the same output as the original implementation and was resilient to hardware failure: the correctness of the simulation was unaffected by the shutdown of 50% of the nodes. PMID:22191916

  17. Modeling the effect of shroud contact and friction dampers on the mistuned response of turbopumps

    NASA Technical Reports Server (NTRS)

    Griffin, Jerry H.; Yang, M.-T.

    1994-01-01

    The contract has been revised. Under the revised scope of work a reduced order model has been developed that can be used to predict the steady-state response of mistuned bladed disks. The approach has been implemented in a computer code, LMCC. It is concluded that: the reduced order model displays structural fidelity comparable to that of a finite element model of an entire bladed disk system with significantly improved computational efficiency; and, when the disk is stiff, both the finite element model and LMCC predict significantly more amplitude variation than was predicted by earlier models. This second result may have important practical ramifications, especially in the case of integrally bladed disks.

  18. Two Models to Conduct Nonphysician-led Exercise Stress Testing in Low to Intermediate Risk Patients.

    PubMed

    Scott, Adam; Whitman, Mark; McDonald, Alice; Webster, Meghan; Jenkins, Carly

    2017-03-01

    Exercise stress testing (EST) is a noninvasive procedure that aids the diagnosis and prognosis of a range of cardiac pathologies. Reduced access is recognized as a limiting factor in enabling early access to treatment or safe and appropriate discharge. Increased accessibility can be achieved by utilizing nonphysician health practitioners to supervise tests. To implement nonphysician-led EST in clinical environments, there is a need for the development and administration of feasible and effective models. Via inpatient and outpatient referral, this article aims to present 2 standardized models of care for patients requiring EST for diagnostic and prognostic evaluation of numerous pathologies. An inpatient and outpatient model was implemented at the Royal Brisbane and Women's Hospital and Logan Hospital in Queensland, Australia between July 2013 and December 2015. Tests were performed by 2 cardiac scientists employed by each hospital. All tests were immediately reported by a cardiology advanced trainee registrar or consultant cardiologist. A total of 2095 tests were performed via the 2 models. Overall, 73 had a positive result (3.5%), 120 equivocal (5.7%), 129 inconclusive/submaximal (6.2%), and 1773 negative (85.2%). After further testing, 38 of the patients with positive and equivocal results were diagnosed with flow-limiting coronary artery disease. The remaining patients were resolved as negative through further diagnostic testing or lost to follow up. After implementation of the 2 models, patient flow was improved for earlier discharge, reduced waiting times, or timely identification of possible cardiac pathologies, thereby optimizing patient care.

  19. Process evaluation of a regional public health model to reduce chronic disease through policy and systems changes, Washington State, 2010-2014.

    PubMed

    Walkinshaw, Lina P; Mason, Caitlin; Allen, Claire L; Vu, Thuy; Nandi, Paj; Santiago, Patti Migliore; Hannon, Peggy A

    2015-03-19

    Although the regionalization of public health systems has been well documented in the case of emergency preparedness, there is little literature on the application of regional approaches to other aspects of public health. From 2011 through 2014 the Washington State Department of Health implemented a Community Transformation Grant to support community-level policy and systems changes to decrease chronic disease risk factors and increase access to clinical preventive services. The Department of Health implemented the grant through a regional model, grouping 32 of the state's 35 local health jurisdictions into 5 regions. Our process evaluation identifies the challenges and facilitators to Community Transformation Grant planning and implementation. We conducted 34 key informant interviews with people directly involved in the implementation of the Community Transformation Grant. We interviewed state and local partners, including representatives from each region, the Department of Health, external consultants, and regional partners. We collected data from October 2013 through July 2014. Challenges for planning, building, and implementing a regional model for chronic disease prevention included stakeholder buy-in, regional geography, and communication; facilitators included shared regional history and infrastructure, strong leadership, collaborative relationships, shared vision and goals, sufficient funding, and direct technical assistance and training. Lessons learned in Washington State provide a foundation for other states interested in using a regional approach to reduce chronic disease risk. Policy and systems changes require adequate time, funding, and staffing. States and funders should work closely with local leaders to address these challenges and facilitators.

  20. Acceleration of discrete stochastic biochemical simulation using GPGPU.

    PubMed

    Sumiyoshi, Kei; Hirata, Kazuki; Hiroi, Noriko; Funahashi, Akira

    2015-01-01

    For systems made up of a small number of molecules, such as a biochemical network in a single cell, a simulation requires a stochastic approach, instead of a deterministic approach. The stochastic simulation algorithm (SSA) simulates the stochastic behavior of a spatially homogeneous system. Since stochastic approaches produce different results each time they are used, multiple runs are required in order to obtain statistical results; this results in a large computational cost. We have implemented a parallel method for using SSA to simulate a stochastic model; the method uses a graphics processing unit (GPU), which enables multiple realizations at the same time, and thus reduces the computational time and cost. During the simulation, for the purpose of analysis, each time course is recorded at each time step. A straightforward implementation of this method on a GPU is about 16 times faster than a sequential simulation on a CPU with hybrid parallelization; each of the multiple simulations is run simultaneously, and the computational tasks within each simulation are parallelized. We also implemented an improvement to the memory access and reduced the memory footprint, in order to optimize the computations on the GPU. We also implemented an asynchronous data transfer scheme to accelerate the time course recording function. To analyze the acceleration of our implementation on various sizes of model, we performed SSA simulations on different model sizes and compared these computation times to those for sequential simulations with a CPU. When used with the improved time course recording function, our method was shown to accelerate the SSA simulation by a factor of up to 130.

  1. Acceleration of discrete stochastic biochemical simulation using GPGPU

    PubMed Central

    Sumiyoshi, Kei; Hirata, Kazuki; Hiroi, Noriko; Funahashi, Akira

    2015-01-01

    For systems made up of a small number of molecules, such as a biochemical network in a single cell, a simulation requires a stochastic approach, instead of a deterministic approach. The stochastic simulation algorithm (SSA) simulates the stochastic behavior of a spatially homogeneous system. Since stochastic approaches produce different results each time they are used, multiple runs are required in order to obtain statistical results; this results in a large computational cost. We have implemented a parallel method for using SSA to simulate a stochastic model; the method uses a graphics processing unit (GPU), which enables multiple realizations at the same time, and thus reduces the computational time and cost. During the simulation, for the purpose of analysis, each time course is recorded at each time step. A straightforward implementation of this method on a GPU is about 16 times faster than a sequential simulation on a CPU with hybrid parallelization; each of the multiple simulations is run simultaneously, and the computational tasks within each simulation are parallelized. We also implemented an improvement to the memory access and reduced the memory footprint, in order to optimize the computations on the GPU. We also implemented an asynchronous data transfer scheme to accelerate the time course recording function. To analyze the acceleration of our implementation on various sizes of model, we performed SSA simulations on different model sizes and compared these computation times to those for sequential simulations with a CPU. When used with the improved time course recording function, our method was shown to accelerate the SSA simulation by a factor of up to 130. PMID:25762936

  2. Simulation of Ultra-Small MOSFETs Using a 2-D Quantum-Corrected Drift-Diffusion Model

    NASA Technical Reports Server (NTRS)

    Biegel, Bryan A.; Rafferty, Conor S.; Yu, Zhiping; Dutton, Robert W.; Ancona, Mario G.; Saini, Subhash (Technical Monitor)

    1998-01-01

    We describe an electronic transport model and an implementation approach that respond to the challenges of device modeling for gigascale integration. We use the density-gradient (DG) transport model, which adds tunneling and quantum smoothing of carrier density profiles to the drift-diffusion model. We present the current implementation of the DG model in PROPHET, a partial differential equation solver developed by Lucent Technologies. This implementation approach permits rapid development and enhancement of models, as well as run-time modifications and model switching. We show that even in typical bulk transport devices such as P-N diodes and BJTs, DG quantum effects can significantly modify the I-V characteristics. Quantum effects are shown to be even more significant in small, surface transport devices, such as sub-0.1 micron MOSFETs. In thin-oxide MOS capacitors, we find that quantum effects may reduce gate capacitance by 25% or more. The inclusion of quantum effects in simulations dramatically improves the match between C-V simulations and measurements. Significant quantum corrections also occur in the I-V characteristics of short-channel MOSFETs due to the gate capacitance correction.

  3. Implementation evaluation of a culturally competent eye injury prevention program for citrus workers in a Florida migrant community.

    PubMed

    Luque, John S; Monaghan, Paul; Contreras, Ricardo B; August, Euna; Baldwin, Julie A; Bryant, Carol A; McDermott, Robert J

    2007-01-01

    The Partnership for Citrus Worker Health (PCWH) is a coalition that connects academic institutions, public health agencies, industry and community-based organizations for implementation of an eye safety pilot project with citrus workers using the Camp Health Aide (CHA) model. This project was an implementation evaluation of an eye safety curriculum using modeling and peer-to-peer education among Mexican migrant citrus workers in a southwest Florida community to increase positive perceptions toward the use of safety eyewear and reduce occupational eye injuries. CHAs have been employed and trained in eye safety and health during harvesting seasons since 2004. Field observations, focus group interviews, and written questionnaires assessed program implementation and initial outcomes. There was an increase in positive perceptions toward use of safety eyewear between 2004 and 2005. Evaluation of training suggested ways to improve the curriculum. The modest literacy level of the CHAs necessitated some redesign of the curriculum and its implementation (e.g., introduction of and more reliance on use of training posters). PCWH benefited by extensive documentation of the training and supervision, a pilot project that demonstrated the potential effectiveness of CHAs, and having a well-defined target population of citrus workers (n = 427). Future research can rigorously test the effectiveness of CHAs in reducing eye injuries among citrus workers.

  4. Improving atomic force microscopy imaging by a direct inverse asymmetric PI hysteresis model.

    PubMed

    Wang, Dong; Yu, Peng; Wang, Feifei; Chan, Ho-Yin; Zhou, Lei; Dong, Zaili; Liu, Lianqing; Li, Wen Jung

    2015-02-03

    A modified Prandtl-Ishlinskii (PI) model, referred to as a direct inverse asymmetric PI (DIAPI) model in this paper, was implemented to reduce the displacement error between a predicted model and the actual trajectory of a piezoelectric actuator which is commonly found in AFM systems. Due to the nonlinearity of the piezoelectric actuator, the standard symmetric PI model cannot precisely describe the asymmetric motion of the actuator. In order to improve the accuracy of AFM scans, two series of slope parameters were introduced in the PI model to describe both the voltage-increase-loop (trace) and voltage-decrease-loop (retrace). A feedforward controller based on the DIAPI model was implemented to compensate hysteresis. Performance of the DIAPI model and the feedforward controller were validated by scanning micro-lenses and standard silicon grating using a custom-built AFM.

  5. Update and extension of the Brazil SimSmoke model to estimate the health impact of cigarette smoking by pregnant women in Brazil.

    PubMed

    Szklo, André Salem; Yuan, Zhe; Levy, David

    2017-12-18

    A previous application of the Brazil SimSmoke tobacco control policy simulation model was used to show the effect of policies implemented between 1989 and 2010 on smoking-attributable deaths (SADs). In this study, we updated and further validated the Brazil SimSmoke model to incorporate policies implemented since 2011 (e.g., a new tax structure with the purpose of increasing revenues/real prices). In addition, we extended the model to estimate smoking-attributable maternal and child health outcomes (MCHOs), such as placenta praevia, placental abruption, preterm birth, low birth weight, and sudden infant death syndrome, to show the role of tobacco control in achieving the Millennium Development Goals. Using data on population, births, smoking, policies, and prevalence of MCHOs, the model is used to assess the effect on both premature deaths and MCHOs of tobacco control policies implemented in Brazil in the last 25 years relative to a counterfactual of policies kept at 1989 levels. Smoking prevalence in Brazil has fallen by an additional 17% for males (16%-19%) and 19% for females (14%-24%) between 2011 and 2015. As a result of the policies implemented since 1989, 7.5 million (6.4-8.5) deaths among adults aged 18 years or older are projected to be averted by 2050. Current policies are also estimated to reduce a cumulative total of 0.9 million (0.4-2.4) adverse MCHOs by 2050. Our findings show the benefits of tobacco control in reducing both SADs and smoking-attributable MCHOs at population level. These benefits may be used to better inform policy makers in low and middle income countries about allocating resources towards tobacco control policies in this important area.

  6. Implementation of a reduced order Kalman filter to assimilate ocean color data into a coupled physical-biochemical model of the North Aegean Sea.

    NASA Astrophysics Data System (ADS)

    Kalaroni, Sofia; Tsiaras, Kostas; Economou-Amilli, Athena; Petihakis, George; Politikos, Dimitrios; Triantafyllou, George

    2013-04-01

    Within the framework of the European project OPEC (Operational Ecology), a data assimilation system was implemented to describe chlorophyll-a concentrations of the North Aegean, as well the impact on the European anchovy (Engraulis encrasicolus) biomass distribution provided by a bioenergetics model, related to the density of three low trophic level functional groups of zooplankton (heterotrophic flagellates, microzooplankton and mesozooplankton). The three-dimensional hydrodynamic-biogeochemical model comprises two on-line coupled sub-models: the Princeton Ocean Model (POM) and the European Regional Seas Ecosystem Model (ERSEM). The assimilation scheme is based on the Singular Evolutive Extended Kalman (SEEK) filter and its variant that uses a fixed correction base (SFEK). For the initialization, SEEK filter uses a reduced order error covariance matrix provided by the dominant Empirical Orthogonal Functions (EOF) of model. The assimilation experiments were performed for year 2003 using SeaWiFS chlorophyll-a data during which the physical model uses the atmospheric forcing obtained from the regional climate model HIRHAM5. The assimilation system is validated by assessing the relevance of the system in fitting the data, the impact of the assimilation on non-observed biochemical parameters and the overall quality of the forecasts.

  7. Analyzing implementation dynamics using theory-driven evaluation principles: lessons learnt from a South African centralized chronic dispensing model.

    PubMed

    Magadzire, Bvudzai Priscilla; Marchal, Bruno; Mathys, Tania; Laing, Richard O; Ward, Kim

    2017-12-04

    Centralized dispensing of essential medicines is one of South Africa's strategies to address the shortage of pharmacists, reduce patients' waiting times and reduce over-crowding at public sector healthcare facilities. This article reports findings of an evaluation of the Chronic Dispensing Unit (CDU) in one province. The objectives of this process evaluation were to: (1) compare what was planned versus the actual implementation and (2) establish the causal elements and contextual factors influencing implementation. This qualitative study employed key informant interviews with the intervention's implementers (clinicians, managers and the service provider) [N = 40], and a review of policy and program documents. Data were thematically analyzed by identifying the main influences shaping the implementation process. Theory-driven evaluation principles were applied as a theoretical framework to explain implementation dynamics. The overall participants' response about the CDU was positive and the majority of informants concurred that the establishment of the CDU to dispense large volumes of medicines is a beneficial strategy to address healthcare barriers because mechanical functions are automated and distribution of medicines much quicker. However, implementation was influenced by the context and discrepancies between planned activities and actual implementation were noted. Procurement inefficiencies at central level caused medicine stock-outs and affected CDU activities. At the frontline, actors were aware of the CDU's implementation guidelines regarding patient selection, prescription validity and management of non-collected medicines but these were adapted to accommodate practical realities and to meet performance targets attached to the intervention. Implementation success was a result of a combination of 'hardware' (e.g. training, policies, implementation support and appropriate infrastructure) and 'software' (e.g. ownership, cooperation between healthcare practitioners and trust) factors. This study shows that health system interventions have unpredictable paths of implementation. Discrepancies between planned and actual implementation reinforce findings in existing literature suggesting that while tools and defined operating procedures are necessary for any intervention, their successful application depends crucially on the context and environment in which implementation occurs. We anticipate that this evaluation will stimulate wider thinking about the implementation of similar models in low- and middle-income countries.

  8. A Standard-Based Model for Adaptive E-Learning Platform for Mauritian Academic Institutions

    ERIC Educational Resources Information Center

    Kanaksabee, P.; Odit, M. P.; Ramdoyal, A.

    2011-01-01

    The key aim of this paper is to introduce a standard-based model for adaptive e-learning platform for Mauritian academic institutions and to investigate the conditions and tools required to implement this model. The main forces of the system are that it allows collaborative learning, communication among user, and reduce considerable paper work.…

  9. Using a Systematic Conceptual Model for a Process Evaluation of a Middle School Obesity Risk-Reduction Nutrition Curriculum Intervention: "Choice, Control & Change"

    ERIC Educational Resources Information Center

    Lee, Heewon; Contento, Isobel R.; Koch, Pamela

    2013-01-01

    Objective: To use and review a conceptual model of process evaluation and to examine the implementation of a nutrition education curriculum, "Choice, Control & Change", designed to promote dietary and physical activity behaviors that reduce obesity risk. Design: A process evaluation study based on a systematic conceptual model. Setting: Five…

  10. Large-scale seismic waveform quality metric calculation using Hadoop

    DOE PAGES

    Magana-Zook, Steven; Gaylord, Jessie M.; Knapp, Douglas R.; ...

    2016-05-27

    Here in this work we investigated the suitability of Hadoop MapReduce and Apache Spark for large-scale computation of seismic waveform quality metrics by comparing their performance with that of a traditional distributed implementation. The Incorporated Research Institutions for Seismology (IRIS) Data Management Center (DMC) provided 43 terabytes of broadband waveform data of which 5.1 TB of data were processed with the traditional architecture, and the full 43 TB were processed using MapReduce and Spark. Maximum performance of ~0.56 terabytes per hour was achieved using all 5 nodes of the traditional implementation. We noted that I/O dominated processing, and that I/Omore » performance was deteriorating with the addition of the 5th node. Data collected from this experiment provided the baseline against which the Hadoop results were compared. Next, we processed the full 43 TB dataset using both MapReduce and Apache Spark on our 18-node Hadoop cluster. We conducted these experiments multiple times with various subsets of the data so that we could build models to predict performance as a function of dataset size. We found that both MapReduce and Spark significantly outperformed the traditional reference implementation. At a dataset size of 5.1 terabytes, both Spark and MapReduce were about 15 times faster than the reference implementation. Furthermore, our performance models predict that for a dataset of 350 terabytes, Spark running on a 100-node cluster would be about 265 times faster than the reference implementation. We do not expect that the reference implementation deployed on a 100-node cluster would perform significantly better than on the 5-node cluster because the I/O performance cannot be made to scale. Finally, we note that although Big Data technologies clearly provide a way to process seismic waveform datasets in a high-performance and scalable manner, the technology is still rapidly changing, requires a high degree of investment in personnel, and will likely require significant changes in other parts of our infrastructure. Nevertheless, we anticipate that as the technology matures and third-party tool vendors make it easier to manage and operate clusters, Hadoop (or a successor) will play a large role in our seismic data processing.« less

  11. Large-scale seismic waveform quality metric calculation using Hadoop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Magana-Zook, Steven; Gaylord, Jessie M.; Knapp, Douglas R.

    Here in this work we investigated the suitability of Hadoop MapReduce and Apache Spark for large-scale computation of seismic waveform quality metrics by comparing their performance with that of a traditional distributed implementation. The Incorporated Research Institutions for Seismology (IRIS) Data Management Center (DMC) provided 43 terabytes of broadband waveform data of which 5.1 TB of data were processed with the traditional architecture, and the full 43 TB were processed using MapReduce and Spark. Maximum performance of ~0.56 terabytes per hour was achieved using all 5 nodes of the traditional implementation. We noted that I/O dominated processing, and that I/Omore » performance was deteriorating with the addition of the 5th node. Data collected from this experiment provided the baseline against which the Hadoop results were compared. Next, we processed the full 43 TB dataset using both MapReduce and Apache Spark on our 18-node Hadoop cluster. We conducted these experiments multiple times with various subsets of the data so that we could build models to predict performance as a function of dataset size. We found that both MapReduce and Spark significantly outperformed the traditional reference implementation. At a dataset size of 5.1 terabytes, both Spark and MapReduce were about 15 times faster than the reference implementation. Furthermore, our performance models predict that for a dataset of 350 terabytes, Spark running on a 100-node cluster would be about 265 times faster than the reference implementation. We do not expect that the reference implementation deployed on a 100-node cluster would perform significantly better than on the 5-node cluster because the I/O performance cannot be made to scale. Finally, we note that although Big Data technologies clearly provide a way to process seismic waveform datasets in a high-performance and scalable manner, the technology is still rapidly changing, requires a high degree of investment in personnel, and will likely require significant changes in other parts of our infrastructure. Nevertheless, we anticipate that as the technology matures and third-party tool vendors make it easier to manage and operate clusters, Hadoop (or a successor) will play a large role in our seismic data processing.« less

  12. Avoidable cost of alcohol abuse in Canada.

    PubMed

    Rehm, Jürgen; Patra, Jayadeep; Gnam, William H; Sarnocinska-Hart, Anna; Popova, Svetlana

    2011-01-01

    To estimate avoidable burden and avoidable costs of alcohol abuse in Canada for the year 2002. A policy effectiveness approach was used. The impact of six effective and cost-effective alcohol policy interventions aimed to reduce alcohol consumption was modeled. In addition, the effect of privatized alcohol sales that would increase alcohol consumption and alcohol-attributable costs was also modeled. The effects of these interventions were compared with the baseline (aggregate) costs obtained from the second Canadian Study of Social Costs Attributable to Substance Abuse. It was estimated that by implementing six cost-effective policies from about 900 million to two billion Canadian dollars per year could be saved in Canada. The greatest savings due to the implementation of these interventions would be achieved in the lowering of productivity losses, followed by health care, and criminality. Substantial increases in burden and cost would occur if Canadian provinces were to privatize alcohol sales. The implementation of proven effective population-based interventions would reduce alcohol-attributable burden and its costs in Canada to a considerable degree. Copyright © 2010 S. Karger AG, Basel.

  13. Damping in Space Constructions

    NASA Astrophysics Data System (ADS)

    de Vreugd, Jan; de Lange, Dorus; Winters, Jasper; Human, Jet; Kamphues, Fred; Tabak, Erik

    2014-06-01

    Monolithic structures are often used in optomechanical designs for space applications to achieve high dimensional stability and to prevent possible backlash and friction phenomena. The capacity of monolithic structures to dissipate mechanical energy is however limited due to the high Q-factor, which might result in high stresses during dynamic launch loads like random vibration, sine sweeps and shock. To reduce the Q-factor in space applications, the effect of constrained layer damping (CLD) is investigated in this work. To predict the damping increase, the CLD effect is implemented locally at the supporting struts in an existing FE model of an optical instrument. Numerical simulations show that the effect of local damping treatment in this instrument could reduce the vibrational stresses with 30-50%. Validation experiments on a simple structure showed good agreement between measured and predicted damping properties. This paper presents material characterization, material modeling, numerical implementation of damping models in finite element code, numerical results on space hardware and the results of validation experiments.

  14. Implementation of marine halogen chemistry into the Community Multiscale Air Quality (CMAQ) model

    NASA Astrophysics Data System (ADS)

    Gantt, B.; Sarwar, G.

    2017-12-01

    In two recent studies (Sarwar et al, 2015 and Gantt et al., 2017), the impact of marine halogen (bromine and iodine) chemistry on air quality has been evaluated using the Community Multiscale Air Quality (CMAQ) model. We found that marine halogen chemistry not only has the expected effect of reducing marine boundary layer ozone concentrations, but also reduces ozone in the free troposphere and inland from the coast. In Sarwar et al. (2015), the impact of the halogen chemistry without and with photochemical reactions of higher iodine oxides over the Northern Hemisphere was examined using the coarse horizontal grids of a hemispheric domain. Halogen chemistry without and with the photochemical reactions of higher iodine oxides reduces ozone over seawater by 15% and 48%, respectively. Using the results of the chemistry without the photochemical reactions of higher iodine oxides, we developed a simple first order ozone loss rate and implemented it into the public version of CMAQv52. In Gantt et al. (2017), the impact of the simple first order loss rate as well as the full halogen chemistry without photochemical reactions of higher iodine oxides over the continental United States was examined using finer horizontal grids of the regional domain and boundary conditions from the hemispheric domain with and without marine halogen chemistry. The boundary conditions obtained with the halogen chemistry as well as the simple halogen chemistry reduces ozone along the coast where CMAQ typically overpredicts the concentrations. Development of halogen chemistry in CMAQ has continued with the implementation of several heterogeneous reactions of bromine and iodine species, revised reactions of higher iodine oxides, and a refined marine halogen emissions inventory. Our latest version of halogen chemistry with photochemical reactions of higher iodine oxides reduces ozone by 23% over the seawater. This presentation will discuss the previous and ongoing implementation of revised halogen chemistry in CMAQ and its impacts on air quality.

  15. Aspect-Oriented Model-Driven Software Product Line Engineering

    NASA Astrophysics Data System (ADS)

    Groher, Iris; Voelter, Markus

    Software product line engineering aims to reduce development time, effort, cost, and complexity by taking advantage of the commonality within a portfolio of similar products. The effectiveness of a software product line approach directly depends on how well feature variability within the portfolio is implemented and managed throughout the development lifecycle, from early analysis through maintenance and evolution. This article presents an approach that facilitates variability implementation, management, and tracing by integrating model-driven and aspect-oriented software development. Features are separated in models and composed of aspect-oriented composition techniques on model level. Model transformations support the transition from problem to solution space models. Aspect-oriented techniques enable the explicit expression and modularization of variability on model, template, and code level. The presented concepts are illustrated with a case study of a home automation system.

  16. Continuous piecewise-linear, reduced-order electrochemical model for lithium-ion batteries in real-time applications

    NASA Astrophysics Data System (ADS)

    Farag, Mohammed; Fleckenstein, Matthias; Habibi, Saeid

    2017-02-01

    Model-order reduction and minimization of the CPU run-time while maintaining the model accuracy are critical requirements for real-time implementation of lithium-ion electrochemical battery models. In this paper, an isothermal, continuous, piecewise-linear, electrode-average model is developed by using an optimal knot placement technique. The proposed model reduces the univariate nonlinear function of the electrode's open circuit potential dependence on the state of charge to continuous piecewise regions. The parameterization experiments were chosen to provide a trade-off between extensive experimental characterization techniques and purely identifying all parameters using optimization techniques. The model is then parameterized in each continuous, piecewise-linear, region. Applying the proposed technique cuts down the CPU run-time by around 20%, compared to the reduced-order, electrode-average model. Finally, the model validation against real-time driving profiles (FTP-72, WLTP) demonstrates the ability of the model to predict the cell voltage accurately with less than 2% error.

  17. Alternative Models for Large-Group Introductory Earth Science Courses: Dual-Structured Model

    ERIC Educational Resources Information Center

    Carpenter, John R.; And Others

    1978-01-01

    An introductory college course in which both the instructional staff and students have input into the content has been successfully implemented into a spectrum of instructor-centered to student-centered introductory earth science courses. Grading by point accumulation method reduced the grade threat and induced student responsibility for learning.…

  18. A Sexual Assault Primary Prevention Model with Diverse Urban Youth

    ERIC Educational Resources Information Center

    Smothers, Melissa Kraemer; Smothers, D. Brian

    2011-01-01

    In this study, a nonprofit community mental health clinic developed a socioecological model of sexual abuse prevention that was implemented in a public school. The goal of the program was to promote and create community change within individuals and the school community by reducing tolerance of sexual violence and sexual harassment. Participants…

  19. Proposed Modification of a School-Wide Bully Prevention Program to Support All Children

    ERIC Educational Resources Information Center

    Ostrander, Jason; Melville, Alysse; Bryan, Janelle K.; Letendre, Joan

    2018-01-01

    Bullying prevention programs in the United States are being implemented in schools from kindergarten through high school to reduce rates of bullying behaviors. The bully prevention in positive behavior support (PBIS) model is an evidence-based, whole school intervention program. The PBIS model trains teachers, school staff, and administrators to…

  20. Implementing nationally determined contributions: building energy policies in India’s mitigation strategy

    NASA Astrophysics Data System (ADS)

    Yu, Sha; Evans, Meredydd; Kyle, Page; Vu, Linh; Tan, Qing; Gupta, Ashu; Patel, Pralit

    2018-03-01

    The Nationally Determined Contributions are allowing countries to examine options for reducing emissions through a range of domestic policies. India, like many developing countries, has committed to reducing emissions through specific policies, including building energy codes. Here we assess the potential of these sectoral policies to help in achieving mitigation targets. Collectively, it is critically important to see the potential impact of such policies across developing countries in meeting national and global emission goals. Buildings accounted for around one third of global final energy use in 2010, and building energy consumption is expected to increase as income grows in developing countries. Using the Global Change Assessment Model, this study finds that implementing a range of energy efficiency policies robustly can reduce total Indian building energy use by 22% and lower total Indian carbon dioxide emissions by 9% in 2050 compared to the business-as-usual scenario. Among various policies, energy codes for new buildings can result in the most significant savings. For all building energy policies, well-coordinated, consistent implementation is critical, which requires coordination across different departments and agencies, improving capacity of stakeholders, and developing appropriate institutions to facilitate policy implementation.

  1. A new role for the ACNP: the rapid response team leader.

    PubMed

    Morse, Kate J; Warshawsky, Deborah; Moore, Jacqueline M; Pecora, Denise C

    2006-01-01

    The implementation of a rapid response team or medical emergency team is 1 of the 6 initiatives of the Institute for Healthcare Improvement's 100,000 Lives Campaign with the goal to reduce the number of cardiopulmonary arrests outside the intensive care unit and inpatient mortality rates. The concept of RRT was pioneered in Australia and is now being implemented in many hospitals across the United States. This article reviews the current literature and describes the implementation of an RRT in a community hospital. The first-quarter data after implementation are described. The unique role of the acute care nurse practitioner in this hospital's model is described.

  2. Reducing hospital readmission through team-based primary care: A 7-week pilot study integrating behavioral health and pharmacy.

    PubMed

    DeCaporale-Ryan, Lauren N; Ahmed-Sarwar, Nabila; Upham, Robbyn; Mahler, Karen; Lashway, Katie

    2017-06-01

    A team-based service delivery model was applied to provide patients with biopsychosocial care following hospital discharge to reduce hospital readmission. Most previous interventions focused on transitions of care occurred in the inpatient setting with attention to predischarge strategies. These interventions have not considered psychosocial stressors, and few have explored management in primary care settings. A 7-week team-based service delivery model was implemented in a family medicine practice emphasizing a biopsychosocial approach. A physician, psychologist, pharmacist, care managers, and interdisciplinary trainees worked with 17 patients following hospital discharge. This comprehensive evaluation assessed patients' mood, cognitive abilities, and self-management of health behaviors. Modifications were made to improve ease of access to outpatient care and to improve patient understanding of the therapeutic plan. This pilot study was conducted to determine the utility of the model. Of 17 patients, 15 individuals avoided readmission at 30- and 90-day intervals. Other substantial benefits were noted, including reduced polypharmacy, engagement in specialty care, and reduction of environmental stressors to improve access to care. The clinic in which this was implemented is currently making efforts to maintain this model of care based on observed success. Although this work only represents a small sample, results are encouraging. This model can be replicated in other primary care settings with specialty clinicians on site. Specifically, approaches that promote a team-based delivery in a primary care setting may support improved patient outcomes and reduced overall systems' costs. Recommendations for research in a clinical setting are also offered. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  3. Does the chronic care model meet the emerging needs of people living with multimorbidity? A systematic review and thematic synthesis

    PubMed Central

    Abu Dabrh, Abd Moain; Gionfriddo, Michael R.; Erwin, Patricia; Montori, Victor M.

    2018-01-01

    Background The Chronic Care Model (CCM) emerged in the 1990s as an approach to re-organize primary care and implement critical elements that enable it to proactively attend to patients with chronic conditions. The chronic care landscape has evolved further, as most patients now present with multiple chronic conditions and increasing psychosocial complexity. These patients face accumulating and overwhelming complexity resulting from the sum of uncoordinated responses to each of their problems. Minimally Disruptive Medicine (MDM) was proposed to respond to this challenge, aiming at improving outcomes that matter to patients with the smallest burden of treatment. We sought to critically appraise the extent to which MDM constructs (e.g., reducing patient work, improving patients’ capacity) have been adopted within CCM implementations. Methods We conducted a systematic review and qualitative thematic synthesis of reports of CCM implementations published from 2011–2016. Results CCM implementations were mostly aligned with the healthcare system’s goals, condition-specific, and targeted disease-specific outcomes or healthcare utilization. No CCM implementation addressed patient work. Few reduced treatment workload without adding additional tasks. Implementations supported patient capacity by offering information, but rarely offered practical resources (e.g., financial assistance, transportation), helped patients reframe their biography with chronic illness, or assisted them in engaging with a supportive social network. Few implementations aimed at improving functional status or quality of life, and only one-third of studies were targeted for patients of low socioeconomic status. Conclusion MDM provides a lens to operationalize how to care for patients with multiple chronic conditions, but its constructs remain mostly absent from how implementations of the CCM are currently reported. Improvements to the primary care of patients with multimorbidity may benefit from the application of MDM, and the current CCM implementations that do apply MDM constructs should be considered exemplars for future implementation work. PMID:29420543

  4. Assessing the impacts of future climate conditions on the effectiveness of winter cover crops in reducing nitrate loads into the Chesapeake Bay Watersheds using SWAT model

    USDA-ARS?s Scientific Manuscript database

    Winter cover crops (WCCs) have been widely implemented in the Coastal Plain of the Chesapeake Bay watershed (CBW) due to their high effectiveness at reducing nitrate loads. However, future climate conditions (FCCs) are expected to exacerbate water quality degradation in the CBW by increasing nitrat...

  5. Animals and the 3Rs in toxicology research and testing: The way forward.

    PubMed

    Stokes, W S

    2015-12-01

    Despite efforts to eliminate the use of animals in testing and the availability of many accepted alternative methods, animals are still widely used for toxicological research and testing. While research using in vitro and computational models has dramatically increased in recent years, such efforts have not yet measurably impacted animal use for regulatory testing and are not likely to do so for many years or even decades. Until regulatory authorities have accepted test methods that can totally replace animals and these are fully implemented, large numbers of animals will continue to be used and many will continue to experience significant pain and distress. In order to positively impact the welfare of these animals, accepted alternatives must be implemented, and efforts must be directed at eliminating pain and distress and reducing animal numbers. Animal pain and distress can be reduced by earlier predictive humane endpoints, pain-relieving medications, and supportive clinical care, while sequential testing and routine use of integrated testing and decision strategies can reduce animal numbers. Applying advances in science and technology to the development of scientifically sound alternative testing models and strategies can improve animal welfare and further reduce and replace animal use. © The Author(s) 2015.

  6. Applying Collaborative Learning and Quality Improvement to Public Health: Lessons from the Collaborative Improvement and Innovation Network (CoIIN) to Reduce Infant Mortality.

    PubMed

    Ghandour, Reem M; Flaherty, Katherine; Hirai, Ashley; Lee, Vanessa; Walker, Deborah Klein; Lu, Michael C

    2017-06-01

    Infant mortality remains a significant public health problem in the U.S. The Collaborative Improvement & Innovation Network (CoIIN) model is an innovative approach, using the science of quality improvement and collaborative learning, which was applied across 13 Southern states in Public Health Regions IV and VI to reduce infant mortality and improve birth outcomes. We provide an in-depth discussion of the history, development, implementation, and adaptation of the model based on the experience of the original CoIIN organizers and participants. In addition to the political genesis and functional components of the initiative, 8 key lessons related to staffing, planning, and implementing future CoIINs are described in detail. This paper reports the findings from a process evaluation of the model. Data on the states' progress toward reducing infant mortality and improving birth outcomes were collected through a survey in the final months of a 24-month implementation period, as well as through ongoing team communications. The peer-to-peer exchange and platform for collaborative learning, as well as the sharing of data across the states, were major strengths and form the foundation for future CoIIN efforts. A lasting legacy of the initiative is the unique application and sharing of provisional "real time" data to inform "real time" decision-making. The CoIIN model of collaborative learning, QI, and innovation offers a promising approach to strengthening partnerships within and across states, bolstering data systems to inform and track progress more rapidly, and ultimately accelerating improvement toward healthier communities, States, and the Nation as a whole.

  7. Development of a student engagement approach to alcohol prevention: the Pragmatics Project.

    PubMed

    Buettner, Cynthia K; Andrews, David W; Glassman, Michael

    2009-01-01

    Significant involvement of students in the development and implementation of college alcohol prevention strategies is largely untested, despite recommendations by the National Institute of Alcohol Abuse and Alcoholism and others. The purpose of the Pragmatics Project was to test a student engagement model for developing and implementing alcohol intervention strategies. The Pragmatics Project involved 89 undergraduate students on a large Midwestern university campus in the design and implementation of projects focused on reducing harm associated with high-risk drinking and off-campus parties. The engagement model used an innovative course piloted in the Human Development and Family Science department. The course successfully involved both students and the community in addressing local alcohol issues. The course design described would fit well into a Master of Public Health, Community Psychology, Health Psychology, or interdisciplinary curricula as well as the service learning model, and it is applicable in addressing other health risk behaviors.

  8. Case Study: Advances in Modelling Exposure

    EPA Science Inventory

    Concerned about the health impacts of air pollution. many environmental agencies around the world are implementing regulations to reduce emissions from various sectors, thus maintaining ambient air quality at acceptable levels. The United States Environmental Protection Agency (E...

  9. Improving Atomic Force Microscopy Imaging by a Direct Inverse Asymmetric PI Hysteresis Model

    PubMed Central

    Wang, Dong; Yu, Peng; Wang, Feifei; Chan, Ho-Yin; Zhou, Lei; Dong, Zaili; Liu, Lianqing; Li, Wen Jung

    2015-01-01

    A modified Prandtl–Ishlinskii (PI) model, referred to as a direct inverse asymmetric PI (DIAPI) model in this paper, was implemented to reduce the displacement error between a predicted model and the actual trajectory of a piezoelectric actuator which is commonly found in AFM systems. Due to the nonlinearity of the piezoelectric actuator, the standard symmetric PI model cannot precisely describe the asymmetric motion of the actuator. In order to improve the accuracy of AFM scans, two series of slope parameters were introduced in the PI model to describe both the voltage-increase-loop (trace) and voltage-decrease-loop (retrace). A feedforward controller based on the DIAPI model was implemented to compensate hysteresis. Performance of the DIAPI model and the feedforward controller were validated by scanning micro-lenses and standard silicon grating using a custom-built AFM. PMID:25654719

  10. Efficient Nonlinear Atomization Model for Thin 3D Free Liquid Films

    NASA Astrophysics Data System (ADS)

    Mehring, Carsten

    2007-03-01

    Reviewed is a nonlinear reduced-dimension thin-film model developed by the author and aimed at the prediction of spray formation from thin films such as those found in gas-turbine engines (e.g., prefilming air-blast atomizers), heavy-fuel-oil burners (e.g., rotary-cup atomizers) and in the paint industry (e.g., flat-fan atomizers). Various implementations of the model focusing on different model-aspects, i.e., effect of film geometry, surface tension, liquid viscosity, coupling with surrounding gas-phase flow, influence of long-range intermolecular forces during film rupture are reviewed together with a validation of the nonlinear wave propagation characteristics predicted by the model for inviscid planar films using a two-dimensional vortex- method. An extension and generalization of the current nonlinear film model for implementation into a commercial flow- solver is outlined.

  11. Development and use of role model stories in a community level HIV risk reduction intervention.

    PubMed Central

    Corby, N H; Enguídanos, S M; Kay, L S

    1996-01-01

    A theory-based HIV prevention intervention was implemented as part of a five-city AIDS Community Demonstration Project for the development and testing of a community-level intervention to reduce AIDS risk among historically underserved groups. This intervention employed written material containing stories of risk-reducing experiences of members of the priority populations, in this case, injecting drug users, their female sex partners, and female sex workers. These materials were distributed to members of these populations by their peers, volunteers from the population who were trained to deliver social reinforcement for interest in personal risk reduction and the materials. The participation of the priority populations in the development and implementation of the intervention was designed to increase the credibility of the intervention and the acceptance of the message. The techniques involved in developing role-model stories are described in this paper. PMID:8862158

  12. Optimal harvesting for a predator-prey agent-based model using difference equations.

    PubMed

    Oremland, Matthew; Laubenbacher, Reinhard

    2015-03-01

    In this paper, a method known as Pareto optimization is applied in the solution of a multi-objective optimization problem. The system in question is an agent-based model (ABM) wherein global dynamics emerge from local interactions. A system of discrete mathematical equations is formulated in order to capture the dynamics of the ABM; while the original model is built up analytically from the rules of the model, the paper shows how minor changes to the ABM rule set can have a substantial effect on model dynamics. To address this issue, we introduce parameters into the equation model that track such changes. The equation model is amenable to mathematical theory—we show how stability analysis can be performed and validated using ABM data. We then reduce the equation model to a simpler version and implement changes to allow controls from the ABM to be tested using the equations. Cohen's weighted κ is proposed as a measure of similarity between the equation model and the ABM, particularly with respect to the optimization problem. The reduced equation model is used to solve a multi-objective optimization problem via a technique known as Pareto optimization, a heuristic evolutionary algorithm. Results show that the equation model is a good fit for ABM data; Pareto optimization provides a suite of solutions to the multi-objective optimization problem that can be implemented directly in the ABM.

  13. Efficient digital implementation of a conductance-based globus pallidus neuron and the dynamics analysis

    NASA Astrophysics Data System (ADS)

    Yang, Shuangming; Wei, Xile; Deng, Bin; Liu, Chen; Li, Huiyan; Wang, Jiang

    2018-03-01

    Balance between biological plausibility of dynamical activities and computational efficiency is one of challenging problems in computational neuroscience and neural system engineering. This paper proposes a set of efficient methods for the hardware realization of the conductance-based neuron model with relevant dynamics, targeting reproducing the biological behaviors with low-cost implementation on digital programmable platform, which can be applied in wide range of conductance-based neuron models. Modified GP neuron models for efficient hardware implementation are presented to reproduce reliable pallidal dynamics, which decode the information of basal ganglia and regulate the movement disorder related voluntary activities. Implementation results on a field-programmable gate array (FPGA) demonstrate that the proposed techniques and models can reduce the resource cost significantly and reproduce the biological dynamics accurately. Besides, the biological behaviors with weak network coupling are explored on the proposed platform, and theoretical analysis is also made for the investigation of biological characteristics of the structured pallidal oscillator and network. The implementation techniques provide an essential step towards the large-scale neural network to explore the dynamical mechanisms in real time. Furthermore, the proposed methodology enables the FPGA-based system a powerful platform for the investigation on neurodegenerative diseases and real-time control of bio-inspired neuro-robotics.

  14. Using Modified-ISS Model to Evaluate Medication Administration Safety During Bar Code Medication Administration Implementation in Taiwan Regional Teaching Hospital.

    PubMed

    Ma, Pei-Luen; Jheng, Yan-Wun; Jheng, Bi-Wei; Hou, I-Ching

    2017-01-01

    Bar code medication administration (BCMA) could reduce medical errors and promote patient safety. This research uses modified information systems success model (M-ISS model) to evaluate nurses' acceptance to BCMA. The result showed moderate correlation between medication administration safety (MAS) to system quality, information quality, service quality, user satisfaction, and limited satisfaction.

  15. Reducing 30-Day Readmission Rates in a High-Risk Population Using a Lay-Health Worker Model in Appalachia Kentucky

    ERIC Educational Resources Information Center

    Cardarelli, Roberto; Horsley, Mary; Ray, Lisa; Maggard, Nancy; Schilling, Jennifer; Weatherford, Sarah; Feltner, Fran; Gilliam, Kayla

    2018-01-01

    This exploratory study aimed to address the effectiveness of a lay-health worker (LHW) model in addressing social needs and readmissions of high-risk patients admitted in a rural community hospital. A quasi-experimental study design assessed implementation of a LHW model for assisting high-risk patients with their post-discharge social needs.…

  16. Reduction of maternal mortality due to preeclampsia in Colombia-an interrupted time-series analysis

    PubMed Central

    Herrera-Medina, Rodolfo; Herrera-Escobar, Juan Pablo; Nieto-Díaz, Aníbal

    2014-01-01

    Introduction: Preeclampsia is the most important cause of maternal mortality in developing countries. A comprehensive prenatal care program including bio-psychosocial components was developed and introduced at a national level in Colombia. We report on the trends in maternal mortality rates and their related causes before and after implementation of this program. Methods: General and specific maternal mortality rates were monitored for nine years (1998-2006). An interrupted time-series analysis was performed with monthly data on cases of maternal mortality that compared trends and changes in national mortality rates and the impact of these changes attributable to the introduction of a bio-psychosocial model. Multivariate analyses were performed to evaluate correlations between the interventions. Results: Five years after (2002 - 2006) its introduction the general maternal mortality rate was significantly reduced to 23% (OR=0.77, CI 95% 0.71-0.82).The implementation of BPSM also reduced the incidence of preeclampsia in 22% (OR= 0.78, CI 95% 0.67-0.88), as also the labor complications by hemorrhage in 25% (OR=0.75, CI 95% 0.59-0.90) associated with the implementation of red code. The other causes of maternal mortality did not reveal significant changes. Biomedical, nutritional, psychosocial assessments, and other individual interventions in prenatal care were not correlated to maternal mortality (p= 0.112); however, together as a model we observed a significant association (p= 0.042). Conclusions: General maternal mortality was reduced after the implementation of a comprehensive national prenatal care program. Is important the evaluation of this program in others populations. PMID:24970956

  17. Multiobjective optimization of low impact development stormwater controls

    NASA Astrophysics Data System (ADS)

    Eckart, Kyle; McPhee, Zach; Bolisetti, Tirupati

    2018-07-01

    Green infrastructure such as Low Impact Development (LID) controls are being employed to manage the urban stormwater and restore the predevelopment hydrological conditions besides improving the stormwater runoff water quality. Since runoff generation and infiltration processes are nonlinear, there is a need for identifying optimal combination of LID controls. A coupled optimization-simulation model was developed by linking the U.S. EPA Stormwater Management Model (SWMM) to the Borg Multiobjective Evolutionary Algorithm (Borg MOEA). The coupled model is capable of performing multiobjective optimization which uses SWMM simulations as a tool to evaluate potential solutions to the optimization problem. The optimization-simulation tool was used to evaluate low impact development (LID) stormwater controls. A SWMM model was developed, calibrated, and validated for a sewershed in Windsor, Ontario and LID stormwater controls were tested for three different return periods. LID implementation strategies were optimized using the optimization-simulation model for five different implementation scenarios for each of the three storm events with the objectives of minimizing peak flow in the stormsewers, reducing total runoff, and minimizing cost. For the sewershed in Windsor, Ontario, the peak run off and total volume of the runoff were found to reduce by 13% and 29%, respectively.

  18. Costs and cost-effectiveness of full implementation of a biennial faecal occult blood test screening program for bowel cancer in Australia.

    PubMed

    Pignone, Michael P; Flitcroft, Kathy L; Howard, Kirsten; Trevena, Lyndal J; Salkeld, Glenn P; St John, D James B

    2011-02-21

    To examine the costs and cost-effectiveness of full implementation of biennial bowel cancer screening for Australian residents aged 50-74 years. Identification of existing economic models from 1993 to 2010 through searches of PubMed and economic analysis databases, and by seeking expert advice; and additional modelling to determine the costs and cost-effectiveness of full implementation of biennial faecal occult blood test screening for the five million adults in Australia aged 50-74 years. Estimated number of deaths from bowel cancer prevented, costs, and cost-effectiveness (cost per life-year gained [LYG]) of biennial bowel cancer screening. We identified six relevant economic analyses, all of which found colorectal cancer (CRC) screening to be very cost-effective, with costs per LYG under $55,000 per year in 2010 Australian dollars. Based on our additional modelling, we conservatively estimate that full implementation of biennial screening for people aged 50-74 years would have gross costs of $150 million, reduce CRC mortality by 15%-25%, prevent 300-500 deaths from bowel cancer, and save 3600-6000 life-years annually, for an undiscounted cost per LYG of $25,000-$41,667, compared with no screening, and not taking cost savings as a result of treatment into consideration. The additional expenditure required, after accounting for reductions in CRC incidence, savings in CRC treatment costs, and existing ad-hoc colonoscopy use, is likely to be less than $50 million annually. Full implementation of biennial faecal occult blood test screening in Australia can reduce bowel cancer mortality, and is an efficient use of health resources that would require modest additional government investment.

  19. Inferential control -- Part 1: Crude unit advanced controls pass accuracy and repeatability tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    San, Y.P.; Landells, K.C.; Mackay, D.C.

    1994-11-28

    An inferential model is one that provides a quality for which an analyzer is not available. This type of model uses readily available physical measurements -- such as temperatures, pressures, and flow rates -- to infer a quality such as kerosine flash point. The No. 2 crude distillation unit (CDU-2) at Singapore Refining Co. Pte. Ltd.'s Pulau Merlimau refinery has a nominal 130,000 b/d capacity. It produces naphtha, kerosine, diesel, and residue products from a wide range of crude blends. Over the past 12 months, extensive advanced control applications have been implemented on the unit. This first of two articlesmore » will describe the control system and its implementation. The second will outline the project's achievements, including reduced quality giveaway and increased profits. The paper describes background of the company and unit, the process, project implementation, the Infer model, model tuning, closed-loop control, feed rate maximization, and economic monitoring.« less

  20. Non-linear mixed effects modeling - from methodology and software development to driving implementation in drug development science.

    PubMed

    Pillai, Goonaseelan Colin; Mentré, France; Steimer, Jean-Louis

    2005-04-01

    Few scientific contributions have made significant impact unless there was a champion who had the vision to see the potential for its use in seemingly disparate areas-and who then drove active implementation. In this paper, we present a historical summary of the development of non-linear mixed effects (NLME) modeling up to the more recent extensions of this statistical methodology. The paper places strong emphasis on the pivotal role played by Lewis B. Sheiner (1940-2004), who used this statistical methodology to elucidate solutions to real problems identified in clinical practice and in medical research and on how he drove implementation of the proposed solutions. A succinct overview of the evolution of the NLME modeling methodology is presented as well as ideas on how its expansion helped to provide guidance for a more scientific view of (model-based) drug development that reduces empiricism in favor of critical quantitative thinking and decision making.

  1. The DaveMLTranslator: An Interface for DAVE-ML Aerodynamic Models

    NASA Technical Reports Server (NTRS)

    Hill, Melissa A.; Jackson, E. Bruce

    2007-01-01

    It can take weeks or months to incorporate a new aerodynamic model into a vehicle simulation and validate the performance of the model. The Dynamic Aerospace Vehicle Exchange Markup Language (DAVE-ML) has been proposed as a means to reduce the time required to accomplish this task by defining a standard format for typical components of a flight dynamic model. The purpose of this paper is to describe an object-oriented C++ implementation of a class that interfaces a vehicle subsystem model specified in DAVE-ML and a vehicle simulation. Using the DaveMLTranslator class, aerodynamic or other subsystem models can be automatically imported and verified at run-time, significantly reducing the elapsed time between receipt of a DAVE-ML model and its integration into a simulation environment. The translator performs variable initializations, data table lookups, and mathematical calculations for the aerodynamic build-up, and executes any embedded static check-cases for verification. The implementation is efficient, enabling real-time execution. Simple interface code for the model inputs and outputs is the only requirement to integrate the DaveMLTranslator as a vehicle aerodynamic model. The translator makes use of existing table-lookup utilities from the Langley Standard Real-Time Simulation in C++ (LaSRS++). The design and operation of the translator class is described and comparisons with existing, conventional, C++ aerodynamic models of the same vehicle are given.

  2. The Health Equity and Effectiveness of Policy Options to Reduce Dietary Salt Intake in England: Policy Forecast.

    PubMed

    Gillespie, Duncan O S; Allen, Kirk; Guzman-Castillo, Maria; Bandosz, Piotr; Moreira, Patricia; McGill, Rory; Anwar, Elspeth; Lloyd-Williams, Ffion; Bromley, Helen; Diggle, Peter J; Capewell, Simon; O'Flaherty, Martin

    2015-01-01

    Public health action to reduce dietary salt intake has driven substantial reductions in coronary heart disease (CHD) over the past decade, but avoidable socio-economic differentials remain. We therefore forecast how further intervention to reduce dietary salt intake might affect the overall level and inequality of CHD mortality. We considered English adults, with socio-economic circumstances (SEC) stratified by quintiles of the Index of Multiple Deprivation. We used IMPACTSEC, a validated CHD policy model, to link policy implementation to salt intake, systolic blood pressure and CHD mortality. We forecast the effects of mandatory and voluntary product reformulation, nutrition labelling and social marketing (e.g., health promotion, education). To inform our forecasts, we elicited experts' predictions on further policy implementation up to 2020. We then modelled the effects on CHD mortality up to 2025 and simultaneously assessed the socio-economic differentials of effect. Mandatory reformulation might prevent or postpone 4,500 (2,900-6,100) CHD deaths in total, with the effect greater by 500 (300-700) deaths or 85% in the most deprived than in the most affluent. Further voluntary reformulation was predicted to be less effective and inequality-reducing, preventing or postponing 1,500 (200-5,000) CHD deaths in total, with the effect greater by 100 (-100-600) deaths or 49% in the most deprived than in the most affluent. Further social marketing and improvements to labelling might each prevent or postpone 400-500 CHD deaths, but minimally affect inequality. Mandatory engagement with industry to limit salt in processed-foods appears a promising and inequality-reducing option. For other policy options, our expert-driven forecast warns that future policy implementation might reach more deprived individuals less well, limiting inequality reduction. We therefore encourage planners to prioritise equity.

  3. Acceleration for 2D time-domain elastic full waveform inversion using a single GPU card

    NASA Astrophysics Data System (ADS)

    Jiang, Jinpeng; Zhu, Peimin

    2018-05-01

    Full waveform inversion (FWI) is a challenging procedure due to the high computational cost related to the modeling, especially for the elastic case. The graphics processing unit (GPU) has become a popular device for the high-performance computing (HPC). To reduce the long computation time, we design and implement the GPU-based 2D elastic FWI (EFWI) in time domain using a single GPU card. We parallelize the forward modeling and gradient calculations using the CUDA programming language. To overcome the limitation of relatively small global memory on GPU, the boundary saving strategy is exploited to reconstruct the forward wavefield. Moreover, the L-BFGS optimization method used in the inversion increases the convergence of the misfit function. A multiscale inversion strategy is performed in the workflow to obtain the accurate inversion results. In our tests, the GPU-based implementations using a single GPU device achieve >15 times speedup in forward modeling, and about 12 times speedup in gradient calculation, compared with the eight-core CPU implementations optimized by OpenMP. The test results from the GPU implementations are verified to have enough accuracy by comparing the results obtained from the CPU implementations.

  4. Impact of mussel bioengineering on fine-grained sediment dynamics in a coastal lagoon: A numerical modelling investigation

    NASA Astrophysics Data System (ADS)

    Forsberg, Pernille L.; Lumborg, Ulrik; Bundgaard, Klavs; Ernstsen, Verner B.

    2017-12-01

    Rødsand lagoon in southeast Denmark is a non-tidal coastal lagoon. It is home to a wide range of marine flora and fauna and part of the Natura 2000 network. An increase in turbidity through elevated levels of suspended sediment concentration (SSC) within the lagoon may affect the ecosystem health due to reduced light penetration. Increasing SSC levels within Rødsand lagoon could be caused by increasing storm intensity or by a sediment spill from dredging activities west of the lagoon in relation to the planned construction of the Fehmarnbelt fixed link between Denmark and Germany. The aim of the study was to investigate the impact of a mussel reef on sediment import and SSC in a semi-enclosed lagoon through the development of a bioengineering modelling application that makes it possible to include the filtrating effect of mussels in a numerical model of the lagoonal system. The numerical implementation of an exterior mussel reef generated a reduction in the SSC in the vicinity of the reef, through the adjacent inlet and in the western part of the lagoon. The mussel reef reduced the sediment import to Rødsand lagoon by 13-22% and reduced the SSC within Rødsand lagoon by 5-9% depending on the filtration rate and the reef length. The results suggest that the implementation of a mussel reef has the potential to relieve the pressure of increasing turbidity levels within a semi-enclosed lagoonal system. However, further assessment and development of the bioengineering application and resulting ecosystem impacts are necessary prior to actual implementation.

  5. Simulation of finite-strain inelastic phenomena governed by creep and plasticity

    NASA Astrophysics Data System (ADS)

    Li, Zhen; Bloomfield, Max O.; Oberai, Assad A.

    2017-11-01

    Inelastic mechanical behavior plays an important role in many applications in science and engineering. Phenomenologically, this behavior is often modeled as plasticity or creep. Plasticity is used to represent the rate-independent component of inelastic deformation and creep is used to represent the rate-dependent component. In several applications, especially those at elevated temperatures and stresses, these processes occur simultaneously. In order to model these process, we develop a rate-objective, finite-deformation constitutive model for plasticity and creep. The plastic component of this model is based on rate-independent J_2 plasticity, and the creep component is based on a thermally activated Norton model. We describe the implementation of this model within a finite element formulation, and present a radial return mapping algorithm for it. This approach reduces the additional complexity of modeling plasticity and creep, over thermoelasticity, to just solving one nonlinear scalar equation at each quadrature point. We implement this algorithm within a multiphysics finite element code and evaluate the consistent tangent through automatic differentiation. We verify and validate the implementation, apply it to modeling the evolution of stresses in the flip chip manufacturing process, and test its parallel strong-scaling performance.

  6. A systematic review of the collaborative clinical education model to inform speech-language pathology practice.

    PubMed

    Briffa, Charmaine; Porter, Judi

    2013-12-01

    A shortage of clinical education placements for allied health students internationally has led to the need to explore innovative models of clinical education. The collaborative model where one clinical educator supervises two or more students completing a clinical placement concurrently is one model enabling expansion of student placements. The aims of this review were to investigate advantages and disadvantages of the collaborative model and to explore its implementation across allied health. A systematic search of the literature was conducted using three electronic databases (CINAHL, Medline, and Embase). Two independent reviewers evaluated studies for methodological quality. Seventeen studies met inclusion/exclusion criteria. Advantages and disadvantages identified were consistent across disciplines. A key advantage of the model was the opportunity afforded for peer learning, whilst a frequently reported disadvantage was reduced time for individual supervision of students. The methodological quality of many included studies was poor, impacting on interpretation of the evidence base. Insufficient data were provided on how the model was implemented across studies. There is a need for high quality research to guide implementation of this model across a wider range of allied health disciplines and to determine educational outcomes using reliable and validated measures.

  7. Aligning with physicians to regionalize services.

    PubMed

    Fink, John

    2014-11-01

    When effectively designed and implemented, regionalization allows a health system to coordinate care, eliminate redundancies, reduce costs, optimize resource utilization, and improve outcomes. The preferred model to manage service lines regionally will depend on each facility's capabilities and the willingness of physicians to accept changes in clinical delivery. Health systems can overcome physicians' objections to regionalization by implementing a hospital-physician alignment structure that gives a measure of shared control in the management of the organization.

  8. Health care informatics.

    PubMed

    Siau, Keng

    2003-03-01

    The health care industry is currently experiencing a fundamental change. Health care organizations are reorganizing their processes to reduce costs, be more competitive, and provide better and more personalized customer care. This new business strategy requires health care organizations to implement new technologies, such as Internet applications, enterprise systems, and mobile technologies in order to achieve their desired business changes. This article offers a conceptual model for implementing new information systems, integrating internal data, and linking suppliers and patients.

  9. An Ounce of Prevention, a Pound of Uncertainty: The Cost-Effectiveness of School-Based Drug Prevention Programs.

    ERIC Educational Resources Information Center

    Caulkins, Jonathan P.; Rydell, C. Peter; Everingham, Susan S.; Chiesa, James; Bushway, Shawn

    This book describes an analysis of the cost-effectiveness of model school-based drug prevention programs at reducing cocaine consumption. It compares prevention's cost-effectiveness with that of several enforcement programs and with that of treating heavy cocaine users. It also assesses the cost of nationwide implementation of model prevention…

  10. Schematic baryon models, their tight binding description and their microwave realization

    NASA Astrophysics Data System (ADS)

    Sadurní, E.; Franco-Villafañe, J. A.; Kuhl, U.; Mortessagne, F.; Seligman, T. H.

    2013-12-01

    A schematic model for baryon excitations is presented in terms of a symmetric Dirac gyroscope, a relativistic model solvable in closed form, that reduces to a rotor in the non-relativistic limit. The model is then mapped on a nearest neighbour tight binding model. In its simplest one-dimensional form this model yields a finite equidistant spectrum. This is experimentally implemented as a chain of dielectric resonators under conditions where their coupling is evanescent and a good agreement with the prediction is achieved.

  11. Using financial incentives to improve value in orthopaedics.

    PubMed

    Lansky, David; Nwachukwu, Benedict U; Bozic, Kevin J

    2012-04-01

    A variety of reforms to traditional approaches to provider payment and benefit design are being implemented in the United States. There is increasing interest in applying these financial incentives to orthopaedics, although it is unclear whether and to what extent they have been implemented and whether they increase quality or reduce costs. We reviewed and discussed physician- and patient-oriented financial incentives being implemented in orthopaedics, key challenges, and prerequisites to payment reform and value-driven payment policy in orthopaedics. We searched the MEDLINE database using as search terms various provider payment and consumer incentive models. We retrieved a total of 169 articles; none of these studies met the inclusion criteria. For incentive models known to the authors to be in use in orthopaedics but for which no peer-reviewed literature was found, we searched Google for further information. Provider financial incentives reviewed include payments for reporting, performance, and patient safety and episode payment. Patient incentives include tiered networks, value-based benefit design, reference pricing, and value-based purchasing. Reform of financial incentives for orthopaedic surgery is challenged by (1) lack of a payment/incentive model that has demonstrated reductions in cost trends and (2) the complex interrelation of current pay schemes in today's fragmented environment. Prerequisites to reform include (1) a reliable and complete data infrastructure; (2) new business structures to support cost sharing; and (3) a retooling of patient expectations. There is insufficient literature reporting the effects of various financial incentive models under implementation in orthopaedics to know whether they increase quality or reduce costs. National concerns about cost will continue to drive experimentation, and all anticipated innovations will require improved collaboration and data collection and reporting.

  12. Evaluation of NHTSA distracted driving demonstration projects in Connecticut and New York.

    DOT National Transportation Integrated Search

    2012-08-01

    The communities of Hartford, Connecticut, and Syracuse, New York, implemented year-long campaigns to test whether NHTSAs high-visibility enforcement (HVE) model could be applied to reduce two specific forms of distracted driving driving while ...

  13. Implementing two nurse practitioner models of service at an Australian male prison: A quality assurance study.

    PubMed

    Wong, Ides; Wright, Eryn; Santomauro, Damian; How, Raquel; Leary, Christopher; Harris, Meredith

    2018-01-01

    To examine the quality and safety of nurse practitioner services of two newly implemented nurse practitioner models of care at a correctional facility. Nurse practitioners could help to meet the physical and mental health needs of Australia's growing prison population; however, the nurse practitioner role has not previously been evaluated in this context. A quality assurance study conducted in an Australian prison where a primary health nurse practitioner and a mental health nurse practitioner were incorporated into an existing primary healthcare service. The study was guided by Donabedian's structure, processes and outcomes framework. Routinely collected information included surveys of staff attitudes to the implementation of the nurse practitioner models (n = 21 staff), consultation records describing clinical processes and time use (n = 289 consultations), and a patient satisfaction survey (n = 29 patients). Data were analysed descriptively and compared to external benchmarks where available. Over the two-month period, the nurse practitioners provided 289 consultations to 208 prisoners. The presenting problems treated indicated that most referrals were appropriate. A significant proportion of consultations involved medication review and management. Both nurse practitioners spent more than half of their time on individual patient-related care. Overall, multidisciplinary team staff agreed that the nurse practitioner services were necessary, safe, met patient need and reduced treatment delays. Findings suggest that the implementation of nurse practitioners into Australian correctional facilities is acceptable and feasible and has the potential to improve prisoners' access to health services. Structural factors (e.g., room availability and limited access to prisoners) may have reduced the efficiency of the nurse practitioners' clinical processes and service implementation. Results suggest that nurse practitioner models can be successfully integrated into a prison setting and could provide a nursing career pathway. © 2017 John Wiley & Sons Ltd.

  14. Estimating the Cost of Providing Foundational Public Health Services.

    PubMed

    Mamaril, Cezar Brian C; Mays, Glen P; Branham, Douglas Keith; Bekemeier, Betty; Marlowe, Justin; Timsina, Lava

    2017-12-28

    To estimate the cost of resources required to implement a set of Foundational Public Health Services (FPHS) as recommended by the Institute of Medicine. A stochastic simulation model was used to generate probability distributions of input and output costs across 11 FPHS domains. We used an implementation attainment scale to estimate costs of fully implementing FPHS. We use data collected from a diverse cohort of 19 public health agencies located in three states that implemented the FPHS cost estimation methodology in their agencies during 2014-2015. The average agency incurred costs of $48 per capita implementing FPHS at their current attainment levels with a coefficient of variation (CV) of 16 percent. Achieving full FPHS implementation would require $82 per capita (CV=19 percent), indicating an estimated resource gap of $34 per capita. Substantial variation in costs exists across communities in resources currently devoted to implementing FPHS, with even larger variation in resources needed for full attainment. Reducing geographic inequities in FPHS may require novel financing mechanisms and delivery models that allow health agencies to have robust roles within the health system and realize a minimum package of public health services for the nation. © Health Research and Educational Trust.

  15. A particle swarm model for estimating reliability and scheduling system maintenance

    NASA Astrophysics Data System (ADS)

    Puzis, Rami; Shirtz, Dov; Elovici, Yuval

    2016-05-01

    Modifying data and information system components may introduce new errors and deteriorate the reliability of the system. Reliability can be efficiently regained with reliability centred maintenance, which requires reliability estimation for maintenance scheduling. A variant of the particle swarm model is used to estimate reliability of systems implemented according to the model view controller paradigm. Simulations based on data collected from an online system of a large financial institute are used to compare three component-level maintenance policies. Results show that appropriately scheduled component-level maintenance greatly reduces the cost of upholding an acceptable level of reliability by reducing the need in system-wide maintenance.

  16. Trust recovery model of Ad Hoc network based on identity authentication scheme

    NASA Astrophysics Data System (ADS)

    Liu, Jie; Huan, Shuiyuan

    2017-05-01

    Mobile Ad Hoc network trust model is widely used to solve mobile Ad Hoc network security issues. Aiming at the problem of reducing the network availability caused by the processing of malicious nodes and selfish nodes in mobile Ad Hoc network routing based on trust model, an authentication mechanism based on identity authentication mobile Ad Hoc network is proposed, which uses identity authentication to identify malicious nodes, And trust the recovery of selfish nodes in order to achieve the purpose of reducing network congestion and improving network quality. The simulation results show that the implementation of the mechanism can effectively improve the network availability and security.

  17. Model and controller reduction of large-scale structures based on projection methods

    NASA Astrophysics Data System (ADS)

    Gildin, Eduardo

    The design of low-order controllers for high-order plants is a challenging problem theoretically as well as from a computational point of view. Frequently, robust controller design techniques result in high-order controllers. It is then interesting to achieve reduced-order models and controllers while maintaining robustness properties. Controller designed for large structures based on models obtained by finite element techniques yield large state-space dimensions. In this case, problems related to storage, accuracy and computational speed may arise. Thus, model reduction methods capable of addressing controller reduction problems are of primary importance to allow the practical applicability of advanced controller design methods for high-order systems. A challenging large-scale control problem that has emerged recently is the protection of civil structures, such as high-rise buildings and long-span bridges, from dynamic loadings such as earthquakes, high wind, heavy traffic, and deliberate attacks. Even though significant effort has been spent in the application of control theory to the design of civil structures in order increase their safety and reliability, several challenging issues are open problems for real-time implementation. This dissertation addresses with the development of methodologies for controller reduction for real-time implementation in seismic protection of civil structures using projection methods. Three classes of schemes are analyzed for model and controller reduction: nodal truncation, singular value decomposition methods and Krylov-based methods. A family of benchmark problems for structural control are used as a framework for a comparative study of model and controller reduction techniques. It is shown that classical model and controller reduction techniques, such as balanced truncation, modal truncation and moment matching by Krylov techniques, yield reduced-order controllers that do not guarantee stability of the closed-loop system, that is, the reduced-order controller implemented with the full-order plant. A controller reduction approach is proposed such that to guarantee closed-loop stability. It is based on the concept of dissipativity (or positivity) of linear dynamical systems. Utilizing passivity preserving model reduction together with dissipative-LQG controllers, effective low-order optimal controllers are obtained. Results are shown through simulations.

  18. Hardware implementation of CMAC neural network with reduced storage requirement.

    PubMed

    Ker, J S; Kuo, Y H; Wen, R C; Liu, B D

    1997-01-01

    The cerebellar model articulation controller (CMAC) neural network has the advantages of fast convergence speed and low computation complexity. However, it suffers from a low storage space utilization rate on weight memory. In this paper, we propose a direct weight address mapping approach, which can reduce the required weight memory size with a utilization rate near 100%. Based on such an address mapping approach, we developed a pipeline architecture to efficiently perform the addressing operations. The proposed direct weight address mapping approach also speeds up the computation for the generation of weight addresses. Besides, a CMAC hardware prototype used for color calibration has been implemented to confirm the proposed approach and architecture.

  19. The Bern Simple Climate Model (BernSCM) v1.0: an extensible and fully documented open-source re-implementation of the Bern reduced-form model for global carbon cycle-climate simulations

    NASA Astrophysics Data System (ADS)

    Strassmann, Kuno M.; Joos, Fortunat

    2018-05-01

    The Bern Simple Climate Model (BernSCM) is a free open-source re-implementation of a reduced-form carbon cycle-climate model which has been used widely in previous scientific work and IPCC assessments. BernSCM represents the carbon cycle and climate system with a small set of equations for the heat and carbon budget, the parametrization of major nonlinearities, and the substitution of complex component systems with impulse response functions (IRFs). The IRF approach allows cost-efficient yet accurate substitution of detailed parent models of climate system components with near-linear behavior. Illustrative simulations of scenarios from previous multimodel studies show that BernSCM is broadly representative of the range of the climate-carbon cycle response simulated by more complex and detailed models. Model code (in Fortran) was written from scratch with transparency and extensibility in mind, and is provided open source. BernSCM makes scientifically sound carbon cycle-climate modeling available for many applications. Supporting up to decadal time steps with high accuracy, it is suitable for studies with high computational load and for coupling with integrated assessment models (IAMs), for example. Further applications include climate risk assessment in a business, public, or educational context and the estimation of CO2 and climate benefits of emission mitigation options.

  20. The application of chemical leasing business models in Mexico.

    PubMed

    Schwager, Petra; Moser, Frank

    2006-03-01

    To better address the requirements of the changing multilateral order, the United Nations Industrial Development Organization (UNIDO) Cleaner Production Programme, in 2004, developed the new Sustainable Industrial Resource Management (SIRM) approach. This approach is in accordance with the principles decided at the United Nations Conference on Environment and Development (UNCED) in Rio de Janeiro, Brazil in 1992. Unlike the traditional approaches to environmental management, the SIRM concept captures the idea of achieving sustainable industrial development through the implementation of circular material and energy flows in the entire production chain and reduction of the amount of material and energy used with greater efficiency solutions. The SIRM approach seeks to develop new models to encourage a shift from selling products to supplying services, modifying, in this manner, the supplier/user relationship and resulting in a win-win situation for the economy and the environment. Chemical Leasing represents such a new service-oriented business model and is currently being promoted by UNIDO's Cleaner Production Programme. MAIN FEATURES. One of the potential approaches to address the problems related to ineffective use and over-consumption of chemicals is the development and implementation of Chemical Leasing business models. These provide concrete solutions to the effective management of chemicals and on the ways negative releases to the environment can be reduced. The Chemical Leasing approach is a strategy that addresses the obligations of the changing international chemicals policy by focusing on a more service-oriented strategy. Mexico is one of the countries that were selected for the implementation of UNIDO's demonstration project to promote Chemical Leasing models in the country. The target sector of this project is the chemical industry, which is expected to shift their traditional business concept towards a more service and value-added approach. This is being achieved through the development of company specific business models that implement the above-indicated Chemical Leasing concept with the support from the Mexican National Cleaner Production Centre (NCPC). The implementation of Chemical Leasing in Mexico has proven to be an efficient instrument in enhancing sustainable chemical management and significantly reducing emissions in Mexico. Several companies from the chemical industrial sector implement or agreed to implement chemical leasing business models. Based on the positive findings of the project, several Mexican companies started to negotiate contents of possible Chemical Leasing contracts with suitable business partners. The project further aimed at disseminating information on Chemical Leasing. It successfully attracted globally operating companies in the chemicals sector to explore possibilities to implement Chemical Leasing business models in Mexico. At the international level, the results of the UNIDO project were presented on 20th September 2005 during a side event of the Strategic Approach to International Chemicals Management (SAICM) Preparation Conference in Vienna. To facilitate the promotion and application of Chemical Leasing project at international level, UNIDO is currently developing a number of tools to standardize Chemical Leasing projects. These include, among others, Chemical leasing contract models; Chemical Leasing data base to find partners for chemical leasing; and guidelines to implement Chemical Leasing projects and work programmes.

  1. A computationally fast, reduced model for simulating landslide dynamics and tsunamis generated by landslides in natural terrains

    NASA Astrophysics Data System (ADS)

    Mohammed, F.

    2016-12-01

    Landslide hazards such as fast-moving debris flows, slow-moving landslides, and other mass flows cause numerous fatalities, injuries, and damage. Landslide occurrences in fjords, bays, and lakes can additionally generate tsunamis with locally extremely high wave heights and runups. Two-dimensional depth-averaged models can successfully simulate the entire lifecycle of the three-dimensional landslide dynamics and tsunami propagation efficiently and accurately with the appropriate assumptions. Landslide rheology is defined using viscous fluids, visco-plastic fluids, and granular material to account for the possible landslide source materials. Saturated and unsaturated rheologies are further included to simulate debris flow, debris avalanches, mudflows, and rockslides respectively. The models are obtained by reducing the fully three-dimensional Navier-Stokes equations with the internal rheological definition of the landslide material, the water body, and appropriate scaling assumptions to obtain the depth-averaged two-dimensional models. The landslide and tsunami models are coupled to include the interaction between the landslide and the water body for tsunami generation. The reduced models are solved numerically with a fast semi-implicit finite-volume, shock-capturing based algorithm. The well-balanced, positivity preserving algorithm accurately accounts for wet-dry interface transition for the landslide runout, landslide-water body interface, and the tsunami wave flooding on land. The models are implemented as a General-Purpose computing on Graphics Processing Unit-based (GPGPU) suite of models, either coupled or run independently within the suite. The GPGPU implementation provides up to 1000 times speedup over a CPU-based serial computation. This enables simulations of multiple scenarios of hazard realizations that provides a basis for a probabilistic hazard assessment. The models have been successfully validated against experiments, past studies, and field data for landslides and tsunamis.

  2. Master-slave control with trajectory planning and Bouc-Wen model for tracking control of piezo-driven stage

    NASA Astrophysics Data System (ADS)

    Lu, Xiaojun; Liu, Changli; Chen, Lei

    2018-04-01

    In this paper, a redundant Piezo-driven stage having 3RRR compliant mechanism is introduced, we propose the master-slave control with trajectory planning (MSCTP) strategy and Bouc-Wen model to improve its micro-motion tracking performance. The advantage of the proposed controller lies in that its implementation only requires a simple control strategy without the complexity of modeling to avoid the master PEA's tracking error. The dynamic model of slave PEA system with Bouc-Wen hysteresis is established and identified via particle swarm optimization (PSO) approach. The Piezo-driven stage with operating period T=1s and 2s is implemented to track a prescribed circle. The simulation results show that MSCTP with Bouc-Wen model reduces the trajectory tracking errors to the range of the accuracy of our available measurement.

  3. Sector and Sphere: the design and implementation of a high-performance data cloud

    PubMed Central

    Gu, Yunhong; Grossman, Robert L.

    2009-01-01

    Cloud computing has demonstrated that processing very large datasets over commodity clusters can be done simply, given the right programming model and infrastructure. In this paper, we describe the design and implementation of the Sector storage cloud and the Sphere compute cloud. By contrast with the existing storage and compute clouds, Sector can manage data not only within a data centre, but also across geographically distributed data centres. Similarly, the Sphere compute cloud supports user-defined functions (UDFs) over data both within and across data centres. As a special case, MapReduce-style programming can be implemented in Sphere by using a Map UDF followed by a Reduce UDF. We describe some experimental studies comparing Sector/Sphere and Hadoop using the Terasort benchmark. In these studies, Sector is approximately twice as fast as Hadoop. Sector/Sphere is open source. PMID:19451100

  4. Fast and robust control of nanopositioning systems: Performance limits enabled by field programmable analog arrays.

    PubMed

    Baranwal, Mayank; Gorugantu, Ram S; Salapaka, Srinivasa M

    2015-08-01

    This paper aims at control design and its implementation for robust high-bandwidth precision (nanoscale) positioning systems. Even though modern model-based control theoretic designs for robust broadband high-resolution positioning have enabled orders of magnitude improvement in performance over existing model independent designs, their scope is severely limited by the inefficacies of digital implementation of the control designs. High-order control laws that result from model-based designs typically have to be approximated with reduced-order systems to facilitate digital implementation. Digital systems, even those that have very high sampling frequencies, provide low effective control bandwidth when implementing high-order systems. In this context, field programmable analog arrays (FPAAs) provide a good alternative to the use of digital-logic based processors since they enable very high implementation speeds, moreover with cheaper resources. The superior flexibility of digital systems in terms of the implementable mathematical and logical functions does not give significant edge over FPAAs when implementing linear dynamic control laws. In this paper, we pose the control design objectives for positioning systems in different configurations as optimal control problems and demonstrate significant improvements in performance when the resulting control laws are applied using FPAAs as opposed to their digital counterparts. An improvement of over 200% in positioning bandwidth is achieved over an earlier digital signal processor (DSP) based implementation for the same system and same control design, even when for the DSP-based system, the sampling frequency is about 100 times the desired positioning bandwidth.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simpson, L.; Britt, J.; Birkmire, R.

    ITN Energy Systems, Inc., and Global Solar Energy, Inc., assisted by NREL's PV Manufacturing R&D program, have continued to advance CIGS production technology by developing trajectory-oriented predictive/control models, fault-tolerance control, control platform development, in-situ sensors, and process improvements. Modeling activities included developing physics-based and empirical models for CIGS and sputter-deposition processing, implementing model-based control, and applying predictive models to the construction of new evaporation sources and for control. Model-based control is enabled by implementing reduced or empirical models into a control platform. Reliability improvement activities include implementing preventive maintenance schedules; detecting failed sensors/equipment and reconfiguring to tinue processing; and systematicmore » development of fault prevention and reconfiguration strategies for the full range of CIGS PV production deposition processes. In-situ sensor development activities have resulted in improved control and indicated the potential for enhanced process status monitoring and control of the deposition processes. Substantial process improvements have been made, including significant improvement in CIGS uniformity, thickness control, efficiency, yield, and throughput. In large measure, these gains have been driven by process optimization, which in turn have been enabled by control and reliability improvements due to this PV Manufacturing R&D program.« less

  6. Parallel processing optimization strategy based on MapReduce model in cloud storage environment

    NASA Astrophysics Data System (ADS)

    Cui, Jianming; Liu, Jiayi; Li, Qiuyan

    2017-05-01

    Currently, a large number of documents in the cloud storage process employed the way of packaging after receiving all the packets. From the local transmitter this stored procedure to the server, packing and unpacking will consume a lot of time, and the transmission efficiency is low as well. A new parallel processing algorithm is proposed to optimize the transmission mode. According to the operation machine graphs model work, using MPI technology parallel execution Mapper and Reducer mechanism. It is good to use MPI technology to implement Mapper and Reducer parallel mechanism. After the simulation experiment of Hadoop cloud computing platform, this algorithm can not only accelerate the file transfer rate, but also shorten the waiting time of the Reducer mechanism. It will break through traditional sequential transmission constraints and reduce the storage coupling to improve the transmission efficiency.

  7. Assessing the effects of noise abatement measures on health risks: A case study in Istanbul

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ongel, Aybike, E-mail: aybike.ongel@eng.bahcesehir.edu.tr; Sezgin, Fatih, E-mail: fatih.sezgin@ibb.gov.tr

    In recent decades, noise pollution caused by industrialization and increased motorization has become a major concern around the world because of its adverse effects on human well-being. Therefore, transportation agencies have been implementing noise abatement measures in order to reduce road traffic noise. However, limited attention is given to noise in environmental assessment of road transportation systems. This paper presents a framework for a health impact assessment model for road transportation noise emissions. The model allows noise impacts to be addressed with the health effects of air pollutant and greenhouse gas emissions from road transportation. The health damages assessed inmore » the model include annoyance, sleep disturbance, and cardiovascular disease in terms of acute myocardial infarction. The model was applied in a case study in Istanbul in order to evaluate the change in health risks from the implementation of noise abatement strategies. The noise abatement strategies evaluated include altering pavement surfaces in order to absorb noise and introducing speed limits. It was shown that significant improvements in health risks can be achieved using open graded pavement surfaces and introducing speed limits on highways. - Highlights: • Transportation noise has a significant effect on health. • Noise should be included in the environmental assessment of transportation systems. • Traffic noise abatement measures include noise reducing pavements and speed limits. • Noise abatement measures help reduce the health risks of transportation noise. • Speed limit reduction on uncongested roads is an effective way to reduce health risks.« less

  8. Aeroelastic simulation of higher harmonic control

    NASA Technical Reports Server (NTRS)

    Robinson, Lawson H.; Friedmann, Peretz P.

    1994-01-01

    This report describes the development of an aeroelastic analysis of a helicopter rotor and its application to the simulation of helicopter vibration reduction through higher harmonic control (HHC). An improved finite-state, time-domain model of unsteady aerodynamics is developed to capture high frequency aerodynamic effects. An improved trim procedure is implemented which accounts for flap, lead-lag, and torsional deformations of the blade. The effect of unsteady aerodynamics is studied and it is found that its impact on blade aeroelastic stability and low frequency response is small, but it has a significant influence on rotor hub vibrations. Several different HHC algorithms are implemented on a hingeless rotor and their effectiveness in reducing hub vibratory shears is compared. All the controllers are found to be quite effective, but very differing HHC inputs are required depending on the aerodynamic model used. Effects of HHC on rotor stability and power requirements are found to be quite small. Simulations of roughly equivalent articulated and hingeless rotors are carried out, and it is found that hingeless rotors can require considerably larger HHC inputs to reduce vibratory shears. This implies that the practical implementation of HHC on hingeless rotors might be considerably more difficult than on articulated rotors.

  9. Modular Bundle Adjustment for Photogrammetric Computations

    NASA Astrophysics Data System (ADS)

    Börlin, N.; Murtiyoso, A.; Grussenmeyer, P.; Menna, F.; Nocerino, E.

    2018-05-01

    In this paper we investigate how the residuals in bundle adjustment can be split into a composition of simple functions. According to the chain rule, the Jacobian (linearisation) of the residual can be formed as a product of the Jacobians of the individual steps. When implemented, this enables a modularisation of the computation of the bundle adjustment residuals and Jacobians where each component has limited responsibility. This enables simple replacement of components to e.g. implement different projection or rotation models by exchanging a module. The technique has previously been used to implement bundle adjustment in the open-source package DBAT (Börlin and Grussenmeyer, 2013) based on the Photogrammetric and Computer Vision interpretations of Brown (1971) lens distortion model. In this paper, we applied the technique to investigate how affine distortions can be used to model the projection of a tilt-shift lens. Two extended distortion models were implemented to test the hypothesis that the ordering of the affine and lens distortion steps can be changed to reduce the size of the residuals of a tilt-shift lens calibration. Results on synthetic data confirm that the ordering of the affine and lens distortion steps matter and is detectable by DBAT. However, when applied to a real camera calibration data set of a tilt-shift lens, no difference between the extended models was seen. This suggests that the tested hypothesis is false and that other effects need to be modelled to better explain the projection. The relatively low implementation effort that was needed to generate the models suggest that the technique can be used to investigate other novel projection models in photogrammetry, including modelling changes in the 3D geometry to better understand the tilt-shift lens.

  10. Implementing school nursing strategies to reduce LGBTQ adolescent suicide: a randomized cluster trial study protocol.

    PubMed

    Willging, Cathleen E; Green, Amy E; Ramos, Mary M

    2016-10-22

    Reducing youth suicide in the United States (U.S.) is a national public health priority, and lesbian, gay, bisexual, transgender, and queer or questioning (LGBTQ) youth are at elevated risk. The Centers for Disease Control and Prevention (CDC) endorses six evidence-based (EB) strategies that center on meeting the needs of LGBTQ youth in schools; however, fewer than 6 % of U.S. schools implement all of them. The proposed intervention model, "RLAS" (Implementing School Nursing Strategies to Reduce LGBTQ Adolescent Suicide), builds on the Exploration, Preparation, Implementation, and Sustainment (EPIS) conceptual framework and the Dynamic Adaptation Process (DAP) to implement EB strategies in U.S. high schools. The DAP accounts for the multilevel context of school settings and uses Implementation Resource Teams (IRTs) to facilitate appropriate expertise, advise on acceptable adaptations, and provide data feedback to make schools implementation ready and prepared to sustain changes. Mixed methods will be used to examine individual, school, and community factors influencing both implementation process and youth outcomes. A cluster randomized controlled trial will assess whether LGBTQ students and their peers in RLAS intervention schools (n = 20) report reductions in suicidality, depression, substance use, bullying, and truancy related to safety concerns compared to those in usual care schools (n = 20). Implementation progress and fidelity for each EB strategy in RLAS intervention schools will be examined using a modified version of the Stages of Implementation Completion checklist. During the implementation and sustainment phases, annual focus groups will be conducted with the 20 IRTs to document their experiences identifying and advancing adaptation supports to facilitate use of EB strategies and their perceptions of the DAP. The DAP represents a data-informed, collaborative, multiple stakeholder approach to progress from exploration to sustainment and obtain fidelity during the implementation of EB strategies in school settings. This study is designed to address the real-world implications of enabling the use of EB strategies by school nurses with the goal of decreasing suicide and youth risk behaviors among LGBTQ youth. Through its participatory processes to refine and sustain EB strategies in high schools, the RLAS represents a novel contribution to implementation science. ClinicalTrials.gov, NCT02875535.

  11. Implications of differentiated care for successful ART scale-up in a concentrated HIV epidemic in Yangon, Myanmar.

    PubMed

    Mesic, Anita; Fontaine, Julie; Aye, Theingy; Greig, Jane; Thwe, Thin Thin; Moretó-Planas, Laura; Kliesckova, Jarmila; Khin, Khin; Zarkua, Nana; Gonzalez, Lucia; Guillergan, Erwin Lloyd; O'Brien, Daniel P

    2017-07-21

    National AIDS Programme in Myanmar has made significant progress in scaling up antiretroviral treatment (ART) services and recognizes the importance of differentiated care for people living with HIV. Indeed, long centred around the hospital and reliant on physicians, the country's HIV response is undergoing a process of successful decentralization with HIV care increasingly being integrated into other health services as part of a systematic effort to expand access to HIV treatment. This study describes implementation of differentiated care in Médecins Sans Frontières (MSF)-supported programmes and reports its outcomes. A descriptive cohort analysis of adult patients on antiretroviral treatment was performed. We assessed stability of patients as of 31 December 2014 and introduced an intervention of reduced frequency of physicians' consultations for stable patients, and fast tract ART refills. We measured a number of saved physician's visits as the result of this intervention. Main outcomes, remained under care, death, lost to follow up, treatment failure, were assessed on 31 December 2015 and reported as rates for different stable groups. On 31 December 2014, our programme counted 16, 272 adult patients enrolled in HIV care, of whom 80.34% were stable. The model allowed for an increase in the average number of patients one medical team could care for - from 745 patients in 2011 to 1, 627 in 2014 - and, thus, a reduction in the number of teams needed. An assessment of stable patients enrolled on ART one year after the implementation of the new model revealed excellent outcomes, aggregated for stable patients as 98.7% remaining in care, 0.4% dead, 0.8% lost to follow-up, 0.8% clinical treatment failure and 5.8% with immunological treatment failure. Implementation of a differentiated model reduced the number of visits between stable clients and physicians, reduced the medical resources required for treatment and enabled integrated treatment of the main co-morbidities. We hope that these findings will encourage other stakeholders to implement innovative models of HIV care in Myanmar, further expediting the scale up of ART services, the decentralization of treatment and the integration of care for the main HIV co-morbidities in this context.

  12. Reducing social inequalities in health: work-related strategies.

    PubMed

    Siegrist, Johannes

    2002-01-01

    Despite reduced health risks in terms of physical and chemical hazards current trends in occupational life continue to contribute to ill health and disease among economically active people. Stress at work plays a crucial role in this respect, as evidenced by recent scientific progress. This paper discusses two leading theoretical models of work-related stress, the demand-control model and the model of effort-reward imbalance, and it summarizes available evidence on adverse health effects. As work stress in terms of these models is more prevalent among lower socioeconomic status groups, these conditions contribute to the explanation of socially graded risks of morbidity and mortality in midlife. Implications of this new knowledge for the design and implementation of worksite health-promotion measures are elaborated. In conclusion, it is argued that workplace strategies deserve high priority on any agenda that aims at reducing social inequalities in health.

  13. Modeling Firing Range Best Management Practices with TREECS (trademark)

    DTIC Science & Technology

    2013-06-01

    is reduced. Phytoremediation can also be considered a source treatment BMP when plants uptake and transform the MC into other chemical forms that are...the amendment, and the local soil conditions. Phytoremediation includes phytoextraction, phytostabilization, and phytotransformation...model can then be run as usual while taking into account the phytoremediation of the MC. Phytotransformation is being implemented as an option in

  14. Development and Implementation of an Optimization Model for Hydropower and Total Dissolved Gas in the Mid-Columbia River System

    DOE PAGES

    Witt, Adam; Magee, Timothy; Stewart, Kevin; ...

    2017-08-10

    Managing energy, water, and environmental priorities and constraints within a cascade hydropower system is a challenging multiobjective optimization effort that requires advanced modeling and forecasting tools. Within the mid-Columbia River system, there is currently a lack of specific solutions for predicting how coordinated operational decisions can mitigate the impacts of total dissolved gas (TDG) supersaturation while satisfying multiple additional policy and hydropower generation objectives. In this study, a reduced-order TDG uptake equation is developed that predicts tailrace TDG at seven hydropower facilities on the mid-Columbia River. The equation is incorporated into a general multiobjective river, reservoir, and hydropower optimization toolmore » as a prioritized operating goal within a broader set of system-level objectives and constraints. A test case is presented to assess the response of TDG and hydropower generation when TDG supersaturation is optimized to remain under state water-quality standards. Satisfaction of TDG as an operating goal is highly dependent on whether constraints that limit TDG uptake are implemented at a higher priority than generation requests. According to the model, an opportunity exists to reduce TDG supersaturation and meet hydropower generation requirements by shifting spillway flows to different time periods. In conclusion, a coordinated effort between all project owners is required to implement systemwide optimized solutions that satisfy the operating policies of all stakeholders.« less

  15. Development and Implementation of an Optimization Model for Hydropower and Total Dissolved Gas in the Mid-Columbia River System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Witt, Adam; Magee, Timothy; Stewart, Kevin

    Managing energy, water, and environmental priorities and constraints within a cascade hydropower system is a challenging multiobjective optimization effort that requires advanced modeling and forecasting tools. Within the mid-Columbia River system, there is currently a lack of specific solutions for predicting how coordinated operational decisions can mitigate the impacts of total dissolved gas (TDG) supersaturation while satisfying multiple additional policy and hydropower generation objectives. In this study, a reduced-order TDG uptake equation is developed that predicts tailrace TDG at seven hydropower facilities on the mid-Columbia River. The equation is incorporated into a general multiobjective river, reservoir, and hydropower optimization toolmore » as a prioritized operating goal within a broader set of system-level objectives and constraints. A test case is presented to assess the response of TDG and hydropower generation when TDG supersaturation is optimized to remain under state water-quality standards. Satisfaction of TDG as an operating goal is highly dependent on whether constraints that limit TDG uptake are implemented at a higher priority than generation requests. According to the model, an opportunity exists to reduce TDG supersaturation and meet hydropower generation requirements by shifting spillway flows to different time periods. In conclusion, a coordinated effort between all project owners is required to implement systemwide optimized solutions that satisfy the operating policies of all stakeholders.« less

  16. Tuberculosis Control in South African Gold Mines: Mathematical Modeling of a Trial of Community-Wide Isoniazid Preventive Therapy

    PubMed Central

    Vynnycky, Emilia; Sumner, Tom; Fielding, Katherine L.; Lewis, James J.; Cox, Andrew P.; Hayes, Richard J.; Corbett, Elizabeth L.; Churchyard, Gavin J.; Grant, Alison D.; White, Richard G.

    2015-01-01

    A recent major cluster randomized trial of screening, active disease treatment, and mass isoniazid preventive therapy for 9 months during 2006–2011 among South African gold miners showed reduced individual-level tuberculosis incidence but no detectable population-level impact. We fitted a dynamic mathematical model to trial data and explored 1) factors contributing to the lack of population-level impact, 2) the best-achievable impact if all implementation characteristics were increased to the highest level achieved during the trial (“optimized intervention”), and 3) how tuberculosis might be better controlled with additional interventions (improving diagnostics, reducing treatment delay, providing isoniazid preventive therapy continuously to human immunodeficiency virus–positive people, or scaling up antiretroviral treatment coverage) individually and in combination. We found the following: 1) The model suggests that a small proportion of latent infections among human immunodeficiency virus–positive people were cured, which could have been a key factor explaining the lack of detectable population-level impact. 2) The optimized implementation increased impact by only 10%. 3) Implementing additional interventions individually and in combination led to up to 30% and 75% reductions, respectively, in tuberculosis incidence after 10 years. Tuberculosis control requires a combination prevention approach, including health systems strengthening to minimize treatment delay, improving diagnostics, increased antiretroviral treatment coverage, and effective preventive treatment regimens. PMID:25792607

  17. Implementing chronic care for COPD: planned visits, care coordination, and patient empowerment for improved outcomes.

    PubMed

    Fromer, Len

    2011-01-01

    Current primary care patterns for chronic obstructive pulmonary disease (COPD) focus on reactive care for acute exacerbations, often neglecting ongoing COPD management to the detriment of patient experience and outcomes. Proactive diagnosis and ongoing multifactorial COPD management, comprising smoking cessation, influenza and pneumonia vaccinations, pulmonary rehabilitation, and symptomatic and maintenance pharmacotherapy according to severity, can significantly improve a patient's health-related quality of life, reduce exacerbations and their consequences, and alleviate the functional, utilization, and financial burden of COPD. Redesign of primary care according to principles of the chronic care model, which is implemented in the patient-centered medical home, can shift COPD management from acute rescue to proactive maintenance. The chronic care model and patient-centered medical home combine delivery system redesign, clinical information systems, decision support, and self-management support within a practice, linked with health care organization and community resources beyond the practice. COPD care programs implementing two or more chronic care model components effectively reduce emergency room and inpatient utilization. This review guides primary care practices in improving COPD care workflows, highlighting the contributions of multidisciplinary collaborative team care, care coordination, and patient engagement. Each primary care practice can devise a COPD care workflow addressing risk awareness, spirometric diagnosis, guideline-based treatment and rehabilitation, and self-management support, to improve patient outcomes in COPD.

  18. Projected Impact of Salt Restriction on Prevention of Cardiovascular Disease in China: A Modeling Study

    PubMed Central

    Liu, Jing; Coxson, Pamela G.; Penko, Joanne; Goldman, Lee; Bibbins-Domingo, Kirsten; Zhao, Dong

    2016-01-01

    Objectives To estimate the effects of achieving China’s national goals for dietary salt (NaCl) reduction or implementing culturally-tailored dietary salt restriction strategies on cardiovascular disease (CVD) prevention. Methods The CVD Policy Model was used to project blood pressure lowering and subsequent downstream prevented CVD that could be achieved by population-wide salt restriction in China. Outcomes were annual CVD events prevented, relative reductions in rates of CVD incidence and mortality, quality-adjusted life-years (QALYs) gained, and CVD treatment costs saved. Results Reducing mean dietary salt intake to 9.0 g/day gradually over 10 years could prevent approximately 197 000 incident annual CVD events [95% uncertainty interval (UI): 173 000–219 000], reduce annual CVD mortality by approximately 2.5% (2.2–2.8%), gain 303 000 annual QALYs (278 000–329 000), and save approximately 1.4 billion international dollars (Int$) in annual CVD costs (Int$; 1.2–1.6 billion). Reducing mean salt intake to 6.0 g/day could approximately double these benefits. Implementing cooking salt-restriction spoons could prevent 183 000 fewer incident CVD cases (153 000–215 000) and avoid Int$1.4 billion in CVD treatment costs annually (1.2–1.7 billion). Implementing a cooking salt substitute strategy could lead to approximately three times the health benefits of the salt-restriction spoon program. More than three-quarters of benefits from any dietary salt reduction strategy would be realized in hypertensive adults. Conclusion China could derive substantial health gains from implementation of population-wide dietary salt reduction policies. Most health benefits from any dietary salt reduction program would be realized in adults with hypertension. PMID:26840409

  19. Using overbooking to manage no-shows in an Italian healthcare center.

    PubMed

    Parente, Chiara Anna; Salvatore, Domenico; Gallo, Giampiero Maria; Cipollini, Fabrizio

    2018-03-15

    In almost all healthcare systems, no-shows (scheduled appointments missed without any notice from patients) have a negative impact on waiting lists, costs and resource utilization, impairing the quality and quantity of cares that could be provided, as well as the revenues from the corresponding activity. Overbooking is a tool healthcare providers can resort to reduce the impact of no-shows. We develop an overbooking algorithm, and we assess its effectiveness using two methods: an analysis of the data coming from a practical implementation in an healthcare center; a simulation experiment to check the robustness and the potential of the strategy under different conditions. The data of the study, which includes personal and administrative information of patients, together with their scheduled and attended examinations, was taken from the electronic database of a big outpatient center. The attention was focused on the Magnetic Resonance (MR) ward because it uses expensive equipment, its services need long execution times, and the center has actually used it to implement an overbooking strategy aimed at reducing the impact of no-shows. We propose a statistical model for the patient's show/no-show behavior and we evaluate the ensuing overbooking procedure implemented in the MR ward. Finally, a simulation study investigates the effects of the overbooking strategy under different scenarios. The first contribution is a list of variables to identify the factors performing the best to predict no-shows. We classified the variables in three groups: "Patient's intrinsic factors", "Exogenous factors" and "Factors associated with the examination". The second contribution is a predictive model of no-shows, which is estimated on context-specific data using the variables just discussed. Such a model represents a fundamental ingredient of the overbooking strategy we propose to reduce the negative effects of no-shows. The third contribution is the assessment of that strategy by means of a simulation study under different scenarios in terms of number of resources and no-show rates. The same overbooking strategy was also implemented in practice (giving the opportunity to consider it as a quasi-experiment) to reduce the negative impact caused by non attendance in the MR ward. Both the quasi-experiment and the simulation study demonstrated that the strategy improved the center's productivity and reduced idle time of resources, although it increased slightly the patient's waiting time and the staff's overtime. This represents an evidence that overbooking can be suitable to improve the management of healthcare centers without adversely affecting their costs and the quality of cares offered. We shown that a well designed overbooking procedure can improve the management of medical centers, in terms of a significant increase of revenue, while keeping patient's waiting time and overtime under control. This was demonstrated by the results of a quasi-experiment (practical implementation of the strategy in the MR ward) and a simulation study (under different scenarios). Such positive results took advantage from a predictive model of no-show carefully designed around the medical center data.

  20. Implementation of Multi-Agent Object Attention System Based on Biologically Inspired Attractor Selection

    NASA Astrophysics Data System (ADS)

    Hashimoto, Ryoji; Matsumura, Tomoya; Nozato, Yoshihiro; Watanabe, Kenji; Onoye, Takao

    A multi-agent object attention system is proposed, which is based on biologically inspired attractor selection model. Object attention is facilitated by using a video sequence and a depth map obtained through a compound-eye image sensor TOMBO. Robustness of the multi-agent system over environmental changes is enhanced by utilizing the biological model of adaptive response by attractor selection. To implement the proposed system, an efficient VLSI architecture is employed with reducing enormous computational costs and memory accesses required for depth map processing and multi-agent attractor selection process. According to the FPGA implementation result of the proposed object attention system, which is accomplished by using 7,063 slices, 640×512 pixel input images can be processed in real-time with three agents at a rate of 9fps in 48MHz operation.

  1. Evaluation of the multifunctional worker role: a stakeholder analysis.

    PubMed

    Jones, K R; Redman, R W; VandenBosch, T M; Holdwick, C; Wolgin, F

    1999-01-01

    Health care organizations are rethinking how care is delivered because of incentives generated by managed care and a competitive marketplace. An evaluation of a work redesign project that involved the creation of redesigned unlicensed caregiver roles is described. The effect of model implementation on patients, multiple categories of caregivers, and physicians was measured using several different approaches to data collection. In this evaluation, caregivers perceived the institutional culture to be both market-driven and hierarchical. The work redesign, along with significant changes in unit configuration and leadership over the same period, significantly reduced job security and satisfaction with supervision. Quality indicators suggested short-term declines in quality during model implementation with higher levels of quality after implementation issues were resolved. Objective measurement of the outcomes of work redesign initiatives is imperative to assure appropriate adjustments and responses to caregiver concerns.

  2. A Model Independent S/W Framework for Search-Based Software Testing

    PubMed Central

    Baik, Jongmoon

    2014-01-01

    In Model-Based Testing (MBT) area, Search-Based Software Testing (SBST) has been employed to generate test cases from the model of a system under test. However, many types of models have been used in MBT. If the type of a model has changed from one to another, all functions of a search technique must be reimplemented because the types of models are different even if the same search technique has been applied. It requires too much time and effort to implement the same algorithm over and over again. We propose a model-independent software framework for SBST, which can reduce redundant works. The framework provides a reusable common software platform to reduce time and effort. The software framework not only presents design patterns to find test cases for a target model but also reduces development time by using common functions provided in the framework. We show the effectiveness and efficiency of the proposed framework with two case studies. The framework improves the productivity by about 50% when changing the type of a model. PMID:25302314

  3. Evaluation of AHRQ's on-time pressure ulcer prevention program: a facilitator-assisted clinical decision support intervention for nursing homes.

    PubMed

    Olsho, Lauren E W; Spector, William D; Williams, Christianna S; Rhodes, William; Fink, Rebecca V; Limcangco, Rhona; Hurd, Donna

    2014-03-01

    Pressure ulcers present serious health and economic consequences for nursing home residents. The Agency for Healthcare Research & Quality, in partnership with the New York State Department of Health, implemented the pressure ulcer module of On-Time Quality Improvement for Long Term Care (On-Time), a clinical decision support intervention to reduce pressure ulcer incidence rates. To evaluate the effectiveness of the On-Time program in reducing the rate of in-house-acquired pressure ulcers among nursing home residents. We employed an interrupted time-series design to identify impacts of 4 core On-Time program components on resident pressure ulcer incidence in 12 New York State nursing homes implementing the intervention (n=3463 residents). The sample was purposively selected to include nursing homes with high baseline prevalence and incidence of pressure ulcers and high motivation to reduce pressure ulcers. Differential timing and sequencing of 4 core On-Time components across intervention nursing homes and units enabled estimation of separate impacts for each component. Inclusion of a nonequivalent comparison group of 13 nursing homes not implementing On-Time (n=2698 residents) accounts for potential mean-reversion bias. Impacts were estimated via a random-effects Poisson model including resident-level and facility-level covariates. We find a large and statistically significant reduction in pressure ulcer incidence associated with the joint implementation of 4 core On-Time components (incidence rate ratio=0.409; P=0.035). Impacts vary with implementation of specific component combinations. On-Time implementation is associated with sizable reductions in pressure ulcer incidence.

  4. Bayesian Modeling of the Assimilative Capacity Component of Stream Nutrient Export

    EPA Science Inventory

    Implementing stream restoration techniques and best management practices to reduce nonpoint source nutrients implies enhancement of the assimilative capacity for the stream system. In this paper, a Bayesian method for evaluating this component of a TMDL load capacity is developed...

  5. Optimal implementation of green infrastructure practices to reduce adverse impacts of urban areas on hydrology and water quality

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Collingsworth, P.; Pijanowski, B. C.; Engel, B.

    2016-12-01

    Nutrient loading from Maumee River watershed is a significant reason for the harmful algal blooms (HABs) problem in Lake Erie. Although studies have explored strategies to reduce nutrient loading from agricultural areas in the Maumee River watershed, the nutrient loading in urban areas also needs to be reduced. Green infrastructure practices are popular approaches for stormwater management and useful for improving hydrology and water quality. In this study, the Long-Term Hydrologic Impact Assessment-Low Impact Development 2.1 (L-THIA-LID 2.1) model was used to determine how different strategies for implementing green infrastructure practices can be optimized to reduce impacts on hydrology and water quality in an urban watershed in the upper Maumee River system. Community inputs, such as the types of green infrastructure practices of greatest interest and environmental concerns for the community, were also considered during the study. Based on community input, the following environmental concerns were considered: runoff volume, Total Suspended Solids (TSS), Total Phosphorous (TP), Total Kjeldahl Nitrogen (TKN), and Nitrate+Nitrite (NOx); green infrastructure practices of interest included rain barrel, cistern, green roof, permeable patio, porous pavement, grassed swale, bioretention system, grass strip, wetland channel, detention basin, retention pond, and wetland basin. Spatial optimization of green infrastructure practice implementation was conducted to maximize environmental benefits while minimizing the cost of implementation. The green infrastructure practice optimization results can be used by the community to solve hydrology and water quality problems.

  6. Outcomes of acutely ill older hospitalized patients following implementation of tailored models of care: a repeated measures (pre- and post-intervention) design.

    PubMed

    Chang, Esther; Hancock, Karen; Hickman, Louise; Glasson, Janet; Davidson, Patricia

    2007-09-01

    There is a lack of research investigating models of nursing care for older hospitalised patients that address the nursing needs of this group. The objective of this study is to evaluate the efficacy of models of care for acutely older patients tailored to two contexts: an aged care specific ward and a medical ward. This is a repeated measures design. Efficacy of the models was evaluated in terms of: patient and nurses' satisfaction with care provided; increased activities of daily living; reduced unplanned hospital readmissions; and medication knowledge. An aged care specific ward and a medical ward in two Sydney teaching hospitals. There were two groups of patients aged 65 years or older who were admitted to hospital for an acute illness: those admitted prior to model implementation (n=232) and those admitted during model implementation (n=116). Patients with moderate or severe dementia were excluded. The two groups of nurses were the pre-model group (n=90) who were working on the medical and aged care wards for the study prior to model implementation, and the post-model group (n=22), who were the nurses working on the wards during model implementation. Action research was used to develop the models of care in two wards: one for an aged care specific ward and another for a general medical ward where older patients were admitted. The models developed were based on empirical data gathered in an earlier phase of this study. The models were successful in both wards in terms of increasing satisfaction levels in patients and nurses (p<0.001), increasing functional independence as measured by activities of daily living (p<0.01), and increasing medication knowledge (p<0.001). Findings indicate that models of care developed by nurses using an evidence-based action research strategy can enhance both satisfaction and health outcomes in older patients.

  7. Public health economic evaluation of different European Union-level policy options aimed at reducing population dietary trans fat intake.

    PubMed

    Martin-Saborido, Carlos; Mouratidou, Theodora; Livaniou, Anastasia; Caldeira, Sandra; Wollgast, Jan

    2016-11-01

    The adverse relation between dietary trans fatty acid (TFA) intake and coronary artery disease risk is well established. Many countries in the European Union (EU) and worldwide have implemented different policies to reduce the TFA intake of their populations. The aim of this study was to assess the added value of EU-level action by estimating the cost-effectiveness of 3 possible EU-level policy measures to reduce population dietary TFA intake. This was calculated against a reference situation of not implementing any EU-level policy (i.e., by assuming only national or self-regulatory measures). We developed a mathematical model to compare different policy options at the EU level: 1) to do nothing beyond the current state (reference situation), 2) to impose mandatory TFA labeling of prepackaged foods, 3) to seek voluntary agreements toward further reducing industrially produced TFA (iTFA) content in foods, and 4) to impose a legislative limit for iTFA content in foods. The model indicated that to impose an EU-level legal limit or to make voluntary agreements may, over the course of a lifetime (85 y), avoid the loss of 3.73 and 2.19 million disability-adjusted life-years (DALYs), respectively, and save >51 and 23 billion euros when compared with the reference situation. Implementing mandatory TFA labeling can also avoid the loss of 0.98 million DALYs, but this option incurs more costs than it saves compared with the reference option. The model indicates that there is added value of an EU-level action, either via a legal limit or through voluntary agreements, with the legal limit option producing the highest additional health benefits. Introducing mandatory TFA labeling for the EU common market may provide some additional health benefits; however, this would likely not be a cost-effective strategy.

  8. Public health economic evaluation of different European Union–level policy options aimed at reducing population dietary trans fat intake12

    PubMed Central

    Mouratidou, Theodora; Livaniou, Anastasia

    2016-01-01

    Background: The adverse relation between dietary trans fatty acid (TFA) intake and coronary artery disease risk is well established. Many countries in the European Union (EU) and worldwide have implemented different policies to reduce the TFA intake of their populations. Objective: The aim of this study was to assess the added value of EU-level action by estimating the cost-effectiveness of 3 possible EU-level policy measures to reduce population dietary TFA intake. This was calculated against a reference situation of not implementing any EU-level policy (i.e., by assuming only national or self-regulatory measures). Design: We developed a mathematical model to compare different policy options at the EU level: 1) to do nothing beyond the current state (reference situation), 2) to impose mandatory TFA labeling of prepackaged foods, 3) to seek voluntary agreements toward further reducing industrially produced TFA (iTFA) content in foods, and 4) to impose a legislative limit for iTFA content in foods. Results: The model indicated that to impose an EU-level legal limit or to make voluntary agreements may, over the course of a lifetime (85 y), avoid the loss of 3.73 and 2.19 million disability-adjusted life-years (DALYs), respectively, and save >51 and 23 billion euros when compared with the reference situation. Implementing mandatory TFA labeling can also avoid the loss of 0.98 million DALYs, but this option incurs more costs than it saves compared with the reference option. Conclusions: The model indicates that there is added value of an EU-level action, either via a legal limit or through voluntary agreements, with the legal limit option producing the highest additional health benefits. Introducing mandatory TFA labeling for the EU common market may provide some additional health benefits; however, this would likely not be a cost-effective strategy. PMID:27680991

  9. Health care worker perspectives of their motivation to reduce health care-associated infections.

    PubMed

    McClung, Laura; Obasi, Chidi; Knobloch, Mary Jo; Safdar, Nasia

    2017-10-01

    Health care-associated infections (HAIs) are largely preventable, but are associated with considerable health care burden. Given the significant cost of HAIs, many health care institutions have implemented bundled interventions to reduce HAIs. These complex behavioral interventions require considerable effort; however, individual behaviors and motivations crucial to successful and sustained implementation have not been adequately assessed. We evaluated health care worker motivations to reduce HAIs. This was a phenomenologic qualitative study of health care workers in different roles within a university hospital, recruited via a snowball strategy. Using constructs from the Consolidated Framework for Implementation Research model, face-to-face semi-structured interviews were used to explore perceptions of health care worker motivation to follow protocols on HAI prevention. Across all types of health care workers interviewed, patient safety and improvement in clinical outcomes were the major motivators to reducing HAIs. Other important motivators included collaborative environment that valued individual input, transparency and feedback at both organizational and individual levels, leadership involvement, and refresher trainings and workshops. We did not find policy, regulatory considerations, or financial penalties to be important motivators. Health care workers perceived patient safety and clinical outcomes as the primary motivators to reduce HAI. Leadership engagement and data-driven interventions with frequent performance feedback were also identified as important facilitators of HAI prevention. Published by Elsevier Inc.

  10. Construction of reduced order models for the non-linear Navier-Stokes equations using the proper orthogonal fecomposition (POD)/Galerkin method.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fike, Jeffrey A.

    2013-08-01

    The construction of stable reduced order models using Galerkin projection for the Euler or Navier-Stokes equations requires a suitable choice for the inner product. The standard L2 inner product is expected to produce unstable ROMs. For the non-linear Navier-Stokes equations this means the use of an energy inner product. In this report, Galerkin projection for the non-linear Navier-Stokes equations using the L2 inner product is implemented as a first step toward constructing stable ROMs for this set of physics.

  11. Alternative Outpatient Chemotherapy Scheduling Method to Improve Patient Service Quality and Nurse Satisfaction.

    PubMed

    Huang, Yu-Li; Bryce, Alan H; Culbertson, Tracy; Connor, Sarah L; Looker, Sherry A; Altman, Kristin M; Collins, James G; Stellner, Winston; McWilliams, Robert R; Moreno-Aspitia, Alvaro; Ailawadhi, Sikander; Mesa, Ruben A

    2018-02-01

    Optimal scheduling and calendar management in an outpatient chemotherapy unit is a complex process that is driven by a need to focus on safety while accommodating a high degree of variability. Primary constraints are infusion times, staffing resources, chair availability, and unit hours. We undertook a process to analyze our existing management models across multiple practice settings in our health care system, then developed a model to optimize safety and efficiency. The model was tested in one of the community chemotherapy units. We assessed staffing violations as measured by nurse-to-patient ratios throughout the workday and at key points during treatment. Staffing violations were tracked before and after the implementation of the new model. The new model reduced staffing violations by nearly 50% and required fewer chairs to treat the same number of patients for the selected clinic day. Actual implementation results indicated that the new model leveled the distribution of patients across the workday with an 18% reduction in maximum chair utilization and a 27% reduction in staffing violations. Subsequently, a positive impact on peak pharmacy workload reduced delays by as much as 35 minutes. Nursing staff satisfaction with the new model was positive. We conclude that the proposed optimization approach with regard to nursing resource assignment and workload balance throughout a day effectively improves patient service quality and staff satisfaction.

  12. Modelling of biogas extraction at an Italian landfill accepting mechanically and biologically treated municipal solid waste.

    PubMed

    Calabrò, Paolo S; Orsi, Sirio; Gentili, Emiliano; Carlo, Meoni

    2011-12-01

    This paper presents the results of the modelling of the biogas extraction in a full-scale Italian landfill by the USEPA LandGEM model and the Andreottola-Cossu approach. The landfill chosen for this research ('Il Fossetto' plant, Monsummano Terme, Italy) had accepted mixed municipal raw waste for about 15 years. In the year 2003 a mechanical biological treatment (MBT) was implemented and starting from the end of the year 2006, the recirculation in the landfill of the concentrated leachate coming from the internal membrane leachate treatment plant was put into practice. The USEPA LandGEM model and the Andreottola-Cossu approach were chosen since they require only input data routinely acquired during landfill management (waste amount and composition) and allow a simplified calibration, therefore they are potentially useful for practical purposes such as landfill gas management. The results given by the models are compared with measured data and analysed in order to verify the impact of MBT on biogas production; moreover, the possible effects of the recirculation of the concentrated leachate are discussed. The results clearly show how both models can adequately fit measured data even after MBT implementation. Model performance was significantly reduced for the period after the beginning of recirculation of concentrated leachate when the probable inhibition of methane production, due to the competition between methanogens and sulfate-reducing bacteria, significantly influenced the biogas production and composition.

  13. Acoustic Biometric System Based on Preprocessing Techniques and Linear Support Vector Machines

    PubMed Central

    del Val, Lara; Izquierdo-Fuente, Alberto; Villacorta, Juan J.; Raboso, Mariano

    2015-01-01

    Drawing on the results of an acoustic biometric system based on a MSE classifier, a new biometric system has been implemented. This new system preprocesses acoustic images, extracts several parameters and finally classifies them, based on Support Vector Machine (SVM). The preprocessing techniques used are spatial filtering, segmentation—based on a Gaussian Mixture Model (GMM) to separate the person from the background, masking—to reduce the dimensions of images—and binarization—to reduce the size of each image. An analysis of classification error and a study of the sensitivity of the error versus the computational burden of each implemented algorithm are presented. This allows the selection of the most relevant algorithms, according to the benefits required by the system. A significant improvement of the biometric system has been achieved by reducing the classification error, the computational burden and the storage requirements. PMID:26091392

  14. Acoustic Biometric System Based on Preprocessing Techniques and Linear Support Vector Machines.

    PubMed

    del Val, Lara; Izquierdo-Fuente, Alberto; Villacorta, Juan J; Raboso, Mariano

    2015-06-17

    Drawing on the results of an acoustic biometric system based on a MSE classifier, a new biometric system has been implemented. This new system preprocesses acoustic images, extracts several parameters and finally classifies them, based on Support Vector Machine (SVM). The preprocessing techniques used are spatial filtering, segmentation-based on a Gaussian Mixture Model (GMM) to separate the person from the background, masking-to reduce the dimensions of images-and binarization-to reduce the size of each image. An analysis of classification error and a study of the sensitivity of the error versus the computational burden of each implemented algorithm are presented. This allows the selection of the most relevant algorithms, according to the benefits required by the system. A significant improvement of the biometric system has been achieved by reducing the classification error, the computational burden and the storage requirements.

  15. Simulating Forest Carbon Dynamics in Response to Large-scale Fuel Reduction Treatments Under Projected Climate-fire Interactions in the Sierra Nevada Mountains, USA

    NASA Astrophysics Data System (ADS)

    Liang, S.; Hurteau, M. D.

    2016-12-01

    The interaction of warmer, drier climate and increasing large wildfires, coupled with increasing fire severity resulting from fire-exclusion are anticipated to undermine forest carbon (C) stock stability and C sink strength in the Sierra Nevada forests. Treatments, including thinning and prescribed burning, to reduce biomass and restore forest structure have proven effective at reducing fire severity and lessening C loss when treated stands are burned by wildfire. However, the current pace and scale of treatment implementation is limited, especially given recent increases in area burned by wildfire. In this study, we used a forest landscape model (LANDIS-II) to evaluate the role of implementation timing of large-scale fuel reduction treatments in influencing forest C stock and fluxes of Sierra Nevada forests with projected climate and larger wildfires. We ran 90-year simulations using climate and wildfire projections from three general circulation models driven by the A2 emission scenario. We simulated two different treatment implementation scenarios: a `distributed' (treatments implemented throughout the simulation) and an `accelerated' (treatments implemented during the first half century) scenario. We found that across the study area, accelerated implementation had 0.6-10.4 Mg ha-1 higher late-century aboveground biomass (AGB) and 1.0-2.2 g C m-2 yr-1 higher mean C sink strength than the distributed scenario, depending on specific climate-wildfire projections. Cumulative wildfire emissions over the simulation period were 0.7-3.9 Mg C ha-1 higher for distributed implementation relative to accelerated implementation. However, simulations with both implementation practices have considerably higher AGB and C sink strength as well as lower wildfire emission than simulations in the absence of fuel reduction treatments. The results demonstrate the potential for implementing large-scale fuel reduction treatments to enhance forest C stock stability and C sink strength under projected climate-wildfire interactions. Given climate and wildfire would become more stressful since the mid-century, a forward management action would grant us more C benefits.

  16. Insights from the design and implementation of a single-entry model of referral for total joint replacement surgery: Critical success factors and unanticipated consequences.

    PubMed

    Damani, Zaheed; MacKean, Gail; Bohm, Eric; Noseworthy, Tom; Wang, Jenney Meng Han; DeMone, Brie; Wright, Brock; Marshall, Deborah A

    2018-02-01

    Single-entry models (SEMs) in healthcare allow patients to see the next-available provider and have been shown to improve waiting times, access and patient flow for preference-sensitive, scheduled services. The Winnipeg Central Intake Service (WCIS) for hip and knee replacement surgery was implemented to improve access in the Winnipeg Regional Health Authority. This paper describes the system's design/implementation; successes, challenges, and unanticipated consequences. On two occasions, during and following implementation, we interviewed all members of the WCIS project team, including processing engineers, waiting list coordinators, administrators and policy-makers regarding their experiences. We used semi-structured telephone interviews to collect data and qualitative thematic analysis to analyze and interpret the findings. Respondents indicated that the overarching objectives of the WCIS were being met. Benefits included streamlined processes, greater patient access, improved measurement and monitoring of outcomes. Challenges included low awareness, change readiness, and initial participation among stakeholders. Unanticipated consequences included workload increases, confusion around stakeholder expectations and under-reporting of data by surgeons' offices. Critical success factors for implementation included a requirement for clear communication, robust data collection, physician leadership and patience by all, especially implementation teams. Although successfully implemented, key lessons and critical success factors were learned related to change management, which if considered and applied, can reduce unanticipated consequences, improve uptake and benefit new models of care. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. On the study of control effectiveness and computational efficiency of reduced Saint-Venant model in model predictive control of open channel flow

    NASA Astrophysics Data System (ADS)

    Xu, M.; van Overloop, P. J.; van de Giesen, N. C.

    2011-02-01

    Model predictive control (MPC) of open channel flow is becoming an important tool in water management. The complexity of the prediction model has a large influence on the MPC application in terms of control effectiveness and computational efficiency. The Saint-Venant equations, called SV model in this paper, and the Integrator Delay (ID) model are either accurate but computationally costly, or simple but restricted to allowed flow changes. In this paper, a reduced Saint-Venant (RSV) model is developed through a model reduction technique, Proper Orthogonal Decomposition (POD), on the SV equations. The RSV model keeps the main flow dynamics and functions over a large flow range but is easier to implement in MPC. In the test case of a modeled canal reach, the number of states and disturbances in the RSV model is about 45 and 16 times less than the SV model, respectively. The computational time of MPC with the RSV model is significantly reduced, while the controller remains effective. Thus, the RSV model is a promising means to balance the control effectiveness and computational efficiency.

  18. Development of Unsteady Aerodynamic and Aeroelastic Reduced-Order Models Using the FUN3D Code

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.; Vatsa, Veer N.; Biedron, Robert T.

    2009-01-01

    Recent significant improvements to the development of CFD-based unsteady aerodynamic reduced-order models (ROMs) are implemented into the FUN3D unstructured flow solver. These improvements include the simultaneous excitation of the structural modes of the CFD-based unsteady aerodynamic system via a single CFD solution, minimization of the error between the full CFD and the ROM unsteady aero- dynamic solution, and computation of a root locus plot of the aeroelastic ROM. Results are presented for a viscous version of the two-dimensional Benchmark Active Controls Technology (BACT) model and an inviscid version of the AGARD 445.6 aeroelastic wing using the FUN3D code.

  19. [Implementation results of emission standards of air pollutants for thermal power plants: a numerical simulation].

    PubMed

    Wang, Zhan-Shan; Pan, Li-Bo

    2014-03-01

    The emission inventory of air pollutants from the thermal power plants in the year of 2010 was set up. Based on the inventory, the air quality of the prediction scenarios by implementation of both 2003-version emission standard and the new emission standard were simulated using Models-3/CMAQ. The concentrations of NO2, SO2, and PM2.5, and the deposition of nitrogen and sulfur in the year of 2015 and 2020 were predicted to investigate the regional air quality improvement by the new emission standard. The results showed that the new emission standard could effectively improve the air quality in China. Compared with the implementation results of the 2003-version emission standard, by 2015 and 2020, the area with NO2 concentration higher than the emission standard would be reduced by 53.9% and 55.2%, the area with SO2 concentration higher than the emission standard would be reduced by 40.0%, the area with nitrogen deposition higher than 1.0 t x km(-2) would be reduced by 75.4% and 77.9%, and the area with sulfur deposition higher than 1.6 t x km(-2) would be reduced by 37.1% and 34.3%, respectively.

  20. Modeling the cost-effectiveness of insect rearing on artificial diets: A test with a tephritid fly used in the sterile insect technique.

    PubMed

    Pascacio-Villafán, Carlos; Birke, Andrea; Williams, Trevor; Aluja, Martín

    2017-01-01

    We modeled the cost-effectiveness of rearing Anastrepha ludens, a major fruit fly pest currently mass reared for sterilization and release in pest control programs implementing the sterile insect technique (SIT). An optimization model was generated by combining response surface models of artificial diet cost savings with models of A. ludens pupation, pupal weight, larval development time and adult emergence as a function of mixtures of yeast, a costly ingredient, with corn flour and corncob fractions in the diet. Our model revealed several yeast-reduced mixtures that could be used to prepare diets that were considerably cheaper than a standard diet used for mass rearing. Models predicted a similar production of insects (pupation and adult emergence), with statistically similar pupal weights and larval development times between yeast-reduced diets and the standard mass rearing diet formulation. Annual savings from using the modified diets could be up to 5.9% of the annual cost of yeast, corn flour and corncob fractions used in the standard diet, representing a potential saving of US $27.45 per ton of diet (US $47,496 in the case of the mean annual production of 1,730.29 tons of artificial diet in the Moscafrut mass rearing facility at Metapa, Chiapas, Mexico). Implementation of the yeast-reduced diet on an experimental scale at mass rearing facilities is still required to confirm the suitability of new mixtures of artificial diet for rearing A. ludens for use in SIT. This should include the examination of critical quality control parameters of flies such as adult flight ability, starvation resistance and male sexual competitiveness across various generations. The method used here could be useful for improving the cost-effectiveness of invertebrate or vertebrate mass rearing diets worldwide.

  1. Modeling the cost-effectiveness of insect rearing on artificial diets: A test with a tephritid fly used in the sterile insect technique

    PubMed Central

    Birke, Andrea; Williams, Trevor; Aluja, Martín

    2017-01-01

    We modeled the cost-effectiveness of rearing Anastrepha ludens, a major fruit fly pest currently mass reared for sterilization and release in pest control programs implementing the sterile insect technique (SIT). An optimization model was generated by combining response surface models of artificial diet cost savings with models of A. ludens pupation, pupal weight, larval development time and adult emergence as a function of mixtures of yeast, a costly ingredient, with corn flour and corncob fractions in the diet. Our model revealed several yeast-reduced mixtures that could be used to prepare diets that were considerably cheaper than a standard diet used for mass rearing. Models predicted a similar production of insects (pupation and adult emergence), with statistically similar pupal weights and larval development times between yeast-reduced diets and the standard mass rearing diet formulation. Annual savings from using the modified diets could be up to 5.9% of the annual cost of yeast, corn flour and corncob fractions used in the standard diet, representing a potential saving of US $27.45 per ton of diet (US $47,496 in the case of the mean annual production of 1,730.29 tons of artificial diet in the Moscafrut mass rearing facility at Metapa, Chiapas, Mexico). Implementation of the yeast-reduced diet on an experimental scale at mass rearing facilities is still required to confirm the suitability of new mixtures of artificial diet for rearing A. ludens for use in SIT. This should include the examination of critical quality control parameters of flies such as adult flight ability, starvation resistance and male sexual competitiveness across various generations. The method used here could be useful for improving the cost-effectiveness of invertebrate or vertebrate mass rearing diets worldwide. PMID:28257496

  2. A model of interaction between anticorruption authority and corruption groups

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neverova, Elena G.; Malafeyef, Oleg A.

    The paper provides a model of interaction between anticorruption unit and corruption groups. The main policy functions of the anticorruption unit involve reducing corrupt practices in some entities through an optimal approach to resource allocation and effective anticorruption policy. We develop a model based on Markov decision-making process and use Howard’s policy-improvement algorithm for solving an optimal decision strategy. We examine the assumption that corruption groups retaliate against the anticorruption authority to protect themselves. This model was implemented through stochastic game.

  3. Implementation of a numerical holding furnace model in foundry and construction of a reduced model

    NASA Astrophysics Data System (ADS)

    Loussouarn, Thomas; Maillet, Denis; Remy, Benjamin; Dan, Diane

    2016-09-01

    Vacuum holding induction furnaces are used for the manufacturing of turbine blades by loss wax foundry process. The control of solidification parameters is a key factor for the manufacturing of these parts in according to geometrical and structural expectations. The definition of a reduced heat transfer model with experimental identification through an estimation of its parameters is required here. In a further stage this model will be used to characterize heat exchanges using internal sensors through inverse techniques to optimize the furnace command and the optimization of its design. Here, an axisymmetric furnace and its load have been numerically modelled using FlexPDE, a finite elements code. A detailed model allows the calculation of the internal induction heat source as well as transient radiative transfer inside the furnace. A reduced lumped body model has been defined to represent the numerical furnace. The model reduction and the estimation of the parameters of the lumped body have been made using a Levenberg-Marquardt least squares minimization algorithm with Matlab, using two synthetic temperature signals with a further validation test.

  4. Effect of different implementations of the same ice history in GIA modeling

    NASA Astrophysics Data System (ADS)

    Barletta, V. R.; Bordoni, A.

    2013-11-01

    This study shows the effect of changing the way ice histories are implemented in Glacial Isostatic Adjustment (GIA) codes to solve the sea level equation. The ice history models are being constantly improved and are provided in different formats. The overall algorithmic design of the sea-level equation solver often forces to implement the ice model in a representation that differs from the one originally provided. We show that using different representations of the same ice model gives important differences and artificial contributions to the sea level estimates, both at global and at regional scale. This study is not a speculative exercise. The ICE-5G model adopted in this work is widely used in present day sea-level analysis, but discrepancies between the results obtained by different groups for the same ice models still exist, and it was the effort to set a common reference for the sea-level community that inspired this work. Understanding this issue is important to be able to reduce the artefacts introduced by a non-suitable ice model representation. This is especially important when developing new GIA models, since neglecting this problem can easily lead to wrong alignment of the ice and sea-level histories, particularly close to the deglaciation areas, like Antarctica.

  5. Reducing student stereotypy by improving teachers' implementation of discrete-trial teaching.

    PubMed

    Dib, Nancy; Sturmey, Peter

    2007-01-01

    Discrete-trial teaching is an instructional method commonly used to teach social and academic skills to children with an autism spectrum disorder. The purpose of the current study was to evaluate the indirect effects of discrete-trial teaching on 3 students' stereotypy. Instructions, feedback, modeling, and rehearsal were used to improve 3 teaching aides' implementation of discrete-trial teaching in a private school for children with autism. Improvements in accurate teaching were accompanied by systematic decreases in students' levels of stereotypy.

  6. A Proposal of TLS Implementation for Cross Certification Model

    NASA Astrophysics Data System (ADS)

    Kaji, Tadashi; Fujishiro, Takahiro; Tezuka, Satoru

    Today, TLS is widely used for achieving a secure communication system. And TLS is used PKI for server authentication and/or client authentication. However, its PKI environment, which is called as “multiple trust anchors environment,” causes the problem that the verifier has to maintain huge number of CA certificates in the ubiquitous network because the increase of terminals connected to the network brings the increase of CAs. However, most of terminals in the ubiquitous network will not have enough memory to hold such huge number of CA certificates. Therefore, another PKI environment, “cross certification environment”, is useful for the ubiquitous network. But, because current TLS is designed for the multiple trust anchors model, TLS cannot work efficiently on the cross-certification model. This paper proposes a TLS implementation method to support the cross certification model efficiently. Our proposal reduces the size of exchanged messages between the TLS client and the TLS server during the handshake process. Therefore, our proposal is suitable for implementing TLS in the terminals that do not have enough computing power and memory in ubiquitous network.

  7. Implementation and performance of parallel Prolog interpreter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wei, S.; Kale, L.V.; Balkrishna, R.

    1988-01-01

    In this paper, the authors discuss the implementation of a parallel Prolog interpreter on different parallel machines. The implementation is based on the REDUCE--OR process model which exploits both AND and OR parallelism in logic programs. It is machine independent as it runs on top of the chare-kernel--a machine-independent parallel programming system. The authors also give the performance of the interpreter running a diverse set of benchmark pargrams on parallel machines including shared memory systems: an Alliant FX/8, Sequent and a MultiMax, and a non-shared memory systems: Intel iPSC/32 hypercube, in addition to its performance on a multiprocessor simulation system.

  8. Finite element dynamic analysis of soft tissues using state-space model.

    PubMed

    Iorga, Lucian N; Shan, Baoxiang; Pelegri, Assimina A

    2009-04-01

    A finite element (FE) model is employed to investigate the dynamic response of soft tissues under external excitations, particularly corresponding to the case of harmonic motion imaging. A solid 3D mixed 'u-p' element S8P0 is implemented to capture the near-incompressibility inherent in soft tissues. Two important aspects in structural modelling of these tissues are studied; these are the influence of viscous damping on the dynamic response and, following FE-modelling, a developed state-space formulation that valuates the efficiency of several order reduction methods. It is illustrated that the order of the mathematical model can be significantly reduced, while preserving the accuracy of the observed system dynamics. Thus, the reduced-order state-space representation of soft tissues for general dynamic analysis significantly reduces the computational cost and provides a unitary framework for the 'forward' simulation and 'inverse' estimation of soft tissues. Moreover, the results suggest that damping in soft-tissue is significant, effectively cancelling the contribution of all but the first few vibration modes.

  9. Dynamics of a Flywheel Energy Storage System Supporting a Wind Turbine Generator in a Microgrid

    NASA Astrophysics Data System (ADS)

    Nair S, Gayathri; Senroy, Nilanjan

    2016-02-01

    Integration of an induction machine based flywheel energy storage system with a wind energy conversion system is implemented in this paper. The nonlinear and linearized models of the flywheel are studied, compared and a reduced order model of the same simulated to analyze the influence of the flywheel inertia and control in system response during a wind power change. A quantification of the relation between the inertia of the flywheel and the controller gain is obtained which allows the system to be considered as a reduced order model that is more controllable in nature. A microgrid setup comprising of the flywheel energy storage system, a two mass model of a DFIG based wind turbine generator and a reduced order model of a diesel generator is utilized to analyse the microgrid dynamics accurately in the event of frequency variations arising due to wind power change. The response of the microgrid with and without the flywheel is studied.

  10. Effect of Blowing on Boundary Layer of Scarf Inlet

    NASA Technical Reports Server (NTRS)

    Gerhold, Carl H.; Clark, Lorenzo R.

    2004-01-01

    When aircraft operate in stationary or low speed conditions, airflow into the engine accelerates around the inlet lip and pockets of turbulence that cause noise and vibration can be ingested. This problem has been encountered with engines equipped with the scarf inlet, both in full scale and in model tests, where the noise produced during the static test makes it difficult to assess the noise reduction performance of the scarf inlet. NASA Langley researchers have implemented boundary layer control in an attempt to reduce the influence of the flow nonuniformity in a 12-in. diameter model of a high bypass fan engine mounted in an anechoic chamber. Static pressures and boundary layer profiles were measured in the inlet and far field acoustic measurements were made to assess the effectiveness of the blowing treatment. The blowing system was found to lack the authority to overcome the inlet distortions. Methods to improve the implementation of boundary layer control to reduce inlet distortion are discussed.

  11. A learning curve-based method to implement multifunctional work teams in the Brazilian footwear sector.

    PubMed

    Guimarães, L B de M; Anzanello, M J; Renner, J S

    2012-05-01

    This paper presents a method for implementing multifunctional work teams in a footwear company that followed the Taylor/Ford system for decades. The suggested framework first applies a Learning Curve (LC) modeling to assess whether rotation between tasks of different complexities affects workers' learning rate and performance. Next, the Macroergonomic Work Analysis (MA) method (Guimarães, 1999, 2009) introduces multifunctional principles in work teams towards workers' training and resources improvement. When applied to a pilot line consisting of 100 workers, the intervention-reduced work related accidents in 80%, absenteeism in 45.65%, and eliminated work related musculoskeletal disorders (WMSD), medical consultations, and turnover. Further, the output rate of the multifunctional team increased average 3% compared to the production rate of the regular lines following the Taylor/Ford system (with the same shoe model being manufactured), while the rework and spoilage rates were reduced 85% and 69%, respectively. Copyright © 2011 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  12. Modeling Microgravity Induced Fluid Redistribution Autoregulatory and Hydrostatic Enhancements

    NASA Technical Reports Server (NTRS)

    Myers, J. G.; Werner, C.; Nelson, E. S.; Feola, A.; Raykin, J.; Samuels, B.; Ethier, C. R.

    2017-01-01

    Space flight induces a marked cephalad (headward) redistribution of blood and interstitial fluid potentially resulting in a loss of venous tone and reduction in heart muscle efficiency upon introduction into the microgravity environment. Using various types of computational models, we are investigating how this fluid redistribution may induce intracranial pressure changes, relevant to reported reductions in astronaut visual acuity, part of the Visual Impairment and Intracranial Pressure (VIIP) syndrome. Methods: We utilize a lumped parameter cardiovascular system (CVS) model, augmented by compartments comprising the cerebral spinal fluid (CSF) space, as the primary tool to describe how microgravity, and the associated lack of hydrostatic gradient, impacts fluid redistribution. Models of ocular fluid pressures and biomechanics then accept the output of the above model as boundary condition input to allow more detailed, local analysis (see IWS Abstract by Ethier et al.). Recently, we enhanced the capabilities our previously reported CVS model through the implementation of robust autoregulatory mechanisms and a more fundamental approach to the implementation of hydrostatic mechanisms. Modifying the approach of Blanco et al., we implemented auto-regulation in a quasi-static manner, as an averaged effect across the span of one heartbeat. This approach reduced the higher frequency perturbations from the regulatory mechanism and was intended to allow longer simulation times (days) than models that implement within-beat regulatory mechanisms (minutes). A more fundamental approach to hydrostatics was implemented by a quasi-1D approach, in which compartment descriptions include compartment length, orientation and relative position, allowed for modeling of body orientation, relative body positioning and, in the future, alternative gravity environments. At this time the inclusion of hydrostatic mechanisms supplies additional capabilities to train and validate the CVS model with terrestrial data. Results and Conclusions: With the implementation of auto-regulation and hydrostatic modeling capabilities, the model performs as expected in the maintaining the CA (Central Artery) compartment pressure when simulating orientations ranging from supine to standing. The model appears to generally overpredict heart rate and thus cardiac output, possibly indicating sensitivity to the nominal heart rate, which is used as an initial set point of the regulation mechanisms. Despite this sensitivity, the model performs consistently for many hours of simulation time, indicating the success of our quasi-static implementation approach.

  13. Reducing overall health care costs for a city municipality: a real life community based learning model.

    PubMed

    Hodges, Linda C; Harper, Tricia Satkowski; Hall-Barrow, Julie; Tatom, Iris D

    2004-06-01

    City municipalities implementing health and wellness programs patterned after North Little Rock, Arkansas, can significantly reduce the cost of health care for employees, as well as reduce costs associated with workers' compensation claims and lost time caused by injury. In addition to primary care services, effective programs include health risk assessments through pre-placement physicals, employee physicals, drug screening, employee health and wellness promotion programs, and immunization and registry. In implementing the program, a team from the University of Arkansas for Medical Sciences College of Nursing worked with city officials to establish a steering committee, safety initiatives through first responders, systems for monitoring immunizations, criteria for pre-placement physicals, and an employee health and wellness program. While the benefits for the city are well documented, the contract also created opportunities for education, research, and services in a real life community based learning laboratory for students in the College of Nursing. In addition, it provided opportunities for faculty to participate in faculty practice and meet the College's service missions. The College's model program holds promise for use by other major health care centers across the region and nation.

  14. Simulating fire and forest dynamics for a coordinated landscape fuel treatment project in the Sierra Nevada

    Treesearch

    Brandon M. Collins; Scott L. Stephens; Gary B. Roller; John Battles

    2011-01-01

    We evaluate an actual landscape fuel treatment project that was designed by local U. S. Forest Service managers in the northern Sierra Nevada. We model the effects of this project at reducing landscape-level fire behavior at multiple time steps, up to nearly 30 yr beyond treatment implementation. Additionally, we modeled planned treatments under multiple diameter-...

  15. Modeling Wind Wave Evolution from Deep to Shallow Water

    DTIC Science & Technology

    2011-09-30

    validation and calibration of new model developments. WORK COMPLETED Development of a Lumped Quadruplet Approximation ( LQA ) To make evaluation of the...interactions based on the WRT method. This Lumped Quadruplet Approximation ( LQA ) clusters (lumps) contributions to the integrations over the...total transfer rate. A procedure has been developed to test the implementation (of LQA and other reduced versions of the WRT) where 1) the non

  16. Progress Toward Affordable High Fidelity Combustion Simulations Using Filtered Density Functions for Hypersonic Flows in Complex Geometries

    NASA Technical Reports Server (NTRS)

    Drozda, Tomasz G.; Quinlan, Jesse R.; Pisciuneri, Patrick H.; Yilmaz, S. Levent

    2012-01-01

    Significant progress has been made in the development of subgrid scale (SGS) closures based on a filtered density function (FDF) for large eddy simulations (LES) of turbulent reacting flows. The FDF is the counterpart of the probability density function (PDF) method, which has proven effective in Reynolds averaged simulations (RAS). However, while systematic progress is being made advancing the FDF models for relatively simple flows and lab-scale flames, the application of these methods in complex geometries and high speed, wall-bounded flows with shocks remains a challenge. The key difficulties are the significant computational cost associated with solving the FDF transport equation and numerically stiff finite rate chemistry. For LES/FDF methods to make a more significant impact in practical applications a pragmatic approach must be taken that significantly reduces the computational cost while maintaining high modeling fidelity. An example of one such ongoing effort is at the NASA Langley Research Center, where the first generation FDF models, namely the scalar filtered mass density function (SFMDF) are being implemented into VULCAN, a production-quality RAS and LES solver widely used for design of high speed propulsion flowpaths. This effort leverages internal and external collaborations to reduce the overall computational cost of high fidelity simulations in VULCAN by: implementing high order methods that allow reduction in the total number of computational cells without loss in accuracy; implementing first generation of high fidelity scalar PDF/FDF models applicable to high-speed compressible flows; coupling RAS/PDF and LES/FDF into a hybrid framework to efficiently and accurately model the effects of combustion in the vicinity of the walls; developing efficient Lagrangian particle tracking algorithms to support robust solutions of the FDF equations for high speed flows; and utilizing finite rate chemistry parametrization, such as flamelet models, to reduce the number of transported reactive species and remove numerical stiffness. This paper briefly introduces the SFMDF model (highlighting key benefits and challenges), and discusses particle tracking for flows with shocks, the hybrid coupled RAS/PDF and LES/FDF model, flamelet generated manifolds (FGM) model, and the Irregularly Portioned Lagrangian Monte Carlo Finite Difference (IPLMCFD) methodology for scalable simulation of high-speed reacting compressible flows.

  17. Reducing the computational footprint for real-time BCPNN learning

    PubMed Central

    Vogginger, Bernhard; Schüffny, René; Lansner, Anders; Cederström, Love; Partzsch, Johannes; Höppner, Sebastian

    2015-01-01

    The implementation of synaptic plasticity in neural simulation or neuromorphic hardware is usually very resource-intensive, often requiring a compromise between efficiency and flexibility. A versatile, but computationally-expensive plasticity mechanism is provided by the Bayesian Confidence Propagation Neural Network (BCPNN) paradigm. Building upon Bayesian statistics, and having clear links to biological plasticity processes, the BCPNN learning rule has been applied in many fields, ranging from data classification, associative memory, reward-based learning, probabilistic inference to cortical attractor memory networks. In the spike-based version of this learning rule the pre-, postsynaptic and coincident activity is traced in three low-pass-filtering stages, requiring a total of eight state variables, whose dynamics are typically simulated with the fixed step size Euler method. We derive analytic solutions allowing an efficient event-driven implementation of this learning rule. Further speedup is achieved by first rewriting the model which reduces the number of basic arithmetic operations per update to one half, and second by using look-up tables for the frequently calculated exponential decay. Ultimately, in a typical use case, the simulation using our approach is more than one order of magnitude faster than with the fixed step size Euler method. Aiming for a small memory footprint per BCPNN synapse, we also evaluate the use of fixed-point numbers for the state variables, and assess the number of bits required to achieve same or better accuracy than with the conventional explicit Euler method. All of this will allow a real-time simulation of a reduced cortex model based on BCPNN in high performance computing. More important, with the analytic solution at hand and due to the reduced memory bandwidth, the learning rule can be efficiently implemented in dedicated or existing digital neuromorphic hardware. PMID:25657618

  18. Early rigorous control interventions can largely reduce dengue outbreak magnitude: experience from Chaozhou, China.

    PubMed

    Liu, Tao; Zhu, Guanghu; He, Jianfeng; Song, Tie; Zhang, Meng; Lin, Hualiang; Xiao, Jianpeng; Zeng, Weilin; Li, Xing; Li, Zhihao; Xie, Runsheng; Zhong, Haojie; Wu, Xiaocheng; Hu, Wenbiao; Zhang, Yonghui; Ma, Wenjun

    2017-08-02

    Dengue fever is a severe public heath challenge in south China. A dengue outbreak was reported in Chaozhou city, China in 2015. Intensified interventions were implemented by the government to control the epidemic. However, it is still unknown the degree to which intensified control measures reduced the size of the epidemics, and when should such measures be initiated to reduce the risk of large dengue outbreaks developing? We selected Xiangqiao district as study setting because the majority of the indigenous cases (90.6%) in Chaozhou city were from this district. The numbers of daily indigenous dengue cases in 2015 were collected through the national infectious diseases and vectors surveillance system, and daily Breteau Index (BI) data were reported by local public health department. We used a compartmental dynamic SEIR (Susceptible, Exposed, Infected and Removed) model to assess the effectiveness of control interventions, and evaluate the control effect of intervention timing on dengue epidemic. A total of 1250 indigenous dengue cases was reported from Xiangqiao district. The results of SEIR modeling using BI as an indicator of actual control interventions showed a total of 1255 dengue cases, which is close to the reported number (n = 1250). The size and duration of the outbreak were highly sensitive to the intensity and timing of interventions. The more rigorous and earlier the control interventions implemented, the more effective it yielded. Even if the interventions were initiated several weeks after the onset of the dengue outbreak, the interventions were shown to greatly impact the prevalence and duration of dengue outbreak. This study suggests that early implementation of rigorous dengue interventions can effectively reduce the epidemic size and shorten the epidemic duration.

  19. Reducing the computational footprint for real-time BCPNN learning.

    PubMed

    Vogginger, Bernhard; Schüffny, René; Lansner, Anders; Cederström, Love; Partzsch, Johannes; Höppner, Sebastian

    2015-01-01

    The implementation of synaptic plasticity in neural simulation or neuromorphic hardware is usually very resource-intensive, often requiring a compromise between efficiency and flexibility. A versatile, but computationally-expensive plasticity mechanism is provided by the Bayesian Confidence Propagation Neural Network (BCPNN) paradigm. Building upon Bayesian statistics, and having clear links to biological plasticity processes, the BCPNN learning rule has been applied in many fields, ranging from data classification, associative memory, reward-based learning, probabilistic inference to cortical attractor memory networks. In the spike-based version of this learning rule the pre-, postsynaptic and coincident activity is traced in three low-pass-filtering stages, requiring a total of eight state variables, whose dynamics are typically simulated with the fixed step size Euler method. We derive analytic solutions allowing an efficient event-driven implementation of this learning rule. Further speedup is achieved by first rewriting the model which reduces the number of basic arithmetic operations per update to one half, and second by using look-up tables for the frequently calculated exponential decay. Ultimately, in a typical use case, the simulation using our approach is more than one order of magnitude faster than with the fixed step size Euler method. Aiming for a small memory footprint per BCPNN synapse, we also evaluate the use of fixed-point numbers for the state variables, and assess the number of bits required to achieve same or better accuracy than with the conventional explicit Euler method. All of this will allow a real-time simulation of a reduced cortex model based on BCPNN in high performance computing. More important, with the analytic solution at hand and due to the reduced memory bandwidth, the learning rule can be efficiently implemented in dedicated or existing digital neuromorphic hardware.

  20. Using Runtime Analysis to Guide Model Checking of Java Programs

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Norvig, Peter (Technical Monitor)

    2001-01-01

    This paper describes how two runtime analysis algorithms, an existing data race detection algorithm and a new deadlock detection algorithm, have been implemented to analyze Java programs. Runtime analysis is based on the idea of executing the program once. and observing the generated run to extract various kinds of information. This information can then be used to predict whether other different runs may violate some properties of interest, in addition of course to demonstrate whether the generated run itself violates such properties. These runtime analyses can be performed stand-alone to generate a set of warnings. It is furthermore demonstrated how these warnings can be used to guide a model checker, thereby reducing the search space. The described techniques have been implemented in the b e grown Java model checker called PathFinder.

  1. Integrating empowerment evaluation and quality improvement to achieve healthcare improvement outcomes

    PubMed Central

    Wandersman, Abraham; Alia, Kassandra Ann; Cook, Brittany; Ramaswamy, Rohit

    2015-01-01

    While the body of evidence-based healthcare interventions grows, the ability of health systems to deliver these interventions effectively and efficiently lags behind. Quality improvement approaches, such as the model for improvement, have demonstrated some success in healthcare but their impact has been lessened by implementation challenges. To help address these challenges, we describe the empowerment evaluation approach that has been developed by programme evaluators and a method for its application (Getting To Outcomes (GTO)). We then describe how GTO can be used to implement healthcare interventions. An illustrative healthcare quality improvement example that compares the model for improvement and the GTO method for reducing hospital admissions through improved diabetes care is described. We conclude with suggestions for integrating GTO and the model for improvement. PMID:26178332

  2. Assessment of the Efficiency of Hospitals Before and After the Implementation of Health Sector Evolution Plan in Iran Based on Pabon Lasso Model

    PubMed Central

    MORADI, Ghobad; PIROOZI, Bakhtiar; SAFARI, Hossein; ESMAIL NASAB, Nader; MOHAMADI BOLBANABAD, Amjad; YARI, Arezoo

    2017-01-01

    Background: Pabon Lasso model was applied to assess the relative performance of hospitals affiliated to Kurdistan University of Medical Sciences (KUMS) before and after the implementation of Health Sector Evolution Plan (HSEP) in Iran. Methods: This cross-sectional study was carried out in 11 public hospitals affiliated to KUMS in 2015. Twelve months before and after the implementation of the first phase of HSEP, a checklist was used to collect data from computerized databases within the hospitals’ admission and discharge units. Pabon Lasso model includes three indices: bed turnover, bed occupancy ratio, and average length of stay. Results: Analysis of hospital performance showed an increase in mean of bed occupancy and turnover ratio, which changed from 65.40% and 86.22 times/year during 12 months before to 69.97% and 90.98 times/year during 12 months after HSEP, respectively. In line with Pabon Lasso model, before the implementation of HSEP, 27.27% and 36.36% of the hospitals were entirely efficient and inefficient, respectively, whilst after the implementation of HSEP, their condition changed to 18.18% and 27.27%, in order. Conclusion: Indicators of bed occupancy and turnover ratio had a 4% increase in the studied hospitals after the implementation of HSEP. Number of the hospitals in the efficient zone reduced because of the relative measurement of efficiency by Pabon Lasso model. Since more than 50% of the hospitals in the studied province have not yet reached their optimal bed occupancy ratio (more than 70%), short-term and suitable strategy for improving the efficiency is to stop further expansion of hospitals as well as developing the number of hospital beds. PMID:28435825

  3. Low-dimensional spike rate models derived from networks of adaptive integrate-and-fire neurons: Comparison and implementation.

    PubMed

    Augustin, Moritz; Ladenbauer, Josef; Baumann, Fabian; Obermayer, Klaus

    2017-06-01

    The spiking activity of single neurons can be well described by a nonlinear integrate-and-fire model that includes somatic adaptation. When exposed to fluctuating inputs sparsely coupled populations of these model neurons exhibit stochastic collective dynamics that can be effectively characterized using the Fokker-Planck equation. This approach, however, leads to a model with an infinite-dimensional state space and non-standard boundary conditions. Here we derive from that description four simple models for the spike rate dynamics in terms of low-dimensional ordinary differential equations using two different reduction techniques: one uses the spectral decomposition of the Fokker-Planck operator, the other is based on a cascade of two linear filters and a nonlinearity, which are determined from the Fokker-Planck equation and semi-analytically approximated. We evaluate the reduced models for a wide range of biologically plausible input statistics and find that both approximation approaches lead to spike rate models that accurately reproduce the spiking behavior of the underlying adaptive integrate-and-fire population. Particularly the cascade-based models are overall most accurate and robust, especially in the sensitive region of rapidly changing input. For the mean-driven regime, when input fluctuations are not too strong and fast, however, the best performing model is based on the spectral decomposition. The low-dimensional models also well reproduce stable oscillatory spike rate dynamics that are generated either by recurrent synaptic excitation and neuronal adaptation or through delayed inhibitory synaptic feedback. The computational demands of the reduced models are very low but the implementation complexity differs between the different model variants. Therefore we have made available implementations that allow to numerically integrate the low-dimensional spike rate models as well as the Fokker-Planck partial differential equation in efficient ways for arbitrary model parametrizations as open source software. The derived spike rate descriptions retain a direct link to the properties of single neurons, allow for convenient mathematical analyses of network states, and are well suited for application in neural mass/mean-field based brain network models.

  4. Low-dimensional spike rate models derived from networks of adaptive integrate-and-fire neurons: Comparison and implementation

    PubMed Central

    Baumann, Fabian; Obermayer, Klaus

    2017-01-01

    The spiking activity of single neurons can be well described by a nonlinear integrate-and-fire model that includes somatic adaptation. When exposed to fluctuating inputs sparsely coupled populations of these model neurons exhibit stochastic collective dynamics that can be effectively characterized using the Fokker-Planck equation. This approach, however, leads to a model with an infinite-dimensional state space and non-standard boundary conditions. Here we derive from that description four simple models for the spike rate dynamics in terms of low-dimensional ordinary differential equations using two different reduction techniques: one uses the spectral decomposition of the Fokker-Planck operator, the other is based on a cascade of two linear filters and a nonlinearity, which are determined from the Fokker-Planck equation and semi-analytically approximated. We evaluate the reduced models for a wide range of biologically plausible input statistics and find that both approximation approaches lead to spike rate models that accurately reproduce the spiking behavior of the underlying adaptive integrate-and-fire population. Particularly the cascade-based models are overall most accurate and robust, especially in the sensitive region of rapidly changing input. For the mean-driven regime, when input fluctuations are not too strong and fast, however, the best performing model is based on the spectral decomposition. The low-dimensional models also well reproduce stable oscillatory spike rate dynamics that are generated either by recurrent synaptic excitation and neuronal adaptation or through delayed inhibitory synaptic feedback. The computational demands of the reduced models are very low but the implementation complexity differs between the different model variants. Therefore we have made available implementations that allow to numerically integrate the low-dimensional spike rate models as well as the Fokker-Planck partial differential equation in efficient ways for arbitrary model parametrizations as open source software. The derived spike rate descriptions retain a direct link to the properties of single neurons, allow for convenient mathematical analyses of network states, and are well suited for application in neural mass/mean-field based brain network models. PMID:28644841

  5. Network Reduction Algorithm for Developing Distribution Feeders for Real-Time Simulators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nagarajan, Adarsh; Nelson, Austin A; Prabakar, Kumaraguru

    As advanced grid-support functions (AGF) become more widely used in grid-connected photovoltaic (PV) inverters, utilities are increasingly interested in their impacts when implemented in the field. These effects can be understood by modeling feeders in real-time simulators and test PV inverters using power hardware-in-the-loop (PHIL) techniques. This paper presents a novel feeder model reduction algorithm using a ruin & reconstruct methodology that enables large feeders to be solved and operated on real-time computing platforms. Two Hawaiian Electric feeder models in Synergi Electric's load flow software were converted to reduced order models in OpenDSS, and subsequently implemented in the OPAL-RT real-timemore » digital testing platform. Smart PV inverters were added to the realtime model with AGF responses modeled after characterizing commercially available hardware inverters. Finally, hardware inverters were tested in conjunction with the real-time model using PHIL techniques so that the effects of AGFs on the feeders could be analyzed.« less

  6. Network Reduction Algorithm for Developing Distribution Feeders for Real-Time Simulators: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nagarajan, Adarsh; Nelson, Austin; Prabakar, Kumaraguru

    As advanced grid-support functions (AGF) become more widely used in grid-connected photovoltaic (PV) inverters, utilities are increasingly interested in their impacts when implemented in the field. These effects can be understood by modeling feeders in real-time systems and testing PV inverters using power hardware-in-the-loop (PHIL) techniques. This paper presents a novel feeder model reduction algorithm using a Monte Carlo method that enables large feeders to be solved and operated on real-time computing platforms. Two Hawaiian Electric feeder models in Synergi Electric's load flow software were converted to reduced order models in OpenDSS, and subsequently implemented in the OPAL-RT real-time digitalmore » testing platform. Smart PV inverters were added to the real-time model with AGF responses modeled after characterizing commercially available hardware inverters. Finally, hardware inverters were tested in conjunction with the real-time model using PHIL techniques so that the effects of AGFs on the choice feeders could be analyzed.« less

  7. A residency clinic chronic condition management quality improvement project.

    PubMed

    Halverson, Larry W; Sontheimer, Dan; Duvall, Sharon

    2007-02-01

    Quality improvement in chronic disease management is a major agenda for improving health and reducing health care costs. A six-component chronic disease management model can help guide this effort. Several characteristics of the "new model" of family medicine described by the Future of Family Medicine (FFM) Project Leadership Committee are promulgated to foster practice changes that improve quality. Our objective was to implement and assess a quality improvement project guided by the components of a chronic disease management model and FFM new model characteristics. Diabetes was selected as a model chronic disease focus. Multiple practice changes were implemented. A mature electronic medical record facilitated data collection and measurement of quality improvement progress. Data from the diabetes registry demonstrates that our efforts have been effective. Significant improvement occurred in five out of six quality indicators. Multidisciplinary teamwork in a model residency practice guided by chronic disease management principles and the FFM new model characteristics can produce significant management improvements in one important chronic disease.

  8. Metabolic modeling of dynamic 13C NMR isotopomer data in the brain in vivo: Fast screening of metabolic models using automated generation of differential equations

    PubMed Central

    Tiret, Brice; Shestov, Alexander A.; Valette, Julien; Henry, Pierre-Gilles

    2017-01-01

    Most current brain metabolic models are not capable of taking into account the dynamic isotopomer information available from fine structure multiplets in 13C spectra, due to the difficulty of implementing such models. Here we present a new approach that allows automatic implementation of multi-compartment metabolic models capable of fitting any number of 13C isotopomer curves in the brain. The new automated approach also makes it possible to quickly modify and test new models to best describe the experimental data. We demonstrate the power of the new approach by testing the effect of adding separate pyruvate pools in astrocytes and neurons, and adding a vesicular neuronal glutamate pool. Including both changes reduced the global fit residual by half and pointed to dilution of label prior to entry into the astrocytic TCA cycle as the main source of glutamine dilution. The glutamate-glutamine cycle rate was particularly sensitive to changes in the model. PMID:26553273

  9. A hybrid expectation maximisation and MCMC sampling algorithm to implement Bayesian mixture model based genomic prediction and QTL mapping.

    PubMed

    Wang, Tingting; Chen, Yi-Ping Phoebe; Bowman, Phil J; Goddard, Michael E; Hayes, Ben J

    2016-09-21

    Bayesian mixture models in which the effects of SNP are assumed to come from normal distributions with different variances are attractive for simultaneous genomic prediction and QTL mapping. These models are usually implemented with Monte Carlo Markov Chain (MCMC) sampling, which requires long compute times with large genomic data sets. Here, we present an efficient approach (termed HyB_BR), which is a hybrid of an Expectation-Maximisation algorithm, followed by a limited number of MCMC without the requirement for burn-in. To test prediction accuracy from HyB_BR, dairy cattle and human disease trait data were used. In the dairy cattle data, there were four quantitative traits (milk volume, protein kg, fat% in milk and fertility) measured in 16,214 cattle from two breeds genotyped for 632,002 SNPs. Validation of genomic predictions was in a subset of cattle either from the reference set or in animals from a third breeds that were not in the reference set. In all cases, HyB_BR gave almost identical accuracies to Bayesian mixture models implemented with full MCMC, however computational time was reduced by up to 1/17 of that required by full MCMC. The SNPs with high posterior probability of a non-zero effect were also very similar between full MCMC and HyB_BR, with several known genes affecting milk production in this category, as well as some novel genes. HyB_BR was also applied to seven human diseases with 4890 individuals genotyped for around 300 K SNPs in a case/control design, from the Welcome Trust Case Control Consortium (WTCCC). In this data set, the results demonstrated again that HyB_BR performed as well as Bayesian mixture models with full MCMC for genomic predictions and genetic architecture inference while reducing the computational time from 45 h with full MCMC to 3 h with HyB_BR. The results for quantitative traits in cattle and disease in humans demonstrate that HyB_BR can perform equally well as Bayesian mixture models implemented with full MCMC in terms of prediction accuracy, but with up to 17 times faster than the full MCMC implementations. The HyB_BR algorithm makes simultaneous genomic prediction, QTL mapping and inference of genetic architecture feasible in large genomic data sets.

  10. GPGPU-based explicit finite element computations for applications in biomechanics: the performance of material models, element technologies, and hardware generations.

    PubMed

    Strbac, V; Pierce, D M; Vander Sloten, J; Famaey, N

    2017-12-01

    Finite element (FE) simulations are increasingly valuable in assessing and improving the performance of biomedical devices and procedures. Due to high computational demands such simulations may become difficult or even infeasible, especially when considering nearly incompressible and anisotropic material models prevalent in analyses of soft tissues. Implementations of GPGPU-based explicit FEs predominantly cover isotropic materials, e.g. the neo-Hookean model. To elucidate the computational expense of anisotropic materials, we implement the Gasser-Ogden-Holzapfel dispersed, fiber-reinforced model and compare solution times against the neo-Hookean model. Implementations of GPGPU-based explicit FEs conventionally rely on single-point (under) integration. To elucidate the expense of full and selective-reduced integration (more reliable) we implement both and compare corresponding solution times against those generated using underintegration. To better understand the advancement of hardware, we compare results generated using representative Nvidia GPGPUs from three recent generations: Fermi (C2075), Kepler (K20c), and Maxwell (GTX980). We explore scaling by solving the same boundary value problem (an extension-inflation test on a segment of human aorta) with progressively larger FE meshes. Our results demonstrate substantial improvements in simulation speeds relative to two benchmark FE codes (up to 300[Formula: see text] while maintaining accuracy), and thus open many avenues to novel applications in biomechanics and medicine.

  11. Workplace prevention and promotion strategies.

    PubMed

    Vézina, Michel; Bourbonnais, Renée; Brisson, Chantal; Trudel, Louis

    2004-01-01

    Psychosocial factors refer to all organizational factors and interpersonal relationships in the workplace that may affect the health of the workers. Currently, two psychosocial risk models are universally recognized for producing solid scientific knowledge regarding the vital link between social or psychological phenomena at work and the development of several diseases, such as cardiovascular diseases or depression. The first is the "job demand-contro-support" model, which was defined by Karasek and to which the concept of social support has been added; the second is the "effort/reward imbalance" model defined by Siegrist. The public health perspective calls for theoretical models based on certain psychosocial attributes of the work environment for which there is empirical evidence of their pathogenic potential for exposed workers. Not only do these models reduce the complexity of the psychosocial reality of the work to components that are significant in terms of health risks, but they also facilitate the development and implementation of workplace interventions. Psychosocial risk intervention strategies currently implemented by companies are predominantly individual-oriented and aim chiefly at reducing the effects of stressful work situations by improving individual ability to adapt to the situation and manage stress. Like personal protection equipment for exposure to physical or chemical risks, these secondary prevention measures are commendable but insufficient, because they aim to reduce only the symptoms and not the cause of problems. Any intervention program for these risks should necessarily include a primary prevention component with a view to eliminating, or at least reducing, the psychosocial pathogenic agents in the workplace. Several authors have suggested that well-structured organizational approaches are most effective and should generate more important, longer-lasting effects than individual approaches. However, the evidence should be strengthened by more systematic studies to assess the models, their implementation and the outcomes for employers and employees alike. The research agenda on mental health and the workplace should have the following goals; to foster the development and evaluation of well-adapted models of interventions designed to reduce adverse psychosocial factors and their mental health effects to give a better understanding of the prevalence of work organization risk factors in Canada, how they may be changing and how they affect mental health in the long term to acquire an understanding of the effects on mental health of prominent trends in organizational practices, such as restructuring, lean production and flexible staffing (all of which result in precarious employment), that may pose special risks for women, immigrants or aging workers in Canada to collect data on the considerable direct and indirect costs to business, workers and society of work-related stress in Canada.

  12. Analysis and Prediction of Weather Impacted Ground Stop Operations

    NASA Technical Reports Server (NTRS)

    Wang, Yao Xun

    2014-01-01

    When the air traffic demand is expected to exceed the available airport's capacity for a short period of time, Ground Stop (GS) operations are implemented by Federal Aviation Administration (FAA) Traffic Flow Management (TFM). The GS requires departing aircraft meeting specific criteria to remain on the ground to achieve reduced demands at the constrained destination airport until the end of the GS. This paper provides a high-level overview of the statistical distributions as well as causal factors for the GSs at the major airports in the United States. The GS's character, the weather impact on GSs, GS variations with delays, and the interaction between GSs and Ground Delay Programs (GDPs) at Newark Liberty International Airport (EWR) are investigated. The machine learning methods are used to generate classification models that map the historical airport weather forecast, schedule traffic, and other airport conditions to implemented GS/GDP operations and the models are evaluated using the cross-validations. This modeling approach produced promising results as it yielded an 85% overall classification accuracy to distinguish the implemented GS days from the normal days without GS and GDP operations and a 71% accuracy to differentiate the GS and GDP implemented days from the GDP only days.

  13. The n-by-T Target Discharge Strategy for Inpatient Units.

    PubMed

    Parikh, Pratik J; Ballester, Nicholas; Ramsey, Kylie; Kong, Nan; Pook, Nancy

    2017-07-01

    Ineffective inpatient discharge planning often causes discharge delays and upstream boarding. While an optimal discharge strategy that works across all units at a hospital is likely difficult to identify and implement, a strategy that provides a reasonable target to the discharge team appears feasible. We used observational and retrospective data from an inpatient trauma unit at a Level 2 trauma center in the Midwest US. Our proposed novel n-by-T strategy-discharge n patients by the Tth hour-was evaluated using a validated simulation model. Outcome measures included 2 measures: time-based (mean discharge completion and upstream boarding times) and capacity-based (increase in annual inpatient and upstream bed hours). Data from the pilot implementation of a 2-by-12 strategy at the unit was obtained and analyzed. The model suggested that the 1-by-T and 2-by-T strategies could advance the mean completion times by over 1.38 and 2.72 h, respectively (for 10 AM ≤ T ≤ noon, occupancy rate = 85%); the corresponding mean boarding time reductions were nearly 11% and 15%. These strategies could increase the availability of annual inpatient and upstream bed hours by at least 2,469 and 500, respectively. At 100% occupancy rate, the hospital-favored 2-by-12 strategy reduced the mean boarding time by 26.1%. A pilot implementation of the 2-by-12 strategy at the unit corroborated with the model findings: a 1.98-h advancement in completion times (P<0.0001) and a 14.5% reduction in boarding times (P = 0.027). Target discharge strategies, such as the n-by-T, can help substantially reduce discharge lateness and upstream boarding, especially during high unit occupancy. To sustain implementation, necessary commitment from the unit staff and physicians is vital, and may require some training.

  14. A Computing based Simulation Model for Missile Guidance in Planar Domain

    NASA Astrophysics Data System (ADS)

    Chauhan, Deepak Singh; Sharma, Rajiv

    2017-10-01

    This paper presents the design, development and implementation of a computing based simulation model for interceptor missile guidance for countering an anti-ship missile through a navigation law. It investigates the possibility of deriving, testing and implementing an efficient variation of the PN and RPN laws. A new guidance law [true combined proportional navigation (TCPN) guidance law] that combines the strengths of both the PN and RPN and has a superior capturability in a specified zone of interest is presented in this paper. The presented proportional navigation (PN) guidance law is modeled in a two dimensional planar engagement model and its performance is studied with respect to a varying navigation ratio (N) that is dependent on the `heading error (HE)' and missile lead angle. The advantage of varying navigation ratio is: if N' > 2, Vc > 0, Vm > 0, then the sign of navigation ratio is determined by cos (ɛ + HE) and for cos (ɛ + HE) ≥ 0 and N > 0, the formulation reduces to that of PN and for cos (ɛ + HE) < 0 and N < 0, the formulation reduces to that of RPN. Hence, depending upon the values of cos (ɛ + HE) the presented navigation guidance strategy is shuffled between the PN navigation ratio and the RPN navigation ratio. The theoretical framework of TCPN guidance law is implemented in two dimensional setting of parameters. An important feature of TCPN is the HE and the aim is to achieve lower values of the heading error in simulation. The presented results in this paper show the efficiency of simulation model and also establish that TCPN can be an accurate guidance strategy that has its own range of application and suitability.

  15. In silico regenerative medicine: how computational tools allow regulatory and financial challenges to be addressed in a volatile market

    PubMed Central

    Geris, L.; Guyot, Y.; Schrooten, J.; Papantoniou, I.

    2016-01-01

    The cell therapy market is a highly volatile one, due to the use of disruptive technologies, the current economic situation and the small size of the market. In such a market, companies as well as academic research institutes are in need of tools to advance their understanding and, at the same time, reduce their R&D costs, increase product quality and productivity, and reduce the time to market. An additional difficulty is the regulatory path that needs to be followed, which is challenging in the case of cell-based therapeutic products and should rely on the implementation of quality by design (QbD) principles. In silico modelling is a tool that allows the above-mentioned challenges to be addressed in the field of regenerative medicine. This review discusses such in silico models and focuses more specifically on the bioprocess. Three (clusters of) examples related to this subject are discussed. The first example comes from the pharmaceutical engineering field where QbD principles and their implementation through the use of in silico models are both a regulatory and economic necessity. The second example is related to the production of red blood cells. The described in silico model is mainly used to investigate the manufacturing process of the cell-therapeutic product, and pays special attention to the economic viability of the process. Finally, we describe the set-up of a model capturing essential events in the development of a tissue-engineered combination product in the context of bone tissue engineering. For each of the examples, a short introduction to some economic aspects is given, followed by a description of the in silico tool or tools that have been developed to allow the implementation of QbD principles and optimal design. PMID:27051516

  16. In silico regenerative medicine: how computational tools allow regulatory and financial challenges to be addressed in a volatile market.

    PubMed

    Geris, L; Guyot, Y; Schrooten, J; Papantoniou, I

    2016-04-06

    The cell therapy market is a highly volatile one, due to the use of disruptive technologies, the current economic situation and the small size of the market. In such a market, companies as well as academic research institutes are in need of tools to advance their understanding and, at the same time, reduce their R&D costs, increase product quality and productivity, and reduce the time to market. An additional difficulty is the regulatory path that needs to be followed, which is challenging in the case of cell-based therapeutic products and should rely on the implementation of quality by design (QbD) principles. In silico modelling is a tool that allows the above-mentioned challenges to be addressed in the field of regenerative medicine. This review discusses such in silico models and focuses more specifically on the bioprocess. Three (clusters of) examples related to this subject are discussed. The first example comes from the pharmaceutical engineering field where QbD principles and their implementation through the use of in silico models are both a regulatory and economic necessity. The second example is related to the production of red blood cells. The described in silico model is mainly used to investigate the manufacturing process of the cell-therapeutic product, and pays special attention to the economic viability of the process. Finally, we describe the set-up of a model capturing essential events in the development of a tissue-engineered combination product in the context of bone tissue engineering. For each of the examples, a short introduction to some economic aspects is given, followed by a description of the in silico tool or tools that have been developed to allow the implementation of QbD principles and optimal design.

  17. Energy-absorbing car seat designs for reducing whiplash.

    PubMed

    Himmetoglu, S; Acar, M; Bouazza-Marouf, K; Taylor, A J

    2008-12-01

    This study presents an investigation of anti-whiplash features that can be implemented in a car seat to reduce whiplash injuries in the case of a rear impact. The main emphasis is on achieving a seat design with good energy absorption properties. A biofidelic 50th percentile male multi-body human model for rear impact is developed to evaluate the performance of car seat design concepts. The model is validated using the responses of 7 volunteers from the Japanese Automobile Research Institute (JARI) sled tests, which were performed at an impact speed of 8 kph with a rigid seat and without head restraint and seatbelt. A generic multi-body car seat model is also developed to implement various seatback and recliner properties, anti-whiplash devices, and head restraints. Using the same driving posture and the rigid seat in the JARI sled tests as the basic configuration, several anti-whiplash seats are designed to allow different types of motion for the seatback and seat-pan. The anti-whiplash car seat design concepts limit neck internal motion successfully until the head-to-head restraint contact occurs and they exhibit low NIC(max) values (7 m(2)/s(2) on average). They are also effective in reducing neck compression forces and T1 forward accelerations. In principle, these car seat design concepts employ controlled recliner rotation and seat-pan displacement to limit the formation of S-shape. This is accomplished by using anti-whiplash devices that absorb the crash energy in such a way that an optimum protection is provided at different severities. The results indicate that the energy absorbing car seat design concepts all demonstrate good whiplash-reducing performances at the IIWPG standard pulse. Especially in higher severity rear impacts, two of the car seat design concepts reduce the ramping of the occupant considerably.

  18. Sparsity enabled cluster reduced-order models for control

    NASA Astrophysics Data System (ADS)

    Kaiser, Eurika; Morzyński, Marek; Daviller, Guillaume; Kutz, J. Nathan; Brunton, Bingni W.; Brunton, Steven L.

    2018-01-01

    Characterizing and controlling nonlinear, multi-scale phenomena are central goals in science and engineering. Cluster-based reduced-order modeling (CROM) was introduced to exploit the underlying low-dimensional dynamics of complex systems. CROM builds a data-driven discretization of the Perron-Frobenius operator, resulting in a probabilistic model for ensembles of trajectories. A key advantage of CROM is that it embeds nonlinear dynamics in a linear framework, which enables the application of standard linear techniques to the nonlinear system. CROM is typically computed on high-dimensional data; however, access to and computations on this full-state data limit the online implementation of CROM for prediction and control. Here, we address this key challenge by identifying a small subset of critical measurements to learn an efficient CROM, referred to as sparsity-enabled CROM. In particular, we leverage compressive measurements to faithfully embed the cluster geometry and preserve the probabilistic dynamics. Further, we show how to identify fewer optimized sensor locations tailored to a specific problem that outperform random measurements. Both of these sparsity-enabled sensing strategies significantly reduce the burden of data acquisition and processing for low-latency in-time estimation and control. We illustrate this unsupervised learning approach on three different high-dimensional nonlinear dynamical systems from fluids with increasing complexity, with one application in flow control. Sparsity-enabled CROM is a critical facilitator for real-time implementation on high-dimensional systems where full-state information may be inaccessible.

  19. Implementation Intentions Reduce Implicit Stereotype Activation and Application.

    PubMed

    Rees, Heather Rose; Rivers, Andrew Michael; Sherman, Jeffrey W

    2018-05-01

    Research has found that implementation intentions, if-then action plans (e.g., "if I see a Black face, I will think safe"), reduce stereotyping on implicit measures. However, it is unknown by what process(es) implementation intentions reduce implicit stereotyping. The present research examines the effects of implementation intentions on stereotype activation (e.g., extent to which stereotypic information is accessible) and stereotype application (e.g., extent to which accessible stereotypes are applied in judgment). In addition, we assessed the efficiency of implementation intentions by manipulating cognitive resources (e.g., digit-span, restricted response window) while participants made judgments on an implicit stereotyping measure. Across four studies, implementation intentions reduced implicit stereotyping. This decrease in stereotyping was associated with reductions in both stereotype activation and application. In addition, these effects of implementation intentions were highly efficient and associated with reduced stereotyping even for groups for which people may have little practice inhibiting stereotypes (e.g., gender).

  20. Implementation and on-sky results of an optimal wavefront controller for the MMT NGS adaptive optics system

    NASA Astrophysics Data System (ADS)

    Powell, Keith B.; Vaitheeswaran, Vidhya

    2010-07-01

    The MMT observatory has recently implemented and tested an optimal wavefront controller for the NGS adaptive optics system. Open loop atmospheric data collected at the telescope is used as the input to a MATLAB based analytical model. The model uses nonlinear constrained minimization to determine controller gains and optimize the system performance. The real-time controller performing the adaptive optics close loop operation is implemented on a dedicated high performance PC based quad core server. The controller algorithm is written in C and uses the GNU scientific library for linear algebra. Tests at the MMT confirmed the optimal controller significantly reduced the residual RMS wavefront compared with the previous controller. Significant reductions in image FWHM and increased peak intensities were obtained in J, H and K-bands. The optimal PID controller is now operating as the baseline wavefront controller for the MMT NGS-AO system.

  1. Employee Retention: A Challenge of the Nineties.

    ERIC Educational Resources Information Center

    Zeiss, Tony

    1990-01-01

    Considers ways in which community colleges can help employers implement programs to improve the work environment and retain trained workers. Presents a model for employee retention that has worked effectively in Pueblo, Colorado. Describes Pueblo Community College's cooperative program with the Wats Marketing Group to help reduce employee…

  2. Community Intervention Model to Reduce Inappropriate Antibiotic Use

    ERIC Educational Resources Information Center

    Alder, Stephen; Wuthrich, Amy; Haddadin, Bassam; Donnelly, Sharon; Hannah, Elizabeth Lyon; Stoddard, Greg; Benuzillo, Jose; Bateman, Kim; Samore, Matthew

    2010-01-01

    Background: The Inter-Mountain Project on Antibiotic Resistance and Therapy (IMPART) is an intervention that addresses emerging antimicrobial resistance and the reduction of unnecessary antimicrobial use. Purpose: This study assesses the design and implementation of the community intervention component of IMPART. Methods: The study was conducted…

  3. Effects of conservation practices on phosphorus loss reduction from an Indiana agricultural watershed

    USDA-ARS?s Scientific Manuscript database

    Phosphorus losses from agricultural lands have caused serious eutrophication problems, particularly in Lake Erie. However, techniques that can effectively reduce total and soluble phosphorus losses from croplands and drainage channels can be difficult to implement and gauge. This modeling study was ...

  4. Child Support Collection: A Stick-and-Carrot Approach.

    ERIC Educational Resources Information Center

    Cullen, Francis T.; And Others

    1980-01-01

    New York State's attempt to reduce welfare expenditures by collecting the child support payments of defaulting parents reinforces federal incentives containing penalties for localities operating ineffective collection programs. The state's program may serve as a model for the more effective implementation of legislation in other jurisdictions.…

  5. Optimizing Blocking and Nonblocking Reduction Operations for Multicore Systems: Hierarchical Design and Implementation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorentla Venkata, Manjunath; Shamis, Pavel; Graham, Richard L

    2013-01-01

    Many scientific simulations, using the Message Passing Interface (MPI) programming model, are sensitive to the performance and scalability of reduction collective operations such as MPI Allreduce and MPI Reduce. These operations are the most widely used abstractions to perform mathematical operations over all processes that are part of the simulation. In this work, we propose a hierarchical design to implement the reduction operations on multicore systems. This design aims to improve the efficiency of reductions by 1) tailoring the algorithms and customizing the implementations for various communication mechanisms in the system 2) providing the ability to configure the depth ofmore » hierarchy to match the system architecture, and 3) providing the ability to independently progress each of this hierarchy. Using this design, we implement MPI Allreduce and MPI Reduce operations (and its nonblocking variants MPI Iallreduce and MPI Ireduce) for all message sizes, and evaluate on multiple architectures including InfiniBand and Cray XT5. We leverage and enhance our existing infrastructure, Cheetah, which is a framework for implementing hierarchical collective operations to implement these reductions. The experimental results show that the Cheetah reduction operations outperform the production-grade MPI implementations such as Open MPI default, Cray MPI, and MVAPICH2, demonstrating its efficiency, flexibility and portability. On Infini- Band systems, with a microbenchmark, a 512-process Cheetah nonblocking Allreduce and Reduce achieves a speedup of 23x and 10x, respectively, compared to the default Open MPI reductions. The blocking variants of the reduction operations also show similar performance benefits. A 512-process nonblocking Cheetah Allreduce achieves a speedup of 3x, compared to the default MVAPICH2 Allreduce implementation. On a Cray XT5 system, a 6144-process Cheetah Allreduce outperforms the Cray MPI by 145%. The evaluation with an application kernel, Conjugate Gradient solver, shows that the Cheetah reductions speeds up total time to solution by 195%, demonstrating the potential benefits for scientific simulations.« less

  6. Towards Behavioral Reflexion Models

    NASA Technical Reports Server (NTRS)

    Ackermann, Christopher; Lindvall, Mikael; Cleaveland, Rance

    2009-01-01

    Software architecture has become essential in the struggle to manage today s increasingly large and complex systems. Software architecture views are created to capture important system characteristics on an abstract and, thus, comprehensible level. As the system is implemented and later maintained, it often deviates from the original design specification. Such deviations can have implication for the quality of the system, such as reliability, security, and maintainability. Software architecture compliance checking approaches, such as the reflexion model technique, have been proposed to address this issue by comparing the implementation to a model of the systems architecture design. However, architecture compliance checking approaches focus solely on structural characteristics and ignore behavioral conformance. This is especially an issue in Systems-of- Systems. Systems-of-Systems (SoS) are decompositions of large systems, into smaller systems for the sake of flexibility. Deviations of the implementation to its behavioral design often reduce the reliability of the entire SoS. An approach is needed that supports the reasoning about behavioral conformance on architecture level. In order to address this issue, we have developed an approach for comparing the implementation of a SoS to an architecture model of its behavioral design. The approach follows the idea of reflexion models and adopts it to support the compliance checking of behaviors. In this paper, we focus on sequencing properties as they play an important role in many SoS. Sequencing deviations potentially have a severe impact on the SoS correctness and qualities. The desired behavioral specification is defined in UML sequence diagram notation and behaviors are extracted from the SoS implementation. The behaviors are then mapped to the model of the desired behavior and the two are compared. Finally, a reflexion model is constructed that shows the deviations between behavioral design and implementation. This paper discusses the approach and shows how it can be applied to investigate reliability issues in SoS.

  7. Reduced Order Podolsky Model

    NASA Astrophysics Data System (ADS)

    Thibes, Ronaldo

    2017-02-01

    We perform the canonical and path integral quantizations of a lower-order derivatives model describing Podolsky's generalized electrodynamics. The physical content of the model shows an auxiliary massive vector field coupled to the usual electromagnetic field. The equivalence with Podolsky's original model is studied at classical and quantum levels. Concerning the dynamical time evolution, we obtain a theory with two first-class and two second-class constraints in phase space. We calculate explicitly the corresponding Dirac brackets involving both vector fields. We use the Senjanovic procedure to implement the second-class constraints and the Batalin-Fradkin-Vilkovisky path integral quantization scheme to deal with the symmetries generated by the first-class constraints. The physical interpretation of the results turns out to be simpler due to the reduced derivatives order permeating the equations of motion, Dirac brackets and effective action.

  8. A computational future for preventing HIV in minority communities: how advanced technology can improve implementation of effective programs.

    PubMed

    Brown, C Hendricks; Mohr, David C; Gallo, Carlos G; Mader, Christopher; Palinkas, Lawrence; Wingood, Gina; Prado, Guillermo; Kellam, Sheppard G; Pantin, Hilda; Poduska, Jeanne; Gibbons, Robert; McManus, John; Ogihara, Mitsunori; Valente, Thomas; Wulczyn, Fred; Czaja, Sara; Sutcliffe, Geoff; Villamar, Juan; Jacobs, Christopher

    2013-06-01

    African Americans and Hispanics in the United States have much higher rates of HIV than non-minorities. There is now strong evidence that a range of behavioral interventions are efficacious in reducing sexual risk behavior in these populations. Although a handful of these programs are just beginning to be disseminated widely, we still have not implemented effective programs to a level that would reduce the population incidence of HIV for minorities. We proposed that innovative approaches involving computational technologies be explored for their use in both developing new interventions and in supporting wide-scale implementation of effective behavioral interventions. Mobile technologies have a place in both of these activities. First, mobile technologies can be used in sensing contexts and interacting to the unique preferences and needs of individuals at times where intervention to reduce risk would be most impactful. Second, mobile technologies can be used to improve the delivery of interventions by facilitators and their agencies. Systems science methods including social network analysis, agent-based models, computational linguistics, intelligent data analysis, and systems and software engineering all have strategic roles that can bring about advances in HIV prevention in minority communities. Using an existing mobile technology for depression and 3 effective HIV prevention programs, we illustrated how 8 areas in the intervention/implementation process can use innovative computational approaches to advance intervention adoption, fidelity, and sustainability.

  9. Automated drug dispensing systems in the intensive care unit: a financial analysis.

    PubMed

    Chapuis, Claire; Bedouch, Pierrick; Detavernier, Maxime; Durand, Michel; Francony, Gilles; Lavagne, Pierre; Foroni, Luc; Albaladejo, Pierre; Allenet, Benoit; Payen, Jean-Francois

    2015-09-09

    To evaluate the economic impact of automated-drug dispensing systems (ADS) in surgical intensive care units (ICUs). A financial analysis was conducted in three adult ICUs of one university hospital, where ADS were implemented, one in each unit, to replace the traditional floor stock system. Costs were estimated before and after implementation of the ADS on the basis of floor stock inventories, expired drugs, and time spent by nurses and pharmacy technicians on medication-related work activities. A financial analysis was conducted that included operating cash flows, investment cash flows, global cash flow and net present value. After ADS implementation, nurses spent less time on medication-related activities with an average of 14.7 hours saved per day/33 beds. Pharmacy technicians spent more time on floor-stock activities with an average of 3.5 additional hours per day across the three ICUs. The cost of drug storage was reduced by €44,298 and the cost of expired drugs was reduced by €14,772 per year across the three ICUs. Five years after the initial investment, the global cash flow was €148,229 and the net present value of the project was positive by €510,404. The financial modeling of the ADS implementation in three ICUs showed a high return on investment for the hospital. Medication-related costs and nursing time dedicated to medications are reduced with ADS.

  10. Tuberculosis in a South African prison – a transmission modelling analysis

    PubMed Central

    Johnstone-Robertson, Simon; Lawn, Stephen D; Welte, Alex; Bekker, Linda-Gail; Wood, Robin

    2015-01-01

    Background Prisons are recognised internationally as institutions with very high tuberculosis (TB) burdens where transmission is predominantly determined by contact between infectious and susceptible prisoners. A recent South African court case described the conditions under which prisoners awaiting trial were kept. With the use of these data, a mathematical model was developed to explore the interactions between incarceration conditions and TB control measures. Methods Cell dimensions, cell occupancy, lock-up time, TB incidence and treatment delays were derived from court evidence and judicial reports. Using the Wells-Riley equation and probability analyses of contact between prisoners, we estimated the current TB transmission probability within prison cells, and estimated transmission probabilities of improved levels of case finding in combination with implementation of national and international minimum standards for incarceration. Results Levels of overcrowding (230%) in communal cells and poor TB case finding result in annual TB transmission risks of 90% per annum. Implementing current national or international cell occupancy recommendations would reduce TB transmission probabilities by 30% and 50%, respectively. Improved passive case finding, modest ventilation increase or decreased lock-up time would minimally impact on transmission if introduced individually. However, active case finding together with implementation of minimum national and international standards of incarceration could reduce transmission by 50% and 94%, respectively. Conclusions Current conditions of detention for awaiting-trial prisoners are highly conducive for spread of drug-sensitive and drug-resistant TB. Combinations of simple well-established scientific control measures should be implemented urgently. PMID:22272961

  11. The effect of business improvement districts on the incidence of violent crimes

    PubMed Central

    Golinelli, Daniela; Stokes, Robert J; Bluthenthal, Ricky

    2010-01-01

    Objective To examine whether business improvement districts (BID) contributed to greater than expected declines in the incidence of violent crimes in affected neighbourhoods. Method A Bayesian hierarchical model was used to assess the changes in the incidence of violent crimes between 1994 and 2005 and the implementation of 30 BID in Los Angeles neighbourhoods. Results The implementation of BID was associated with a 12% reduction in the incidence of robbery (95% posterior probability interval −2 to 24) and an 8% reduction in the total incidence of violent crimes (95% posterior probability interval −5 to 21). The strength of the effect of BID on robbery crimes varied by location. Conclusion These findings indicate that the implementation of BID can reduce the incidence of violent crimes likely to result in injury to individuals. The findings also indicate that the establishment of a BID by itself is not a panacea, and highlight the importance of targeting BID efforts to crime prevention interventions that reduce violence exposure associated with criminal behaviours. PMID:20587814

  12. Reducing young driver road trauma: guidance and optimism for the future

    PubMed Central

    Senserrick, T M

    2006-01-01

    This paper highlights lessons from each of the Expert Panel papers in the present supplement that provide guidance for future research and initiatives. Although some shortfalls still remain in our understanding, it is argued that much has been learned and we are ready for more translation, implementation, and evaluation of multilevel interventions to help reduce young driver road trauma. Non‐use of restraints, speeding, driving at night and with passengers, and fatigue are highlighted as key risk factors to address. “Best practice” intervention is proposed as implementing and strengthening graduated driver licensing systems and complementary candidate programs and research, such as hazard perception training programs. A schematic cognitive‐perceptual model to explain the crash sequence process is explored. There is optimism that meaningful impacts can be made, especially coupled with the advances in vehicle technologies, but caution is necessary in the absence of targeted “real world” evaluations and broader implementation and diffusion strategies. PMID:16788114

  13. Integrating watershed hydrology and economics to establish a local market for water quality improvement: A field experiment.

    PubMed

    Uchida, Emi; Swallow, Stephen K; Gold, Arthur; Opaluch, James; Kafle, Achyut; Merrill, Nathaniel; Michaud, Clayton; Gill, Carrie Anne

    2018-04-01

    Innovative market mechanisms are being increasingly recognized as effective decision-making institutions to incorporate the value of ecosystem services into the economy. We present a field experiment that integrates an economic auction and a biophysical water flux model to develop a local market process consisting of both the supply and demand sides. On the supply side, we operate an auction with small-scale livestock owners who bid for contracts to implement site-specific manure management practices that reduce phosphorus loadings to a major reservoir. On the demand side, we implement a real money, multi-unit public good auction for these contracts with residents who potentially benefit from reduced water quality risks. The experiments allow us to construct supply and demand curves to find an equilibrium price for water quality improvement. The field experiments provide a proof-of-concept for practical implementation of a local market for environmental improvements, even for the challenging context of nonpoint pollution.

  14. Technical and Economic Assessment of the Implementation of Measures for Reducing Energy Losses in Distribution Systems

    NASA Astrophysics Data System (ADS)

    Aguila, Alexander; Wilson, Jorge

    2017-07-01

    This paper develops a methodology to assess a group of measures of electrical improvements in distribution systems, starting from the complementation of technical and economic criteria. In order to solve the problem of energy losses in distribution systems, technical and economic analysis was performed based on a mathematical model to establish a direct relationship between the energy saved by way of minimized losses and the costs of implementing the proposed measures. This paper aims at analysing the feasibility of reducing energy losses in distribution systems, by changing existing network conductors by larger crosssection conductors and distribution voltage change at higher levels. The impact of this methodology provides a highly efficient mathematical tool for analysing the feasibility of implementing improvement projects based on their costs which is a very useful tool for the distribution companies that will serve as a starting point to the analysis for this type of projects in distribution systems.

  15. Simulation of subwavelength metallic gratings using a new implementation of the recursive convolution finite-difference time-domain algorithm.

    PubMed

    Banerjee, Saswatee; Hoshino, Tetsuya; Cole, James B

    2008-08-01

    We introduce a new implementation of the finite-difference time-domain (FDTD) algorithm with recursive convolution (RC) for first-order Drude metals. We implemented RC for both Maxwell's equations for light polarized in the plane of incidence (TM mode) and the wave equation for light polarized normal to the plane of incidence (TE mode). We computed the Drude parameters at each wavelength using the measured value of the dielectric constant as a function of the spatial and temporal discretization to ensure both the accuracy of the material model and algorithm stability. For the TE mode, where Maxwell's equations reduce to the wave equation (even in a region of nonuniform permittivity) we introduced a wave equation formulation of RC-FDTD. This greatly reduces the computational cost. We used our methods to compute the diffraction characteristics of metallic gratings in the visible wavelength band and compared our results with frequency-domain calculations.

  16. The effect of business improvement districts on the incidence of violent crimes.

    PubMed

    MacDonald, John; Golinelli, Daniela; Stokes, Robert J; Bluthenthal, Ricky

    2010-10-01

    To examine whether business improvement districts (BID) contributed to greater than expected declines in the incidence of violent crimes in affected neighbourhoods. A Bayesian hierarchical model was used to assess the changes in the incidence of violent crimes between 1994 and 2005 and the implementation of 30 BID in Los Angeles neighbourhoods. The implementation of BID was associated with a 12% reduction in the incidence of robbery (95% posterior probability interval -2 to 24) and an 8% reduction in the total incidence of violent crimes (95% posterior probability interval -5 to 21). The strength of the effect of BID on robbery crimes varied by location. These findings indicate that the implementation of BID can reduce the incidence of violent crimes likely to result in injury to individuals. The findings also indicate that the establishment of a BID by itself is not a panacea, and highlight the importance of targeting BID efforts to crime prevention interventions that reduce violence exposure associated with criminal behaviours.

  17. Proper Orthogonal Decomposition in Optimal Control of Fluids

    NASA Technical Reports Server (NTRS)

    Ravindran, S. S.

    1999-01-01

    In this article, we present a reduced order modeling approach suitable for active control of fluid dynamical systems based on proper orthogonal decomposition (POD). The rationale behind the reduced order modeling is that numerical simulation of Navier-Stokes equations is still too costly for the purpose of optimization and control of unsteady flows. We examine the possibility of obtaining reduced order models that reduce computational complexity associated with the Navier-Stokes equations while capturing the essential dynamics by using the POD. The POD allows extraction of certain optimal set of basis functions, perhaps few, from a computational or experimental data-base through an eigenvalue analysis. The solution is then obtained as a linear combination of these optimal set of basis functions by means of Galerkin projection. This makes it attractive for optimal control and estimation of systems governed by partial differential equations. We here use it in active control of fluid flows governed by the Navier-Stokes equations. We show that the resulting reduced order model can be very efficient for the computations of optimization and control problems in unsteady flows. Finally, implementational issues and numerical experiments are presented for simulations and optimal control of fluid flow through channels.

  18. Improving health systems performance in low- and middle-income countries: a system dynamics model of the pay-for-performance initiative in Afghanistan.

    PubMed

    Alonge, O; Lin, S; Igusa, T; Peters, D H

    2017-12-01

    System dynamics methods were used to explore effective implementation pathways for improving health systems performance through pay-for-performance (P4P) schemes. A causal loop diagram was developed to delineate primary causal relationships for service delivery within primary health facilities. A quantitative stock-and-flow model was developed next. The stock-and-flow model was then used to simulate the impact of various P4P implementation scenarios on quality and volume of services. Data from the Afghanistan national facility survey in 2012 was used to calibrate the model. The models show that P4P bonuses could increase health workers' motivation leading to higher levels of quality and volume of services. Gaming could reduce or even reverse this desired effect, leading to levels of quality and volume of services that are below baseline levels. Implementation issues, such as delays in the disbursement of P4P bonuses and low levels of P4P bonuses, also reduce the desired effect of P4P on quality and volume, but they do not cause the outputs to fall below baseline levels. Optimal effect of P4P on quality and volume of services is obtained when P4P bonuses are distributed per the health workers' contributions to the services that triggered the payments. Other distribution algorithms such as equal allocation or allocations proportionate to salaries resulted in quality and volume levels that were substantially lower, sometimes below baseline. The system dynamics models served to inform, with quantitative results, the theory of change underlying P4P intervention. Specific implementation strategies, such as prompt disbursement of adequate levels of performance bonus distributed per health workers' contribution to service, increase the likelihood of P4P success. Poorly designed P4P schemes, such as those without an optimal algorithm for distributing performance bonuses and adequate safeguards for gaming, can have a negative overall impact on health service delivery systems. © The Author 2017. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine.

  19. The Stochastic Parcel Model: A deterministic parameterization of stochastically entraining convection

    DOE PAGES

    Romps, David M.

    2016-03-01

    Convective entrainment is a process that is poorly represented in existing convective parameterizations. By many estimates, convective entrainment is the leading source of error in global climate models. As a potential remedy, an Eulerian implementation of the Stochastic Parcel Model (SPM) is presented here as a convective parameterization that treats entrainment in a physically realistic and computationally efficient way. Drawing on evidence that convecting clouds comprise air parcels subject to Poisson-process entrainment events, the SPM calculates the deterministic limit of an infinite number of such parcels. For computational efficiency, the SPM groups parcels at each height by their purity, whichmore » is a measure of their total entrainment up to that height. This reduces the calculation of convective fluxes to a sequence of matrix multiplications. The SPM is implemented in a single-column model and compared with a large-eddy simulation of deep convection.« less

  20. Implicit solvation model for density-functional study of nanocrystal surfaces and reaction pathways

    NASA Astrophysics Data System (ADS)

    Mathew, Kiran; Sundararaman, Ravishankar; Letchworth-Weaver, Kendra; Arias, T. A.; Hennig, Richard G.

    2014-02-01

    Solid-liquid interfaces are at the heart of many modern-day technologies and provide a challenge to many materials simulation methods. A realistic first-principles computational study of such systems entails the inclusion of solvent effects. In this work, we implement an implicit solvation model that has a firm theoretical foundation into the widely used density-functional code Vienna ab initio Software Package. The implicit solvation model follows the framework of joint density functional theory. We describe the framework, our algorithm and implementation, and benchmarks for small molecular systems. We apply the solvation model to study the surface energies of different facets of semiconducting and metallic nanocrystals and the SN2 reaction pathway. We find that solvation reduces the surface energies of the nanocrystals, especially for the semiconducting ones and increases the energy barrier of the SN2 reaction.

  1. The Health Equity and Effectiveness of Policy Options to Reduce Dietary Salt Intake in England: Policy Forecast

    PubMed Central

    Gillespie, Duncan O. S.; Allen, Kirk; Guzman-Castillo, Maria; Bandosz, Piotr; Moreira, Patricia; McGill, Rory; Anwar, Elspeth; Lloyd-Williams, Ffion; Bromley, Helen; Diggle, Peter J.; Capewell, Simon; O’Flaherty, Martin

    2015-01-01

    Background Public health action to reduce dietary salt intake has driven substantial reductions in coronary heart disease (CHD) over the past decade, but avoidable socio-economic differentials remain. We therefore forecast how further intervention to reduce dietary salt intake might affect the overall level and inequality of CHD mortality. Methods We considered English adults, with socio-economic circumstances (SEC) stratified by quintiles of the Index of Multiple Deprivation. We used IMPACTSEC, a validated CHD policy model, to link policy implementation to salt intake, systolic blood pressure and CHD mortality. We forecast the effects of mandatory and voluntary product reformulation, nutrition labelling and social marketing (e.g., health promotion, education). To inform our forecasts, we elicited experts’ predictions on further policy implementation up to 2020. We then modelled the effects on CHD mortality up to 2025 and simultaneously assessed the socio-economic differentials of effect. Results Mandatory reformulation might prevent or postpone 4,500 (2,900–6,100) CHD deaths in total, with the effect greater by 500 (300–700) deaths or 85% in the most deprived than in the most affluent. Further voluntary reformulation was predicted to be less effective and inequality-reducing, preventing or postponing 1,500 (200–5,000) CHD deaths in total, with the effect greater by 100 (−100–600) deaths or 49% in the most deprived than in the most affluent. Further social marketing and improvements to labelling might each prevent or postpone 400–500 CHD deaths, but minimally affect inequality. Conclusions Mandatory engagement with industry to limit salt in processed-foods appears a promising and inequality-reducing option. For other policy options, our expert-driven forecast warns that future policy implementation might reach more deprived individuals less well, limiting inequality reduction. We therefore encourage planners to prioritise equity. PMID:26131981

  2. A microphysical parameterization of aqSOA and sulfate formation in clouds

    NASA Astrophysics Data System (ADS)

    McVay, Renee; Ervens, Barbara

    2017-07-01

    Sulfate and secondary organic aerosol (cloud aqSOA) can be chemically formed in cloud water. Model implementation of these processes represents a computational burden due to the large number of microphysical and chemical parameters. Chemical mechanisms have been condensed by reducing the number of chemical parameters. Here an alternative is presented to reduce the number of microphysical parameters (number of cloud droplet size classes). In-cloud mass formation is surface and volume dependent due to surface-limited oxidant uptake and/or size-dependent pH. Box and parcel model simulations show that using the effective cloud droplet diameter (proportional to total volume-to-surface ratio) reproduces sulfate and aqSOA formation rates within ≤30% as compared to full droplet distributions; other single diameters lead to much greater deviations. This single-class approach reduces computing time significantly and can be included in models when total liquid water content and effective diameter are available.

  3. Noise Estimation in Electroencephalogram Signal by Using Volterra Series Coefficients

    PubMed Central

    Hassani, Malihe; Karami, Mohammad Reza

    2015-01-01

    The Volterra model is widely used for nonlinearity identification in practical applications. In this paper, we employed Volterra model to find the nonlinearity relation between electroencephalogram (EEG) signal and the noise that is a novel approach to estimate noise in EEG signal. We show that by employing this method. We can considerably improve the signal to noise ratio by the ratio of at least 1.54. An important issue in implementing Volterra model is its computation complexity, especially when the degree of nonlinearity is increased. Hence, in many applications it is urgent to reduce the complexity of computation. In this paper, we use the property of EEG signal and propose a new and good approximation of delayed input signal to its adjacent samples in order to reduce the computation of finding Volterra series coefficients. The computation complexity is reduced by the ratio of at least 1/3 when the filter memory is 3. PMID:26284176

  4. Template-Directed Instrumentation Reduces Cost and Improves Efficiency for Total Knee Arthroplasty: An Economic Decision Analysis and Pilot Study.

    PubMed

    McLawhorn, Alexander S; Carroll, Kaitlin M; Blevins, Jason L; DeNegre, Scott T; Mayman, David J; Jerabek, Seth A

    2015-10-01

    Template-directed instrumentation (TDI) for total knee arthroplasty (TKA) may streamline operating room (OR) workflow and reduce costs by preselecting implants and minimizing instrument tray burden. A decision model simulated the economics of TDI. Sensitivity analyses determined thresholds for model variables to ensure TDI success. A clinical pilot was reviewed. The accuracy of preoperative templates was validated, and 20 consecutive primary TKAs were performed using TDI. The model determined that preoperative component size estimation should be accurate to ±1 implant size for 50% of TKAs to implement TDI. The pilot showed that preoperative template accuracy exceeded 97%. There were statistically significant improvements in OR turnover time and in-room time for TDI compared to an historical cohort of TKAs. TDI reduces costs and improves OR efficiency. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. A Hyperbolic Ontology Visualization Tool for Model Application Programming Interface Documentation

    NASA Technical Reports Server (NTRS)

    Hyman, Cody

    2011-01-01

    Spacecraft modeling, a critically important portion in validating planned spacecraft activities, is currently carried out using a time consuming method of mission to mission model implementations and integration. A current project in early development, Integrated Spacecraft Analysis (ISCA), aims to remedy this hindrance by providing reusable architectures and reducing time spent integrating models with planning and sequencing tools. The principle objective of this internship was to develop a user interface for an experimental ontology-based structure visualization of navigation and attitude control system modeling software. To satisfy this, a number of tree and graph visualization tools were researched and a Java based hyperbolic graph viewer was selected for experimental adaptation. Early results show promise in the ability to organize and display large amounts of spacecraft model documentation efficiently and effectively through a web browser. This viewer serves as a conceptual implementation for future development but trials with both ISCA developers and end users should be performed to truly evaluate the effectiveness of continued development of such visualizations.

  6. Engagement matters: lessons from assessing classroom implementation of steps to respect: a bullying prevention program over a one-year period.

    PubMed

    Low, Sabina; Van Ryzin, Mark J; Brown, Eric C; Smith, Brian H; Haggerty, Kevin P

    2014-04-01

    Steps to Respect: A Bullying Prevention Program (STR) relies on a social-ecological model of prevention to increase school staff awareness and responsiveness, foster socially responsible beliefs among students, and teach social-emotional skills to students to reduce bullying behavior. As part of a school-randomized controlled trial of STR, we examined predictors and outcomes associated with classroom curriculum implementation in intervention schools. Data on classroom implementation (adherence and engagement) were collected from a sample of teachers using a weekly on-line Teacher Implementation Checklist system. Pre-post data related to school bullying-related outcomes were collected from 1,424 students and archival school demographic data were obtained from the National Center for Education Statistics. Results of multilevel analyses indicated that higher levels of program engagement were influenced by school-level percentage of students receiving free/reduced lunch, as well as classroom-level climate indicators. Results also suggest that higher levels of program engagement were related to lower levels of school bullying problems, enhanced school climate and attitudes less supportive of bullying. Predictors and outcomes related to program fidelity (i.e., adherence) were largely nonsignificant. Results suggest that student engagement is a key element of program impact, though implementation is influenced by both school-level demographics and classroom contexts.

  7. A reduced basis approach for implementing thermodynamic phase-equilibria information in geophysical and geodynamic studies

    NASA Astrophysics Data System (ADS)

    Afonso, J. C.; Zlotnik, S.; Diez, P.

    2015-12-01

    We present a flexible, general and efficient approach for implementing thermodynamic phase equilibria information (in the form of sets of physical parameters) into geophysical and geodynamic studies. The approach is based on multi-dimensional decomposition methods, which transform the original multi-dimensional discrete information into a dimensional-separated representation. This representation has the property of increasing the number of coefficients to be stored linearly with the number of dimensions (opposite to a full multi-dimensional cube requiring exponential storage depending on the number of dimensions). Thus, the amount of information to be stored in memory during a numerical simulation or geophysical inversion is drastically reduced. Accordingly, the amount and resolution of the thermodynamic information that can be used in a simulation or inversion increases substantially. In addition, the method is independent of the actual software used to obtain the primary thermodynamic information, and therefore it can be used in conjunction with any thermodynamic modeling program and/or database. Also, the errors associated with the decomposition procedure are readily controlled by the user, depending on her/his actual needs (e.g. preliminary runs vs full resolution runs). We illustrate the benefits, generality and applicability of our approach with several examples of practical interest for both geodynamic modeling and geophysical inversion/modeling. Our results demonstrate that the proposed method is a competitive and attractive candidate for implementing thermodynamic constraints into a broad range of geophysical and geodynamic studies.

  8. Diagnostic schemes for reducing epidemic size of African viral hemorrhagic fever outbreaks.

    PubMed

    Okeke, Iruka N; Manning, Robert S; Pfeiffer, Thomas

    2014-09-12

    Viral hemorrhagic fever (VHF) outbreaks, with high mortality rates, have often been amplified in African health institutions due to person-to-person transmission via infected body fluids.  By collating and analyzing epidemiological data from documented outbreaks, we observed that diagnostic delay contributes to epidemic size for Ebola and Marburg hemorrhagic fever outbreaks. We used a susceptible-exposed-infectious-removed (SEIR) model and data from the 1995 outbreak in Kikwit, Democratic Republic of Congo, to simulate Ebola hemorrhagic fever epidemics. Our model allows us to describe the dynamics for hospital staff separately from that for the general population, and to implement health worker-specific interventions. The model illustrates that implementing World Health Organization/US Centers for Disease Control and Prevention guidelines of isolating patients who do not respond to antimalarial and antibacterial chemotherapy reduces total outbreak size, from a median of 236, by 90% or more. Routinely employing diagnostic testing in post-mortems of patients that died of refractory fevers reduces the median outbreak size by a further 60%. Even greater reductions in outbreak size were seen when all febrile patients were tested for endemic infections or when febrile health-care workers were tested.  The effect of testing strategies was not impaired by the 1-3 day delay that would occur if testing were performed by a reference laboratory. In addition to improving the quality of care for common causes of febrile infections, increased and strategic use of laboratory diagnostics for fever could reduce the chance of hospital amplification of VHFs in resource-limited African health systems.

  9. The what, when, and why of implementation frameworks for evidence-based practices in child welfare and child mental health service systems.

    PubMed

    Hanson, Rochelle F; Self-Brown, Shannon; Rostad, Whitney L; Jackson, Matthew C

    2016-03-01

    It is widely recognized that children in the child welfare system are particularly vulnerable to the adverse health and mental effects associated with exposure to abuse and neglect, making it imperative to have broad-based availability of evidence-based practices (EBPs) that can prevent child maltreatment and reduce the negative mental health outcomes for youth who are victims. A variety of EBPs exist for reducing child maltreatment risk and addressing the associated negative mental health outcomes, but the reach of these practices is limited. An emerging literature documents factors that can enhance or inhibit the success of EBP implementation in community service agencies, including how the selection of a theory-driven conceptual framework, or model, might facilitate implementation planning by providing guidance for best practices during implementation phases. However, limited research is available to guide decision makers in the selection of implementation frameworks that can boost implementation success for EBPs that focus on preventing child welfare recidivism and serving the mental health needs of maltreated youth. The aims of this conceptual paper are to (1) provide an overview of existing implementation frameworks, beginning with a discussion of definitional issues and the selection criteria for frameworks included in the review; and (2) offer recommendations for practice and policy as applicable for professionals and systems serving victims of child maltreatment and their families. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Cost-minimization model of a multidisciplinary antibiotic stewardship team based on a successful implementation on a urology ward of an academic hospital.

    PubMed

    Dik, Jan-Willem H; Hendrix, Ron; Friedrich, Alex W; Luttjeboer, Jos; Panday, Prashant Nannan; Wilting, Kasper R; Lo-Ten-Foe, Jerome R; Postma, Maarten J; Sinha, Bhanu

    2015-01-01

    In order to stimulate appropriate antimicrobial use and thereby lower the chances of resistance development, an Antibiotic Stewardship Team (A-Team) has been implemented at the University Medical Center Groningen, the Netherlands. Focus of the A-Team was a pro-active day 2 case-audit, which was financially evaluated here to calculate the return on investment from a hospital perspective. Effects were evaluated by comparing audited patients with a historic cohort with the same diagnosis-related groups. Based upon this evaluation a cost-minimization model was created that can be used to predict the financial effects of a day 2 case-audit. Sensitivity analyses were performed to deal with uncertainties. Finally, the model was used to financially evaluate the A-Team. One whole year including 114 patients was evaluated. Implementation costs were calculated to be €17,732, which represent total costs spent to implement this A-Team. For this specific patient group admitted to a urology ward and consulted on day 2 by the A-Team, the model estimated total savings of €60,306 after one year for this single department, leading to a return on investment of 5.9. The implemented multi-disciplinary A-Team performing a day 2 case-audit in the hospital had a positive return on investment caused by a reduced length of stay due to a more appropriate antibiotic therapy. Based on the extensive data analysis, a model of this intervention could be constructed. This model could be used by other institutions, using their own data to estimate the effects of a day 2 case-audit in their hospital.

  11. Rapid Deployment of Optimal Control for Building HVAC Systems using Innovative Software Tools and a Hybrid Heuristic/Model Based Control Approach

    DTIC Science & Technology

    2017-03-21

    Energy and Water Projects March 21, 2017 REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting burden for this collection of...included reduced system energy use and cost as well as improved performance driven by autonomous commissioning and optimized system control. In the end...improve system performance and reduce energy use and cost. However, implementing these solutions into the extremely heterogeneous and often

  12. Addressing the Barriers to Agile Development in the Department of Defense: Program Structure, Requirements, and Contracting

    DTIC Science & Technology

    2015-04-30

    approach directly contrast with the traditional DoD acquisition model designed for a single big-bang waterfall approach (Broadus, 2013). Currently...progress, reduce technical and programmatic risk, and respond to feedback and changes more quickly than traditional waterfall methods (Modigliani...requirements, and contracting. The DoD can address these barriers by utilizing a proactively tailored Agile acquisition model , implementing an IT Box

  13. An overview of modelling approaches and potential solution towards an endgame of tobacco

    NASA Astrophysics Data System (ADS)

    Halim, Tisya Farida Abdul; Sapiri, Hasimah; Abidin, Norhaslinda Zainal

    2015-12-01

    A high number of premature mortality due to tobacco use has increased worldwide. Despite control policies being implemented to reduce premature mortality, the rate of smoking prevalence is still high. Moreover, tobacco issues become increasingly difficult since many aspects need to be considered simultaneously. Thus, the purpose of this paper is to present an overview of existing modelling studies on tobacco control system. The background section describes the tobacco issues and its current trends. These models have been categorised according to their modelling approaches either individual or integrated approaches. Next, a framework of modelling approaches based on the integration of multi-criteria decision making, system dynamics and nonlinear programming is proposed, expected to reduce the smoking prevalence. This framework provides guideline for modelling the interaction between smoking behaviour and its impacts, tobacco control policies and the effectiveness of each strategy in healthcare.

  14. Visco-elastic controlled-source full waveform inversion without surface waves

    NASA Astrophysics Data System (ADS)

    Paschke, Marco; Krause, Martin; Bleibinhaus, Florian

    2016-04-01

    We developed a frequency-domain visco-elastic full waveform inversion for onshore seismic experiments with topography. The forward modeling is based on a finite-difference time-domain algorithm by Robertsson that uses the image-method to ensure a stress-free condition at the surface. The time-domain data is Fourier-transformed at every point in the model space during the forward modeling for a given set of frequencies. The motivation for this approach is the reduced amount of memory when computing kernels, and the straightforward implementation of the multiscale approach. For the inversion, we calculate the Frechet derivative matrix explicitly, and we implement a Levenberg-Marquardt scheme that allows for computing the resolution matrix. To reduce the size of the Frechet derivative matrix, and to stabilize the inversion, an adapted inverse mesh is used. The node spacing is controlled by the velocity distribution and the chosen frequencies. To focus the inversion on body waves (P, P-coda, and S) we mute the surface waves from the data. Consistent spatiotemporal weighting factors are applied to the wavefields during the Fourier transform to obtain the corresponding kernels. We test our code with a synthetic study using the Marmousi model with arbitrary topography. This study also demonstrates the importance of topography and muting surface waves in controlled-source full waveform inversion.

  15. Impact of the implementation of rest days in live bird markets on the dynamics of H5N1 highly pathogenic avian influenza.

    PubMed

    Fournié, G; Guitian, F J; Mangtani, P; Ghani, A C

    2011-08-07

    Live bird markets (LBMs) act as a network 'hub' and potential reservoir of infection for domestic poultry. They may therefore be responsible for sustaining H5N1 highly pathogenic avian influenza (HPAI) virus circulation within the poultry sector, and thus a suitable target for implementing control strategies. We developed a stochastic transmission model to understand how market functioning impacts on the transmission dynamics. We then investigated the potential for rest days-periods during which markets are emptied and disinfected-to modulate the dynamics of H5N1 HPAI within the poultry sector using a stochastic meta-population model. Our results suggest that under plausible parameter scenarios, HPAI H5N1 could be sustained silently within LBMs with the time spent by poultry in markets and the frequency of introduction of new susceptible birds' dominant factors determining sustained silent spread. Compared with interventions applied in farms (i.e. stamping out, vaccination), our model shows that frequent rest days are an effective means to reduce HPAI transmission. Furthermore, our model predicts that full market closure would be only slightly more effective than rest days to reduce transmission. Strategies applied within markets could thus help to control transmission of the disease.

  16. Impact of the implementation of rest days in live bird markets on the dynamics of H5N1 highly pathogenic avian influenza

    PubMed Central

    Fournié, G.; Guitian, F. J.; Mangtani, P.; Ghani, A. C.

    2011-01-01

    Live bird markets (LBMs) act as a network ‘hub’ and potential reservoir of infection for domestic poultry. They may therefore be responsible for sustaining H5N1 highly pathogenic avian influenza (HPAI) virus circulation within the poultry sector, and thus a suitable target for implementing control strategies. We developed a stochastic transmission model to understand how market functioning impacts on the transmission dynamics. We then investigated the potential for rest days—periods during which markets are emptied and disinfected—to modulate the dynamics of H5N1 HPAI within the poultry sector using a stochastic meta-population model. Our results suggest that under plausible parameter scenarios, HPAI H5N1 could be sustained silently within LBMs with the time spent by poultry in markets and the frequency of introduction of new susceptible birds' dominant factors determining sustained silent spread. Compared with interventions applied in farms (i.e. stamping out, vaccination), our model shows that frequent rest days are an effective means to reduce HPAI transmission. Furthermore, our model predicts that full market closure would be only slightly more effective than rest days to reduce transmission. Strategies applied within markets could thus help to control transmission of the disease. PMID:21131332

  17. The development and implementation of the Chronic Care Management Programme in Counties Manukau.

    PubMed

    Wellingham, John; Tracey, Jocelyn; Rea, Harold; Gribben, Barry

    2003-02-21

    To develop an effective and efficient process for the seamless delivery of care for targeted patients with specific chronic diseases. To reduce inexplicable variation and maximise use of available resources by implementing evidence-based care processes. To develop a programme that is acceptable and applicable to the Counties Manukau region. A model for the management of people with chronic diseases was developed. Model components and potential interventions were piloted. For each disease project, a return on investment was calculated and external evaluation was undertaken. The initial model was subsequently modified and individual disease projects aligned to it. The final Chronic Care Management model, agreed in September 2001, described a single common process. Key components were the targeting of high risk patients, organisation of cost effective interventions into a system of care, and an integrated care server acting as a data warehouse with a rules engine, providing flags and reminders. Return on investment analysis suggested potential savings for each disease component from $277 to $980 per person per annum. For selected chronic diseases, introduction of an integrated chronic care management programme, based on internationally accepted best practice processes and interventions can make significant savings, reducing morbidity and improving the efficiency of health delivery in the Counties Manukau region.

  18. REVEAL: An Extensible Reduced Order Model Builder for Simulation and Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agarwal, Khushbu; Sharma, Poorva; Ma, Jinliang

    2013-04-30

    Many science domains need to build computationally efficient and accurate representations of high fidelity, computationally expensive simulations. These computationally efficient versions are known as reduced-order models. This paper presents the design and implementation of a novel reduced-order model (ROM) builder, the REVEAL toolset. This toolset generates ROMs based on science- and engineering-domain specific simulations executed on high performance computing (HPC) platforms. The toolset encompasses a range of sampling and regression methods that can be used to generate a ROM, automatically quantifies the ROM accuracy, and provides support for an iterative approach to improve ROM accuracy. REVEAL is designed to bemore » extensible in order to utilize the core functionality with any simulator that has published input and output formats. It also defines programmatic interfaces to include new sampling and regression techniques so that users can ‘mix and match’ mathematical techniques to best suit the characteristics of their model. In this paper, we describe the architecture of REVEAL and demonstrate its usage with a computational fluid dynamics model used in carbon capture.« less

  19. Automating Risk Analysis of Software Design Models

    PubMed Central

    Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  20. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  1. Modeling an alkaline electrolysis cell through reduced-order and loss-estimate approaches

    NASA Astrophysics Data System (ADS)

    Milewski, Jaroslaw; Guandalini, Giulio; Campanari, Stefano

    2014-12-01

    The paper presents two approaches to the mathematical modeling of an Alkaline Electrolyzer Cell. The presented models were compared and validated against available experimental results taken from a laboratory test and against literature data. The first modeling approach is based on the analysis of estimated losses due to the different phenomena occurring inside the electrolytic cell, and requires careful calibration of several specific parameters (e.g. those related to the electrochemical behavior of the electrodes) some of which could be hard to define. An alternative approach is based on a reduced-order equivalent circuit, resulting in only two fitting parameters (electrodes specific resistance and parasitic losses) and calculation of the internal electric resistance of the electrolyte. Both models yield satisfactory results with an average error limited below 3% vs. the considered experimental data and show the capability to describe with sufficient accuracy the different operating conditions of the electrolyzer; the reduced-order model could be preferred thanks to its simplicity for implementation within plant simulation tools dealing with complex systems, such as electrolyzers coupled with storage facilities and intermittent renewable energy sources.

  2. Computationally efficient method for Fourier transform of highly chirped pulses for laser and parametric amplifier modeling.

    PubMed

    Andrianov, Alexey; Szabo, Aron; Sergeev, Alexander; Kim, Arkady; Chvykov, Vladimir; Kalashnikov, Mikhail

    2016-11-14

    We developed an improved approach to calculate the Fourier transform of signals with arbitrary large quadratic phase which can be efficiently implemented in numerical simulations utilizing Fast Fourier transform. The proposed algorithm significantly reduces the computational cost of Fourier transform of a highly chirped and stretched pulse by splitting it into two separate transforms of almost transform limited pulses, thereby reducing the required grid size roughly by a factor of the pulse stretching. The application of our improved Fourier transform algorithm in the split-step method for numerical modeling of CPA and OPCPA shows excellent agreement with standard algorithms.

  3. Use of Six Sigma strategies to pull the line on central line-associated bloodstream infections in a neurotrauma intensive care unit.

    PubMed

    Loftus, Kelli; Tilley, Terry; Hoffman, Jason; Bradburn, Eric; Harvey, Ellen

    2015-01-01

    The creation of a consistent culture of safety and quality in an intensive care unit is challenging. We applied the Six Sigma Define-Measure-Analyze-Improve-Control (DMAIC) model for quality improvement (QI) to develop a long-term solution to improve outcomes in a high-risk neurotrauma intensive care unit. We sought to reduce central line utilization as a cornerstone in preventing central line-associated bloodstream infections (CLABSIs). This study describes the successful application of the DMAIC model in the creation and implementation of evidence-based quality improvement designed to reduce CLABSIs to below national benchmarks.

  4. Model Order Reduction for the fast solution of 3D Stokes problems and its application in geophysical inversion

    NASA Astrophysics Data System (ADS)

    Ortega Gelabert, Olga; Zlotnik, Sergio; Afonso, Juan Carlos; Díez, Pedro

    2017-04-01

    The determination of the present-day physical state of the thermal and compositional structure of the Earth's lithosphere and sub-lithospheric mantle is one of the main goals in modern lithospheric research. All this data is essential to build Earth's evolution models and to reproduce many geophysical observables (e.g. elevation, gravity anomalies, travel time data, heat flow, etc) together with understanding the relationship between them. Determining the lithospheric state involves the solution of high-resolution inverse problems and, consequently, the solution of many direct models is required. The main objective of this work is to contribute to the existing inversion techniques in terms of improving the estimation of the elevation (topography) by including a dynamic component arising from sub-lithospheric mantle flow. In order to do so, we implement an efficient Reduced Order Method (ROM) built upon classic Finite Elements. ROM allows to reduce significantly the computational cost of solving a family of problems, for example all the direct models that are required in the solution of the inverse problem. The strategy of the method consists in creating a (reduced) basis of solutions, so that when a new problem has to be solved, its solution is sought within the basis instead of attempting to solve the problem itself. In order to check the Reduced Basis approach, we implemented the method in a 3D domain reproducing a portion of Earth that covers up to 400 km depth. Within the domain the Stokes equation is solved with realistic viscosities and densities. The different realizations (the family of problems) is created by varying viscosities and densities in a similar way as it would happen in an inversion problem. The Reduced Basis method is shown to be an extremely efficiently solver for the Stokes equation in this context.

  5. Farm-system modeling to evaluate environmental losses, profitability, and best management practice cost-effectiveness

    USDA-ARS?s Scientific Manuscript database

    To meet Chesapeake Bay Total Maximum Daily Load requirements for agricultural pollution, conservation districts and farmers are tasked with implementing best management practices (BMPs) that reduce farm losses of nutrients and sediment. The importance of the agricultural industry to the regional eco...

  6. A School Nurse-Delivered Intervention for Overweight and Obese Adolescents

    ERIC Educational Resources Information Center

    Pbert, Lori; Druker, Susan; Gapinski, Mary A.; Gellar, Lauren; Magner, Robert; Reed, George; Schneider, Kristin; Osganian, Stavroula

    2013-01-01

    Background: Models are needed for implementing weight management interventions for adolescents through readily accessible venues. This study evaluated the feasibility and ef?cacy of a school nurse-delivered intervention in improving diet and activity and reducing body mass index (BMI) among overweight and obese adolescents. Methods: Six high…

  7. Reducing Teacher Stress by Implementing Collaborative Problem Solving in a School Setting

    ERIC Educational Resources Information Center

    Schaubman, Averi; Stetson, Erica; Plog, Amy

    2011-01-01

    Student behavior affects teacher stress levels and the student-teacher relationship. In this pilot study, teachers were trained in Collaborative Problem Solving (CPS), a cognitive-behavioral model that explains challenging behavior as the result of underlying deficits in the areas of flexibility/adaptability, frustration tolerance, and problem…

  8. Simulating runoff from small grazed pasture watersheds located at North Appalachian Experimental Watershed in Ohio

    USDA-ARS?s Scientific Manuscript database

    Runoff from grazing pasture lands can impact water quality in receiving streams if not well managed. Management consists of conservation practices to reduce runoff and pollutants transport. Simulation models have been effectively used to design and implement these conservation practices. The Agricul...

  9. Center for Technology for Advanced Scientific Componet Software (TASCS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Govindaraju, Madhusudhan

    Advanced Scientific Computing Research Computer Science FY 2010Report Center for Technology for Advanced Scientific Component Software: Distributed CCA State University of New York, Binghamton, NY, 13902 Summary The overall objective of Binghamton's involvement is to work on enhancements of the CCA environment, motivated by the applications and research initiatives discussed in the proposal. This year we are working on re-focusing our design and development efforts to develop proof-of-concept implementations that have the potential to significantly impact scientific components. We worked on developing parallel implementations for non-hydrostatic code and worked on a model coupling interface for biogeochemical computations coded in MATLAB.more » We also worked on the design and implementation modules that will be required for the emerging MapReduce model to be effective for scientific applications. Finally, we focused on optimizing the processing of scientific datasets on multi-core processors. Research Details We worked on the following research projects that we are working on applying to CCA-based scientific applications. 1. Non-Hydrostatic Hydrodynamics: Non-static hydrodynamics are significantly more accurate at modeling internal waves that may be important in lake ecosystems. Non-hydrostatic codes, however, are significantly more computationally expensive, often prohibitively so. We have worked with Chin Wu at the University of Wisconsin to parallelize non-hydrostatic code. We have obtained a speed up of about 26 times maximum. Although this is significant progress, we hope to improve the performance further, such that it becomes a practical alternative to hydrostatic codes. 2. Model-coupling for water-based ecosystems: To answer pressing questions about water resources requires that physical models (hydrodynamics) be coupled with biological and chemical models. Most hydrodynamics codes are written in Fortran, however, while most ecologists work in MATLAB. This disconnect creates a great barrier. To address this, we are working on a model coupling interface that will allow biogeochemical computations written in MATLAB to couple with Fortran codes. This will greatly improve the productivity of ecosystem scientists. 2. Low overhead and Elastic MapReduce Implementation Optimized for Memory and CPU-Intensive Applications: Since its inception, MapReduce has frequently been associated with Hadoop and large-scale datasets. Its deployment at Amazon in the cloud, and its applications at Yahoo! for large-scale distributed document indexing and database building, among other tasks, have thrust MapReduce to the forefront of the data processing application domain. The applicability of the paradigm however extends far beyond its use with data intensive applications and diskbased systems, and can also be brought to bear in processing small but CPU intensive distributed applications. MapReduce however carries its own burdens. Through experiments using Hadoop in the context of diverse applications, we uncovered latencies and delay conditions potentially inhibiting the expected performance of a parallel execution in CPU-intensive applications. Furthermore, as it currently stands, MapReduce is favored for data-centric applications, and as such tends to be solely applied to disk-based applications. The paradigm, falls short in bringing its novelty to diskless systems dedicated to in-memory applications, and compute intensive programs processing much smaller data, but requiring intensive computations. In this project, we focused both on the performance of processing large-scale hierarchical data in distributed scientific applications, as well as the processing of smaller but demanding input sizes primarily used in diskless, and memory resident I/O systems. We designed LEMO-MR [1], a Low overhead, elastic, configurable for in- memory applications, and on-demand fault tolerance, an optimized implementation of MapReduce, for both on disk and in memory applications. We conducted experiments to identify not only the necessary components of this model, but also trade offs and factors to be considered. We have initial results to show the efficacy of our implementation in terms of potential speedup that can be achieved for representative data sets used by cloud applications. We have quantified the performance gains exhibited by our MapReduce implementation over Apache Hadoop in a compute intensive environment. 3. Cache Performance Optimization for Processing XML and HDF-based Application Data on Multi-core Processors: It is important to design and develop scientific middleware libraries to harness the opportunities presented by emerging multi-core processors. Implementations of scientific middleware and applications that do not adapt to the programming paradigm when executing on emerging processors can severely impact the overall performance. In this project, we focused on the utilization of the L2 cache, which is a critical shared resource on chip multiprocessors (CMP). The access pattern of the shared L2 cache, which is dependent on how the application schedules and assigns processing work to each thread, can either enhance or hurt the ability to hide memory latency on a multi-core processor. Therefore, while processing scientific datasets such as HDF5, it is essential to conduct fine-grained analysis of cache utilization, to inform scheduling decisions in multi-threaded programming. In this project, using the TAU toolkit for performance feedback from dual- and quad-core machines, we conducted performance analysis and recommendations on how processing threads can be scheduled on multi-core nodes to enhance the performance of a class of scientific applications that requires processing of HDF5 data. In particular, we quantified the gains associated with the use of the adaptations we have made to the Cache-Affinity and Balanced-Set scheduling algorithms to improve L2 cache performance, and hence the overall application execution time [2]. References: 1. Zacharia Fadika, Madhusudhan Govindaraju, ``MapReduce Implementation for Memory-Based and Processing Intensive Applications'', accepted in 2nd IEEE International Conference on Cloud Computing Technology and Science, Indianapolis, USA, Nov 30 - Dec 3, 2010. 2. Rajdeep Bhowmik, Madhusudhan Govindaraju, ``Cache Performance Optimization for Processing XML-based Application Data on Multi-core Processors'', in proceedings of The 10th IEEE/ACM International Symposium on Cluster, Cloud and Grid Computing, May 17-20, 2010, Melbourne, Victoria, Australia. Contact Information: Madhusudhan Govindaraju Binghamton University State University of New York (SUNY) mgovinda@cs.binghamton.edu Phone: 607-777-4904« less

  10. Interdisciplinary modeling and analysis to reduce loss of life from tsunamis

    NASA Astrophysics Data System (ADS)

    Wood, N. J.

    2016-12-01

    Recent disasters have demonstrated the significant loss of life and community impacts that can occur from tsunamis. Minimizing future losses requires an integrated understanding of the range of potential tsunami threats, how individuals are specifically vulnerable to these threats, what is currently in place to improve their chances of survival, and what risk-reduction efforts could be implemented. This presentation will provide a holistic perspective of USGS research enabled by recent advances in geospatial modeling to assess and communicate population vulnerability to tsunamis and the range of possible interventions to reduce it. Integrated research includes efforts to characterize the magnitude and demography of at-risk individuals in tsunami-hazard zones, their evacuation potential based on landscape conditions, nature-based mitigation to improve evacuation potential, evacuation pathways and population demand at assembly areas, siting considerations for vertical-evacuation refuges, community implications of multiple evacuation zones, car-based evacuation modeling for distant tsunamis, and projected changes in population exposure to tsunamis over time. Collectively, this interdisciplinary research supports emergency managers in their efforts to implement targeted risk-reduction efforts based on local conditions and needs, instead of generic regional strategies that only focus on hazard attributes.

  11. From innovation to implementation: the long and winding road.

    PubMed

    Galavotti, Christine; Kuhlmann, Anne K Sebert; Kraft, Joan Marie; Harford, Nicola; Petraglia, Joseph

    2008-06-01

    Building on theory and past research, in early 2000 scientists in the Division of Reproductive Health developed a prevention innovation for CDC's Global AIDS Program for use in countries severely affected by the HIV/AIDS epidemic. This innovative program model is called MARCH: Modeling and Reinforcement to Combat HIV/AIDS (Galavotti et al. Am J Public Health 91:1602-1607, 2001). MARCH promotes behavioral changes that reduce the risk of HIV infection and creates normative environments that sustain these changes through two key program components: entertainment-education using mass media, particularly long-running radio serial dramas, and reinforcement activities at the community level. Using the framework developed by Wandersman et al. (Am J Commun Psychol, 41(3-4), 2008), we describe the key elements of the MARCH prevention innovation and outline how we support its adaptation and implementation. We focus on the following questions: How do we get from an innovative model to effective program implementation in the field? How do we support implementation with fidelity when adaptation is required? And, once implemented, can we demonstrate fidelity of the adaptation to the original program model? Because our program model requires local adaptation for every instance of implementation, we suggest a potential enhancement to the Interactive Systems Framework-support for adaptation of the innovation-as part of the Prevention Support System. In this paper we describe how we supported adaptation of the radio serial drama component for unique contexts in several African countries. We focus attention on the tools and trainings we developed to build innovation specific capacity for implementation, including capacities for adaptation. We then present results of a qualitative analysis of scripts from the MARCH radio serial drama in Zimbabwe to assess the adapted program's fidelity to the original design of the innovation. Finally, we discuss lessons learned and explore implications for the field.

  12. Computation of Alfvèn eigenmode stability and saturation through a reduced fast ion transport model in the TRANSP tokamak transport code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Podestà, M.; Gorelenkova, M.; Gorelenkov, N. N.

    Alfvénic instabilities (AEs) are well known as a potential cause of enhanced fast ion transport in fusion devices. Given a specific plasma scenario, quantitative predictions of (i) expected unstable AE spectrum and (ii) resulting fast ion transport are required to prevent or mitigate the AE-induced degradation in fusion performance. Reduced models are becoming an attractive tool to analyze existing scenarios as well as for scenario prediction in time-dependent simulations. Here, in this work, a neutral beam heated NSTX discharge is used as reference to illustrate the potential of a reduced fast ion transport model, known as kick model, that hasmore » been recently implemented for interpretive and predictive analysis within the framework of the time-dependent tokamak transport code TRANSP. Predictive capabilities for AE stability and saturation amplitude are first assessed, based on given thermal plasma profiles only. Predictions are then compared to experimental results, and the interpretive capabilities of the model further discussed. Overall, the reduced model captures the main properties of the instabilities and associated effects on the fast ion population. Finally, additional information from the actual experiment enables further tuning of the model's parameters to achieve a close match with measurements.« less

  13. Computation of Alfvèn eigenmode stability and saturation through a reduced fast ion transport model in the TRANSP tokamak transport code

    DOE PAGES

    Podestà, M.; Gorelenkova, M.; Gorelenkov, N. N.; ...

    2017-07-20

    Alfvénic instabilities (AEs) are well known as a potential cause of enhanced fast ion transport in fusion devices. Given a specific plasma scenario, quantitative predictions of (i) expected unstable AE spectrum and (ii) resulting fast ion transport are required to prevent or mitigate the AE-induced degradation in fusion performance. Reduced models are becoming an attractive tool to analyze existing scenarios as well as for scenario prediction in time-dependent simulations. Here, in this work, a neutral beam heated NSTX discharge is used as reference to illustrate the potential of a reduced fast ion transport model, known as kick model, that hasmore » been recently implemented for interpretive and predictive analysis within the framework of the time-dependent tokamak transport code TRANSP. Predictive capabilities for AE stability and saturation amplitude are first assessed, based on given thermal plasma profiles only. Predictions are then compared to experimental results, and the interpretive capabilities of the model further discussed. Overall, the reduced model captures the main properties of the instabilities and associated effects on the fast ion population. Finally, additional information from the actual experiment enables further tuning of the model's parameters to achieve a close match with measurements.« less

  14. Strategies for implementing Health-Promoting Schools in a province in China.

    PubMed

    Aldinger, Carmen; Zhang, Xin-Wei; Liu, Li-Qun; Guo, Jun-Xiang; Yu Sen Hai; Jones, Jack

    2008-01-01

    After successful pilot projects in 10 schools (four schools with tobacco control and six schools with nutrition interventions, plus 10 control schools), Health and Education officials in Zhejiang Province, China, decided to scale up Health-Promoting Schools (HPS) systematically over the entire province, starting with an initial cohort of 51 additional schools, reaching from primary to vocational schools. Interviews with school personnel during the first phase of scaling up illuminated the key pre-implementation, implementation, and monitoring and evaluation activities. Pre-implementation activities included choosing an entry point, setting up a special HPS committee, and establishing a work plan. Implementation activities included conducting mobilization meetings, prioritizing health, popularizing the HPS concept, ensuring community cooperation and participation, acting as role models, offering training, and using new teaching and learning methods. Monitoring and evaluation activities included process, baseline, and final evaluations and changing standards of evaluation to a more holistic evaluation that schools go through to become Health-Promoting Schools. Schools also reported that they faced - and overcame - a number of challenges including understanding and integrating the HPS concept and lack of professional development and support. Results revealed that schools transitioned from a passive model of education to interactive pedagogy put priority on health and viewed it as a co-responsibility, reshaped assessment to a more holistic approach and called for more training and technical support. Participants mentioned that they gained knowledge and skills and developed a deeper understanding about health. Health impact was also demonstrated, for instance in reduced injuries and reduced smoking, and educational impact was demonstrated, for instance in improved relationships of children to parents and teachers, improved social qualities, and improved teacher satisfaction.

  15. Software Framework for Advanced Power Plant Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    John Widmann; Sorin Munteanu; Aseem Jain

    2010-08-01

    This report summarizes the work accomplished during the Phase II development effort of the Advanced Process Engineering Co-Simulator (APECS). The objective of the project is to develop the tools to efficiently combine high-fidelity computational fluid dynamics (CFD) models with process modeling software. During the course of the project, a robust integration controller was developed that can be used in any CAPE-OPEN compliant process modeling environment. The controller mediates the exchange of information between the process modeling software and the CFD software. Several approaches to reducing the time disparity between CFD simulations and process modeling have been investigated and implemented. Thesemore » include enabling the CFD models to be run on a remote cluster and enabling multiple CFD models to be run simultaneously. Furthermore, computationally fast reduced-order models (ROMs) have been developed that can be 'trained' using the results from CFD simulations and then used directly within flowsheets. Unit operation models (both CFD and ROMs) can be uploaded to a model database and shared between multiple users.« less

  16. Can Cultural Competency Reduce Racial And Ethnic Health Disparities? A Review And Conceptual Model

    PubMed Central

    Brach, Cindy; Fraserirector, Irene

    2016-01-01

    This article develops a conceptual model of cultural competency’s potential to reduce racial and ethnic health disparities, using the cultural competency and disparities literature to lay the foundation for the model and inform assessments of its validity. The authors identify nine major cultural competency techniques: interpreter services, recruitment and retention policies, training, coordinating with traditional healers, use of community health workers, culturally competent health promotion, including family/community members, immersion into another culture, and administrative and organizational accommodations. The conceptual model shows how these techniques could theoretically improve the ability of health systems and their clinicians to deliver appropriate services to diverse populations, thereby improving outcomes and reducing disparities. The authors conclude that while there is substantial research evidence to suggest that cultural competency should in fact work, health systems have little evidence about which cultural competency techniques are effective and less evidence on when and how to implement them properly. PMID:11092163

  17. Extreme load alleviation using industrial implementation of active trailing edge flaps in a full design load basis

    NASA Astrophysics Data System (ADS)

    Barlas, Thanasis; Pettas, Vasilis; Gertz, Drew; Madsen, Helge A.

    2016-09-01

    The application of active trailing edge flaps in an industrial oriented implementation is evaluated in terms of capability of alleviating design extreme loads. A flap system with basic control functionality is implemented and tested in a realistic full Design Load Basis (DLB) for the DTU 10MW Reference Wind Turbine (RWT) model and for an upscaled rotor version in DTU's aeroelastic code HAWC2. The flap system implementation shows considerable potential in reducing extreme loads in components of interest including the blades, main bearing and tower top, with no influence on fatigue loads and power performance. In addition, an individual flap controller for fatigue load reduction in above rated power conditions is also implemented and integrated in the general controller architecture. The system is shown to be a technology enabler for rotor upscaling, by combining extreme and fatigue load reduction.

  18. A Statewide Partnership for Implementing Inquiry Science

    NASA Astrophysics Data System (ADS)

    Lytle, Charles

    The North Carolina Infrastructure for Science Education (NC-ISE) is a statewide partnership for implementing standards-based inquiry science using exemplary curriculum materials in the public schools of North Carolina. North Carolina is the 11th most populous state in the USA with 8,000,000 residents, 117 school districts and a geographic area of 48,718 miles. NC-ISE partners include the state education agency, local school systems, three branches of the University of North Carolina, the state mathematics and science education network, businesses, and business groups. The partnership, based upon the Science for All Children model developed by the National Science Resources Centre, was initiated in 1997 for improvement in teaching and learning of science and mathematics. This research-based model has been successfully implemented in several American states during the past decade. Where effectively implemented, the model has led to significant improvements in student interest and student learning. It has also helped reduce the achievement gap between minority and non-minority students and among students from different economic levels. A key program element of the program is an annual Leadership Institute that helps teams of administrators and teachers develop a five-year strategic plan for their local systems. Currently 33 of the117 local school systems have joined the NC-ISE Program and are in various stages of implementation of inquiry science in grades K-8.

  19. A PDE-based methodology for modeling, parameter estimation and feedback control in structural and structural acoustic systems

    NASA Technical Reports Server (NTRS)

    Banks, H. T.; Brown, D. E.; Metcalf, Vern L.; Silcox, R. J.; Smith, Ralph C.; Wang, Yun

    1994-01-01

    A problem of continued interest concerns the control of vibrations in a flexible structure and the related problem of reducing structure-borne noise in structural acoustic systems. In both cases, piezoceramic patches bonded to the structures have been successfully used as control actuators. Through the application of a controlling voltage, the patches can be used to reduce structural vibrations which in turn lead to methods for reducing structure-borne noise. A PDE-based methodology for modeling, estimating physical parameters, and implementing a feedback control scheme for problems of this type is discussed. While the illustrating example is a circular plate, the methodology is sufficiently general so as to be applicable in a variety of structural and structural acoustic systems.

  20. Formal specification and design techniques for wireless sensor and actuator networks.

    PubMed

    Martínez, Diego; González, Apolinar; Blanes, Francisco; Aquino, Raúl; Simo, José; Crespo, Alfons

    2011-01-01

    A current trend in the development and implementation of industrial applications is to use wireless networks to communicate the system nodes, mainly to increase application flexibility, reliability and portability, as well as to reduce the implementation cost. However, the nondeterministic and concurrent behavior of distributed systems makes their analysis and design complex, often resulting in less than satisfactory performance in simulation and test bed scenarios, which is caused by using imprecise models to analyze, validate and design these systems. Moreover, there are some simulation platforms that do not support these models. This paper presents a design and validation method for Wireless Sensor and Actuator Networks (WSAN) which is supported on a minimal set of wireless components represented in Colored Petri Nets (CPN). In summary, the model presented allows users to verify the design properties and structural behavior of the system.

  1. Interventions to reduce racial and ethnic disparities in health care.

    PubMed

    Chin, Marshall H; Walters, Amy E; Cook, Scott C; Huang, Elbert S

    2007-10-01

    In 2005, the Robert Wood Johnson Foundation created Finding Answers: Disparities Research for Change, a program to identify, evaluate, and disseminate interventions to reduce racial and ethnic disparities in the care and outcomes of patients with cardiovascular disease, depression, and diabetes. In this introductory paper, we present a conceptual model for interventions that aim to reduce disparities. With this model as a framework, we summarize the key findings from the six other papers in this supplement on cardiovascular disease, diabetes, depression, breast cancer, interventions using cultural leverage, and pay-for-performance and public reporting of performance measures. Based on these findings, we present global conclusions regarding the current state of health disparities interventions and make recommendations for future interventions to reduce disparities. Multifactorial, culturally tailored interventions that target different causes of disparities hold the most promise, but much more research is needed to investigate potential solutions and their implementation.

  2. National assessment of geologic carbon dioxide storage resources: methodology implementation

    USGS Publications Warehouse

    Blondes, Madalyn S.; Brennan, Sean T.; Merrill, Matthew D.; Buursink, Marc L.; Warwick, Peter D.; Cahan, Steven M.; Corum, Margo D.; Cook, Troy A.; Craddock, William H.; DeVera, Christina A.; Drake II, Ronald M.; Drew, Lawrence J.; Freeman, P.A.; Lohr, Celeste D.; Olea, Ricardo A.; Roberts-Ashby, Tina L.; Slucher, Ernie R.; Varela, Brian A.

    2013-01-01

    In response to the 2007 Energy Independence and Security Act, the U.S. Geological Survey (USGS) conducted a national assessment of potential geologic storage resources for carbon dioxide (CO2). Storage of CO2 in subsurface saline formations is one important method to reduce greenhouse gas emissions and curb global climate change. This report provides updates and implementation details of the assessment methodology of Brennan and others (2010, http://pubs.usgs.gov/of/2010/1127/) and describes the probabilistic model used to calculate potential storage resources in subsurface saline formations.

  3. LASSIE: simulating large-scale models of biochemical systems on GPUs.

    PubMed

    Tangherloni, Andrea; Nobile, Marco S; Besozzi, Daniela; Mauri, Giancarlo; Cazzaniga, Paolo

    2017-05-10

    Mathematical modeling and in silico analysis are widely acknowledged as complementary tools to biological laboratory methods, to achieve a thorough understanding of emergent behaviors of cellular processes in both physiological and perturbed conditions. Though, the simulation of large-scale models-consisting in hundreds or thousands of reactions and molecular species-can rapidly overtake the capabilities of Central Processing Units (CPUs). The purpose of this work is to exploit alternative high-performance computing solutions, such as Graphics Processing Units (GPUs), to allow the investigation of these models at reduced computational costs. LASSIE is a "black-box" GPU-accelerated deterministic simulator, specifically designed for large-scale models and not requiring any expertise in mathematical modeling, simulation algorithms or GPU programming. Given a reaction-based model of a cellular process, LASSIE automatically generates the corresponding system of Ordinary Differential Equations (ODEs), assuming mass-action kinetics. The numerical solution of the ODEs is obtained by automatically switching between the Runge-Kutta-Fehlberg method in the absence of stiffness, and the Backward Differentiation Formulae of first order in presence of stiffness. The computational performance of LASSIE are assessed using a set of randomly generated synthetic reaction-based models of increasing size, ranging from 64 to 8192 reactions and species, and compared to a CPU-implementation of the LSODA numerical integration algorithm. LASSIE adopts a novel fine-grained parallelization strategy to distribute on the GPU cores all the calculations required to solve the system of ODEs. By virtue of this implementation, LASSIE achieves up to 92× speed-up with respect to LSODA, therefore reducing the running time from approximately 1 month down to 8 h to simulate models consisting in, for instance, four thousands of reactions and species. Notably, thanks to its smaller memory footprint, LASSIE is able to perform fast simulations of even larger models, whereby the tested CPU-implementation of LSODA failed to reach termination. LASSIE is therefore expected to make an important breakthrough in Systems Biology applications, for the execution of faster and in-depth computational analyses of large-scale models of complex biological systems.

  4. Long-term energy and climate implications of carbon capture and storage deployment strategies in the US coal-fired electricity fleet.

    PubMed

    Sathre, Roger; Masanet, Eric

    2012-09-04

    To understand the long-term energy and climate implications of different implementation strategies for carbon capture and storage (CCS) in the US coal-fired electricity fleet, we integrate three analytical elements: scenario projection of energy supply systems, temporally explicit life cycle modeling, and time-dependent calculation of radiative forcing. Assuming continued large-scale use of coal for electricity generation, we find that aggressive implementation of CCS could reduce cumulative greenhouse gas emissions (CO(2), CH(4), and N(2)O) from the US coal-fired power fleet through 2100 by 37-58%. Cumulative radiative forcing through 2100 would be reduced by only 24-46%, due to the front-loaded time profile of the emissions and the long atmospheric residence time of CO(2). The efficiency of energy conversion and carbon capture technologies strongly affects the amount of primary energy used but has little effect on greenhouse gas emissions or radiative forcing. Delaying implementation of CCS deployment significantly increases long-term radiative forcing. This study highlights the time-dynamic nature of potential climate benefits and energy costs of different CCS deployment pathways and identifies opportunities and constraints of successful CCS implementation.

  5. Altering school climate through school-wide Positive Behavioral Interventions and Supports: findings from a group-randomized effectiveness trial.

    PubMed

    Bradshaw, Catherine P; Koth, Christine W; Thornton, Leslie A; Leaf, Philip J

    2009-06-01

    Positive Behavioral Interventions and Supports (PBIS) is a universal, school-wide prevention strategy that is currently implemented in over 7,500 schools to reduce disruptive behavior problems. The present study examines the impact of PBIS on staff reports of school organizational health using data from a group-randomized controlled effectiveness trial of PBIS conducted in 37 elementary schools. Longitudinal multilevel analyses on data from 2,596 staff revealed a significant effect of PBIS on the schools' overall organizational health, resource influence, staff affiliation, and academic emphasis over the 5-year trial; the effects on collegial leadership and institutional integrity were significant when implementation fidelity was included in the model. Trained schools that adopted PBIS the fastest tended to have higher levels of organizational health at baseline, but the later-implementing schools tended to experience the greatest improvements in organizational health after implementing PBIS. This study indicated that changes in school organizational health are important consequences of the PBIS whole-school prevention model, and may in turn be a potential contextual mediator of the effect of PBIS on student performance.

  6. Single-entry models (SEMs) for scheduled services: Towards a roadmap for the implementation of recommended practices.

    PubMed

    Lopatina, Elena; Damani, Zaheed; Bohm, Eric; Noseworthy, Tom W; Conner-Spady, Barbara; MacKean, Gail; Simpson, Chris S; Marshall, Deborah A

    2017-09-01

    Long waiting times for elective services continue to be a challenging issue. Single-entry models (SEMs) are used to increase access to and flow through the healthcare system. This paper provides a roadmap for healthcare decision-makers, managers, physicians, and researchers to guide implementation and management of successful and sustainable SEMs. The roadmap was informed by an inductive qualitative synthesis of the findings from a deliberative process (a symposium on SEMs, with clinicians, researchers, senior policy-makers, healthcare managers, and patient representatives) and focus groups with the symposium participants. SEMs are a promising strategy to improve the management of referrals and represent one approach to reduce waiting times. The SEMs roadmap outlines current knowledge about SEMs and critical success factors for SEMs' implementation and management. This SEM roadmap is intended to help clinicians, decision-makers, managers, and researchers interested in developing new or strengthening existing SEMs. We consider this roadmap to be a living document that will continue to evolve as we learn more about implementing and managing sustainable SEMs. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. DAMAGE MODELING OF INJECTION-MOLDED SHORT- AND LONG-FIBER THERMOPLASTICS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nguyen, Ba Nghiep; Kunc, Vlastimil; Bapanapalli, Satish K.

    2009-10-30

    This article applies the recent anisotropic rotary diffusion – reduced strain closure (ARD-RSC) model for predicting fiber orientation and a new damage model for injection-molded long-fiber thermoplastics (LFTs) to analyze progressive damage leading to total failure of injection-molded long-glass-fiber/polypropylene (PP) specimens. The ARD-RSC model was implemented in a research version of the Autodesk Moldflow Plastics Insight (MPI) processing code, and it has been used to simulate injection-molding of a long-glass-fiber/PP plaque. The damage model combines micromechanical modeling with a continuum damage mechanics description to predict the nonlinear behavior due to plasticity coupled with damage in LFTs. This model has beenmore » implemented in the ABAQUS finite element code via user-subroutines and has been used in the damage analyses of tensile specimens removed from the injection-molded long-glass-fiber/PP plaques. Experimental characterization and mechanical testing were performed to provide input data to support and validate both process modeling and damage analyses. The predictions are in agreement with the experimental results.« less

  8. The role of public policies in reducing smoking: the Minnesota SimSmoke tobacco policy model.

    PubMed

    Levy, David T; Boyle, Raymond G; Abrams, David B

    2012-11-01

    Following the landmark lawsuit and settlement with the tobacco industry, Minnesota pursued the implementation of stricter tobacco control policies, including tax increases, mass media campaigns, smokefree air laws, and cessation treatment policies. Modeling is used to examine policy effects on smoking prevalence and smoking-attributable deaths. To estimate the effect of tobacco control policies in Minnesota on smoking prevalence and smoking-attributable deaths using the SimSmoke simulation model. Minnesota data starting in 1993 are applied to SimSmoke, a simulation model used to examine the effect of tobacco control policies over time on smoking initiation and cessation. Upon validating the model against smoking prevalence, SimSmoke is used to distinguish the effect of policies implemented since 1993 on smoking prevalence. Using standard attribution methods, SimSmoke also estimates deaths averted as a result of the policies. SimSmoke predicts smoking prevalence accurately between 1993 and 2011. Since 1993, a relative reduction in smoking rates of 29% by 2011 and of 41% by 2041 can be attributed to tobacco control policies, mainly tax increases, smokefree air laws, media campaigns, and cessation treatment programs. Moreover, 48,000 smoking-attributable deaths will be averted by 2041. Minnesota SimSmoke demonstrates that tobacco control policies, especially taxes, have substantially reduced smoking prevalence and smoking-attributable deaths. Taxes, smokefree air laws, mass media, cessation treatment policies, and youth-access enforcement contributed to the decline in prevalence and deaths averted, with the strongest component being taxes. With stronger policies, for example, increasing cigarette taxes to $4.00 per pack, Minnesota's smoking rate could be reduced by another 13%, and 7200 deaths could be averted by 2041. Copyright © 2012 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  9. Offering integrated medical equipment management in an application service provider model.

    PubMed

    Cruz, Antonio Miguel; Barr, Cameron; Denis, Ernesto Rodríguez

    2007-01-01

    With the advancement of medical technology and thus the complexity of the equipment under their care, clinical engineering departments (CEDs) must continue to make use of computerized tools in the management of departmental activities. Authors of this paper designed, installed, and implemented an application service provider (ASP) model at the laboratory level to offer value added management tools in an online format to CEDs. The project, designed to investigate how to help meet demands across multiple healthcare organizations and provide a means of access for organizations that otherwise might not be able to take advantage of the benefits of those tools, has been well received. Ten hospitals have requested the service, and five of those are ready to proceed with the implementation of the ASP. With the proposed centralized system architecture, the model has shown promise in reducing network infrastructure labor and equipment costs, benchmarking of equipment performance indicators, and developing avenues for proper and timely problem reporting. The following is a detailed description of the design process from conception to implementation of the five main software modules and supporting system architecture.

  10. Spindle speed variation technique in turning operations: Modeling and real implementation

    NASA Astrophysics Data System (ADS)

    Urbikain, G.; Olvera, D.; de Lacalle, L. N. López; Elías-Zúñiga, A.

    2016-11-01

    Chatter is still one of the most challenging problems in machining vibrations. Researchers have focused their efforts to prevent, avoid or reduce chatter vibrations by introducing more accurate predictive physical methods. Among them, the techniques based on varying the rotational speed of the spindle (or SSV, Spindle Speed ​​Variation) have gained great relevance. However, several problems need to be addressed due to technical and practical reasons. On one hand, they can generate harmful overheating of the spindle especially at high speeds. On the other hand, the machine may be unable to perform the interpolation properly. Moreover, it is not trivial to select the most appropriate tuning parameters. This paper conducts a study of the real implementation of the SSV technique in turning systems. First, a stability model based on perturbation theory was developed for simulation purposes. Secondly, the procedure to realistically implement the technique in a conventional turning center was tested and developed. The balance between the improved stability margins and acceptable behavior of the spindle is ensured by energy consumption measurements. Mathematical model shows good agreement with experimental cutting tests.

  11. Implementing large-scale food fortification in Ghana: lessons learned.

    PubMed

    Nyumuah, Richard Odum; Hoang, Thuy-Co Caroline; Amoaful, Esi Foriwa; Agble, Rosanna; Meyer, Marc; Wirth, James P; Locatelli-Rossi, Lorenzo; Panagides, Dora

    2012-12-01

    Food fortification began in Ghana in 1996 when legislation was passed to enforce the iodization of salt. This paper describes the development of the Ghanaian fortification program and identifies lessons learned in implementing fortification initiatives (universal salt iodization, fortification of vegetable oil and wheat flour) from 1996 to date. This paper identifies achievements, challenges, and lessons learned in implementing large scale food fortification in Ghana. Primary data was collected through interviews with key members of the National Food Fortification Alliance (NFFA), implementation staff of the Food Fortification Project, and staff of GAIN. Secondary data was collected through desk review of documentation from the project offices of the National Food Fortification Project and the National Secretariat for the Implementation of the National Salt Iodization in Ghana. Reduction of the prevalence of goiter has been observed, and coverage of households with adequately iodized salt increased between 1996 and 2006. Two models were designed to increase production of adequately iodized salt: one to procure and distribute potassium iodate (KIO3) locally, and the second, the salt bank cooperative (SBC) model, specifically designed for small-scale artisanal salt farmers. This resulted in the establishment of a centralized potassium iodate procurement and distribution system, tailored to local needs and ensuring competitive and stable prices. The SBC model allowed for nearly 157 MT of adequately iodized salt to be produced in 2011 in a region where adequately iodized salt was initially not available. For vegetable oil fortification, implementing quantitative analysis methods for accurate control of added fortificant proved challenging but was overcome with the use of a rapid test device, confirming that 95% of vegetable oil is adequately fortified in Ghana. However, appropriate compliance with national standards on wheat flour continues to pose challenges due to adverse sensory effects, which have led producers to reduce the dosage of premix in wheat flour. Challenges to access to premix experienced by small producers can be overcome with a central procurement model in which the distributor leverages the overall volume by tendering for a consolidated order. The SBC model has the potential to be expanded and to considerably increase the coverage of the population consuming iodized salt in Ghana. Successful implementation of the cost-effective iCheck CHROMA rapid test device should be replicated in other countries where quality control of fortified vegetable oil is a challenge, and extended to additional food vehicles, such as wheat flour and salt. Only a reduced impact on iron deficiency in Ghana can be expected, given the low level of fortificant added to the wheat flour. An integrated approach, with complementary programs including additional iron-fortified food vehicles, should be explored to maximize health impact.

  12. NoSQL Based 3D City Model Management System

    NASA Astrophysics Data System (ADS)

    Mao, B.; Harrie, L.; Cao, J.; Wu, Z.; Shen, J.

    2014-04-01

    To manage increasingly complicated 3D city models, a framework based on NoSQL database is proposed in this paper. The framework supports import and export of 3D city model according to international standards such as CityGML, KML/COLLADA and X3D. We also suggest and implement 3D model analysis and visualization in the framework. For city model analysis, 3D geometry data and semantic information (such as name, height, area, price and so on) are stored and processed separately. We use a Map-Reduce method to deal with the 3D geometry data since it is more complex, while the semantic analysis is mainly based on database query operation. For visualization, a multiple 3D city representation structure CityTree is implemented within the framework to support dynamic LODs based on user viewpoint. Also, the proposed framework is easily extensible and supports geoindexes to speed up the querying. Our experimental results show that the proposed 3D city management system can efficiently fulfil the analysis and visualization requirements.

  13. Operational oil spill trajectory modelling using HF radar currents: A northwest European continental shelf case study.

    PubMed

    Abascal, Ana J; Sanchez, Jorge; Chiri, Helios; Ferrer, María I; Cárdenas, Mar; Gallego, Alejandro; Castanedo, Sonia; Medina, Raúl; Alonso-Martirena, Andrés; Berx, Barbara; Turrell, William R; Hughes, Sarah L

    2017-06-15

    This paper presents a novel operational oil spill modelling system based on HF radar currents, implemented in a northwest European shelf sea. The system integrates Open Modal Analysis (OMA), Short Term Prediction algorithms (STPS) and an oil spill model to simulate oil spill trajectories. A set of 18 buoys was used to assess the accuracy of the system for trajectory forecast and to evaluate the benefits of HF radar data compared to the use of currents from a hydrodynamic model (HDM). The results showed that simulated trajectories using OMA currents were more accurate than those obtained using a HDM. After 48h the mean error was reduced by 40%. The forecast skill of the STPS method was valid up to 6h ahead. The analysis performed shows the benefits of HF radar data for operational oil spill modelling, which could be easily implemented in other regions with HF radar coverage. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  14. Postneoliberal Public Health Care Reforms: Neoliberalism, Social Medicine, and Persistent Health Inequalities in Latin America.

    PubMed

    Hartmann, Christopher

    2016-12-01

    Several Latin American countries are implementing a suite of so-called "postneoliberal" social and political economic policies to counter neoliberal models that emerged in the 1980s. This article considers the influence of postneoliberalism on public health discourses, policies, institutions, and practices in Bolivia, Ecuador, and Venezuela. Social medicine and neoliberal public health models are antecedents of postneoliberal public health care models. Postneoliberal public health governance models neither fully incorporate social medicine nor completely reject neoliberal models. Postneoliberal reforms may provide an alternative means of reducing health inequalities and improving population health.

  15. Computation of Alfvèn eigenmode stability and saturation through a reduced fast ion transport model in the TRANSP tokamak transport code

    NASA Astrophysics Data System (ADS)

    Podestà, M.; Gorelenkova, M.; Gorelenkov, N. N.; White, R. B.

    2017-09-01

    Alfvénic instabilities (AEs) are well known as a potential cause of enhanced fast ion transport in fusion devices. Given a specific plasma scenario, quantitative predictions of (i) expected unstable AE spectrum and (ii) resulting fast ion transport are required to prevent or mitigate the AE-induced degradation in fusion performance. Reduced models are becoming an attractive tool to analyze existing scenarios as well as for scenario prediction in time-dependent simulations. In this work, a neutral beam heated NSTX discharge is used as reference to illustrate the potential of a reduced fast ion transport model, known as kick model, that has been recently implemented for interpretive and predictive analysis within the framework of the time-dependent tokamak transport code TRANSP. Predictive capabilities for AE stability and saturation amplitude are first assessed, based on given thermal plasma profiles only. Predictions are then compared to experimental results, and the interpretive capabilities of the model further discussed. Overall, the reduced model captures the main properties of the instabilities and associated effects on the fast ion population. Additional information from the actual experiment enables further tuning of the model’s parameters to achieve a close match with measurements.

  16. An energy-efficient rate adaptive media access protocol (RA-MAC) for long-lived sensor networks.

    PubMed

    Hu, Wen; Chen, Quanjun; Corke, Peter; O'Rourke, Damien

    2010-01-01

    We introduce an energy-efficient Rate Adaptive Media Access Control (RA-MAC) algorithm for long-lived Wireless Sensor Networks (WSNs). Previous research shows that the dynamic and lossy nature of wireless communications is one of the major challenges to reliable data delivery in WSNs. RA-MAC achieves high link reliability in such situations by dynamically trading off data rate for channel gain. The extra gain that can be achieved reduces the packet loss rate which contributes to reduced energy expenditure through a reduced numbers of retransmissions. We achieve this at the expense of raw bit rate which generally far exceeds the application's link requirement. To minimize communication energy consumption, RA-MAC selects the optimal data rate based on the estimated link quality at each data rate and an analytical model of the energy consumption. Our model shows how the selected data rate depends on different channel conditions in order to minimize energy consumption. We have implemented RA-MAC in TinyOS for an off-the-shelf sensor platform (the TinyNode) on top of a state-of-the-art WSN Media Access Control Protocol, SCP-MAC, and evaluated its performance by comparing our implementation with the original SCP-MAC using both simulation and experiment.

  17. Numerical simulations with a FSI-calibrated actuator disk model of wind turbines operating in stratified ABLs

    NASA Astrophysics Data System (ADS)

    Gohari, S. M. Iman; Sarkar, Sutanu; Korobenko, Artem; Bazilevs, Yuri

    2017-11-01

    Numerical simulations of wind turbines operating under different regimes of stability are performed using LES. A reduced model, based on the generalized actuator disk model (ADM), is implemented to represent the wind turbines within the ABL. Data from the fluid-solid interaction (FSI) simulations of wind turbines have been used to calibrate and validate the reduced model. The computational cost of this method to include wind turbines is affordable and incurs an overhead as low as 1.45%. Using this reduced model, we study the coupling of unsteady turbulent flow with the wind turbine under different ABL conditions: (i) A neutral ABL with zero heat-flux and inversion layer at 350m, in which the incoming wind has the maximum mean shear between the heights of upper-tip and lower-tip; (2) A shallow ABL with surface cooling rate of -1 K/hr wherein the low level jet occurs at the wind turbine hub height. We will discuss how the differences in the unsteady flow between the two ABL regimes impact the wind turbine performance.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simpson, L.

    ITN Energy Systems, Inc., and Global Solar Energy, Inc., with the assistance of NREL's PV Manufacturing R&D program, have continued the advancement of CIGS production technology through the development of trajectory-oriented predictive/control models, fault-tolerance control, control-platform development, in-situ sensors, and process improvements. Modeling activities to date include the development of physics-based and empirical models for CIGS and sputter-deposition processing, implementation of model-based control, and application of predictive models to the construction of new evaporation sources and for control. Model-based control is enabled through implementation of reduced or empirical models into a control platform. Reliability improvement activities include implementation of preventivemore » maintenance schedules; detection of failed sensors/equipment and reconfiguration to continue processing; and systematic development of fault prevention and reconfiguration strategies for the full range of CIGS PV production deposition processes. In-situ sensor development activities have resulted in improved control and indicated the potential for enhanced process status monitoring and control of the deposition processes. Substantial process improvements have been made, including significant improvement in CIGS uniformity, thickness control, efficiency, yield, and throughput. In large measure, these gains have been driven by process optimization, which, in turn, have been enabled by control and reliability improvements due to this PV Manufacturing R&D program. This has resulted in substantial improvements of flexible CIGS PV module performance and efficiency.« less

  19. Population health outcome models in suicide prevention policy.

    PubMed

    Lynch, Frances L

    2014-09-01

    Suicide is a leading cause of death in the U.S. and results in immense suffering and significant cost. Effective suicide prevention interventions could reduce this burden, but policy makers need estimates of health outcomes achieved by alternative interventions to focus implementation efforts. To illustrate the utility of health outcome models to help in achieving goals defined by the National Action Alliance for Suicide Prevention's Research Prioritization Task Force. The approach is illustrated specifically with psychotherapeutic interventions to prevent suicide reattempt in emergency department settings. A health outcome model using decision analysis with secondary data was applied to estimate suicide attempts and deaths averted from evidence-based interventions. Under optimal conditions, the model estimated that over 1 year, implementing evidence-based psychotherapeutic interventions in emergency departments could decrease the number of suicide attempts by 18,737, and if offered over 5 years, it could avert 109,306 attempts. Over 1 year, the model estimated 2,498 fewer deaths from suicide, and over 5 years, about 13,928 fewer suicide deaths. Health outcome models could aid in suicide prevention policy by helping focus implementation efforts. Further research developing more sophisticated models of the impact of suicide prevention interventions that include a more complex understanding of suicidal behavior, longer time frames, and inclusion of additional outcomes that capture the full benefits and costs of interventions would be helpful next steps. Copyright © 2014 American Journal of Preventive Medicine. All rights reserved.

  20. Learning to REDUCE: A Reduced Electricity Consumption Prediction Ensemble

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aman, Saima; Chelmis, Charalampos; Prasanna, Viktor

    Utilities use Demand Response (DR) to balance supply and demand in the electric grid by involving customers in efforts to reduce electricity consumption during peak periods. To implement and adapt DR under dynamically changing conditions of the grid, reliable prediction of reduced consumption is critical. However, despite the wealth of research on electricity consumption prediction and DR being long in practice, the problem of reduced consumption prediction remains largely un-addressed. In this paper, we identify unique computational challenges associated with the prediction of reduced consumption and contrast this to that of normal consumption and DR baseline prediction.We propose a novelmore » ensemble model that leverages different sequences of daily electricity consumption on DR event days as well as contextual attributes for reduced consumption prediction. We demonstrate the success of our model on a large, real-world, high resolution dataset from a university microgrid comprising of over 950 DR events across a diverse set of 32 buildings. Our model achieves an average error of 13.5%, an 8.8% improvement over the baseline. Our work is particularly relevant for buildings where electricity consumption is not tied to strict schedules. Our results and insights should prove useful to the researchers and practitioners working in the sustainable energy domain.« less

  1. Combining Natural Attenuation Capacity and use of Targeted Technological Mitigation Measures for Reducing Diffuse Nutrient Emissions to Surface Waters: The Danish Way

    NASA Astrophysics Data System (ADS)

    Kronvang, B.; Højberg, A. L.; Hoffmann, C. C.; Windolf, J.; Blicher-Mathiesen, G.

    2015-12-01

    Excess nitrogen (N) and phosphorus (P) emissions to surface waters are a high priority environmental problem worldwide for protection of water resources in times of population growth and climate change. As clean water is a scarce resource the struggle for reducing nutrient emissions are an ongoing issue for many countries and regions. Since the mid1980s a wide range of national regulatory general measures have been implemented to reduce land based nitrogen (N) and phosphorus (P) loadings of the Danish aquatic environment. These measures have addressed both point source emissions and emissions from diffuse sources especially from agricultural production. Following nearly 4 decades of combating nutrient pollution our surface waters such as lakes and estuaries are only slowly responding on the 50% reduction in N and 56% reduction in P. Therefore, the implementation of the EU Water Framework Directive in Danish surface waters still call for further reductions of N and P loadings. Therefore, a new era of targeted implemented measures was the outcome of a Commission on Nature and Agriculture established by the Danish Government in 2013. Their White Book points to the need of increased growth and better environment through more targeted and efficient regulation using advanced technological mitigation methods that are implemented intelligently according to the local natural attenuation capacity for nutrients in the landscape. As a follow up a national consensus model for N was established chaining existing leaching, 3D groundwater and surface water models that enable a calculation of the N dynamics and attenuation capacity within a scale of 15 km2. Moreover, several research projects have been conducted to investigate the effect of a suite of targeted mitigation measures such as restored natural wetlands, constructed wetlands, controlled drainage, buffer strips and constructed buffer strips. The results of these studies will be shared in this presentation.

  2. The Oregon Model of Behavior Family Therapy: From Intervention Design to Promoting Large-Scale System Change

    PubMed Central

    Dishion, Thomas; Forgatch, Marion; Chamberlain, Patricia; Pelham, William E.

    2017-01-01

    This paper reviews the evolution of the Oregon model of family behavior therapy over the past four decades. Inspired by basic research on family interaction and innovation in behavior change theory, a set of intervention strategies were developed that were effective for reducing multiple forms of problem behavior in children (e.g., Patterson, Chamberlain, & Reid, 1982). Over the ensuing decades, the behavior family therapy principles were applied and adapted to promote children’s adjustment to address family formation and adaptation (Family Check-Up model), family disruption and maladaptation (Parent Management Training–Oregon model), and family attenuation and dissolution (Treatment Foster Care–Oregon model). We provide a brief overview of each intervention model and summarize randomized trials of intervention effectiveness. We review evidence on the viability of effective implementation, as well as barriers and solutions to adopting these evidence-based practices. We conclude by proposing an integrated family support system for the three models applied to the goal of reducing the prevalence of severe problem behavior, addiction, and mental problems for children and families, as well as reducing the need for costly and largely ineffective residential placements. PMID:27993335

  3. Reducing Change Management Complexity: Aligning Change Recipient Sensemaking to Change Agent Sensegiving

    ERIC Educational Resources Information Center

    Kumar; Payal; Singhal, Manish

    2012-01-01

    Implementation of change in an organisation through culture can elicit a wide array of reactions from organisational members, spanning from acceptance to resistance. Drawing on Hatch's cultural dynamics model and on Wegner's social theory of learning, this paper dwells on an underdeveloped area in the extant literature, namely understanding change…

  4. Implementation and Acceptability of an Adapted Classroom Check-Up Coaching Model to Promote Culturally Responsive Classroom Management

    ERIC Educational Resources Information Center

    Pas, Elise T.; Larson, Kristine E.; Reinke, Wendy M.; Herman, Keith C.; Bradshaw, Catherine P.

    2016-01-01

    Literature suggests that improving teacher use of culturally responsive classroom management strategies may reduce the disproportionate number of racial and ethnic minority students who receive exclusionary discipline actions and are identified as needing special education, particularly for emotional and behavioral disorders. Coaching teachers is…

  5. Improving Pit Vehicle Ecology Safety

    NASA Astrophysics Data System (ADS)

    Koptev, V. Yu; Kopteva, A. V.

    2018-05-01

    The article discloses the ways to improve the ecological safety of a pit transport: reducing harmful substance concentrations in exhaust gases, implementation of the ecological certificate of the dumping truck, taking into account the operation of the dumping truck actual work, choosing the best model and comparing ecological characteristics of pit lifters at deep pits.

  6. Reducing Nutrients and Nutrient Impacts Priority Issue Team - St. Louis Bay Project: Implementing Nutrients PIT Action Step 1.1

    NASA Technical Reports Server (NTRS)

    Mason, Ted

    2011-01-01

    The NASA Applied Science & Technology Project Office at Stennis Space Center(SSC) used satellites, in-situ measurements and computational modeling to study relationships between water quality in St. Louis Bay, Mississippi and the watershed characteristics of the Jourdan and Wolf rivers from 2000-2010.

  7. Department of Defense Food service Program Needs Contracting and Management Improvements.

    DTIC Science & Technology

    1981-10-20

    in the ration, changes in consumer preferences , and advances in food technology, we believe composition changes could occur which would reduce the...accurately predict consumer preferences . The computer model which the DoD has developed to implement the proposed changes to Title 10 U.S.C. is based upon more

  8. Predicting First-Grade Reading Performance from Kindergarten Response to Tier 1 Instruction

    ERIC Educational Resources Information Center

    Al Otaiba, Stephanie; Folsom, Jessica S.; Schatschneider, Christopher; Wanzek, Jeanne; Greulich, Luana; Meadows, Jane; Li, Zhi; Connor, Carol M.

    2011-01-01

    Many schools are implementing multitier response-to-intervention (RTI) models to reduce reading difficulties. This study was part of our larger ongoing longitudinal RTI investigation within the Florida Learning Disabilities Center grant and was conducted in 7 ethnically and socioeconomically diverse schools. We observed reading instruction in 20…

  9. 75 FR 66055 - Notice of Data Availability Supporting Federal Implementation Plans To Reduce Interstate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-27

    ... at the Air and Radiation Docket and Information Center, EPA/DC, EPA East Building, Room 3334, 1301... docket for the Proposed Transport Rule (Docket ID No. EPA-HQ-OAR-2009- 0491) additional information... emissions inventories for specific source categories and related new information and models that have become...

  10. A Model for Usability Evaluation for the Development and Implementation of Consumer eHealth Interventions.

    PubMed

    Parry, David; Carter, Philip; Koziol-McLain, Jane; Feather, Jacqueline

    2015-01-01

    Consumer eHealth products are often used by people in their own homes or other settings without dedicated clinical supervision, and often with minimal training and limited support--much as eCommerce and eGovernment applications are currently deployed. Internet based self-care systems have been advocated for over a decade as a way to reduce costs and allow more convenient care, and--because of the expectation that they will be used to reduced health cost--, by increasing self-care and avoiding hospitalization. However, the history of consumer eHealth interventions is mixed, with many unsuccessful implementations. Many consumer eHealth products will form part of a broader complex intervention, with many possible benefits and effects on both individuals and society. This poster describes a model of consumer eHealth assessment based on multiple methods of usability evaluation at different stages in the design and fielding of eHealth systems. We argue that different methods of usability evaluation are able to give valuable insights into the likely effects of an intervention in a way that is congruent with software development processes.

  11. The regrets of procrastination in climate policy

    NASA Astrophysics Data System (ADS)

    Keller, Klaus; Robinson, Alexander; Bradford, David F.; Oppenheimer, Michael

    2007-04-01

    Anthropogenic carbon dioxide (CO2) emissions are projected to impose economic costs due to the associated climate change impacts. Climate change impacts can be reduced by abating CO2 emissions. What would be an economically optimal investment in abating CO2 emissions? Economic models typically suggest that reducing CO2 emissions by roughly ten to twenty per cent relative to business-as-usual would be an economically optimal strategy. The currently implemented CO2 abatement of a few per cent falls short of this benchmark. Hence, the global community may be procrastinating in implementing an economically optimal strategy. Here we use a simple economic model to estimate the regrets of this procrastination—the economic costs due to the suboptimal strategy choice. The regrets of procrastination can range from billions to trillions of US dollars. The regrets increase with increasing procrastination period and with decreasing limits on global mean temperature increase. Extended procrastination may close the window of opportunity to avoid crossing temperature limits interpreted by some as 'dangerous anthropogenic interference with the climate system' in the sense of Article 2 of the United Nations Framework Convention on Global Climate Change.

  12. Active Control of Mixing and Combustion, from Mechanisms to Implementation

    NASA Astrophysics Data System (ADS)

    Ghoniem, Ahmed F.

    2001-11-01

    Implementation of active control in complex processes, of the type encountered in high Reynolds number mixing and combustion, is predicated upon the identification of the underlying mechanisms and the construction of reduced order models that capture their essential characteristics. The mechanisms of interest must be shown to be amenable to external actuations, allowing optimal control strategies to exploit the delicate interactions that lead to the desired outcome. Reduced order models are utilized in defining the form and requisite attributes of actuation, its relationship to the monitoring system and the relevant control algorithms embedded in a feedforward or a feedback loop. The talk will review recent work on active control of mixing in combustion devices in which strong shear zones concur with mixing, combustion stabilization and flame anchoring. The underlying mechanisms, e.g., stability of shear flows, formation/evolution of large vortical structures in separating and swirling flows, their mutual interactions with acoustic fields, flame fronts and chemical kinetics, etc., are discussed in light of their key roles in mixing, burning enhancement/suppression, and combustion instability. Subtle attributes of combustion mechanisms are used to suggest the requisite control strategies.

  13. SU-E-T-121: Analyzing the Broadening Effect On the Bragg Peak Due to Heterogeneous Geometries and Implementing User-Routines in the Monte-Carlo Code FLUKA in Order to Reduce Computation Time

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baumann, K; Weber, U; Simeonov, Y

    2015-06-15

    Purpose: Aim of this study was to analyze the modulating, broadening effect on the Bragg Peak due to heterogeneous geometries like multi-wire chambers in the beam path of a particle therapy beam line. The effect was described by a mathematical model which was implemented in the Monte-Carlo code FLUKA via user-routines, in order to reduce the computation time for the simulations. Methods: The depth dose curve of 80 MeV/u C12-ions in a water phantom was calculated using the Monte-Carlo code FLUKA (reference curve). The modulating effect on this dose distribution behind eleven mesh-like foils (periodicity ∼80 microns) occurring in amore » typical set of multi-wire and dose chambers was mathematically described by optimizing a normal distribution so that the reverence curve convoluted with this distribution equals the modulated dose curve. This distribution describes a displacement in water and was transferred in a probability distribution of the thickness of the eleven foils using the water equivalent thickness of the foil’s material. From this distribution the distribution of the thickness of one foil was determined inversely. In FLUKA the heterogeneous foils were replaced by homogeneous foils and a user-routine was programmed that varies the thickness of the homogeneous foils for each simulated particle using this distribution. Results: Using the mathematical model and user-routine in FLUKA the broadening effect could be reproduced exactly when replacing the heterogeneous foils by homogeneous ones. The computation time was reduced by 90 percent. Conclusion: In this study the broadening effect on the Bragg Peak due to heterogeneous structures was analyzed, described by a mathematical model and implemented in FLUKA via user-routines. Applying these routines the computing time was reduced by 90 percent. The developed tool can be used for any heterogeneous structure in the dimensions of microns to millimeters, in principle even for organic materials like lung tissue.« less

  14. A prison mental health in-reach model informed by assertive community treatment principles: evaluation of its impact on planning during the pre-release period, community mental health service engagement and reoffending.

    PubMed

    McKenna, Brian; Skipworth, Jeremy; Tapsell, Rees; Madell, Dominic; Pillai, Krishna; Simpson, Alexander; Cavney, James; Rouse, Paul

    2015-12-01

    It is well recognised that prisoners with serious mental illness (SMI) are at high risk of poor outcomes on return to the community. Early engagement with mental health services and other community agencies could provide the substrate for reducing risk. To evaluate the impact of implementing an assertive community treatment informed prison in-reach model of care (PMOC) on post-release engagement with community mental health services and on reoffending rates. One hundred and eighty prisoners with SMI released from four prisons in the year before implementation of the PMOC were compared with 170 such prisoners released the year after its implementation. The assertive prison model of care was associated with more pre-release contacts with community mental health services and contacts with some social care agencies in some prisons. There were significantly more post-release community mental health service engagements after implementation of this model (Z = -2.388, p = 0.02). There was a trend towards reduction in reoffending rates after release from some of the prisons (Z =1.82, p = 0.07). Assertive community treatment applied to prisoners with mental health problems was superior to 'treatment as usual', but more work is needed to ensure that agencies will engage prisoners in pre-release care. The fact that the model showed some benefits in the absence of any increase in resources suggests that it may be the model per se that is effective. Copyright © 2014 John Wiley & Sons, Ltd.

  15. Development of a standardized, citywide process for managing smart-pump drug libraries.

    PubMed

    Walroth, Todd A; Smallwood, Shannon; Arthur, Karen; Vance, Betsy; Washington, Alana; Staublin, Therese; Haslar, Tammy; Reddan, Jennifer G; Fuller, James

    2018-06-15

    Development and implementation of an interprofessional consensus-driven process for review and optimization of smart-pump drug libraries and dosing limits are described. The Indianapolis Coalition for Patient Safety (ICPS), which represents 6 Indianapolis-area health systems, identified an opportunity to reduce clinically insignificant alerts that smart infusion pumps present to end users. Through a consensus-driven process, ICPS aimed to identify best practices to implement at individual hospitals in order to establish specific action items for smart-pump drug library optimization. A work group of pharmacists, nurses, and industrial engineers met to evaluate variability within and lack of scrutiny of smart-pump drug libraries. The work group used Lean Six Sigma methodologies to generate a list of key needs and barriers to be addressed in process standardization. The group reviewed targets for smart-pump drug library optimization, including dosing limits, types of alerts reviewed, policies, and safety best practices. The work group also analyzed existing processes at each site to develop a final consensus statement outlining a model process for reviewing alerts and managing smart-pump data. Analysis of the total number of alerts per device across ICPS-affiliated health systems over a 4-year period indicated a 50% decrease (from 7.2 to 3.6 alerts per device per month) after implementation of the model by ICPS member organizations. Through implementation of a standardized, consensus-driven process for smart-pump drug library optimization, ICPS member health systems reduced clinically insignificant smart-pump alerts. Copyright © 2018 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  16. Experience-based quality control of clinical intensity-modulated radiotherapy planning.

    PubMed

    Moore, Kevin L; Brame, R Scott; Low, Daniel A; Mutic, Sasa

    2011-10-01

    To incorporate a quality control tool, according to previous planning experience and patient-specific anatomic information, into the intensity-modulated radiotherapy (IMRT) plan generation process and to determine whether the tool improved treatment plan quality. A retrospective study of 42 IMRT plans demonstrated a correlation between the fraction of organs at risk (OARs) overlapping the planning target volume and the mean dose. This yielded a model, predicted dose = prescription dose (0.2 + 0.8 [1 - exp(-3 overlapping planning target volume/volume of OAR)]), that predicted the achievable mean doses according to the planning target volume overlap/volume of OAR and the prescription dose. The model was incorporated into the planning process by way of a user-executable script that reported the predicted dose for any OAR. The script was introduced to clinicians engaged in IMRT planning and deployed thereafter. The script's effect was evaluated by tracking δ = (mean dose-predicted dose)/predicted dose, the fraction by which the mean dose exceeded the model. All OARs under investigation (rectum and bladder in prostate cancer; parotid glands, esophagus, and larynx in head-and-neck cancer) exhibited both smaller δ and reduced variability after script implementation. These effects were substantial for the parotid glands, for which the previous δ = 0.28 ± 0.24 was reduced to δ = 0.13 ± 0.10. The clinical relevance was most evident in the subset of cases in which the parotid glands were potentially salvageable (predicted dose <30 Gy). Before script implementation, an average of 30.1 Gy was delivered to the salvageable cases, with an average predicted dose of 20.3 Gy. After implementation, an average of 18.7 Gy was delivered to salvageable cases, with an average predicted dose of 17.2 Gy. In the prostate cases, the rectum model excess was reduced from δ = 0.28 ± 0.20 to δ = 0.07 ± 0.15. On surveying dosimetrists at the end of the study, most reported that the script both improved their IMRT planning (8 of 10) and increased their efficiency (6 of 10). This tool proved successful in increasing normal tissue sparing and reducing interclinician variability, providing effective quality control of the IMRT plan development process. Copyright © 2011 Elsevier Inc. All rights reserved.

  17. Impact of Contextual Factors on Interventions to Reduce Acute Care Transfers II Implementation and Hospital Readmission Rates.

    PubMed

    Rask, Kimberly J; Hodge, Jennifer; Kluge, Linda

    2017-11-01

    Identify contextual and implementation factors impacting the effectiveness of an organizational-level intervention to reduce preventable hospital readmissions from affiliated skilled nursing facilities (SNFs). Observational study of the implementation of Interventions to Reduce Acute Care Transfers tools in 3 different cohorts. SNFs. SNFs belonging to 1 of 2 corporate entities and a group of independent SNFs that volunteered to participate in a Quality Improvement Organization (QIO) training program. Two groups of SNFs received INTERACT II training and technical assistance from corporate staff, and 1 group of SNFs received training from QIO staff. Thirty-day acute care hospital readmissions from Medicare fee-for-service claims, contextual factors using the Model for Understanding Success in Quality framework. All 3 cohorts were able to deliver the INTERACT training program to their constituent facilities through regional events as well as onsite technical assistance, but the impact on readmission rates varied. Facilities supported by the QIO and corporation A were able to achieve statistically significant reductions in 30-day readmission rates. A review of contextual factors found that although all cohorts were challenged by staff turnover and workload, corporation B facilities struggled with a less mature quality improvement (QI) culture and infrastructure. Both corporations demonstrated a strong corporate commitment to implementing INTERACT II, but differences in training strategies, QI culture, capacity, and competing pressures may have impacted the effectiveness of the training. Proactively addressing these factors may help long-term care organizations interested in reducing acute care readmission rates increase the likelihood of QI success. Copyright © 2017 AMDA – The Society for Post-Acute and Long-Term Care Medicine. All rights reserved.

  18. A framework for multi-criteria assessment of model enhancements

    NASA Astrophysics Data System (ADS)

    Francke, Till; Foerster, Saskia; Brosinsky, Arlena; Delgado, José; Güntner, Andreas; López-Tarazón, José A.; Bronstert, Axel

    2016-04-01

    Modellers are often faced with unsatisfactory model performance for a specific setup of a hydrological model. In these cases, the modeller may try to improve the setup by addressing selected causes for the model errors (i.e. data errors, structural errors). This leads to adding certain "model enhancements" (MEs), e.g. climate data based on more monitoring stations, improved calibration data, modifications in process formulations. However, deciding on which MEs to implement remains a matter of expert knowledge, guided by some sensitivity analysis at best. When multiple MEs have been implemented, a resulting improvement in model performance is not easily attributed, especially when considering different aspects of this improvement (e.g. better performance dynamics vs. reduced bias). In this study we present an approach for comparing the effect of multiple MEs in the face of multiple improvement aspects. A stepwise selection approach and structured plots help in addressing the multidimensionality of the problem. The approach is applied to a case study, which employs the meso-scale hydrosedimentological model WASA-SED for a sub-humid catchment. The results suggest that the effect of the MEs is quite diverse, with some MEs (e.g. augmented rainfall data) cause improvements for almost all aspects, while the effect of other MEs is restricted to few aspects or even deteriorate some. These specific results may not be generalizable. However, we suggest that based on studies like this, identifying the most promising MEs to implement may be facilitated.

  19. An extended Kalman filter approach to non-stationary Bayesian estimation of reduced-order vocal fold model parameters.

    PubMed

    Hadwin, Paul J; Peterson, Sean D

    2017-04-01

    The Bayesian framework for parameter inference provides a basis from which subject-specific reduced-order vocal fold models can be generated. Previously, it has been shown that a particle filter technique is capable of producing estimates and associated credibility intervals of time-varying reduced-order vocal fold model parameters. However, the particle filter approach is difficult to implement and has a high computational cost, which can be barriers to clinical adoption. This work presents an alternative estimation strategy based upon Kalman filtering aimed at reducing the computational cost of subject-specific model development. The robustness of this approach to Gaussian and non-Gaussian noise is discussed. The extended Kalman filter (EKF) approach is found to perform very well in comparison with the particle filter technique at dramatically lower computational cost. Based upon the test cases explored, the EKF is comparable in terms of accuracy to the particle filter technique when greater than 6000 particles are employed; if less particles are employed, the EKF actually performs better. For comparable levels of accuracy, the solution time is reduced by 2 orders of magnitude when employing the EKF. By virtue of the approximations used in the EKF, however, the credibility intervals tend to be slightly underpredicted.

  20. Impact of Soil and Water Conservation Interventions on Watershed Runoff Response in a Tropical Humid Highland of Ethiopia.

    PubMed

    Sultan, Dagnenet; Tsunekawa, Atsushi; Haregeweyn, Nigussie; Adgo, Enyew; Tsubo, Mitsuru; Meshesha, Derege Tsegaye; Masunaga, Tsugiyuki; Aklog, Dagnachew; Fenta, Ayele Almaw; Ebabu, Kindiye

    2018-05-01

    Various soil and water conservation measures (SWC) have been widely implemented to reduce surface runoff in degraded and drought-prone watersheds. But little quantitative study has been done on to what extent such measures can reduce watershed-scale runoff, particularly from typical humid tropical highlands of Ethiopia. The overall goal of this study is to analyze the impact of SWC interventions on the runoff response by integrating field measurement with a hydrological CN model which gives a quantitative analysis future thought. Firstly, a paired-watershed approach was employed to quantify the relative difference in runoff response for the Kasiry (treated) and Akusty (untreated) watersheds. Secondly, a calibrated curve number hydrological modeling was applied to investigate the effect of various SWC management scenarios for the Kasiry watershed alone. The paired-watershed approach showed a distinct runoff response between the two watersheds however the effect of SWC measures was not clearly discerned being masked by other factors. On the other hand, the model predicts that, under the current SWC coverage at Kasiry, the seasonal runoff yield is being reduced by 5.2%. However, runoff yields from Kasiry watershed could be decreased by as much as 34% if soil bunds were installed on cultivated land and trenches were installed on grazing and plantation lands. In contrast, implementation of SWC measures on bush land and natural forest would have little effect on reducing runoff. The results on the magnitude of runoff reduction under optimal combinations of SWC measures and land use will support decision-makers in selection and promotion of valid management practices that are suited to particular biophysical niches in the tropical humid highlands of Ethiopia.

  1. Model, analysis, and evaluation of the effects of analog VLSI arithmetic on linear subspace-based image recognition.

    PubMed

    Carvajal, Gonzalo; Figueroa, Miguel

    2014-07-01

    Typical image recognition systems operate in two stages: feature extraction to reduce the dimensionality of the input space, and classification based on the extracted features. Analog Very Large Scale Integration (VLSI) is an attractive technology to achieve compact and low-power implementations of these computationally intensive tasks for portable embedded devices. However, device mismatch limits the resolution of the circuits fabricated with this technology. Traditional layout techniques to reduce the mismatch aim to increase the resolution at the transistor level, without considering the intended application. Relating mismatch parameters to specific effects in the application level would allow designers to apply focalized mismatch compensation techniques according to predefined performance/cost tradeoffs. This paper models, analyzes, and evaluates the effects of mismatched analog arithmetic in both feature extraction and classification circuits. For the feature extraction, we propose analog adaptive linear combiners with on-chip learning for both Least Mean Square (LMS) and Generalized Hebbian Algorithm (GHA). Using mathematical abstractions of analog circuits, we identify mismatch parameters that are naturally compensated during the learning process, and propose cost-effective guidelines to reduce the effect of the rest. For the classification, we derive analog models for the circuits necessary to implement Nearest Neighbor (NN) approach and Radial Basis Function (RBF) networks, and use them to emulate analog classifiers with standard databases of face and hand-writing digits. Formal analysis and experiments show how we can exploit adaptive structures and properties of the input space to compensate the effects of device mismatch at the application level, thus reducing the design overhead of traditional layout techniques. Results are also directly extensible to multiple application domains using linear subspace methods. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Implementation of standardized follow-up care significantly reduces peritonitis in children on chronic peritoneal dialysis.

    PubMed

    Neu, Alicia M; Richardson, Troy; Lawlor, John; Stuart, Jayne; Newland, Jason; McAfee, Nancy; Warady, Bradley A

    2016-06-01

    The Standardizing Care to improve Outcomes in Pediatric End stage renal disease (SCOPE) Collaborative aims to reduce peritonitis rates in pediatric chronic peritoneal dialysis patients by increasing implementation of standardized care practices. To assess this, monthly care bundle compliance and annualized monthly peritonitis rates were evaluated from 24 SCOPE centers that were participating at collaborative launch and that provided peritonitis rates for the 13 months prior to launch. Changes in bundle compliance were assessed using either a logistic regression model or a generalized linear mixed model. Changes in average annualized peritonitis rates over time were illustrated using the latter model. In the first 36 months of the collaborative, 644 patients with 7977 follow-up encounters were included. The likelihood of compliance with follow-up care practices increased significantly (odds ratio 1.15, 95% confidence interval 1.10, 1.19). Mean monthly peritonitis rates significantly decreased from 0.63 episodes per patient year (95% confidence interval 0.43, 0.92) prelaunch to 0.42 (95% confidence interval 0.31, 0.57) at 36 months postlaunch. A sensitivity analysis confirmed that as mean follow-up compliance increased, peritonitis rates decreased, reaching statistical significance at 80% at which point the prelaunch rate was 42% higher than the rate in the months following achievement of 80% compliance. In its first 3 years, the SCOPE Collaborative has increased the implementation of standardized follow-up care and demonstrated a significant reduction in average monthly peritonitis rates. Copyright © 2016 International Society of Nephrology. Published by Elsevier Inc. All rights reserved.

  3. Formal Specification and Design Techniques for Wireless Sensor and Actuator Networks

    PubMed Central

    Martínez, Diego; González, Apolinar; Blanes, Francisco; Aquino, Raúl; Simo, José; Crespo, Alfons

    2011-01-01

    A current trend in the development and implementation of industrial applications is to use wireless networks to communicate the system nodes, mainly to increase application flexibility, reliability and portability, as well as to reduce the implementation cost. However, the nondeterministic and concurrent behavior of distributed systems makes their analysis and design complex, often resulting in less than satisfactory performance in simulation and test bed scenarios, which is caused by using imprecise models to analyze, validate and design these systems. Moreover, there are some simulation platforms that do not support these models. This paper presents a design and validation method for Wireless Sensor and Actuator Networks (WSAN) which is supported on a minimal set of wireless components represented in Colored Petri Nets (CPN). In summary, the model presented allows users to verify the design properties and structural behavior of the system. PMID:22344203

  4. Data Assimilation by delay-coordinate nudging

    NASA Astrophysics Data System (ADS)

    Pazo, Diego; Lopez, Juan Manuel; Carrassi, Alberto

    2016-04-01

    A new nudging method for data assimilation, delay-coordinate nudging, is presented. Delay-coordinate nudging makes explicit use of present and past observations in the formulation of the forcing driving the model evolution at each time-step. Numerical experiments with a low order chaotic system show that the new method systematically outperforms standard nudging in different model and observational scenarios, also when using an un-optimized formulation of the delay-nudging coefficients. A connection between the optimal delay and the dominant Lyapunov exponent of the dynamics is found based on heuristic arguments and is confirmed by the numerical results, providing a guideline for the practical implementation of the algorithm. Delay-coordinate nudging preserves the easiness of implementation, the intuitive functioning and the reduced computational cost of the standard nudging, making it a potential alternative especially in the field of seasonal-to-decadal predictions with large Earth system models that limit the use of more sophisticated data assimilation procedures.

  5. Ontological approach for safe and effective polypharmacy prescription

    PubMed Central

    Grando, Adela; Farrish, Susan; Boyd, Cynthia; Boxwala, Aziz

    2012-01-01

    The intake of multiple medications in patients with various medical conditions challenges the delivery of medical care. Initial empirical studies and pilot implementations seem to indicate that generic safe and effective multi-drug prescription principles could be defined and reused to reduce adverse drug events and to support compliance with medical guidelines and drug formularies. Given that ontologies are known to provide well-principled, sharable, setting-independent and machine-interpretable declarative specification frameworks for modeling and reasoning on biomedical problems, we explore here their use in the context of multi-drug prescription. We propose an ontology for modeling drug-related knowledge and a repository of safe and effective generic prescription principles. To test the usability and the level of granularity of the developed ontology-based specification models and heuristic we implemented a tool that computes the complexity of multi-drug treatments, and a decision aid to check the safeness and effectiveness of prescribed multi-drug treatments. PMID:23304299

  6. Integrating empowerment evaluation and quality improvement to achieve healthcare improvement outcomes.

    PubMed

    Wandersman, Abraham; Alia, Kassandra Ann; Cook, Brittany; Ramaswamy, Rohit

    2015-10-01

    While the body of evidence-based healthcare interventions grows, the ability of health systems to deliver these interventions effectively and efficiently lags behind. Quality improvement approaches, such as the model for improvement, have demonstrated some success in healthcare but their impact has been lessened by implementation challenges. To help address these challenges, we describe the empowerment evaluation approach that has been developed by programme evaluators and a method for its application (Getting To Outcomes (GTO)). We then describe how GTO can be used to implement healthcare interventions. An illustrative healthcare quality improvement example that compares the model for improvement and the GTO method for reducing hospital admissions through improved diabetes care is described. We conclude with suggestions for integrating GTO and the model for improvement. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  7. A neural network controller for automated composite manufacturing

    NASA Technical Reports Server (NTRS)

    Lichtenwalner, Peter F.

    1994-01-01

    At McDonnell Douglas Aerospace (MDA), an artificial neural network based control system has been developed and implemented to control laser heating for the fiber placement composite manufacturing process. This neurocontroller learns an approximate inverse model of the process on-line to provide performance that improves with experience and exceeds that of conventional feedback control techniques. When untrained, the control system behaves as a proportional plus integral (PI) controller. However after learning from experience, the neural network feedforward control module provides control signals that greatly improve temperature tracking performance. Faster convergence to new temperature set points and reduced temperature deviation due to changing feed rate have been demonstrated on the machine. A Cerebellar Model Articulation Controller (CMAC) network is used for inverse modeling because of its rapid learning performance. This control system is implemented in an IBM compatible 386 PC with an A/D board interface to the machine.

  8. Real-time solution of linear computational problems using databases of parametric reduced-order models with arbitrary underlying meshes

    NASA Astrophysics Data System (ADS)

    Amsallem, David; Tezaur, Radek; Farhat, Charbel

    2016-12-01

    A comprehensive approach for real-time computations using a database of parametric, linear, projection-based reduced-order models (ROMs) based on arbitrary underlying meshes is proposed. In the offline phase of this approach, the parameter space is sampled and linear ROMs defined by linear reduced operators are pre-computed at the sampled parameter points and stored. Then, these operators and associated ROMs are transformed into counterparts that satisfy a certain notion of consistency. In the online phase of this approach, a linear ROM is constructed in real-time at a queried but unsampled parameter point by interpolating the pre-computed linear reduced operators on matrix manifolds and therefore computing an interpolated linear ROM. The proposed overall model reduction framework is illustrated with two applications: a parametric inverse acoustic scattering problem associated with a mockup submarine, and a parametric flutter prediction problem associated with a wing-tank system. The second application is implemented on a mobile device, illustrating the capability of the proposed computational framework to operate in real-time.

  9. A digitally implemented communications experiment utilizing the communications technology satellite, Hermes

    NASA Technical Reports Server (NTRS)

    Jackson, H. D.; Fiala, J.

    1980-01-01

    Developments which will reduce the costs associated with the distribution of satellite services are considered with emphasis on digital communication link implementation. A digitally implemented communications experiment (DICE) which demonstrates the flexibility and efficiency of digital transmission of television video and audio, telephone voice, and high-bit-rate data is described. The utilization of the DICE system in a full duplex teleconferencing mode is addressed. Demonstration teleconferencing results obtained during the conduct of two sessions of the 7th AIAA Communication Satellite Systems Conference are discussed. Finally, the results of link characterization tests conducted to determine (1) relationships between the Hermes channel 1 EIRP and DICE model performance and (2) channel spacing criteria for acceptable multichannel operation, are presented.

  10. Implementation of a transfusion algorithm to reduce blood product utilization in pediatric cardiac surgery.

    PubMed

    Whitney, Gina; Daves, Suanne; Hughes, Alex; Watkins, Scott; Woods, Marcella; Kreger, Michael; Marincola, Paula; Chocron, Isaac; Donahue, Brian

    2013-07-01

    The goal of this project is to measure the impact of standardization of transfusion practice on blood product utilization and postoperative bleeding in pediatric cardiac surgery patients. Transfusion is common following cardiopulmonary bypass (CPB) in children and is associated with increased mortality, infection, and duration of mechanical ventilation. Transfusion in pediatric cardiac surgery is often based on clinical judgment rather than objective data. Although objective transfusion algorithms have demonstrated efficacy for reducing transfusion in adult cardiac surgery, such algorithms have not been applied in the pediatric setting. This quality improvement effort was designed to reduce blood product utilization in pediatric cardiac surgery using a blood product transfusion algorithm. We implemented an evidence-based transfusion protocol in January 2011 and monitored the impact of this algorithm on blood product utilization, chest tube output during the first 12 h of intensive care unit (ICU) admission, and predischarge mortality. When compared with the 12 months preceding implementation, blood utilization per case in the operating room odds ratio (OR) for the 11 months following implementation decreased by 66% for red cells (P = 0.001) and 86% for cryoprecipitate (P < 0.001). Blood utilization during the first 12 h of ICU did not increase during this time and actually decreased 56% for plasma (P = 0.006) and 41% for red cells (P = 0.031), indicating that the decrease in OR transfusion did not shift the transfusion burden to the ICU. Postoperative bleeding, as measured by chest tube output in the first 12 ICU hours, did not increase following implementation of the algorithm. Monthly surgical volume did not change significantly following implementation of the algorithm (P = 0.477). In a logistic regression model for predischarge mortality among the nontransplant patients, after accounting for surgical severity and duration of CPB, use of the transfusion algorithm was associated with a 0.247 relative risk of mortality (P = 0.013). These results indicate that introduction of an objective transfusion algorithm in pediatric cardiac surgery significantly reduces perioperative blood product utilization and mortality, without increasing postoperative chest tube losses. © 2013 John Wiley & Sons Ltd.

  11. Effect of a checklist on advanced trauma life support workflow deviations during trauma resuscitations without pre-arrival notification.

    PubMed

    Kelleher, Deirdre C; Jagadeesh Chandra Bose, R P; Waterhouse, Lauren J; Carter, Elizabeth A; Burd, Randall S

    2014-03-01

    Trauma resuscitations without pre-arrival notification are often initially chaotic, which can potentially compromise patient care. We hypothesized that trauma resuscitations without pre-arrival notification are performed with more variable adherence to ATLS protocol and that implementation of a checklist would improve performance. We analyzed event logs of trauma resuscitations from two 4-month periods before (n = 222) and after (n = 215) checklist implementation. Using process mining techniques, individual resuscitations were compared with an ideal workflow model of 6 ATLS primary survey tasks performed by the bedside evaluator and given model fitness scores (range 0 to 1). Mean fitness scores and frequency of conformance (fitness = 1) were compared (using Student's t-test or chi-square test, as appropriate) for activations with and without notification both before and after checklist implementation. Multivariable linear regression, controlling for patient and resuscitation characteristics, was also performed to assess the association between pre-arrival notification and model fitness before and after checklist implementation. Fifty-five (12.6%) resuscitations lacked pre-arrival notification (23 pre-implementation and 32 post-implementation; p = 0.15). Before checklist implementation, resuscitations without notification had lower fitness (0.80 vs 0.90; p < 0.001) and conformance (26.1% vs 50.8%; p = 0.03) than those with notification. After checklist implementation, the fitness (0.80 vs 0.91; p = 0.007) and conformance (26.1% vs 59.4%; p = 0.01) improved for resuscitations without notification, but still remained lower than activations with notification. In multivariable analysis, activations without notification had lower fitness both before (b = -0.11, p < 0.001) and after checklist implementation (b = -0.04, p = 0.02). Trauma resuscitations without pre-arrival notification are associated with a decreased adherence to key components of the ATLS primary survey protocol. The addition of a checklist improves protocol adherence and reduces the effect of notification on task performance. Copyright © 2014 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  12. Development of BEM for ceramic composites

    NASA Technical Reports Server (NTRS)

    Henry, D. P.; Banerjee, P. K.; Dargush, G. F.

    1990-01-01

    Details on the progress made during the first three years of a five-year program towards the development of a boundary element code are presented. This code was designed for the micromechanical studies of advance ceramic composites. Additional effort was made in generalizing the implementation to allow the program to be applicable to real problems in the aerospace industry. The ceramic composite formulations developed were implemented in the three-dimensional boundary element computer code BEST3D. BEST3D was adopted as the base for the ceramic composite program, so that many of the enhanced features of this general purpose boundary element code could by utilized. Some of these facilities include sophisticated numerical integration, the capability of local definition of boundary conditions, and the use of quadratic shape functions for modeling geometry and field variables on the boundary. The multi-region implementation permits a body to be modeled in substructural parts; thus dramatically reducing the cost of the analysis. Furthermore, it allows a body consisting of regions of different ceramic matrices and inserts to be studied.

  13. Oregon's Coordinated Care Organizations Increased Timely Prenatal Care Initiation And Decreased Disparities.

    PubMed

    Muoto, Ifeoma; Luck, Jeff; Yoon, Jangho; Bernell, Stephanie; Snowden, Jonathan M

    2016-09-01

    Policies at the state and federal levels affect access to health services, including prenatal care. In 2012 the State of Oregon implemented a major reform of its Medicaid program. The new model, called a coordinated care organization (CCO), is designed to improve the coordination of care for Medicaid beneficiaries. This reform effort provides an ideal opportunity to evaluate the impact of broad financing and delivery reforms on prenatal care use. Using birth certificate data from Oregon and Washington State, we evaluated the effect of CCO implementation on the probability of early prenatal care initiation, prenatal care adequacy, and disparities in prenatal care use by type of insurance. Following CCO implementation, we found significant increases in early prenatal care initiation and a reduction in disparities across insurance types but no difference in overall prenatal care adequacy. Oregon's reforms could serve as a model for other Medicaid and commercial health plans seeking to improve prenatal care quality and reduce disparities. Project HOPE—The People-to-People Health Foundation, Inc.

  14. Implementation of an Online Chemistry Model to a Large Eddy Simulation Model (PALM-4U0

    NASA Astrophysics Data System (ADS)

    Mauder, M.; Khan, B.; Forkel, R.; Banzhaf, S.; Russo, E. E.; Sühring, M.; Kanani-Sühring, F.; Raasch, S.; Ketelsen, K.

    2017-12-01

    Large Eddy Simulation (LES) models permit to resolve relevant scales of turbulent motion, so that these models can capture the inherent unsteadiness of atmospheric turbulence. However, LES models are so far hardly applied for urban air quality studies, in particular chemical transformation of pollutants. In this context, BMBF (Bundesministerium für Bildung und Forschung) funded a joint project, MOSAIK (Modellbasierte Stadtplanung und Anwendung im Klimawandel / Model-based city planning and application in climate change) with the main goal to develop a new highly efficient urban climate model (UCM) that also includes atmospheric chemical processes. The state-of-the-art LES model PALM; Maronga et al, 2015, Geosci. Model Dev., 8, doi:10.5194/gmd-8-2515-2015), has been used as a core model for the new UCM named as PALM-4U. For the gas phase chemistry, a fully coupled 'online' chemistry model has been implemented into PALM. The latest version of the Kinetic PreProcessor (KPP) Version 2.3, has been utilized for the numerical integration of chemical species. Due to the high computational demands of the LES model, compromises in the description of chemical processes are required. Therefore, a reduced chemistry mechanism, which includes only major pollutants namely O3, NO, NO2, CO, a highly simplified VOC chemistry and a small number of products have been implemented. This work shows preliminary results of the advection, and chemical transformation of atmospheric pollutants. Non-cyclic boundaries have been used for inflow and outflow in east-west directions while periodic boundary conditions have been implemented to the south-north lateral boundaries. For practical applications, our approach is to go beyond the simulation of single street canyons to chemical transformation, advection and deposition of air pollutants in the larger urban canopy. Tests of chemistry schemes and initial studies of chemistry-turbulence, transport and transformations are presented.

  15. Development of extended WRF variational data assimilation system (WRFDA) for WRF non-hydrostatic mesoscale model

    NASA Astrophysics Data System (ADS)

    Pattanayak, Sujata; Mohanty, U. C.

    2018-06-01

    The paper intends to present the development of the extended weather research forecasting data assimilation (WRFDA) system in the framework of the non-hydrostatic mesoscale model core of weather research forecasting system (WRF-NMM), as an imperative aspect of numerical modeling studies. Though originally the WRFDA provides improved initial conditions for advanced research WRF, we have successfully developed a unified WRFDA utility that can be used by the WRF-NMM core, as well. After critical evaluation, it has been strategized to develop a code to merge WRFDA framework and WRF-NMM output. In this paper, we have provided a few selected implementations and initial results through single observation test, and background error statistics like eigenvalues, eigenvector and length scale among others, which showcase the successful development of extended WRFDA code for WRF-NMM model. Furthermore, the extended WRFDA system is applied for the forecast of three severe cyclonic storms: Nargis (27 April-3 May 2008), Aila (23-26 May 2009) and Jal (4-8 November 2010) formed over the Bay of Bengal. Model results are compared and contrasted within the analysis fields and later on with high-resolution model forecasts. The mean initial position error is reduced by 33% with WRFDA as compared to GFS analysis. The vector displacement errors in track forecast are reduced by 33, 31, 30 and 20% to 24, 48, 72 and 96 hr forecasts respectively, in data assimilation experiments as compared to control run. The model diagnostics indicates successful implementation of WRFDA within the WRF-NMM system.

  16. Biomass production in the Lower Mississippi River Basin: Mitigating associated nutrient and sediment discharge to the Gulf of Mexico.

    PubMed

    Ha, Miae; Zhang, Zhonglong; Wu, May

    2018-04-24

    A watershed model was developed using the Soil and Water Assessment Tool (SWAT) that simulates nitrogen, phosphorus, and sediment loadings in the Lower Mississippi River Basin (LMRB). The LMRB SWAT model was calibrated and validated using 21 years of observed flow, sediment, and water-quality data. The baseline model results indicate that agricultural lands within the Lower Mississippi River Basin (LMRB) are the dominant sources of nitrogen and phosphorus discharging into the Gulf of Mexico. The model was further used to evaluate the impact of biomass production, in the presence of riparian buffers in the LMRB, on suspended-sediment and nutrient loading discharge from the Mississippi River into the Gulf of Mexico. The interplay among land use, riparian buffers, crop type, land slope, water quality, and hydrology were anlyzed at various scales. Implementing a riparian buffer in the dominant agricultural region within the LMRB could reduce suspended sediment, nitrogen, and phosphorus loadings at the regional scale by up to 65%, 38%, and 39%, respectively. Implementation of this land management practice can reduce the suspended-sediment content and improve the water quality of the discharge from the LMRB into the Gulf of Mexico and support the potential production of bioenergy and bio-products within the Mississippi River Basin. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Caring Wisely: A Program to Support Frontline Clinicians and Staff in Improving Healthcare Delivery and Reducing Costs.

    PubMed

    Gonzales, Ralph; Moriates, Christopher; Lau, Catherine; Valencia, Victoria; Imershein, Sarah; Rajkomar, Alvin; Prasad, Priya; Boscardin, Christy; Grady, Deborah; Johnston, S

    2017-08-01

    We describe a program called "Caring Wisely"®, developed by the University of California, San Francisco's (UCSF), Center for Healthcare Value, to increase the value of services provided at UCSF Health. The overarching goal of the Caring Wisely® program is to catalyze and advance delivery system redesign and innovations that reduce costs, enhance healthcare quality, and improve health outcomes. The program is designed to engage frontline clinicians and staff-aided by experienced implementation scientists-to develop and implement interventions specifically designed to address overuse, underuse, or misuse of services. Financial savings of the program are intended to cover the program costs. The theoretical underpinnings for the design of the Caring Wisely® program emphasize the importance of stakeholder engagement, behavior change theory, market (target audience) segmentation, and process measurement and feedback. The Caring Wisely® program provides an institutional model for using crowdsourcing to identify "hot spot" areas of low-value care, inefficiency and waste, and for implementing robust interventions to address these areas. © 2017 Society of Hospital Medicine.

  18. An Employee-Centered Care Model Responds to the Triple Aim: Improving Employee Health.

    PubMed

    Fox, Kelly; McCorkle, Ruth

    2018-01-01

    Health care expenditures, patient satisfaction, and timely access to care will remain problematic if dramatic changes in health care delivery models are not developed and implemented. To combat this challenge, a Triple Aim approach is essential; Innovation in payment and health care delivery models is required. Using the Donabedian framework of structure, process, and outcome, this article describes a nurse-led employee-centered care model designed to improve consumers' health care experiences, improve employee health, and increase access to care while reducing health care costs for employees, age 18 and older, in a corporate environment.

  19. Active vibration control with model correction on a flexible laboratory grid structure

    NASA Technical Reports Server (NTRS)

    Schamel, George C., II; Haftka, Raphael T.

    1991-01-01

    This paper presents experimental and computational comparisons of three active damping control laws applied to a complex laboratory structure. Two reduced structural models were used with one model being corrected on the basis of measured mode shapes and frequencies. Three control laws were investigated, a time-invariant linear quadratic regulator with state estimation and two direct rate feedback control laws. Experimental results for all designs were obtained with digital implementation. It was found that model correction improved the agreement between analytical and experimental results. The best agreement was obtained with the simplest direct rate feedback control.

  20. [The implementation of the week surgery in an orthopedic and urology ward and assessment of its impact].

    PubMed

    Mulloni, Giovanna; Petrucco, Stefania; De Marc, Raffaella; Nazzi, Cheti; Petri, Roberto; Guarrera, Giovanni Maria

    2015-01-01

    The implementation of the week surgery in an orthopedic and urology ward and the assessment of its impact. The week surgery (WS) is one of the models organized according the intensity of care that allows the improvement of the appropriateness of the hospital admissions. To describe the implementation and the impact of the WS on costs and levels of care. The WS was gradually implemented in an orthopedic and urology ward. The planning of the surgeries was modified, the wards where patients would have been transferred during the week-end where identified, the nurses were supported by expert nurses to learn new skills and clinical pathways were implemented. The periods January-June 2012 and 2013 were compared identifying a set of indicators according to the health technology assessment method. The nurses were able to take vacations according to schedule; the cost of outsourcing services were reduced (-4.953 Euros) as well as those of consumables. The nursing care could be guaranteed employing less (-5) full-time nurses; the global clinical performance of the ward did not vary. Unfortunately several urology patients could not be discharged during the week-ends. A good planning of the surgeries according to the patients' length of staying, together with interventions to increase the staff-skill mix, and the clinical pathways allowed an effective and efficient implementation of the WS model without jeopardizing patients' safety.

  1. Performance analysis of distributed symmetric sparse matrix vector multiplication algorithm for multi-core architectures

    DOE PAGES

    Oryspayev, Dossay; Aktulga, Hasan Metin; Sosonkina, Masha; ...

    2015-07-14

    In this article, sparse matrix vector multiply (SpMVM) is an important kernel that frequently arises in high performance computing applications. Due to its low arithmetic intensity, several approaches have been proposed in literature to improve its scalability and efficiency in large scale computations. In this paper, our target systems are high end multi-core architectures and we use messaging passing interface + open multiprocessing hybrid programming model for parallelism. We analyze the performance of recently proposed implementation of the distributed symmetric SpMVM, originally developed for large sparse symmetric matrices arising in ab initio nuclear structure calculations. We also study important featuresmore » of this implementation and compare with previously reported implementations that do not exploit underlying symmetry. Our SpMVM implementations leverage the hybrid paradigm to efficiently overlap expensive communications with computations. Our main comparison criterion is the "CPU core hours" metric, which is the main measure of resource usage on supercomputers. We analyze the effects of topology-aware mapping heuristic using simplified network load model. Furthermore, we have tested the different SpMVM implementations on two large clusters with 3D Torus and Dragonfly topology. Our results show that the distributed SpMVM implementation that exploits matrix symmetry and hides communication yields the best value for the "CPU core hours" metric and significantly reduces data movement overheads.« less

  2. Estimating the variance for heterogeneity in arm-based network meta-analysis.

    PubMed

    Piepho, Hans-Peter; Madden, Laurence V; Roger, James; Payne, Roger; Williams, Emlyn R

    2018-04-19

    Network meta-analysis can be implemented by using arm-based or contrast-based models. Here we focus on arm-based models and fit them using generalized linear mixed model procedures. Full maximum likelihood (ML) estimation leads to biased trial-by-treatment interaction variance estimates for heterogeneity. Thus, our objective is to investigate alternative approaches to variance estimation that reduce bias compared with full ML. Specifically, we use penalized quasi-likelihood/pseudo-likelihood and hierarchical (h) likelihood approaches. In addition, we consider a novel model modification that yields estimators akin to the residual maximum likelihood estimator for linear mixed models. The proposed methods are compared by simulation, and 2 real datasets are used for illustration. Simulations show that penalized quasi-likelihood/pseudo-likelihood and h-likelihood reduce bias and yield satisfactory coverage rates. Sum-to-zero restriction and baseline contrasts for random trial-by-treatment interaction effects, as well as a residual ML-like adjustment, also reduce bias compared with an unconstrained model when ML is used, but coverage rates are not quite as good. Penalized quasi-likelihood/pseudo-likelihood and h-likelihood are therefore recommended. Copyright © 2018 John Wiley & Sons, Ltd.

  3. Agent based models for testing city evacuation strategies under a flood event as strategy to reduce flood risk

    NASA Astrophysics Data System (ADS)

    Medina, Neiler; Sanchez, Arlex; Nokolic, Igor; Vojinovic, Zoran

    2016-04-01

    This research explores the uses of Agent Based Models (ABM) and its potential to test large scale evacuation strategies in coastal cities at risk from flood events due to extreme hydro-meteorological events with the final purpose of disaster risk reduction by decreasing human's exposure to the hazard. The first part of the paper corresponds to the theory used to build the models such as: Complex adaptive systems (CAS) and the principles and uses of ABM in this field. The first section outlines the pros and cons of using AMB to test city evacuation strategies at medium and large scale. The second part of the paper focuses on the central theory used to build the ABM, specifically the psychological and behavioral model as well as the framework used in this research, specifically the PECS reference model is cover in this section. The last part of this section covers the main attributes or characteristics of human beings used to described the agents. The third part of the paper shows the methodology used to build and implement the ABM model using Repast-Symphony as an open source agent-based modelling and simulation platform. The preliminary results for the first implementation in a region of the island of Sint-Maarten a Dutch Caribbean island are presented and discussed in the fourth section of paper. The results obtained so far, are promising for a further development of the model and its implementation and testing in a full scale city

  4. A reduced theoretical model for estimating condensation effects in combustion-heated hypersonic tunnel

    NASA Astrophysics Data System (ADS)

    Lin, L.; Luo, X.; Qin, F.; Yang, J.

    2018-03-01

    As one of the combustion products of hydrocarbon fuels in a combustion-heated wind tunnel, water vapor may condense during the rapid expansion process, which will lead to a complex two-phase flow inside the wind tunnel and even change the design flow conditions at the nozzle exit. The coupling of the phase transition and the compressible flow makes the estimation of the condensation effects in such wind tunnels very difficult and time-consuming. In this work, a reduced theoretical model is developed to approximately compute the nozzle-exit conditions of a flow including real-gas and homogeneous condensation effects. Specifically, the conservation equations of the axisymmetric flow are first approximated in the quasi-one-dimensional way. Then, the complex process is split into two steps, i.e., a real-gas nozzle flow but excluding condensation, resulting in supersaturated nozzle-exit conditions, and a discontinuous jump at the end of the nozzle from the supersaturated state to a saturated state. Compared with two-dimensional numerical simulations implemented with a detailed condensation model, the reduced model predicts the flow parameters with good accuracy except for some deviations caused by the two-dimensional effect. Therefore, this reduced theoretical model can provide a fast, simple but also accurate estimation of the condensation effect in combustion-heated hypersonic tunnels.

  5. Melanoma screening: Informing public health policy with quantitative modelling.

    PubMed

    Gilmore, Stephen

    2017-01-01

    Australia and New Zealand share the highest incidence rates of melanoma worldwide. Despite the substantial increase in public and physician awareness of melanoma in Australia over the last 30 years-as a result of the introduction of publicly funded mass media campaigns that began in the early 1980s -mortality has steadily increased during this period. This increased mortality has led investigators to question the relative merits of primary versus secondary prevention; that is, sensible sun exposure practices versus early detection. Increased melanoma vigilance on the part of the public and among physicians has resulted in large increases in public health expenditure, primarily from screening costs and increased rates of office surgery. Has this attempt at secondary prevention been effective? Unfortunately epidemiologic studies addressing the causal relationship between the level of secondary prevention and mortality are prohibitively difficult to implement-it is currently unknown whether increased melanoma surveillance reduces mortality, and if so, whether such an approach is cost-effective. Here I address the issue of secondary prevention of melanoma with respect to incidence and mortality (and cost per life saved) by developing a Markov model of melanoma epidemiology based on Australian incidence and mortality data. The advantages of developing a methodology that can determine constraint-based surveillance outcomes are twofold: first, it can address the issue of effectiveness; and second, it can quantify the trade-off between cost and utilisation of medical resources on one hand, and reduced morbidity and lives saved on the other. With respect to melanoma, implementing the model facilitates the quantitative determination of the relative effectiveness and trade-offs associated with different levels of secondary and tertiary prevention, both retrospectively and prospectively. For example, I show that the surveillance enhancement that began in 1982 has resulted in greater diagnostic incidence and reduced mortality, but the reduced mortality carried a significant cost per life saved. I implement the model out to 2028 and demonstrate that the enhanced secondary prevention that began in 1982 becomes increasingly cost-effective over the period 2013-2028. On the other hand, I show that reductions in mortality achieved by significantly enhancing secondary prevention beyond 2013 levels are comparable with those achieved by only modest improvements in late-stage disease survival. Given the ballooning costs of increased melanoma surveillance, I suggest the process of public health policy decision-making-particularly with respect to the public funding of melanoma screening and discretionary mole removal-would be better served by incorporating the results of quantitative modelling.

  6. ICARUSS, the Integrated Care for the Reduction of Secondary Stroke trial: rationale and design of a randomized controlled trial of a multimodal intervention to prevent recurrent stroke in patients with a recent cerebrovascular event, ACTRN = 12611000264987.

    PubMed

    Joubert, J; Davis, S M; Hankey, G J; Levi, C; Olver, J; Gonzales, G; Donnan, G A

    2015-07-01

    The majority of strokes, both ischaemic and haemorrhagic, are attributable to a relatively small number of risk factors which are readily manageable in primary care setting. Implementation of best-practice recommendations for risk factor management is calculated to reduce stroke recurrence by around 80%. However, risk factor management in stroke survivors has generally been poor at primary care level. A model of care that supports long-term effective risk factor management is needed. To determine whether the model of Integrated Care for the Reduction of Recurrent Stroke (ICARUSS) will, through promotion of implementation of best-practice recommendations for risk factor management reduce the combined incidence of stroke, myocardial infarction and vascular death in patients with recent stroke or transient ischaemic attack (TIA) of the brain or eye. A prospective, Australian, multicentre, randomized controlled trial. Academic stroke units in Melbourne, Perth and the John Hunter Hospital, New South Wales. 1000 stroke survivors recruited as from March 2007 with a recent (<3 months) stroke (ischaemic or haemorrhagic) or a TIA (brain or eye). Randomization and data collection are performed by means of a central computer generated telephone system (IVRS). Exposure to the ICARUSS model of integrated care or usual care. The composite of stroke, MI or death from any vascular cause, whichever occurs first. Risk factor management in the community, depression, quality of life, disability and dementia. With 1000 patients followed up for a median of one-year, with a recurrence rate of 7-10% per year in patients exposed to usual care, the study will have at least 80% power to detect a significant reduction in primary end-points The ICARUSS study aims to recruit and follow up patients between 2007 and 2013 and demonstrate the effectiveness of exposure to the ICARUSS model in stroke survivors to reduce recurrent stroke or vascular events and promote the implementation of best practice risk factor management at primary care level. © 2015 World Stroke Organization.

  7. Road trauma among young Australians: Implementing policy to reduce road deaths and serious injury.

    PubMed

    Walker, Clara; Thompson, Jason; Stevenson, Mark

    2017-05-19

    The objective of this study was to estimate the likely reduction in road trauma associated with the implementation of effective interventions to reduce road trauma among young Australians. A desktop evaluation was conducted to model the likely reduction in road trauma (deaths and serious injuries resulting in hospitalization) among young people aged 17-24 years residing in Queensland, New South Wales, and Victoria. Potential interventions were identified using a rapid literature review and assigned a score based on evidence of effectiveness and implementation feasibility with the 3 highest scoring interventions included in the modeling. Likely reduction in road trauma was estimated by applying the average risk reduction effect sizes for each intervention to baseline risk (passenger or driver death or serious injury per 100,000 population) of road trauma for young Australians. Point estimates were calculated for the potential number of deaths and serious injuries averted in each state and per 100,000 population, with a one-way sensitivity analysis conducted using uncertainty ranges identified. Peer passenger and night driving restrictions as well as improved vehicle safety measures had the greatest potential to reduce road trauma. Peer passenger restrictions could avert 14 (range: 5-24) and 24 (range: 8-41) hospitalizations per year in Queensland and New South Wales, respectively, and night driving restrictions could avert 17 (range: 7-26), 28 (range: 12-45), and 13 (range: 6-21) hospitalizations annually in Queensland, New South Wales, and Victoria. These interventions reduced fatalities by less than 1 death annually in each state. Improved vehicle safety measures could avert 0-3, 0-4, and 0-3 deaths and 3-91, 4-156, and 2-75 hospitalizations in Queensland, New South Wales, and Victoria. Key elements of graduated licensing (peer passenger and night driving restrictions) along with vehicle safety interventions offer modest but practically significant reductions in road trauma for young Australians. State governments need to revise current legislation to ensure that these reductions in road trauma can be realized.

  8. Monte Carlo verification of radiotherapy treatments with CloudMC.

    PubMed

    Miras, Hector; Jiménez, Rubén; Perales, Álvaro; Terrón, José Antonio; Bertolet, Alejandro; Ortiz, Antonio; Macías, José

    2018-06-27

    A new implementation has been made on CloudMC, a cloud-based platform presented in a previous work, in order to provide services for radiotherapy treatment verification by means of Monte Carlo in a fast, easy and economical way. A description of the architecture of the application and the new developments implemented is presented together with the results of the tests carried out to validate its performance. CloudMC has been developed over Microsoft Azure cloud. It is based on a map/reduce implementation for Monte Carlo calculations distribution over a dynamic cluster of virtual machines in order to reduce calculation time. CloudMC has been updated with new methods to read and process the information related to radiotherapy treatment verification: CT image set, treatment plan, structures and dose distribution files in DICOM format. Some tests have been designed in order to determine, for the different tasks, the most suitable type of virtual machines from those available in Azure. Finally, the performance of Monte Carlo verification in CloudMC is studied through three real cases that involve different treatment techniques, linac models and Monte Carlo codes. Considering computational and economic factors, D1_v2 and G1 virtual machines were selected as the default type for the Worker Roles and the Reducer Role respectively. Calculation times up to 33 min and costs of 16 € were achieved for the verification cases presented when a statistical uncertainty below 2% (2σ) was required. The costs were reduced to 3-6 € when uncertainty requirements are relaxed to 4%. Advantages like high computational power, scalability, easy access and pay-per-usage model, make Monte Carlo cloud-based solutions, like the one presented in this work, an important step forward to solve the long-lived problem of truly introducing the Monte Carlo algorithms in the daily routine of the radiotherapy planning process.

  9. Understanding climate policy data needs

    NASA Astrophysics Data System (ADS)

    Brown, Molly E.; Macauley, Molly

    2012-08-01

    NASA Carbon Monitoring System: Characterizing Flux Uncertainty; Washington, D. C, 11 January 2012 Climate policy in the United States is currently guided by public-private partnerships and actions at the local and state levels that focus on energy efficiency, renewable energy, agricultural practices, and implementation of technologies to reduce greenhouse gases. How will policy makers know if these strategies are working, particularly at the scales at which they are being implemented? The NASA Carbon Monitoring System (CMS) will provide information on carbon dioxide (CO2) fluxes derived from observations of Earth's land, ocean, and atmosphere used in state-of-the-art models describing their interactions. This new modeling system could be used to assess the impact of specific policy interventions on reductions of atmospheric CO2 concentrations, enabling an iterative, results-oriented policy process.

  10. Reduced Order Model Implementation in the Risk-Informed Safety Margin Characterization Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandelli, Diego; Smith, Curtis L.; Alfonsi, Andrea

    2015-09-01

    The RISMC project aims to develop new advanced simulation-based tools to perform Probabilistic Risk Analysis (PRA) for the existing fleet of U.S. nuclear power plants (NPPs). These tools numerically model not only the thermo-hydraulic behavior of the reactor primary and secondary systems but also external events temporal evolution and components/system ageing. Thus, this is not only a multi-physics problem but also a multi-scale problem (both spatial, µm-mm-m, and temporal, ms-s-minutes-years). As part of the RISMC PRA approach, a large amount of computationally expensive simulation runs are required. An important aspect is that even though computational power is regularly growing, themore » overall computational cost of a RISMC analysis may be not viable for certain cases. A solution that is being evaluated is the use of reduce order modeling techniques. During the FY2015, we investigated and applied reduced order modeling techniques to decrease the RICM analysis computational cost by decreasing the number of simulations runs to perform and employ surrogate models instead of the actual simulation codes. This report focuses on the use of reduced order modeling techniques that can be applied to any RISMC analysis to generate, analyze and visualize data. In particular, we focus on surrogate models that approximate the simulation results but in a much faster time (µs instead of hours/days). We apply reduced order and surrogate modeling techniques to several RISMC types of analyses using RAVEN and RELAP-7 and show the advantages that can be gained.« less

  11. Disciplined rubidium oscillator with GPS selective availability

    NASA Technical Reports Server (NTRS)

    Dewey, Wayne P.

    1993-01-01

    A U.S. Department of Defense decision for continuous implementation of GPS Selective Availability (S/A) has made it necessary to modify Rubidium oscillator disciplining methods. One such method for reducing the effects of S/A on the oscillator disciplining process was developed which achieves results approaching pre-S/A GPS. The Satellite Hopping algorithm used in minimizing the effects of S/A on the oscillator disciplining process is described, and the results of using this process to those obtained prior to the implementation of S/A are compared. Test results are from a TrueTime Rubidium based Model GPS-DC timing receiver.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shi, X.P.

    Empirical studies on the effectiveness of workplace safety regulations are inconclusive. This study hypothesizes that the asynchronous effects of safety regulations occur because regulations need time to become effective. Safety regulations will work initially by reducing the most serious accidents, and later by improving overall safety performance. The hypothesis is tested by studying a provincial level aggregate panel dataset for China's coal industry using two different models with different sets of dependent variables: a fixed-effects model on mortality rate, which is defined as fatalities per 1,000 employees; and a negative binominal model on the annual number (frequency) of disastrous accidents.more » Safety regulations can reduce the frequency of disastrous accidents, but have not reduced mortality rate, which represents overall safety performance. Policy recommendations are made, including shifting production from small to large mines through industrial consolidation, improving the safety performance of large mines, addressing consequences of decentralization, and facilitating the implementation of regulations through carrying on institutional actions and supporting legislation.« less

  13. Creating a Methodology for Coordinating High-resolution Air Quality Improvement Map and Greenhouse Gas Mitigation Strategies in Pittsburgh City

    NASA Astrophysics Data System (ADS)

    Shi, J.; Donahue, N. M.; Klima, K.; Blackhurst, M.

    2016-12-01

    In order to tradeoff global impacts of greenhouse gases with highly local impacts of conventional air pollution, researchers require a method to compare global and regional impacts. Unfortunately, we are not aware of a method that allows these to be compared, "apples-to-apples". In this research we propose a three-step model to compare possible city-wide actions to reduce greenhouse gases and conventional air pollutants. We focus on Pittsburgh, PA, a city with consistently poor air quality that is interested in reducing both greenhouse gases and conventional air pollutants. First, we use the 2013 Pittsburgh Greenhouse Gas Inventory to update the Blackhurst et al. model and conduct a greenhouse gas abatement potentials and implementation costs of proposed greenhouse gas reduction efforts. Second, we use field tests for PM2.5, NOx, SOx, organic carbon (OC) and elemental carbon (EC) data to inform a Land-use Regression Model for local air pollution at a 100m x 100m spatial level, which combined with a social cost of air pollution model (EASIUR) allows us to calculate economic social damages. Third, we combine these two models into a three-dimensional greenhouse gas cost abatement curve to understand the implementation costs and social benefits in terms of air quality improvement and greenhouse gas abatement for each potential intervention. We anticipated such results could provide policy-maker insights in green city development.

  14. Multi-Dimensional Quantum Tunneling and Transport Using the Density-Gradient Model

    NASA Technical Reports Server (NTRS)

    Biegel, Bryan A.; Yu, Zhi-Ping; Ancona, Mario; Rafferty, Conor; Saini, Subhash (Technical Monitor)

    1999-01-01

    We show that quantum effects are likely to significantly degrade the performance of MOSFETs (metal oxide semiconductor field effect transistor) as these devices are scaled below 100 nm channel length and 2 nm oxide thickness over the next decade. A general and computationally efficient electronic device model including quantum effects would allow us to monitor and mitigate these effects. Full quantum models are too expensive in multi-dimensions. Using a general but efficient PDE solver called PROPHET, we implemented the density-gradient (DG) quantum correction to the industry-dominant classical drift-diffusion (DD) model. The DG model efficiently includes quantum carrier profile smoothing and tunneling in multi-dimensions and for any electronic device structure. We show that the DG model reduces DD model error from as much as 50% down to a few percent in comparison to thin oxide MOS capacitance measurements. We also show the first DG simulations of gate oxide tunneling and transverse current flow in ultra-scaled MOSFETs. The advantages of rapid model implementation using the PDE solver approach will be demonstrated, as well as the applicability of the DG model to any electronic device structure.

  15. Toward optimal implementation of cancer prevention and control programs in public health: a study protocol on mis-implementation.

    PubMed

    Padek, Margaret; Allen, Peg; Erwin, Paul C; Franco, Melissa; Hammond, Ross A; Heuberger, Benjamin; Kasman, Matt; Luke, Doug A; Mazzucca, Stephanie; Moreland-Russell, Sarah; Brownson, Ross C

    2018-03-23

    Much of the cancer burden in the USA is preventable, through application of existing knowledge. State-level funders and public health practitioners are in ideal positions to affect programs and policies related to cancer control. Mis-implementation refers to ending effective programs and policies prematurely or continuing ineffective ones. Greater attention to mis-implementation should lead to use of effective interventions and more efficient expenditure of resources, which in the long term, will lead to more positive cancer outcomes. This is a three-phase study that takes a comprehensive approach, leading to the elucidation of tactics for addressing mis-implementation. Phase 1: We assess the extent to which mis-implementation is occurring among state cancer control programs in public health. This initial phase will involve a survey of 800 practitioners representing all states. The programs represented will span the full continuum of cancer control, from primary prevention to survivorship. Phase 2: Using data from phase 1 to identify organizations in which mis-implementation is particularly high or low, the team will conduct eight comparative case studies to get a richer understanding of mis-implementation and to understand contextual differences. These case studies will highlight lessons learned about mis-implementation and identify hypothesized drivers. Phase 3: Agent-based modeling will be used to identify dynamic interactions between individual capacity, organizational capacity, use of evidence, funding, and external factors driving mis-implementation. The team will then translate and disseminate findings from phases 1 to 3 to practitioners and practice-related stakeholders to support the reduction of mis-implementation. This study is innovative and significant because it will (1) be the first to refine and further develop reliable and valid measures of mis-implementation of public health programs; (2) bring together a strong, transdisciplinary team with significant expertise in practice-based research; (3) use agent-based modeling to address cancer control implementation; and (4) use a participatory, evidence-based, stakeholder-driven approach that will identify key leverage points for addressing mis-implementation among state public health programs. This research is expected to provide replicable computational simulation models that can identify leverage points and public health system dynamics to reduce mis-implementation in cancer control and may be of interest to other health areas.

  16. A comprehensive Network Security Risk Model for process control networks.

    PubMed

    Henry, Matthew H; Haimes, Yacov Y

    2009-02-01

    The risk of cyber attacks on process control networks (PCN) is receiving significant attention due to the potentially catastrophic extent to which PCN failures can damage the infrastructures and commodity flows that they support. Risk management addresses the coupled problems of (1) reducing the likelihood that cyber attacks would succeed in disrupting PCN operation and (2) reducing the severity of consequences in the event of PCN failure or manipulation. The Network Security Risk Model (NSRM) developed in this article provides a means of evaluating the efficacy of candidate risk management policies by modeling the baseline risk and assessing expectations of risk after the implementation of candidate measures. Where existing risk models fall short of providing adequate insight into the efficacy of candidate risk management policies due to shortcomings in their structure or formulation, the NSRM provides model structure and an associated modeling methodology that captures the relevant dynamics of cyber attacks on PCN for risk analysis. This article develops the NSRM in detail in the context of an illustrative example.

  17. The impact of electronic medical record systems on outpatient workflows: a longitudinal evaluation of its workflow effects.

    PubMed

    Vishwanath, Arun; Singh, Sandeep Rajan; Winkelstein, Peter

    2010-11-01

    The promise of the electronic medical record (EMR) lies in its ability to reduce the costs of health care delivery and improve the overall quality of care--a promise that is realized through major changes in workflows within the health care organization. Yet little systematic information exists about the workflow effects of EMRs. Moreover, some of the research to-date points to reduced satisfaction among physicians after implementation of the EMR and increased time, i.e., negative workflow effects. A better understanding of the impact of the EMR on workflows is, hence, vital to understanding what the technology really does offer that is new and unique. (i) To empirically develop a physician centric conceptual model of the workflow effects of EMRs; (ii) To use the model to understand the antecedents to the physicians' workflow expectation from the new EMR; (iii) To track physicians' satisfaction overtime, 3 months and 20 months after implementation of the EMR; (iv) To explore the impact of technology learning curves on physicians' reported satisfaction levels. The current research uses the mixed-method technique of concept mapping to empirically develop the conceptual model of an EMR's workflow effects. The model is then used within a controlled study to track physician expectations from a new EMR system as well as their assessments of the EMR's performance 3 months and 20 months after implementation. The research tracks the actual implementation of a new EMR within the outpatient clinics of a large northeastern research hospital. The pre-implementation survey netted 20 physician responses; post-implementation Time 1 survey netted 22 responses, and Time 2 survey netted 26 physician responses. The implementation of the actual EMR served as the intervention. Since the study was conducted within the same setting and tracked a homogenous group of respondents, the overall study design ensured against extraneous influences on the results. Outcome measures were derived empirically from the conceptual model. They included 85 items that measured physician perceptions of the EMR's workflow effect on the following eight issues: (1) administration, (2) efficiency in patient processing, (3) basic clinical processes, (4) documentation of patient encounter, (5) economic challenges and reimbursement, (6) technical issues, (7) patient safety and care, and (8) communication and confidentiality. The items were used to track expectations prior to implementation and they served as retrospective measures of satisfaction with the EMR in post-implementation Time 1 and Time 2. The findings suggest that physicians conceptualize EMRs as an incremental extension of older computerized provider order entries (CPOEs) rather than as a new innovation. The EMRs major functional advantages are seen to be very similar to, if not the same as, those of CPOEs. Technology learning curves play a statistically significant though minor role in shaping physician perceptions. The physicians' expectations from the EMR are based on their prior beliefs rather than on a rational evaluation of the EMR's fit, functionality, or performance. Their decision regarding the usefulness of the EMR is made very early, within the first few months of use of the EMR. These early perceptions then remain stable and become the lens through which subsequent experience with the EMR is interpreted. The findings suggest a need for communication based interventions aimed at explaining the value, fit, and usefulness of EMRs to physicians early in the pre- and immediate post-EMR implementation stages. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  18. Modeling College Graduation GPA Considering Equity in Admissions: Evidence from the University of Puerto Rico

    ERIC Educational Resources Information Center

    Matos-Díaz, Horacio; García, Dwight

    2014-01-01

    Over concerns about private school students' advantages in standardized tests, beginning in 1995-96 the University of Puerto Rico (UPR) implemented a new admissions formula that reduced the weight they previously had in the General Admissions Index (GAI), on which its admissions decisions are based. This study seeks to determine the possible…

  19. The Effects of Positive Behavior Interventions and Support on Changing the Behavior of Red Zone Students

    ERIC Educational Resources Information Center

    Robinson, Fredrick

    2012-01-01

    In order to improve culture, safety, and climate, numerous schools nationwide are implementing Positive Behavior Interventions and Support (PBIS). The purpose of this study was to examine the effectiveness of the Positive Behavior Interventions and Support (PBIS) model for reducing high-risk behaviors of students identified as red zone. The…

  20. 77 FR 73575 - Approval and Promulgation of Air Quality Implementation Plans; West Virginia; Redesignation of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-11

    ...) and Cross State Air Pollution Rule (CSAPR or the Transport Rule) On May 12, 2005, EPA published CAIR...) for the purpose of reducing SO 2 and NO X emissions. The monitoring data used to demonstrate the Area... Source Review (NSR) permit programs; Provisions for air pollution modeling; and Provisions for public and...

  1. And never the twain shall meet? Integrating revenue cycle and supply chain functions.

    PubMed

    Matjucha, Karen A; Chung, Bianca

    2008-09-01

    Four initial steps to implementing a profit and loss management model are: Identify the supplies clinicians are using. Empower stakeholders to remove items that are not commonly used. Reduce factors driving wasted product. Review the chargemaster to ensure that supplies used in selected procedures are represented. Strategically set prices that optimize maximum allowable reimbursement.

  2. Understanding User Resistance to Information Technology: Toward a Comprehensive Model in Health Information Technology

    ERIC Educational Resources Information Center

    Ngafeeson, Madison N.

    2013-01-01

    The successful implementation of health information systems is expected to increase legibility, reduce medical errors, boost the quality of healthcare and shrink costs. Yet, evidence points to the fact that healthcare professionals resist the full use of these systems. Physicians and nurses have been reported to resist the system. Even though…

  3. Sensitivity of disease management decision aids to temperature input errors associated with out-of-canopy and reduced time-resolution measurements

    USDA-ARS?s Scientific Manuscript database

    Plant disease management decision aids typically require inputs of weather elements such as air temperature. Whereas many disease models are created based on weather elements at the crop canopy, and with relatively fine time resolution, the decision aids commonly are implemented with hourly weather...

  4. Bridging Research and Practice: Challenges and Successes in Implementing Evidence-Based Preventive Intervention Strategies for Child Maltreatment

    ERIC Educational Resources Information Center

    Toth, Sheree L.; Manly, Jody Todd

    2011-01-01

    Child maltreatment has been associated with a wide range of negative developmental outcomes for children and families as well as significant economic consequences. While efficacious intervention strategies have been demonstrated to reduce symptoms of trauma and to improve behavioral and emotional functioning, these models have not been widely…

  5. The Role of Public Policies in Reducing Smoking

    PubMed Central

    Levy, David T.; Boyle, Raymond G.; Abrams, David B.

    2015-01-01

    Background Following the landmark lawsuit and settlement with the tobacco industry, Minnesota pursued the implementation of stricter tobacco control policies, including tax increases, mass media campaigns, smokefree air laws, and cessation treatment policies. Modeling is used to examine policy effects on smoking prevalence and smoking-attributable deaths. Purpose To estimate the effect of tobacco control policies in Minnesota on smoking prevalence and smoking-attributable deaths using the SimSmoke simulation model. Methods Minnesota data starting in 1993 are applied to SimSmoke, a simulation model used to examine the effect of tobacco control policies over time on smoking initiation and cessation. Upon validating the model against smoking prevalence, SimSmoke is used to distinguish the effect of policies implemented since 1993 on smoking prevalence. Using standard attribution methods, SimSmoke also estimates deaths averted as a result of the policies. Results SimSmoke predicts smoking prevalence accurately between 1993 and 2011. Since 1993, a relative reduction in smoking rates of 29% by 2011 and of 41% by 2041 can be attributed to tobacco control policies, mainly tax increases, smokefree air laws, media campaigns, and cessation treatment programs. Moreover, 48,000 smoking-attributable deaths will be averted by 2041. Conclusions Minnesota SimSmoke demonstrates that tobacco control policies, especially taxes, have substantially reduced smoking prevalence and smoking-attributable deaths. Taxes, smokefree air laws, mass media, cessation treatment policies, and youth-access enforcement contributed to the decline in prevalence and deaths averted, with the strongest component being taxes. With stronger policies, for example, increasing cigarette taxes to $4.00 per pack, Minnesota’s smoking rate could be reduced by another 13%, and 7200 deaths could be averted by 2041. PMID:23079215

  6. Reaching Healthy People 2010 by 2013

    PubMed Central

    Levy, David T.; Mabry, Patricia L.; Graham, Amanda L.; Orleans, C. Tracy; Abrams, David B.

    2010-01-01

    Background Healthy People 2010 (HP 2010) set as a goal to reduce adult smoking prevalence to 12% by 2010. Purpose This paper uses simulation modeling to examine the effects of three tobacco control policies and cessation treatment policies—alone and in conjunction—on population smoking prevalence. Methods Building on previous versions of the SimSmoke model, the effects of a defined set of policies on quit attempts, treatment use, and treatment effectiveness are estimated as potential levers to reduce smoking prevalence. The analysis considers the effects of (1) price increases through cigarette tax increases, (2) smokefree indoor air laws, (3) mass media/educational policies, and (4) evidence-based and promising cessation treatment policies. Results Evidence-based cessation treatment policies have the strongest effect, boosting the population quit rate by 78.8% in relative terms. Treatment policies are followed by cigarette tax increases (65.9%), smokefree air laws (31.8%), and mass media/educational policies (18.2%). Relative to the status quo in 2020, the model projects that smoking prevalence is reduced by 14.3% by a nationwide tax increase of $2.00, by 7.2% by smokefree laws, by 4.7% by mass media/educational policies, and by 16.5% by cessation treatment policies alone. Implementing all of the above policies in tandem would increase the quit rate by 296% such that the HP 2010 smoking prevalence goal of 12% is reached by 2013. Conclusions The impact of a combination of policies led to some surprisingly optimistic possible futures in lowering smoking prevalence to 12% within just several years. Simulation models can be a useful tool to evaluate complex scenarios where policies are implemented in tandem and for which there are limited data. PMID:20176310

  7. Necessity is the mother of invention: an innovative hospitalist-resident initiative for improving quality and reducing readmissions from skilled nursing facilities.

    PubMed

    Petigara, Sunny; Krishnamurthy, Mahesh; Livert, David

    2017-03-01

    Background : Hospital readmissions have been a major challenge to the US health system. Medicare data shows that approximately 25% of Medicare skilled nursing facility (SNF) residents are readmitted back to the hospital within 30 days. Some of the major reasons for high readmission rates include fragmented information exchange during transitions of care and limited access to physicians round-the-clock in SNFs. These represent safety, quality, and health outcome concerns. Aim : The goal of the project was to reduce hospital readmission rates from SNFs by improving transition of care and increasing physician availability in SNFs (five to seven days a week physical presence with 24/7 accessibility by phone). Methods : We proposed a model whereby a hospitalist-led team, including the resident on the geriatrics rotation, followed patients discharged from the hospital to one SNF. Readmission rates pre- and post-implementation were compared. Study results : The period between January 2014 and June 2014 served as the baseline and showed readmission rate of 32.32% from the SNF back to the hospital. After we implemented the new hospitalist SNF model in June 2014, readmission rates decreased to 23.96% between July 2014 and December 2014. From January 2015 to June 2015, the overall readmission rate from the SNF reduced further to 16.06%. Statistical analysis revealed a post-intervention odds ratio of 0.403 (p < 0.001). Conclusion : The government is piloting several care models that incentivize value- based behavior. Our study strongly suggests that the hospitalist-resident continuity model of following patients to the SNFs can significantly decrease 30-days hospital readmission rates.

  8. Life Prediction of Large Lithium-Ion Battery Packs with Active and Passive Balancing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shi, Ying; Smith, Kandler A; Zane, Regan

    Lithium-ion battery packs take a major part of large-scale stationary energy storage systems. One challenge in reducing battery pack cost is to reduce pack size without compromising pack service performance and lifespan. Prognostic life model can be a powerful tool to handle the state of health (SOH) estimate and enable active life balancing strategy to reduce cell imbalance and extend pack life. This work proposed a life model using both empirical and physical-based approaches. The life model described the compounding effect of different degradations on the entire cell with an empirical model. Then its lower-level submodels considered the complex physicalmore » links between testing statistics (state of charge level, C-rate level, duty cycles, etc.) and the degradation reaction rates with respect to specific aging mechanisms. The hybrid approach made the life model generic, robust and stable regardless of battery chemistry and application usage. The model was validated with a custom pack with both passive and active balancing systems implemented, which created four different aging paths in the pack. The life model successfully captured the aging trajectories of all four paths. The life model prediction errors on capacity fade and resistance growth were within +/-3% and +/-5% of the experiment measurements.« less

  9. Reducing surface water pollution through the assessment of the cost-effectiveness of BMPs at different spatial scales.

    PubMed

    Panagopoulos, Y; Makropoulos, C; Mimikou, M

    2011-10-01

    Two kinds of agricultural Best Management Practices (BMPs) were examined with respect to cost-effectiveness (CE) in reducing sediment, nitrates-nitrogen (NO(3)-N) and total phosphorus (TP) losses to surface waters of the Arachtos catchment in Western Greece. The establishment of filter strips at the edge of fields and a non-structural measure, namely fertilization reduction in alfalfa, combined with contour farming and zero-tillage in corn and reduction of animal numbers in pastureland, were evaluated. The Soil and Water Assessment Tool (SWAT) model was used as the non-point-source (NPS) estimator, while a simple economic component was developed estimating BMP implementation cost as the mean annual expenses needed to undertake and operate the practice for a 5-year period. After each BMP implementation, the ratio of their CE in reducing pollution was calculated for each Hydrologic Response Unit (HRU) separately, for each agricultural land use type entirely and for the whole catchment. The results at the HRU scale are presented comprehensively on a map, demonstrating the spatial differentiation of CE ratios across the catchment that enhances the identification of locations where each BMP is most advisable for implementation. Based on the analysis, a catchment management solution of affordable total cost would include the expensive measure of filter strips in corn and only in a small number of pastureland fields, in combination with the profitable measure of reducing fertilization to alfalfa fields. When examined for its impact on river loads at the outlet, the latter measure led to a 20 tn or 8% annual decrease of TP from the baseline with savings of 15€/kg of pollutant reduction. Filter strips in corn fields reduced annual sediments by 66 Ktn or 5%, NO(3)-N by 71 tn or 9.5% and TP by 27 tn or 10%, with an additional cost of 3.1 €/tn, 3.3 €/kg and 8.1 €/kg of each pollutant respectively. The study concludes that considerable reductions of several pollutant types at the same time can be achieved, even at low total cost, by combining targeted BMP implementation strategies only in small parts of the catchment, also enabling policy makers to take local socio-economic constraints into consideration. The methodology and the results presented aim to facilitate decision making for a cost-effective management of diffuse pollution by enabling modelers and researchers to make rapid and reliable BMP cost estimations and thus being able to calculate their CE at the local level in order to identify the most suitable areas for their implementation. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. Impact of implementation of NRHM program on NMR in Tamil Nadu (TN): a case study.

    PubMed

    Kumutha, J; Chitra, N; Vidyasagar, Dharmapuri

    2014-12-01

    The Government of India had set up the National Rural Health Mission (NRHM) in 2005 in an effort towards providing quality healthcare to the underserved rural areas and also to achieve the Millennium Development Goals (MDGs) by 2015. While the trends in child and maternal mortality show great progress by India since 1990 with steady decline in Maternal Mortality Ratio (MMR) and Infant Mortality Rate (IMR), a comparison of the predicted trend and target of MDGs show that India would fall short by a few points. In contrast, Tamil Nadu has reached its MDGs and is ensuring sustained progress in reducing child and maternal mortality with an effective implementation of the various schemes of NRHM. Tamil Nadu leads the way in ensuring universal health coverage leveraging the expertise and funds of NRHM by providing round the clock services, introducing new and innovative programs to improve outcomes and regular monitoring of the functional operation and outcomes to ensure effective implementation. Adopting the features of the Tamil Nadu model of healthcare system that caters to their particular state and effectively implementing the initiatives of NRHM would help the other states in considerably reducing the child and maternal mortality and also ensure early achievement of MDGs by the nation.

  11. Improving Efficiency While Improving Patient Care in a Student-Run Free Clinic.

    PubMed

    Lee, Jason S; Combs, Kristen; Pasarica, Magdalena

    2017-01-01

    Student-run free clinics (SRFCs) have the capacity to decrease health care inequity in underserved populations. These facilities can benefit from improved patient experience and outcomes. We implemented a series of quality improvement interventions with the objectives to decrease patient wait times and to increase the variety of services provided. A needs assessment was performed. Problems related to time management, communication between staff and providers, clinic resources, and methods for assessing clinic performance were identified as targets to reduce wait times and improve the variety of services provided. Seventeen interventions were designed and implemented over a 2-month period. The interventions resulted in improved efficiency for clinic operations and reduced patient wait times. The number of specialty providers, patient visits for specialty care, lifestyle education visits for disease prevention and treatment, free medications, and free laboratory investigations increased to achieve the goal of improving the availability and the variety of services provided. We demonstrated that it is feasible to implement successful quality improvement interventions in SRFCs to decrease patient wait times and to increase the variety of services provided. We believe that the changes we implemented can serve as a model for other SRFCs to improve their performance. © Copyright 2017 by the American Board of Family Medicine.

  12. Do Sector Wide Approaches for health aid delivery lead to 'donor-flight'? A comparison of 46 low-income countries.

    PubMed

    Sweeney, Rohan; Mortimer, Duncan; Johnston, David W

    2014-03-01

    Sector Wide Approaches (SWAp) emerged during the 1990s as a new policy mechanism for aid delivery. Eschewing many features of traditional project-based aid, SWAps give greater control of aid allocation to recipient countries. Some critics have questioned whether reducing a donor's level of influence over aid allocation might lead to a decrease in donor contributions. While some qualitative evaluations have described the level of fund pooling and donor participation in SWAps, no previous study has empirically examined this potential 'donor-flight' response to health SWAp implementation. This paper utilises a uniquely compiled dataset of 46 low-income countries over 1990-2009 and a variety of panel data regression models to estimate the impact of health SWAp implementation on levels of health aid. Results suggest that amongst 16 especially poor low-income countries, SWAp implementation is associated with significant decreases in health aid levels compared with non-implementers. This suggests donors are not indifferent to how their contributions are allocated by recipients, and that low-income countries considering a SWAp may need to weigh the benefits of greater control of aid allocations against the possibility of reduced aid income. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Effects of school-wide positive behavioral interventions and supports and fidelity of implementation on problem behavior in high schools.

    PubMed

    Flannery, K B; Fenning, P; Kato, M McGrath; McIntosh, K

    2014-06-01

    High school is an important time in the educational career of students. It is also a time when adolescents face many behavioral, academic, and social-emotional challenges. Current statistics about the behavioral, academic, and social-emotional challenges faced by adolescents, and the impact on society through incarceration and dropout, have prompted high schools to direct their attention toward keeping students engaged and reducing high-risk behavioral challenges. The purpose of the study was to examine the effects of School-Wide Positive Behavioral Interventions and Supports (SW-PBIS) on the levels of individual student problem behaviors during a 3-year effectiveness trial without random assignment to condition. Participants were 36,653 students in 12 high schools. Eight schools implemented SW-PBIS, and four schools served as comparison schools. Results of a multilevel latent growth model showed statistically significant decreases in student office discipline referrals in SW-PBIS schools, with increases in comparison schools, when controlling for enrollment and percent of students receiving free or reduced price meals. In addition, as fidelity of implementation increased, office discipline referrals significantly decreased. Results are discussed in terms of effectiveness of a SW-PBIS approach in high schools and considerations to enhance fidelity of implementation. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  14. Evaluating watershed protection programs in New York City's Cannonsville Reservoir source watershed using SWAT-HS

    NASA Astrophysics Data System (ADS)

    Hoang, L.; Mukundan, R.; Moore, K. E.; Owens, E. M.; Steenhuis, T. S.

    2017-12-01

    New York City (NYC)'s reservoirs supply over one billion gallons of drinking water each day to over nine million consumers in NYC and upstate communities. The City has invested more than $1.5 billion in watershed protection programs to maintain a waiver from filtration for the Catskill and Delaware Systems. In the last 25 years, the NYC Department of Environmental Protection (NYCDEP) has implemented programs in cooperation with upstate communities that include nutrient management, crop rotations, improvement of barnyards and manure storage, implementing tertiary treatment for Phosphorus (P) in wastewater treatment plants, and replacing failed septic systems in an effort to reduce P loads to water supply reservoirs. There have been several modeling studies evaluating the effect of agricultural Best Management Practices (BMPs) on P control in the Cannonsville watershed in the Delaware System. Although these studies showed that BMPs would reduce dissolved P losses, they were limited to farm-scale or watershed-scale estimates of reduction factors without consideration of the dynamic nature of overland flow and P losses from variable source areas. Recently, we developed the process-based SWAT-Hillslope (SWAT-HS) model, a modified version of the Soil and Water Assessment Tool (SWAT) that can realistically predict variable source runoff processes. The objective of this study is to use the SWAT-HS model to evaluate watershed protection programs addressing both point and non-point sources of P. SWAT-HS predicts streamflow very well for the Cannonsville watershed with a daily Nash Sutcliffe Efficiency (NSE) of 0.85 at the watershed outlet and NSE values ranging from 0.56 - 0.82 at five other locations within the watershed. Based on good hydrological prediction, we applied the model to predict P loads using detailed P inputs that change over time due to the implementation of watershed protection programs. Results from P model predictions provide improved projections of P loads and form a basis for evaluating the cumulative and individual effects of watershed protection programs.

  15. Routine Microsecond Molecular Dynamics Simulations with AMBER on GPUs. 1. Generalized Born

    PubMed Central

    2012-01-01

    We present an implementation of generalized Born implicit solvent all-atom classical molecular dynamics (MD) within the AMBER program package that runs entirely on CUDA enabled NVIDIA graphics processing units (GPUs). We discuss the algorithms that are used to exploit the processing power of the GPUs and show the performance that can be achieved in comparison to simulations on conventional CPU clusters. The implementation supports three different precision models in which the contributions to the forces are calculated in single precision floating point arithmetic but accumulated in double precision (SPDP), or everything is computed in single precision (SPSP) or double precision (DPDP). In addition to performance, we have focused on understanding the implications of the different precision models on the outcome of implicit solvent MD simulations. We show results for a range of tests including the accuracy of single point force evaluations and energy conservation as well as structural properties pertainining to protein dynamics. The numerical noise due to rounding errors within the SPSP precision model is sufficiently large to lead to an accumulation of errors which can result in unphysical trajectories for long time scale simulations. We recommend the use of the mixed-precision SPDP model since the numerical results obtained are comparable with those of the full double precision DPDP model and the reference double precision CPU implementation but at significantly reduced computational cost. Our implementation provides performance for GB simulations on a single desktop that is on par with, and in some cases exceeds, that of traditional supercomputers. PMID:22582031

  16. A model to inform management actions as a response to chytridiomycosis-associated decline

    USGS Publications Warehouse

    Converse, Sarah J.; Bailey, Larissa L.; Mosher, Brittany A.; Funk, W. Chris; Gerber, Brian D.; Muths, Erin L.

    2017-01-01

    Decision-analytic models provide forecasts of how systems of interest will respond to management. These models can be parameterized using empirical data, but sometimes require information elicited from experts. When evaluating the effects of disease in species translocation programs, expert judgment is likely to play a role because complete empirical information will rarely be available. We illustrate development of a decision-analytic model built to inform decision-making regarding translocations and other management actions for the boreal toad (Anaxyrus boreas boreas), a species with declines linked to chytridiomycosis caused by Batrachochytrium dendrobatidis (Bd). Using the model, we explored the management implications of major uncertainties in this system, including whether there is a genetic basis for resistance to pathogenic infection by Bd, how translocation can best be implemented, and the effectiveness of efforts to reduce the spread of Bd. Our modeling exercise suggested that while selection for resistance to pathogenic infectionDecision-analytic models provide forecasts of how systems of interest will respond to management. These models can be parameterized using empirical data, but sometimes require information elicited from experts. When evaluating the effects of disease in species translocation programs, expert judgment is likely to play a role because complete empirical information will rarely be available. We illustrate development of a decision-analytic model built to inform decision-making regarding translocations and other management actions for the boreal toad (Anaxyrus boreas boreas), a species with declines linked to chytridiomycosis caused by Batrachochytrium dendrobatidis (Bd). Using the model, we explored the management implications of major uncertainties in this system, including whether there is a genetic basis for resistance to pathogenic infection by Bd, how translocation can best be implemented, and the effectiveness of efforts to reduce the spread of Bd. Our modeling exercise suggested that while selection for resistance to pathogenic infection by Bd could increase numbers of sites occupied by toads, and translocations could increase the rate of toad recovery, efforts to reduce the spread of Bd may have little effect. We emphasize the need to continue developing and parameterizing models necessary to assess management actions for combating chytridiomycosis-associated declines. by Bd could increase numbers of sites occupied by toads, and translocations could increase the rate of toad recovery, efforts to reduce the spread of Bd may have little effect. We emphasize the need to continue developing and parameterizing models necessary to assess management actions for combating chytridiomycosis-associated declines.

  17. [Level of implementation of the Program for Safety and Health at Work in Antioquia, Colombia].

    PubMed

    Vega-Monsalve, Ninfa Del Carmen

    2017-07-13

    This study describes the level of implementation of the Program for Safety and Health at Work in companies located in the Department of Antioquia, Colombia, and associated factors. A cross-sectional survey included 73 companies with more than 50 workers each and implementation of the program. A total of 65 interviews were held, in addition to 73 checklists and process reviews. The companies showed suboptimal compliance with the management model for workplace safety and health proposed by the International Labor Organization (ILO). The component with the best development was Organization (87%), and the worst was Policy (67%). Company executives contended that the causes of suboptimal implementation were the limited commitment by area directors and scarce budget resources. Risk management mostly aimed to comply with the legal requirements in order to avoid penalties, plus documenting cases. There was little implementation of effective checks and controls to reduce the sources of work accidents. The study concludes that workers' health management lacks effective strategies.

  18. Existential risks: exploring a robust risk reduction strategy.

    PubMed

    Jebari, Karim

    2015-06-01

    A small but growing number of studies have aimed to understand, assess and reduce existential risks, or risks that threaten the continued existence of mankind. However, most attention has been focused on known and tangible risks. This paper proposes a heuristic for reducing the risk of black swan extinction events. These events are, as the name suggests, stochastic and unforeseen when they happen. Decision theory based on a fixed model of possible outcomes cannot properly deal with this kind of event. Neither can probabilistic risk analysis. This paper will argue that the approach that is referred to as engineering safety could be applied to reducing the risk from black swan extinction events. It will also propose a conceptual sketch of how such a strategy may be implemented: isolated, self-sufficient, and continuously manned underground refuges. Some characteristics of such refuges are also described, in particular the psychosocial aspects. Furthermore, it is argued that this implementation of the engineering safety strategy safety barriers would be effective and plausible and could reduce the risk of an extinction event in a wide range of possible (known and unknown) scenarios. Considering the staggering opportunity cost of an existential catastrophe, such strategies ought to be explored more vigorously.

  19. Interventions: Employees' Perceptions of What Reduces Stress

    PubMed Central

    Boyd, Carolyn M.; Provis, Chris

    2017-01-01

    Objective To build upon research evaluating stress interventions, this qualitative study tests the framework of the extended Job Demands-Resources model to investigate employees' perceptions of the stress-reduction measures implemented at 13 Australian universities. Methods In a cross-sectional survey design, tenured and contract staff indicated whether their overall level of stress had changed during the previous three-four years, and, if so, they described the major causes. A total of 462 staff reported that their level of stress had decreased; the study examines commentary from 115 academic and 304 nonacademic staff who provided details of what they perceived to be effective in reducing stress. Results Thematic analyses show that the key perceived causes were changes in job or work role, new heads of departments or supervisors, and the use of organizational strategies to reduce or manage stress. A higher percentage of academic staff reported reduced stress due to using protective coping strategies or their increased recognition and/or success, whereas a higher percentage of nonacademic staff reported reduced stress due to increases in staffing resources and/or systems. Conclusion These results identify the importance of implementing multilevel strategies to enhance employees' well-being. Nonacademic staff, in particular, specified a variety of organizational stress-reduction interventions. PMID:29318146

  20. The relationship between school-level characteristics and implementation fidelity of a coordinated school health childhood obesity prevention intervention.

    PubMed

    Lederer, Alyssa M; King, Mindy H; Sovinski, Danielle; Seo, Dong-Chul; Kim, Nayoung

    2015-01-01

    Curtailing childhood obesity is a public health imperative. Although multicomponent school-based programs reduce obesity among children, less is known about the implementation fidelity of these interventions. This study examines process evaluation findings for the Healthy, Energetic Ready, Outstanding, Enthusiastic, Schools (HEROES) Initiative, a tri-state school-based childhood obesity prevention intervention based on the coordinated school health (CSH) model. Site visits were conducted that included key stakeholder interviews, observation, and document review. Scores were given for 8 domains, and a total implementation score was calculated. Two-way analyses of variance were conducted to examine the relationship of 4 school-level characteristics: elementary vs. middle/high schools, public vs. private schools, district vs. building level implementation, and socioeconomic status on each implementation area. Overall, schools had high fidelity scores, although some domains were implemented more successfully than others. Three school-level characteristics were associated with 1 or more domains, with elementary schools and schools implementing at the building level consistently having higher implementation scores than their counterparts. Process evaluation findings provide insight into successes and challenges schools implementing the CSH approach may encounter. Although preliminary, these findings on school-level characteristics establish a new area of research related to school-based childhood obesity prevention programs' implementation fidelity. © 2014, American School Health Association.

  1. Management of groundwater in-situ bioremediation system using reactive transport modelling under parametric uncertainty: field scale application

    NASA Astrophysics Data System (ADS)

    Verardo, E.; Atteia, O.; Rouvreau, L.

    2015-12-01

    In-situ bioremediation is a commonly used remediation technology to clean up the subsurface of petroleum-contaminated sites. Forecasting remedial performance (in terms of flux and mass reduction) is a challenge due to uncertainties associated with source properties and the uncertainties associated with contribution and efficiency of concentration reducing mechanisms. In this study, predictive uncertainty analysis of bio-remediation system efficiency is carried out with the null-space Monte Carlo (NSMC) method which combines the calibration solution-space parameters with the ensemble of null-space parameters, creating sets of calibration-constrained parameters for input to follow-on remedial efficiency. The first step in the NSMC methodology for uncertainty analysis is model calibration. The model calibration was conducted by matching simulated BTEX concentration to a total of 48 observations from historical data before implementation of treatment. Two different bio-remediation designs were then implemented in the calibrated model. The first consists in pumping/injection wells and the second in permeable barrier coupled with infiltration across slotted piping. The NSMC method was used to calculate 1000 calibration-constrained parameter sets for the two different models. Several variants of the method were implemented to investigate their effect on the efficiency of the NSMC method. The first variant implementation of the NSMC is based on a single calibrated model. In the second variant, models were calibrated from different initial parameter sets. NSMC calibration-constrained parameter sets were sampled from these different calibrated models. We demonstrate that in context of nonlinear model, second variant avoids to underestimate parameter uncertainty which may lead to a poor quantification of predictive uncertainty. Application of the proposed approach to manage bioremediation of groundwater in a real site shows that it is effective to provide support in management of the in-situ bioremediation systems. Moreover, this study demonstrates that the NSMC method provides a computationally efficient and practical methodology of utilizing model predictive uncertainty methods in environmental management.

  2. Methods of Transposition of Nurses between Wards

    NASA Astrophysics Data System (ADS)

    Miyazaki, Shigeji; Masuda, Masakazu

    In this paper, a computer-implemented method for automating the transposition of a hospital’s nursing staff is proposed. The model is applied to the real case example ‘O’ hospital, which performs a transposition of its nursing staff once a year. Results are compared with real data obtained from this hospital’s current manual transposition system. The proposed method not only significantly reduces the time taken to construct the transposition, thereby significantly reducing management labor costs, but also is demonstrated to increase nurses’ levels of satisfaction with the process.

  3. Modeling nitrate-nitrogen load reduction strategies for the des moines river, iowa using SWAT

    USGS Publications Warehouse

    Schilling, K.E.; Wolter, C.F.

    2009-01-01

    The Des Moines River that drains a watershed of 16,175 km2 in portions of Iowa and Minnesota is impaired for nitrate-nitrogen (nitrate) due to concentrations that exceed regulatory limits for public water supplies. The Soil Water Assessment Tool (SWAT) model was used to model streamflow and nitrate loads and evaluate a suite of basin-wide changes and targeting configurations to potentially reduce nitrate loads in the river. The SWAT model comprised 173 subbasins and 2,516 hydrologic response units and included point and nonpoint nitrogen sources. The model was calibrated for an 11-year period and three basin-wide and four targeting strategies were evaluated. Results indicated that nonpoint sources accounted for 95% of the total nitrate export. Reduction in fertilizer applications from 170 to 50 kg/ha achieved the 38% reduction in nitrate loads, exceeding the 34% reduction required. In terms of targeting, the most efficient load reductions occurred when fertilizer applications were reduced in subbasins nearest the watershed outlet. The greatest load reduction for the area of land treated was associated with reducing loads from 55 subbasins with the highest nitrate loads, achieving a 14% reduction in nitrate loads achieved by reducing applications on 30% of the land area. SWAT model results provide much needed guidance on how to begin implementing load reduction strategies most efficiently in the Des Moines River watershed. ?? 2009 Springer Science+Business Media, LLC.

  4. Impacts of potential CO2-reduction policies on air quality in the United States.

    PubMed

    Trail, Marcus A; Tsimpidi, Alexandra P; Liu, Peng; Tsigaridis, Kostas; Hu, Yongtao; Rudokas, Jason R; Miller, Paul J; Nenes, Athanasios; Russell, Armistead G

    2015-04-21

    Impacts of emissions changes from four potential U.S. CO2 emission reduction policies on 2050 air quality are analyzed using the community multiscale air quality model (CMAQ). Future meteorology was downscaled from the Goddard Institute for Space Studies (GISS) ModelE General Circulation Model (GCM) to the regional scale using the Weather Research Forecasting (WRF) model. We use emissions growth factors from the EPAUS9r MARKAL model to project emissions inventories for two climate tax scenarios, a combined transportation and energy scenario, a biomass energy scenario and a reference case. Implementation of a relatively aggressive carbon tax leads to improved PM2.5 air quality compared to the reference case as incentives increase for facilities to install flue-gas desulfurization (FGD) and carbon capture and sequestration (CCS) technologies. However, less capital is available to install NOX reduction technologies, resulting in an O3 increase. A policy aimed at reducing CO2 from the transportation sector and electricity production sectors leads to reduced emissions of mobile source NOX, thus reducing O3. Over most of the U.S., this scenario leads to reduced PM2.5 concentrations. However, increased primary PM2.5 emissions associated with fuel switching in the residential and industrial sectors leads to increased organic matter (OM) and PM2.5 in some cities.

  5. Implementation of a parallel protein structure alignment service on cloud.

    PubMed

    Hung, Che-Lun; Lin, Yaw-Ling

    2013-01-01

    Protein structure alignment has become an important strategy by which to identify evolutionary relationships between protein sequences. Several alignment tools are currently available for online comparison of protein structures. In this paper, we propose a parallel protein structure alignment service based on the Hadoop distribution framework. This service includes a protein structure alignment algorithm, a refinement algorithm, and a MapReduce programming model. The refinement algorithm refines the result of alignment. To process vast numbers of protein structures in parallel, the alignment and refinement algorithms are implemented using MapReduce. We analyzed and compared the structure alignments produced by different methods using a dataset randomly selected from the PDB database. The experimental results verify that the proposed algorithm refines the resulting alignments more accurately than existing algorithms. Meanwhile, the computational performance of the proposed service is proportional to the number of processors used in our cloud platform.

  6. Life cycle implications of urban green infrastructure.

    PubMed

    Spatari, Sabrina; Yu, Ziwen; Montalto, Franco A

    2011-01-01

    Low Impact Development (LID) is part of a new paradigm in urban water management that aims to decentralize water storage and movement functions within urban watersheds. LID strategies can restore ecosystem functions and reduce runoff loadings to municipal water pollution control facilities (WPCF). This research examines the avoided energy and greenhouse gas (GHG) emissions of select LID strategies using life cycle assessment (LCA) and a stochastic urban watershed model. We estimate annual energy savings and avoided GHG emissions of 7.3 GJ and 0.4 metric tons, respectively, for a LID strategy implemented in a neighborhood in New York City. Annual savings are small compared to the energy and GHG intensity of the LID materials, resulting in slow environmental payback times. This preliminary analysis suggests that if implemented throughout an urban watershed, LID strategies may have important energy cost savings to WPCF, and can make progress towards reducing their carbon footprint. Copyright © 2011 Elsevier Ltd. All rights reserved.

  7. Implementation of a Parallel Protein Structure Alignment Service on Cloud

    PubMed Central

    Hung, Che-Lun; Lin, Yaw-Ling

    2013-01-01

    Protein structure alignment has become an important strategy by which to identify evolutionary relationships between protein sequences. Several alignment tools are currently available for online comparison of protein structures. In this paper, we propose a parallel protein structure alignment service based on the Hadoop distribution framework. This service includes a protein structure alignment algorithm, a refinement algorithm, and a MapReduce programming model. The refinement algorithm refines the result of alignment. To process vast numbers of protein structures in parallel, the alignment and refinement algorithms are implemented using MapReduce. We analyzed and compared the structure alignments produced by different methods using a dataset randomly selected from the PDB database. The experimental results verify that the proposed algorithm refines the resulting alignments more accurately than existing algorithms. Meanwhile, the computational performance of the proposed service is proportional to the number of processors used in our cloud platform. PMID:23671842

  8. Toward Implementing Patient Flow in a Cancer Treatment Center to Reduce Patient Waiting Time and Improve Efficiency.

    PubMed

    Suss, Samuel; Bhuiyan, Nadia; Demirli, Kudret; Batist, Gerald

    2017-06-01

    Outpatient cancer treatment centers can be considered as complex systems in which several types of medical professionals and administrative staff must coordinate their work to achieve the overall goals of providing quality patient care within budgetary constraints. In this article, we use analytical methods that have been successfully employed for other complex systems to show how a clinic can simultaneously reduce patient waiting times and non-value added staff work in a process that has a series of steps, more than one of which involves a scarce resource. The article describes the system model and the key elements in the operation that lead to staff rework and patient queuing. We propose solutions to the problems and provide a framework to evaluate clinic performance. At the time of this report, the proposals are in the process of implementation at a cancer treatment clinic in a major metropolitan hospital in Montreal, Canada.

  9. Examing the prospective of implementing passive house standards in providing sustainable schools

    NASA Astrophysics Data System (ADS)

    Suhaili, Wan Farhani; Shahrill, Masitah

    2018-04-01

    This study examines the potential of implementing the passive house standards to reduce energy consumption on school buildings in Brunei. Furthermore, it investigates whether sustainable school buildings make business sense to the government. To do this, conventional and Passive House primary school buildings are compared in terms of their performances using the Passive House Planning Package as well as the Ecotect environmental analysis tool. The findings indicated that by replacing lower U-values building fabrics brought a significantly reduction in the cooling demand of 54%. Whereas, Ecotect models have demonstrated that the heating and cooling loads have tremendously reduced to 75% by reorienting the location of the building to south elevation and by replacing the building fabrics with a lower U-values. These findings were then evaluated with a cost benefit analysis that proved to save cost energy annually from air-conditioning usage from a typical primary school with eight years of pay back period.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hellwinckel, Chad; de la Torre Ugarte, Daniel; Perlack, Robert D

    An integrated, socioeconomic biogeophysical model is used to analyze the interactions of cap-and-trade legislation and the Renewable Fuels Standard. Five alternative policy scenarios were considered with the purpose of identifying policies that act in a synergistic manner to reduce carbon emissions, increase economic returns to agriculture, and adequately meet ethanol mandates. We conclude that climate and energy policies can best be implemented together by offering carbon offset payments to conservation tillage, herbaceous grasses for biomass, and by constraining crop residue removal for ethanol feedstocks to carbon neutral level. When comparing this scenario to the Baseline scenario, the agricultural sector realizesmore » an economic benefit of US$156 billion by 2030 and emissions are reduced by 135 Tg C-equivalent (Eq) yr 1. Results also indicate that geographic location of cellulosic feedstocks could shift significantly depending on the final policies implemented in cap and trade legislation. Placement of cellulosic ethanol facilities should consider these possible shifts when determining site location.« less

  11. Stakeholder engagement in quattro helix model for mobile phone reverse logistics in Indonesia: a conceptual framework

    NASA Astrophysics Data System (ADS)

    Maheswari, H.; Yudoko, G.; Adhiutama, A.

    2017-12-01

    The number of e-waste from mobile phone industry is still dominating until now. This is happened because there is no mutual commitment from all of parties i.e. businesses, government, and societies to reduce the use of mobile phone that has the shortest product life cycle. There are many researches study about firms’ motivation and government’s role, other discuss about actions of communities in supporting reverse logistics implementation. Unfortunately, research about engagement mechanism that involving all parties is still rare. Therefore, it is important to find the engagement model through this conceptual paper and it is expected useful to build the novel model. Through literature review, the results of this research are establishing the Quattro helix model as the appropriate structure to build the robust team by exploring stakeholder theories; mapping the engagement model either in form of collaboration or participation that consider stakeholders’ role and motivation and finding six types of engagement that consider their interest; and determining the novel model of engagement through Quattro helix model for implementing reverse logistics in handling e-waste by describing the linkage and the gaps among existing model.

  12. A unified structural/terminological interoperability framework based on LexEVS: application to TRANSFoRm.

    PubMed

    Ethier, Jean-François; Dameron, Olivier; Curcin, Vasa; McGilchrist, Mark M; Verheij, Robert A; Arvanitis, Theodoros N; Taweel, Adel; Delaney, Brendan C; Burgun, Anita

    2013-01-01

    Biomedical research increasingly relies on the integration of information from multiple heterogeneous data sources. Despite the fact that structural and terminological aspects of interoperability are interdependent and rely on a common set of requirements, current efforts typically address them in isolation. We propose a unified ontology-based knowledge framework to facilitate interoperability between heterogeneous sources, and investigate if using the LexEVS terminology server is a viable implementation method. We developed a framework based on an ontology, the general information model (GIM), to unify structural models and terminologies, together with relevant mapping sets. This allowed a uniform access to these resources within LexEVS to facilitate interoperability by various components and data sources from implementing architectures. Our unified framework has been tested in the context of the EU Framework Program 7 TRANSFoRm project, where it was used to achieve data integration in a retrospective diabetes cohort study. The GIM was successfully instantiated in TRANSFoRm as the clinical data integration model, and necessary mappings were created to support effective information retrieval for software tools in the project. We present a novel, unifying approach to address interoperability challenges in heterogeneous data sources, by representing structural and semantic models in one framework. Systems using this architecture can rely solely on the GIM that abstracts over both the structure and coding. Information models, terminologies and mappings are all stored in LexEVS and can be accessed in a uniform manner (implementing the HL7 CTS2 service functional model). The system is flexible and should reduce the effort needed from data sources personnel for implementing and managing the integration.

  13. Sequential data assimilation for a distributed hydrologic model considering different time scale of internal processes

    NASA Astrophysics Data System (ADS)

    Noh, S.; Tachikawa, Y.; Shiiba, M.; Kim, S.

    2011-12-01

    Applications of the sequential data assimilation methods have been increasing in hydrology to reduce uncertainty in the model prediction. In a distributed hydrologic model, there are many types of state variables and each variable interacts with each other based on different time scales. However, the framework to deal with the delayed response, which originates from different time scale of hydrologic processes, has not been thoroughly addressed in the hydrologic data assimilation. In this study, we propose the lagged filtering scheme to consider the lagged response of internal states in a distributed hydrologic model using two filtering schemes; particle filtering (PF) and ensemble Kalman filtering (EnKF). The EnKF is one of the widely used sub-optimal filters implementing an efficient computation with limited number of ensemble members, however, still based on Gaussian approximation. PF can be an alternative in which the propagation of all uncertainties is carried out by a suitable selection of randomly generated particles without any assumptions about the nature of the distributions involved. In case of PF, advanced particle regularization scheme is implemented together to preserve the diversity of the particle system. In case of EnKF, the ensemble square root filter (EnSRF) are implemented. Each filtering method is parallelized and implemented in the high performance computing system. A distributed hydrologic model, the water and energy transfer processes (WEP) model, is applied for the Katsura River catchment, Japan to demonstrate the applicability of proposed approaches. Forecasted results via PF and EnKF are compared and analyzed in terms of the prediction accuracy and the probabilistic adequacy. Discussions are focused on the prospects and limitations of each data assimilation method.

  14. A unified structural/terminological interoperability framework based on LexEVS: application to TRANSFoRm

    PubMed Central

    Ethier, Jean-François; Dameron, Olivier; Curcin, Vasa; McGilchrist, Mark M; Verheij, Robert A; Arvanitis, Theodoros N; Taweel, Adel; Delaney, Brendan C; Burgun, Anita

    2013-01-01

    Objective Biomedical research increasingly relies on the integration of information from multiple heterogeneous data sources. Despite the fact that structural and terminological aspects of interoperability are interdependent and rely on a common set of requirements, current efforts typically address them in isolation. We propose a unified ontology-based knowledge framework to facilitate interoperability between heterogeneous sources, and investigate if using the LexEVS terminology server is a viable implementation method. Materials and methods We developed a framework based on an ontology, the general information model (GIM), to unify structural models and terminologies, together with relevant mapping sets. This allowed a uniform access to these resources within LexEVS to facilitate interoperability by various components and data sources from implementing architectures. Results Our unified framework has been tested in the context of the EU Framework Program 7 TRANSFoRm project, where it was used to achieve data integration in a retrospective diabetes cohort study. The GIM was successfully instantiated in TRANSFoRm as the clinical data integration model, and necessary mappings were created to support effective information retrieval for software tools in the project. Conclusions We present a novel, unifying approach to address interoperability challenges in heterogeneous data sources, by representing structural and semantic models in one framework. Systems using this architecture can rely solely on the GIM that abstracts over both the structure and coding. Information models, terminologies and mappings are all stored in LexEVS and can be accessed in a uniform manner (implementing the HL7 CTS2 service functional model). The system is flexible and should reduce the effort needed from data sources personnel for implementing and managing the integration. PMID:23571850

  15. Developing, delivering and evaluating primary mental health care: the co-production of a new complex intervention.

    PubMed

    Reeve, Joanne; Cooper, Lucy; Harrington, Sean; Rosbottom, Peter; Watkins, Jane

    2016-09-06

    Health services face the challenges created by complex problems, and so need complex intervention solutions. However they also experience ongoing difficulties in translating findings from research in this area in to quality improvement changes on the ground. BounceBack was a service development innovation project which sought to examine this issue through the implementation and evaluation in a primary care setting of a novel complex intervention. The project was a collaboration between a local mental health charity, an academic unit, and GP practices. The aim was to translate the charity's model of care into practice-based evidence describing delivery and impact. Normalisation Process Theory (NPT) was used to support the implementation of the new model of primary mental health care into six GP practices. An integrated process evaluation evaluated the process and impact of care. Implementation quickly stalled as we identified problems with the described model of care when applied in a changing and variable primary care context. The team therefore switched to using the NPT framework to support the systematic identification and modification of the components of the complex intervention: including the core components that made it distinct (the consultation approach) and the variable components (organisational issues) that made it work in practice. The extra work significantly reduced the time available for outcome evaluation. However findings demonstrated moderately successful implementation of the model and a suggestion of hypothesised changes in outcomes. The BounceBack project demonstrates the development of a complex intervention from practice. It highlights the use of Normalisation Process Theory to support development, and not just implementation, of a complex intervention; and describes the use of the research process in the generation of practice-based evidence. Implications for future translational complex intervention research supporting practice change through scholarship are discussed.

  16. Satellite power system (SPS) financial/management scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1978-10-01

    The problems of financing and managing a large-scale, lengthy SPS program reduce to the key questions of ownership and control. Ownership (that is, the sources of capital) may be governmental, corporate, or individual; control may be exercised by a government agency, a government-sanctioned monopoly, or a competitive corporation. Since the R and D phase and the commercial implementation phase of an SPS program are qualitatively very different with respect to length of time before return-on-investment, we have considered two general categories of SPS organizations: (1) organizations capable of carrying out a complete SPS program, from R and D through commercialization;more » (2) organizations capable of carrying out commercial implementation only. Six organizational models for carrying out the complete SPS program have been examined in some detail: 1) existing government agencies (DOE, NASA, etc.); 2) a new government agency, patterned after TVA; 3) a taxpayer stock corporation, a new concept; 4) a trust fund supported by energy taxes, patterned after the financing of the Interstate Highway System; 5) a federal agency financed by bonds, patterned after the Federal National Mortgage Association; and 6) the staging company, a new concept, already in the early stages of implementation as a private venture. Four additional organizational forms have been considered for commercial implementation of SPS: 7) a government-chartered monopoly, patterned after the Communications Satellite Corporation; 8) the consortium model, already widely used for large-scale projects; 9) the corporate socialism model, patterned after such developments as the transcontinental railroad; and 10) the universal capitalism model, a concept partially implemented in the 1976 legislation creating Employee Stock Ownership Plans. A number of qualitative criteria for comparative assessment of these alternatives have been developed.« less

  17. A failed model-based attempt to implement an evidence-based nursing guideline for fall prevention.

    PubMed

    Semin-Goossens, Astrid; van der Helm, Jelle M J; Bossuyt, Patrick M M

    2003-01-01

    An evidence-based nursing guideline had been locally developed in 1993 to reduce fall incidence rates, creating a 30% reduction. Implementation had failed though. Between 1999 and 2001 the guideline was updated. A multifaceted intervention was chosen based on a model for implementing change. The study was performed in 2 wards. All recommendations of Grol's 5-step implementation model were followed. The aim was a reduction of 30% in fall incidence within a year. Data on falls were extracted from nursing records and Incidence Report Forms (IRFs). In a pilot study an average of 9 falls per 1000 patients per day had been recorded in the department of internal medicine and 16 in the neurology ward. Given the desired reduction of 30%, the target averages were 6 and 11 falls respectively. During the intervention year the average incidences were 8 and 13 falls (95% CI: 6-11 and 10-15). There was a changeable pattern over time without any declining trend. The percentage filled in IRFs varied strongly, with an average of 52% in the department of internal medicine and 60% in the neurology department. There has been no durable decrease in monthly falls despite the use of a model-based procedure for implementing change. Neither did we observe any improvement in filling in IRFs. It can be questioned if the nurses themselves did experience patient falls to be troublesome enough. Investigating this is difficult though. Although the most successful strategy still appears to be changing attitudes of nurses in order to increase fall prevention, there is no clear strategy on how to create this successfully.

  18. Uncertainties on exclusive diffractive Higgs boson and jet production at the LHC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dechambre, A.; CEA/IRFU/Service de physique des particules, CEA/Saclay; Kepka, O.

    2011-03-01

    Two theoretical descriptions of exclusive diffractive jets and Higgs production at the LHC were implemented into the FPMC generator: the Khoze, Martin, Ryskin model and the Cudell, Hernandez, Ivanov, Dechambre exclusive model. We then study the uncertainties. We compare their predictions to the CDF measurement and discuss the possibility of constraining the exclusive Higgs production at the LHC with early measurements of exclusive jets. We show that the present theoretical uncertainties can be reduced with such data by a factor of 5.

  19. Post-hoc simulation study to adopt a computerized adaptive testing (CAT) for a Korean Medical License Examination.

    PubMed

    Seo, Dong Gi; Choi, Jeongwook

    2018-05-17

    Computerized adaptive testing (CAT) has been adopted in license examinations due to a test efficiency and accuracy. Many research about CAT have been published to prove the efficiency and accuracy of measurement. This simulation study investigated scoring method and item selection methods to implement CAT in Korean medical license examination (KMLE). This study used post-hoc (real data) simulation design. The item bank used in this study was designed with all items in a 2017 KMLE. All CAT algorithms for this study were implemented by a 'catR' package in R program. In terms of accuracy, Rasch and 2parametric logistic (PL) model performed better than 3PL model. Modal a Posteriori (MAP) or Expected a Posterior (EAP) provided more accurate estimates than MLE and WLE. Furthermore Maximum posterior weighted information (MPWI) or Minimum expected posterior variance (MEPV) performed better than other item selection methods. In terms of efficiency, Rasch model was recommended to reduce test length. Simulation study should be performed under varied test conditions before adopting a live CAT. Based on a simulation study, specific scoring and item selection methods should be predetermined before implementing a live CAT.

  20. Efficient implementation of a real-time estimation system for thalamocortical hidden Parkinsonian properties

    NASA Astrophysics Data System (ADS)

    Yang, Shuangming; Deng, Bin; Wang, Jiang; Li, Huiyan; Liu, Chen; Fietkiewicz, Chris; Loparo, Kenneth A.

    2017-01-01

    Real-time estimation of dynamical characteristics of thalamocortical cells, such as dynamics of ion channels and membrane potentials, is useful and essential in the study of the thalamus in Parkinsonian state. However, measuring the dynamical properties of ion channels is extremely challenging experimentally and even impossible in clinical applications. This paper presents and evaluates a real-time estimation system for thalamocortical hidden properties. For the sake of efficiency, we use a field programmable gate array for strictly hardware-based computation and algorithm optimization. In the proposed system, the FPGA-based unscented Kalman filter is implemented into a conductance-based TC neuron model. Since the complexity of TC neuron model restrains its hardware implementation in parallel structure, a cost efficient model is proposed to reduce the resource cost while retaining the relevant ionic dynamics. Experimental results demonstrate the real-time capability to estimate thalamocortical hidden properties with high precision under both normal and Parkinsonian states. While it is applied to estimate the hidden properties of the thalamus and explore the mechanism of the Parkinsonian state, the proposed method can be useful in the dynamic clamp technique of the electrophysiological experiments, the neural control engineering and brain-machine interface studies.

  1. Implementing a Complex Intervention to Support Personal Recovery: A Qualitative Study Nested within a Cluster Randomised Controlled Trial

    PubMed Central

    Leamy, Mary; Clarke, Eleanor; Le Boutillier, Clair; Bird, Victoria; Janosik, Monika; Sabas, Kai; Riley, Genevieve; Williams, Julie; Slade, Mike

    2014-01-01

    Objective To investigate staff and trainer perspectives on the barriers and facilitators to implementing a complex intervention to help staff support the recovery of service users with a primary diagnosis of psychosis in community mental health teams. Design Process evaluation nested within a cluster randomised controlled trial (RCT). Participants 28 interviews with mental health care staff, 3 interviews with trainers, 4 focus groups with intervention teams and 28 written trainer reports. Setting 14 community-based mental health teams in two UK sites (one urban, one semi-rural) who received the intervention. Results The factors influencing the implementation of the intervention can be organised under two over-arching themes: Organisational readiness for change and Training effectiveness. Organisational readiness for change comprised three sub-themes: NHS Trust readiness; Team readiness; and Practitioner readiness. Training effectiveness comprised three sub-themes: Engagement strategies; Delivery style and Modelling recovery principles. Conclusions Three findings can inform future implementation and evaluation of complex interventions. First, the underlying intervention model predicted that three areas would be important for changing practice: staff skill development; intention to implement; and actual implementation behaviour. This study highlighted the importance of targeting the transition from practitioners' intent to implement to actual implementation behaviour, using experiential learning and target setting. Second, practitioners make inferences about organisational commitment by observing the allocation of resources, Knowledge Performance Indicators and service evaluation outcome measures. These need to be aligned with recovery values, principles and practice. Finally, we recommend the use of organisational readiness tools as an inclusion criteria for selecting both organisations and teams in cluster RCTs. We believe this would maximise the likelihood of adequate implementation and hence reduce waste in research expenditure. Trial Registration Controlled-Trials.com ISRCTN02507940 PMID:24875748

  2. Implementing a complex intervention to support personal recovery: a qualitative study nested within a cluster randomised controlled trial.

    PubMed

    Leamy, Mary; Clarke, Eleanor; Le Boutillier, Clair; Bird, Victoria; Janosik, Monika; Sabas, Kai; Riley, Genevieve; Williams, Julie; Slade, Mike

    2014-01-01

    To investigate staff and trainer perspectives on the barriers and facilitators to implementing a complex intervention to help staff support the recovery of service users with a primary diagnosis of psychosis in community mental health teams. Process evaluation nested within a cluster randomised controlled trial (RCT). 28 interviews with mental health care staff, 3 interviews with trainers, 4 focus groups with intervention teams and 28 written trainer reports. 14 community-based mental health teams in two UK sites (one urban, one semi-rural) who received the intervention. The factors influencing the implementation of the intervention can be organised under two over-arching themes: Organisational readiness for change and Training effectiveness. Organisational readiness for change comprised three sub-themes: NHS Trust readiness; Team readiness; and Practitioner readiness. Training effectiveness comprised three sub-themes: Engagement strategies; Delivery style and Modelling recovery principles. Three findings can inform future implementation and evaluation of complex interventions. First, the underlying intervention model predicted that three areas would be important for changing practice: staff skill development; intention to implement; and actual implementation behaviour. This study highlighted the importance of targeting the transition from practitioners' intent to implement to actual implementation behaviour, using experiential learning and target setting. Second, practitioners make inferences about organisational commitment by observing the allocation of resources, Knowledge Performance Indicators and service evaluation outcome measures. These need to be aligned with recovery values, principles and practice. Finally, we recommend the use of organisational readiness tools as an inclusion criteria for selecting both organisations and teams in cluster RCTs. We believe this would maximise the likelihood of adequate implementation and hence reduce waste in research expenditure. Controlled-Trials.com ISRCTN02507940.

  3. Implementing a Parallel Image Edge Detection Algorithm Based on the Otsu-Canny Operator on the Hadoop Platform.

    PubMed

    Cao, Jianfang; Chen, Lichao; Wang, Min; Tian, Yun

    2018-01-01

    The Canny operator is widely used to detect edges in images. However, as the size of the image dataset increases, the edge detection performance of the Canny operator decreases and its runtime becomes excessive. To improve the runtime and edge detection performance of the Canny operator, in this paper, we propose a parallel design and implementation for an Otsu-optimized Canny operator using a MapReduce parallel programming model that runs on the Hadoop platform. The Otsu algorithm is used to optimize the Canny operator's dual threshold and improve the edge detection performance, while the MapReduce parallel programming model facilitates parallel processing for the Canny operator to solve the processing speed and communication cost problems that occur when the Canny edge detection algorithm is applied to big data. For the experiments, we constructed datasets of different scales from the Pascal VOC2012 image database. The proposed parallel Otsu-Canny edge detection algorithm performs better than other traditional edge detection algorithms. The parallel approach reduced the running time by approximately 67.2% on a Hadoop cluster architecture consisting of 5 nodes with a dataset of 60,000 images. Overall, our approach system speeds up the system by approximately 3.4 times when processing large-scale datasets, which demonstrates the obvious superiority of our method. The proposed algorithm in this study demonstrates both better edge detection performance and improved time performance.

  4. An efficient and general approach for implementing thermodynamic phase equilibria information in geophysical and geodynamic studies

    NASA Astrophysics Data System (ADS)

    Afonso, Juan Carlos; Zlotnik, Sergio; Díez, Pedro

    2015-10-01

    We present a flexible, general, and efficient approach for implementing thermodynamic phase equilibria information (in the form of sets of physical parameters) into geophysical and geodynamic studies. The approach is based on Tensor Rank Decomposition methods, which transform the original multidimensional discrete information into a separated representation that contains significantly fewer terms, thus drastically reducing the amount of information to be stored in memory during a numerical simulation or geophysical inversion. Accordingly, the amount and resolution of the thermodynamic information that can be used in a simulation or inversion increases substantially. In addition, the method is independent of the actual software used to obtain the primary thermodynamic information, and therefore, it can be used in conjunction with any thermodynamic modeling program and/or database. Also, the errors associated with the decomposition procedure are readily controlled by the user, depending on her/his actual needs (e.g., preliminary runs versus full resolution runs). We illustrate the benefits, generality, and applicability of our approach with several examples of practical interest for both geodynamic modeling and geophysical inversion/modeling. Our results demonstrate that the proposed method is a competitive and attractive candidate for implementing thermodynamic constraints into a broad range of geophysical and geodynamic studies. MATLAB implementations of the method and examples are provided as supporting information and can be downloaded from the journal's website.

  5. Implementation of a Nurse Driven Pathway to Reduce Incidence of Hospital Acquired Pressure Injuries in the Pediatric Intensive Care Setting.

    PubMed

    Rowe, Angela D; McCarty, Karen; Huett, Amy

    2018-03-13

    A large, freestanding pediatric hospital in the southern United States saw a 117% increase in reported hospital acquired pressure injuries (HAPI) between 2013 and 2015, with the intensive care units being the units of highest occurrence. Design and Methods A quality improvement project was designed and implemented to assist with pressure injury prevention. Literature review confirmed that pediatric HAPIs are a challenge and that usage of bundles and user-friendly guidelines/pathways can help eliminate barriers to prevention. The aim of this quality improvement project had two aims. First, to reduce HAPI incidence in the PICU by 10%. Second, to increase consistent usage of pressure injury prevention strategies as evidenced by a 10% increase in pressure injury bundle compliance. The third aim was to identify if there are differences in percentage of interventions implemented between two different groups of patients. Donabedian's model of Structure, Process, and Outcomes guided the development and implementation of this quality improvement project. Interventions focused on risk assessment subscale scores have the opportunity to mitigate specific risk factors and improve pressure injury prevention. Through implementation of the nurse driven pathway there was as 57% decrease in reported HAPIs in the PICU as well as a 66% increase in pressure ulcer prevention bundle compliance. Implementation of the nurse driven pressure injury prevention pathway was successful. There was a significant increase in bundle compliance for pressure ulcer prevention and a decrease in reported HAPIs. The pathway developed and implemented for this quality improvement project could be adapted to other populations and care settings to provide guidance across the continuum. Copyright © 2018 Elsevier Inc. All rights reserved.

  6. Solving Disparities Through Payment And Delivery System Reform: A Program To Achieve Health Equity.

    PubMed

    DeMeester, Rachel H; Xu, Lucy J; Nocon, Robert S; Cook, Scott C; Ducas, Andrea M; Chin, Marshall H

    2017-06-01

    Payment systems generally do not directly encourage or support the reduction of health disparities. In 2013 the Finding Answers: Solving Disparities through Payment and Delivery System Reform program of the Robert Wood Johnson Foundation sought to understand how alternative payment models might intentionally incorporate a disparities-reduction component to promote health equity. A qualitative analysis of forty proposals to the program revealed that applicants generally did not link payment reform tightly to disparities reduction. Most proposed general pay-for-performance, global payment, or shared savings plans, combined with multicomponent system interventions. None of the applicants proposed making any financial payments contingent on having successfully reduced disparities. Most applicants did not address how they would optimize providers' intrinsic and extrinsic motivation to reduce disparities. A better understanding of how payment and care delivery models might be designed and implemented to reduce health disparities is essential. Project HOPE—The People-to-People Health Foundation, Inc.

  7. Engaged for Change: A Community-Engaged Process for Developing Interventions to Reduce Health Disparities.

    PubMed

    Rhodes, Scott D; Mann-Jackson, Lilli; Alonzo, Jorge; Simán, Florence M; Vissman, Aaron T; Nall, Jennifer; Abraham, Claire; Aronson, Robert E; Tanner, Amanda E

    2017-12-01

    The science underlying the development of individual, community, system, and policy interventions designed to reduce health disparities has lagged behind other innovations. Few models, theoretical frameworks, or processes exist to guide intervention development. Our community-engaged research partnership has been developing, implementing, and evaluating efficacious interventions to reduce HIV disparities for over 15 years. Based on our intervention research experiences, we propose a novel 13-step process designed to demystify and guide intervention development. Our intervention development process includes steps such as establishing an intervention team to manage the details of intervention development; assessing community needs, priorities, and assets; generating intervention priorities; evaluating and incorporating theory; developing a conceptual or logic model; crafting activities; honing materials; administering a pilot, noting its process, and gathering feedback from all those involved; and editing the intervention based on what was learned. Here, we outline and describe each of these 13 steps.

  8. Research on PM2.5 emission reduction path of China ‘s electric power industry based on DEA model

    NASA Astrophysics Data System (ADS)

    Jin, Yanming; Yang, Fan; Liu, Jun

    2018-02-01

    Based on the theory of data envelopment analysis, this study constructs the environmental performance evaluation model of the power industry, analyzes the performance of development of clean energy, the implementation of electricity replacement, and the development of coal-fired energy-saving and emission-reducing measures. Put forward technology path to reduce emission in the future. The results show that (1) improving the proportion of coal for power generation, speeding up the replacement of electricity is the key to solve the haze in China. (2) With the photovoltaic and other new energy power generation costs gradually reduced and less limit from thermal energy, by final of “thirteenth five-years plan”, the economy of clean energy will surpass thermal energy-saving emission reduction. (3) After 2025, the economy of the electricity replacement will be able to show.

  9. Economic analysis of best management practices to reduce watershed phosphorus losses.

    PubMed

    Rao, Nalini S; Easton, Zachary M; Lee, David R; Steenhuis, Tammo S

    2012-01-01

    In phosphorus-limited freshwater systems, small increases in phosphorus (P) concentrations can lead to eutrophication. To reduce P inputs to these systems, various environmental and agricultural agencies provide producers with incentives to implement best management practices (BMPs). In this study, we examine both the water quality and economic consequences of systematically protecting saturated, runoff-generating areas from active agriculture with selected BMPs. We also examine the joint water quality/economic impacts of these BMPs-specifically BMPs focusing on barnyards and buffer areas. Using the Variable Source Loading Function model (a modified Generalized Watershed Loading Function model) and net present value analysis (NPV), the results indicate that converting runoff-prone agricultural land to buffers and installing barnyard BMPs are both highly effective in decreasing dissolved P loss from a single-farm watershed, but are also costly for the producer. On average, including barnyard BMPs decreases the nutrient loading by about 5.5% compared with only implementing buffers. The annualized NPV for installing both buffers on only the wettest areas of the landscape and implementing barnyard BMPs becomes positive only if the BMPs lifetime exceeds 15 yr. The spatial location of the BMPs in relation to runoff producing areas, the time frame over which the BMPs are implemented, and the marginal costs of increasing buffer size were found to be the most critical considerations for water quality and profitability. The framework presented here incorporates estimations of nutrient loading reductions in the economic analysis, and is applicable to farms facing BMP adoption decisions. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  10. Implementation of Kalman filter algorithm on models reduced using singular pertubation approximation method and its application to measurement of water level

    NASA Astrophysics Data System (ADS)

    Rachmawati, Vimala; Khusnul Arif, Didik; Adzkiya, Dieky

    2018-03-01

    The systems contained in the universe often have a large order. Thus, the mathematical model has many state variables that affect the computation time. In addition, generally not all variables are known, so estimations are needed to measure the magnitude of the system that cannot be measured directly. In this paper, we discuss the model reduction and estimation of state variables in the river system to measure the water level. The model reduction of a system is an approximation method of a system with a lower order without significant errors but has a dynamic behaviour that is similar to the original system. The Singular Perturbation Approximation method is one of the model reduction methods where all state variables of the equilibrium system are partitioned into fast and slow modes. Then, The Kalman filter algorithm is used to estimate state variables of stochastic dynamic systems where estimations are computed by predicting state variables based on system dynamics and measurement data. Kalman filters are used to estimate state variables in the original system and reduced system. Then, we compare the estimation results of the state and computational time between the original and reduced system.

  11. Evaluating strategies to reduce urban air pollution

    NASA Astrophysics Data System (ADS)

    Duque, L.; Relvas, H.; Silveira, C.; Ferreira, J.; Monteiro, A.; Gama, C.; Rafael, S.; Freitas, S.; Borrego, C.; Miranda, A. I.

    2016-02-01

    During the last years, specific air quality problems have been detected in the urban area of Porto (Portugal). Both PM10 and NO2 limit values have been surpassed in several air quality monitoring stations and, following the European legislation requirements, Air Quality Plans were designed and implemented to reduce those levels. In this sense, measures to decrease PM10 and NO2 emissions have been selected, these mainly related to the traffic sector, but also regarding the industrial and residential combustion sectors. The main objective of this study is to investigate the efficiency of these reduction measures with regard to the improvement of PM10 and NO2 concentration levels over the Porto urban region using a numerical modelling tool - The Air Pollution Model (TAPM). TAPM was applied over the study region, for a simulation domain of 80 × 80 km2 with a spatial resolution of 1 × 1 km2. The entire year of 2012 was simulated and set as the base year for the analysis of the impacts of the selected measures. Taking into account the main activity sectors, four main scenarios have been defined and simulated, with focus on: (1) hybrid cars; (2) a Low Emission Zone (LEZ); (3) fireplaces and (4) industry. The modelling results indicate that measures to reduce PM10 should be focused on residential combustion (fireplaces) and industrial activity and for NO2 the strategy should be based on the traffic sector. The implementation of all the defined scenarios will allow a total maximum reduction of 4.5% on the levels of both pollutants.

  12. Modeling survival of juvenile salmon during downriver migration in the Columbia River on a microcomputer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peloquin, R.A.; McKenzie, D.H.

    1994-10-01

    A compartmental model has been implemented on a microcomputer as an aid in the analysis of alternative solutions to a problem. The model, entitled Smolt Survival Simulator, simulates the survival of juvenile salmon during their downstream migration and passage of hydroelectric dams in the Columbia River. The model is designed to function in a workshop environment where resource managers and fisheries biologists can study alternative measures that may potentially increase juvenile anadromous fish survival during downriver migration. The potential application of the model has placed several requirements on the implementing software. It must be available for use in workshop settings.more » The software must be easily to use with minimal computer knowledge. Scenarios must be created and executed quickly and efficiently. Results must be immediately available. Software design emphasis vas placed on the user interface because of these requirements. The discussion focuses on methods used in the development of the SSS software user interface. These methods should reduce user stress and alloy thorough and easy parameter modification.« less

  13. Implementation and comparative analysis of the optimisations produced by evolutionary algorithms for the parameter extraction of PSP MOSFET model

    NASA Astrophysics Data System (ADS)

    Hadia, Sarman K.; Thakker, R. A.; Bhatt, Kirit R.

    2016-05-01

    The study proposes an application of evolutionary algorithms, specifically an artificial bee colony (ABC), variant ABC and particle swarm optimisation (PSO), to extract the parameters of metal oxide semiconductor field effect transistor (MOSFET) model. These algorithms are applied for the MOSFET parameter extraction problem using a Pennsylvania surface potential model. MOSFET parameter extraction procedures involve reducing the error between measured and modelled data. This study shows that ABC algorithm optimises the parameter values based on intelligent activities of honey bee swarms. Some modifications have also been applied to the basic ABC algorithm. Particle swarm optimisation is a population-based stochastic optimisation method that is based on bird flocking activities. The performances of these algorithms are compared with respect to the quality of the solutions. The simulation results of this study show that the PSO algorithm performs better than the variant ABC and basic ABC algorithm for the parameter extraction of the MOSFET model; also the implementation of the ABC algorithm is shown to be simpler than that of the PSO algorithm.

  14. ePCR: an R-package for survival and time-to-event prediction in advanced prostate cancer, applied to real-world patient cohorts.

    PubMed

    Laajala, Teemu D; Murtojärvi, Mika; Virkki, Arho; Aittokallio, Tero

    2018-06-15

    Prognostic models are widely used in clinical decision-making, such as risk stratification and tailoring treatment strategies, with the aim to improve patient outcomes while reducing overall healthcare costs. While prognostic models have been adopted into clinical use, benchmarking their performance has been difficult due to lack of open clinical datasets. The recent DREAM 9.5 Prostate Cancer Challenge carried out an extensive benchmarking of prognostic models for metastatic Castration-Resistant Prostate Cancer (mCRPC), based on multiple cohorts of open clinical trial data. We make available an open-source implementation of the top-performing model, ePCR, along with an extended toolbox for its further re-use and development, and demonstrate how to best apply the implemented model to real-world data cohorts of advanced prostate cancer patients. The open-source R-package ePCR and its reference documentation are available at the Central R Archive Network (CRAN): https://CRAN.R-project.org/package=ePCR. R-vignette provides step-by-step examples for the ePCR usage. Supplementary data are available at Bioinformatics online.

  15. Conservation Reserve Program effects on floodplain land cover management.

    PubMed

    Jobe, Addison; Kalra, Ajay; Ibendahl, Elise

    2018-05-15

    Growing populations and industrialized agriculture practices have eradicated much of the United States wetlands along river floodplains. One program available for the restoration of floodplains is the Conservation Reserve Program (CRP). The current research explores the effects CRP land change has on flooding zones, utilizing Flood Modeller and HEC-RAS. Flood Modeller is proven a viable tool for flood modeling within the United States when compared to HEC-RAS. Application of the software is used in the Nodaway River system located in the western halves of Iowa and Missouri to model effects of introducing new forest areas within the region. Flood stage during the conversion first decreases in the early years, before rising to produce greater heights. Flow velocities where CRP land is present are reduced for long-term scopes. Velocity reduction occurs as the Manning's roughness increases due to tree diameter and brush density. Flood zones become more widespread with the implementation of CRP. Future model implementations are recommended to witness the effects of smaller flood recurrence intervals. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. Neural Generalized Predictive Control: A Newton-Raphson Implementation

    NASA Technical Reports Server (NTRS)

    Soloway, Donald; Haley, Pamela J.

    1997-01-01

    An efficient implementation of Generalized Predictive Control using a multi-layer feedforward neural network as the plant's nonlinear model is presented. In using Newton-Raphson as the optimization algorithm, the number of iterations needed for convergence is significantly reduced from other techniques. The main cost of the Newton-Raphson algorithm is in the calculation of the Hessian, but even with this overhead the low iteration numbers make Newton-Raphson faster than other techniques and a viable algorithm for real-time control. This paper presents a detailed derivation of the Neural Generalized Predictive Control algorithm with Newton-Raphson as the minimization algorithm. Simulation results show convergence to a good solution within two iterations and timing data show that real-time control is possible. Comments about the algorithm's implementation are also included.

  17. Performance Analysis of a Hardware Implemented Complex Signal Kurtosis Radio-Frequency Interference Detector

    NASA Technical Reports Server (NTRS)

    Schoenwald, Adam J.; Bradley, Damon C.; Mohammed, Priscilla N.; Piepmeier, Jeffrey R.; Wong, Mark

    2016-01-01

    In the field of microwave radiometry, Radio Frequency Interference (RFI) consistently degrades the value of scientific results. Through the use of digital receivers and signal processing, the effects of RFI on scientific measurements can be reduced depending on certain circumstances. As technology allows us to implement wider band digital receivers for radiometry, the problem of RFI mitigation changes. Our work focuses on finding a detector that outperforms real kurtosis in wide band scenarios. The algorithm implemented is a complex signal kurtosis detector which was modeled and simulated. The performance of both complex and real signal kurtosis is evaluated for continuous wave, pulsed continuous wave, and wide band quadrature phase shift keying (QPSK) modulations. The use of complex signal kurtosis increased the detectability of interference.

  18. Modeling urban building energy use: A review of modeling approaches and procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Wenliang; Zhou, Yuyu; Cetin, Kristen

    With rapid urbanization and economic development, the world has been experiencing an unprecedented increase in energy consumption and greenhouse gas (GHG) emissions. While reducing energy consumption and GHG emissions is a common interest shared by major developed and developing countries, actions to enable these global reductions are generally implemented at the city scale. This is because baseline information from individual cities plays an important role in identifying economical options for improving building energy efficiency and reducing GHG emissions. Numerous approaches have been proposed for modeling urban building energy use in the past decades. This paper aims to provide an up-to-datemore » review of the broad categories of energy models for urban buildings and describes the basic workflow of physics-based, bottom-up models and their applications in simulating urban-scale building energy use. Because there are significant differences across models with varied potential for application, strengths and weaknesses of the reviewed models are also presented. This is followed by a discussion of challenging issues associated with model preparation and calibration.« less

  19. Modeling urban building energy use: A review of modeling approaches and procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Wenliang; Zhou, Yuyu; Cetin, Kristen

    With rapid urbanization and economic development, the world has been experiencing an unprecedented increase in energy consumption and greenhouse gas (GHG) emissions. While reducing energy consumption and GHG emissions is a common interest shared by major developed and developing countries, actions to enable these global reductions are generally implemented at the city scale. This is because baseline information from individual cities plays an important role in identifying economical options for improving building energy efficiency and reducing GHG emissions. Numerous approaches have been proposed for modeling urban building energy use in the past decades. Our paper aims to provide an up-to-datemore » review of the broad categories of energy models for urban buildings and describes the basic workflow of physics-based, bottom-up models and their applications in simulating urban-scale building energy use. Because there are significant differences across models with varied potential for application, strengths and weaknesses of the reviewed models are also presented. We then follow this with a discussion of challenging issues associated with model preparation and calibration.« less

  20. Modeling urban building energy use: A review of modeling approaches and procedures

    DOE PAGES

    Li, Wenliang; Zhou, Yuyu; Cetin, Kristen; ...

    2017-11-13

    With rapid urbanization and economic development, the world has been experiencing an unprecedented increase in energy consumption and greenhouse gas (GHG) emissions. While reducing energy consumption and GHG emissions is a common interest shared by major developed and developing countries, actions to enable these global reductions are generally implemented at the city scale. This is because baseline information from individual cities plays an important role in identifying economical options for improving building energy efficiency and reducing GHG emissions. Numerous approaches have been proposed for modeling urban building energy use in the past decades. Our paper aims to provide an up-to-datemore » review of the broad categories of energy models for urban buildings and describes the basic workflow of physics-based, bottom-up models and their applications in simulating urban-scale building energy use. Because there are significant differences across models with varied potential for application, strengths and weaknesses of the reviewed models are also presented. We then follow this with a discussion of challenging issues associated with model preparation and calibration.« less

  1. A systematic RE-AIM review to assess sugar-sweetened beverage interventions for children and adolescents across the socio-ecological model

    PubMed Central

    Porter, Kathleen; Estabrooks, Paul; Zoellner, Jamie

    2016-01-01

    Background Sugar-sweetened beverage (SSB) consumption among children and adolescents is a determinant of childhood obesity. Many programs to reduce consumption across the socio-ecological model report significant positive results; however, the generalizability of the results, including whether reporting differences exist among socio-ecological strategy levels, is unknown. Objectives This systematic review aims to (1) examine the extent to which studies reported internal and external validity indicators defined by RE-AIM (reach, effectiveness, adoption, implementation, maintenance) and (2) assess reporting differences by socio-ecological level: intrapersonal/interpersonal (Level 1), environmental/policy (Level 2), multi-level (Combined Level). Methods Six major databases (PubMed, Web of Science, Cinahl, CAB Abstracts, ERIC, and Agiricola) systematic literature review was conducted to identify studies from 2004–2015 meeting inclusion criteria (targeting children aged 3–12, adolescents 13–17, and young adults 18 years, experimental/quasi-experimental, substantial SSB component). Interventions were categorized by socio-ecological level, and data were extracted using a validated RE-AIM protocol. A one-way ANOVA assessed differences between levels. Results There were 55 eligible studies (N) accepted, including 21 Level 1, 18 Level 2, and 16 Combined Level studies. Thirty-six (65%) were conducted in the USA, 19 (35%) internationally, and 39 (71%) were implemented in schools. Across levels, reporting averages were low for all RE-AIM dimensions (reach=29%, efficacy/effectiveness=45%, adoption=26%, implementation=27%, maintenance=14%). Level 2 studies had significantly lower reporting on reach and effectiveness (10% and 26%, respectively) compared to Level 1 (44%, 57%) or Combined Level studies (31%, 52%) (p<0.001). Adoption, implementation, and maintenance reporting did not vary among levels. Conclusion Interventions to reduce SSB in children and adolescents across the socio-ecological spectrum do not provide the necessary information for dissemination and implementation in community nutrition settings. Future interventions should address both internal and external validity to maximize population impact. PMID:27262383

  2. Model-Based Engine Control Architecture with an Extended Kalman Filter

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey T.; Connolly, Joseph W.

    2016-01-01

    This paper discusses the design and implementation of an extended Kalman filter (EKF) for model-based engine control (MBEC). Previously proposed MBEC architectures feature an optimal tuner Kalman Filter (OTKF) to produce estimates of both unmeasured engine parameters and estimates for the health of the engine. The success of this approach relies on the accuracy of the linear model and the ability of the optimal tuner to update its tuner estimates based on only a few sensors. Advances in computer processing are making it possible to replace the piece-wise linear model, developed off-line, with an on-board nonlinear model running in real-time. This will reduce the estimation errors associated with the linearization process, and is typically referred to as an extended Kalman filter. The non-linear extended Kalman filter approach is applied to the Commercial Modular Aero-Propulsion System Simulation 40,000 (C-MAPSS40k) and compared to the previously proposed MBEC architecture. The results show that the EKF reduces the estimation error, especially during transient operation.

  3. Model-Based Engine Control Architecture with an Extended Kalman Filter

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey T.; Connolly, Joseph W.

    2016-01-01

    This paper discusses the design and implementation of an extended Kalman filter (EKF) for model-based engine control (MBEC). Previously proposed MBEC architectures feature an optimal tuner Kalman Filter (OTKF) to produce estimates of both unmeasured engine parameters and estimates for the health of the engine. The success of this approach relies on the accuracy of the linear model and the ability of the optimal tuner to update its tuner estimates based on only a few sensors. Advances in computer processing are making it possible to replace the piece-wise linear model, developed off-line, with an on-board nonlinear model running in real-time. This will reduce the estimation errors associated with the linearization process, and is typically referred to as an extended Kalman filter. The nonlinear extended Kalman filter approach is applied to the Commercial Modular Aero-Propulsion System Simulation 40,000 (C-MAPSS40k) and compared to the previously proposed MBEC architecture. The results show that the EKF reduces the estimation error, especially during transient operation.

  4. The role of public policies in reducing smoking and deaths caused by smoking in Vietnam: results from the Vietnam tobacco policy simulation model.

    PubMed

    Levy, David T; Bales, Sarah; Lam, Nguyen T; Nikolayev, Leonid

    2006-04-01

    A simulation model is developed for Vietnam to project smoking prevalence and associated premature mortality. The model examines independently and as a package the effects of five types of tobacco control policies: tax increases, clean air laws, mass media campaigns, advertising bans, and youth access policies. Predictions suggest that the largest reductions in smoking rates will result from implementing a comprehensive tobacco control policy package. Significant inroads may be achieved through tax increases. A media campaign along with programs to publicize and enforce clean air laws, advertising bans and youth access laws would further reduce smoking rates. Tobacco control policies have the potential to make large dents in smoking rates, which in turn could lead to many lives saved. In the absence of these measures, deaths from smoking will increase. The model also helps to identify information gaps pertinent both to modeling and policy-making.

  5. Development of WRF-ROI system by incorporating eigen-decomposition

    NASA Astrophysics Data System (ADS)

    Kim, S.; Noh, N.; Song, H.; Lim, G.

    2011-12-01

    This study presents the development of WRF-ROI system, which is the implementation of Retrospective Optimal Interpolation (ROI) to the Weather Research and Forecasting model (WRF). ROI is a new data assimilation algorithm introduced by Song et al. (2009) and Song and Lim (2009). The formulation of ROI is similar with that of Optimal Interpolation (OI), but ROI iteratively assimilates an observation set at a post analysis time into a prior analysis, possibly providing the high quality reanalysis data. ROI method assimilates the data at post analysis time using perturbation method (Errico and Raeder, 1999) without adjoint model. In previous study, ROI method is applied to Lorenz 40-variable model (Lorenz, 1996) to validate the algorithm and to investigate the capability. It is therefore required to apply this ROI method into a more realistic and complicated model framework such as WRF. In this research, the reduced-rank formulation of ROI is used instead of a reduced-resolution method. The computational costs can be reduced due to the eigen-decomposition of background error covariance in the reduced-rank method. When single profile of observations is assimilated in the WRF-ROI system by incorporating eigen-decomposition, the analysis error tends to be reduced if compared with the background error. The difference between forecast errors with assimilation and without assimilation is obviously increased as time passed, which means the improvement of forecast error by assimilation.

  6. The Instrument for Measuring the Implementation Situation of Traditional Chinese Medicine Guideline: Evaluation and Application

    PubMed Central

    Wang, Yangyang; Chen, Yaolong; Wang, Xiaoyun; Deng, Jingwen

    2017-01-01

    Clinical practice guidelines play an important role in reducing the variations in clinical practices and improving the quality of care. To assess the real effect, measuring its implementation situation is needed. The implementation situation can be reflected by testing the consistency between the actual clinical practice and the guideline. We constructed an instrument to measure the implementation situation of Traditional Chinese Medicine (TCM) guideline through consistency testing. The main objectives of our study were to validate the instrument and evaluate the implementation situation of menopause syndrome guideline of TCM, using the data from the consistency test of comparing the medical records with the guideline. A total of 621 cases were included for data analysis. Cronbach's Alpha coefficient is 0.73. The model fit of 7 items in four dimensions was good (SRMR = 0.04; GFI = 0.97; NFI = 0.97; TLI = 0.96; CFI = 0.98; AGFI = 0.90). This instrument is of good reliability and validity. It can help the guideline developers to measure the implementation situation, find the reasons affecting the implementation, and revise the guideline. The method of using consistency test to measure the implementation situation may provide a sample for evaluating the guideline implementation in other fields. PMID:29234379

  7. Adaptive Neuron Apoptosis for Accelerating Deep Learning on Large Scale Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Siegel, Charles M.; Daily, Jeffrey A.; Vishnu, Abhinav

    Machine Learning and Data Mining (MLDM) algorithms are becoming ubiquitous in {\\em model learning} from the large volume of data generated using simulations, experiments and handheld devices. Deep Learning algorithms -- a class of MLDM algorithms -- are applied for automatic feature extraction, and learning non-linear models for unsupervised and supervised algorithms. Naturally, several libraries which support large scale Deep Learning -- such as TensorFlow and Caffe -- have become popular. In this paper, we present novel techniques to accelerate the convergence of Deep Learning algorithms by conducting low overhead removal of redundant neurons -- {\\em apoptosis} of neurons --more » which do not contribute to model learning, during the training phase itself. We provide in-depth theoretical underpinnings of our heuristics (bounding accuracy loss and handling apoptosis of several neuron types), and present the methods to conduct adaptive neuron apoptosis. We implement our proposed heuristics with the recently introduced TensorFlow and using its recently proposed extension with MPI. Our performance evaluation on two difference clusters -- one connected with Intel Haswell multi-core systems, and other with nVIDIA GPUs -- using InfiniBand, indicates the efficacy of the proposed heuristics and implementations. Specifically, we are able to improve the training time for several datasets by 2-3x, while reducing the number of parameters by 30x (4-5x on average) on datasets such as ImageNet classification. For the Higgs Boson dataset, our implementation improves the accuracy (measured by Area Under Curve (AUC)) for classification from 0.88/1 to 0.94/1, while reducing the number of parameters by 3x in comparison to existing literature, while achieving a 2.44x speedup in comparison to the default (no apoptosis) algorithm.« less

  8. Reducing postponements of elective pediatric cardiac procedures: analysis and implementation of a discrete event simulation model.

    PubMed

    Day, Theodore Eugene; Sarawgi, Sandeep; Perri, Alexis; Nicolson, Susan C

    2015-04-01

    This study describes the use of discrete event simulation (DES) to model and analyze a large academic pediatric and test cardiac center. The objective was to identify a strategy, and to predict and test the effectiveness of that strategy, to minimize the number of elective cardiac procedures that are postponed because of a lack of available cardiac intensive care unit (CICU) capacity. A DES of the cardiac center at The Children's Hospital of Philadelphia was developed and was validated by use of 1 year of deidentified administrative patient data. The model was then used to analyze strategies for reducing postponements of cases requiring CICU care through improved scheduling of multipurpose space. Each of five alternative scenarios was simulated for ten independent 1-year runs. Reductions in simulated elective procedure postponements were found when a multipurpose procedure room (the hybrid room) was used for operations on Wednesday and Thursday, compared with Friday (as was the real-world use). The reduction Wednesday was statistically significant, with postponements dropping from 27.8 to 23.3 annually (95% confidence interval 18.8-27.8). Thus, we anticipate a relative reduction in postponements of 16.2%. Since the implementation, there have been two postponements from July 1 to November 21, 2014, compared with ten for the same time period in 2013. Simulation allows us to test planned changes in complex environments, including pediatric cardiac care. Reduction in postponements of cardiac procedures requiring CICU care is predicted through reshuffling schedules of existing multipurpose capacity, and these reductions appear to be achievable in the real world after implementation. Copyright © 2015 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  9. Quantitative option analysis for implementation and management of landfills.

    PubMed

    Kerestecioğlu, Merih

    2016-09-01

    The selection of the most feasible strategy for implementation of landfills is a challenging step. Potential implementation options of landfills cover a wide range, from conventional construction contracts to the concessions. Montenegro, seeking to improve the efficiency of the public services while maintaining affordability, was considering privatisation as a way to reduce public spending on service provision. In this study, to determine the most feasible model for construction and operation of a regional landfill, a quantitative risk analysis was implemented with four steps: (i) development of a global risk matrix; (ii) assignment of qualitative probabilities of occurrences and magnitude of impacts; (iii) determination of the risks to be mitigated, monitored, controlled or ignored; (iv) reduction of the main risk elements; and (v) incorporation of quantitative estimates of probability of occurrence and expected impact for each risk element in the reduced risk matrix. The evaluated scenarios were: (i) construction and operation of the regional landfill by the public sector; (ii) construction and operation of the landfill by private sector and transfer of the ownership to the public sector after a pre-defined period; and (iii) operation of the landfill by the private sector, without ownership. The quantitative risk assessment concluded that introduction of a public private partnership is not the most feasible option, unlike the common belief in several public institutions in developing countries. A management contract for the first years of operation was advised to be implemented, after which, a long term operating contract may follow. © The Author(s) 2016.

  10. The Brazil SimSmoke Policy Simulation Model: The Effect of Strong Tobacco Control Policies on Smoking Prevalence and Smoking-Attributable Deaths in a Middle Income Nation

    PubMed Central

    Levy, David; de Almeida, Liz Maria; Szklo, Andre

    2012-01-01

    Background Brazil has reduced its smoking rate by about 50% in the last 20 y. During that time period, strong tobacco control policies were implemented. This paper estimates the effect of these stricter policies on smoking prevalence and associated premature mortality, and the effect that additional policies may have. Methods and Findings The model was developed using the SimSmoke tobacco control policy model. Using policy, population, and smoking data for Brazil, the model assesses the effect on premature deaths of cigarette taxes, smoke-free air laws, mass media campaigns, marketing restrictions, packaging requirements, cessation treatment programs, and youth access restrictions. We estimate the effect of past policies relative to a counterfactual of policies kept to 1989 levels, and the effect of stricter future policies. Male and female smoking prevalence in Brazil have fallen by about half since 1989, which represents a 46% (lower and upper bounds: 28%–66%) relative reduction compared to the 2010 prevalence under the counterfactual scenario of policies held to 1989 levels. Almost half of that 46% reduction is explained by price increases, 14% by smoke-free air laws, 14% by marketing restrictions, 8% by health warnings, 6% by mass media campaigns, and 10% by cessation treatment programs. As a result of the past policies, a total of almost 420,000 (260,000–715,000) deaths had been averted by 2010, increasing to almost 7 million (4.5 million–10.3 million) deaths projected by 2050. Comparing future implementation of a set of stricter policies to a scenario with 2010 policies held constant, smoking prevalence by 2050 could be reduced by another 39% (29%–54%), and 1.3 million (0.9 million–2.0 million) out of 9 million future premature deaths could be averted. Conclusions Brazil provides one of the outstanding public health success stories in reducing deaths due to smoking, and serves as a model for other low and middle income nations. However, a set of stricter policies could further reduce smoking and save many additional lives. Please see later in the article for the Editors' Summary PMID:23139643

  11. The Brazil SimSmoke policy simulation model: the effect of strong tobacco control policies on smoking prevalence and smoking-attributable deaths in a middle income nation.

    PubMed

    Levy, David; de Almeida, Liz Maria; Szklo, Andre

    2012-01-01

    Brazil has reduced its smoking rate by about 50% in the last 20 y. During that time period, strong tobacco control policies were implemented. This paper estimates the effect of these stricter policies on smoking prevalence and associated premature mortality, and the effect that additional policies may have. The model was developed using the SimSmoke tobacco control policy model. Using policy, population, and smoking data for Brazil, the model assesses the effect on premature deaths of cigarette taxes, smoke-free air laws, mass media campaigns, marketing restrictions, packaging requirements, cessation treatment programs, and youth access restrictions. We estimate the effect of past policies relative to a counterfactual of policies kept to 1989 levels, and the effect of stricter future policies. Male and female smoking prevalence in Brazil have fallen by about half since 1989, which represents a 46% (lower and upper bounds: 28%-66%) relative reduction compared to the 2010 prevalence under the counterfactual scenario of policies held to 1989 levels. Almost half of that 46% reduction is explained by price increases, 14% by smoke-free air laws, 14% by marketing restrictions, 8% by health warnings, 6% by mass media campaigns, and 10% by cessation treatment programs. As a result of the past policies, a total of almost 420,000 (260,000-715,000) deaths had been averted by 2010, increasing to almost 7 million (4.5 million-10.3 million) deaths projected by 2050. Comparing future implementation of a set of stricter policies to a scenario with 2010 policies held constant, smoking prevalence by 2050 could be reduced by another 39% (29%-54%), and 1.3 million (0.9 million-2.0 million) out of 9 million future premature deaths could be averted. Brazil provides one of the outstanding public health success stories in reducing deaths due to smoking, and serves as a model for other low and middle income nations. However, a set of stricter policies could further reduce smoking and save many additional lives. Please see later in the article for the Editors' Summary.

  12. Confronting Equity Issues on Campus: Implementing the Equity Scorecard in Theory and Practice

    ERIC Educational Resources Information Center

    Bensimon, Estela Mara, Ed.; Malcom, Lindsey, Ed.

    2012-01-01

    How can it be that 50 years after the passage of the Civil Rights Act, our institutions of higher education have still not found ways of reducing the higher education gaps for racial and ethnic groups? That is the question that informs and animates the Equity Scorecard model of organizational change. It shifts institutions' focus from what…

  13. Assessing climate change impacts on winter cover crop nitrate uptake efficiency on the coastal plain of the Chesapeake Bay watershed using the SWAT model

    USDA-ARS?s Scientific Manuscript database

    Climate change is expected to exacerbate water quality degradation in the Chesapeake Bay watershed (CBW). Winter cover crops (WCCs) have been widely implemented in this region owing to their high effectiveness at reducing nitrate loads. However, little is known about climate change impacts on the ef...

  14. Modeling potential outcomes of fire and fuel management scenarios on the structure of forested habitats in northeast Oregon, USA.

    Treesearch

    B.C. Wales; L.H. Suring; M.A. Hemstrom

    2007-01-01

    Thinning and prescribed fire are being used extensively across the interior Western United States to reduce the risk of large, severe wildfires. However, the full ecological consequences of implementing these management practices on the landscape have not been completely evaluated. We projected future vegetation trends resulting from four management scenarios and...

  15. Implementation of Medicaid Managed Long-Term Services and Supports for Adults with Intellectual And/Or Developmental Disabilities in Kansas

    ERIC Educational Resources Information Center

    Williamson, Heather J.; Perkins, Elizabeth A.; Levin, Bruce L.; Baldwin, Julie A.; Lulinski, Amie; Armstrong, Mary I.; Massey, Oliver T.

    2017-01-01

    Many adults with intellectual and/or developmental disabilities (IDD) can access health and long-term services and supports (LTSS) through Medicaid. States are reforming their Medicaid LTSS programs from a fee-for-service model to a Medicaid managed LTSS (MLTSS) approach, anticipating improved quality of care and reduced costs, although there is…

  16. Implementing Trauma-Informed Treatment for Youth in a Residential Facility: First-Year Outcomes

    ERIC Educational Resources Information Center

    Greenwald, Ricky; Siradas, Lynn; Schmitt, Thomas A.; Reslan, Summar; Fierle, Julia; Sande, Brad

    2012-01-01

    Training in the Fairy Tale model of trauma-informed treatment was provided to clinical and direct care staff working with 53 youth in a residential treatment facility. Compared to the year prior to training, in the year of the training the average improvement in presenting problems was increased by 34%, time to discharge was reduced by 39%, and…

  17. A Culturally Appropriate School Wellness Initiative: Results of a 2-Year Pilot Intervention in 2 Jewish Schools

    ERIC Educational Resources Information Center

    Benjamins, Maureen R.; Whitman, Steven

    2010-01-01

    Background: Despite the growing number of school-based interventions designed to reduce childhood obesity or otherwise promote health, no models or materials were found for Jewish schools. The current study describes an effort within a Jewish school system in Chicago to create, implement, and evaluate a school-based intervention tailored to the…

  18. 75 FR 53613 - Notice of Data Availability Supporting Federal Implementation Plans To Reduce Interstate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-01

    ... files should avoid the use of special characters, avoid any form of encryption, and be free of any... modeling. This includes both units affected by the Proposed Transport Rule and other EGUs (e.g. fossil-fired units smaller than 25 MWe, non-fossil-fired units, and fossil-fired units 25 MWe or greater in...

  19. Optimising fuel treatments over time and space

    Treesearch

    Woodam Chung; Greg Jones; Kurt Krueger; Jody Bramel; Marco Contreras

    2013-01-01

    Fuel treatments have been widely used as a tool to reduce catastrophic wildland fire risks in many forests around the world. However, it is a challenging task for forest managers to prioritise where, when and how to implement fuel treatments across a large forest landscape. In this study, an optimisation model was developed for long-term fuel management decisions at a...

  20. Consistency.

    PubMed

    Levin, Roger

    2005-09-01

    Consistency is a reflection of having the right model, the right systems and the right implementation. As Vince Lombardi, the legendary coach of the Green Bay Packers, once said, "You don't do things right once in a while. You do them right all the time." To provide the ultimate level of patient care, reduce stress for the dentist and staff members and ensure high practice profitability, consistency is key.

  1. Implementation of a Model-Tracing-Based Learning Diagnosis System to Promote Elementary Students' Learning in Mathematics

    ERIC Educational Resources Information Center

    Chu, Yian-Shu; Yang, Haw-Ching; Tseng, Shian-Shyong; Yang, Che-Ching

    2014-01-01

    Of all teaching methods, one-to-one human tutoring is the most powerful method for promoting learning. To achieve this aim and reduce teaching load, researchers developed intelligent tutoring systems (ITSs) to employ one-to-one tutoring (Aleven, McLaren, & Sewall, 2009; Aleven, McLaren, Sewall, & Koedinger, 2009; Anderson, Corbett,…

  2. A fuel treatment reduces fire severity and increases suppression efficiency in a mixed conifer forest

    Treesearch

    Jason J. Moghaddas; Larry Craggs

    2007-01-01

    Fuel treatments are being implemented on public and private lands across the western United States. Although scientists and managers have an understanding of how fuel treatments can modify potential fire behaviour under modelled conditions, there is limited information on how treatments perform under real wildfire conditions in Sierran mixed conifer forests. The Bell...

  3. A Comparison of Multivariable Control Design Techniques for a Turbofan Engine Control

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay; Watts, Stephen R.

    1995-01-01

    This paper compares two previously published design procedures for two different multivariable control design techniques for application to a linear engine model of a jet engine. The two multivariable control design techniques compared were the Linear Quadratic Gaussian with Loop Transfer Recovery (LQG/LTR) and the H-Infinity synthesis. The two control design techniques were used with specific previously published design procedures to synthesize controls which would provide equivalent closed loop frequency response for the primary control loops while assuring adequate loop decoupling. The resulting controllers were then reduced in order to minimize the programming and data storage requirements for a typical implementation. The reduced order linear controllers designed by each method were combined with the linear model of an advanced turbofan engine and the system performance was evaluated for the continuous linear system. Included in the performance analysis are the resulting frequency and transient responses as well as actuator usage and rate capability for each design method. The controls were also analyzed for robustness with respect to structured uncertainties in the unmodeled system dynamics. The two controls were then compared for performance capability and hardware implementation issues.

  4. Does it make sense to modify tropical cyclones? A decision-analytic assessment.

    PubMed

    Klima, Kelly; Morgan, M Granger; Grossmann, Iris; Emanuel, Kerry

    2011-05-15

    Recent dramatic increases in damages caused by tropical cyclones (TCs) and improved understanding of TC physics have led DHS to fund research on intentional hurricane modification. We present a decision analytic assessment of whether it is potentially cost-effective to attempt to lower the wind speed of TCs approaching South Florida by reducing sea surface temperatures with wind-wave pumps. Using historical data on hurricanes approaching South Florida, we develop prior probabilities of how storms might evolve. The effects of modification are estimated using a modern TC model. The FEMA HAZUS-MH MR3 damage model and census data on the value of property at risk are used to estimate expected economic losses. We compare wind damages after storm modification with damages after implementing hardening strategies protecting buildings. We find that if it were feasible and properly implemented, modification could reduce net losses from an intense storm more than hardening structures. However, hardening provides "fail safe" protection for average storms that might not be achieved if the only option were modification. The effect of natural variability is larger than that of either strategy. Damage from storm surge is modest in the scenario studied but might be abated by modification.

  5. Prevention of hospital-onset Clostridium difficile infection in the New York metropolitan region using a collaborative intervention model.

    PubMed

    Koll, Brian S; Ruiz, Rafael E; Calfee, David P; Jalon, Hillary S; Stricof, Rachel L; Adams, Audrey; Smith, Barbara A; Shin, Gina; Gase, Kathleen; Woods, Maria K; Sirtalan, Ismail

    2014-01-01

    The incidence, severity, and associated costs of Clostridium difficile (C. difficile) infection (CDI) have dramatically increased in hospitals over the past decade, indicating an urgent need for strategies to prevent transmission of C. difficile. This article describes a multifaceted collaborative approach to reduce hospital-onset CDI rates in 35 acute care hospitals in the New York metropolitan region. Hospitals participated in a comprehensive CDI reduction intervention and formed interdisciplinary teams to coordinate their efforts. Standardized clinical infection prevention and environmental cleaning protocols were implemented and monitored using checklists. Monthly data reports were provided to hospitals for facility-specific performance evaluation and comparison to aggregate data from all participants. Hospitals also participated in monthly teleconferences to review data and highlight successes, challenges, and strategies to reduce CDI. Incidence of hospital-onset CDI per 10,000 patient days was the primary outcome measure. Additionally, the incidence of nonhospital-associated, community-onset, hospital-associated, and recurrent CDIs were measured. The use of a collaborative model to implement a multifaceted infection prevention strategy was temporally associated with a significant reduction in hospital-onset CDI rates in participating New York metropolitan regional hospitals. © 2013 National Association for Healthcare Quality.

  6. Patient-centered health care using pharmacist-delivered medication therapy management in rural Mississippi.

    PubMed

    Ross, Leigh Ann; Bloodworth, Lauren S

    2012-01-01

    To describe and provide preliminary clinical and economic outcomes from a pharmacist-delivered patient-centered health care (PCHC) model implemented in the Mississippi Delta. Mississippi between July 2008 and June 2010. 13 community pharmacies in nine Mississippi Delta counties. This PCHC model implements a comprehensive medication therapy management (MTM) program with pharmacist training, individualized patient encounters and group education, provider outreach, integration of pharmacists into health information technology, and on-site support in community pharmacies in a medically underserved region with a large burden of chronic disease and health disparities. The program also expands on traditional MTM services through initiatives in health literacy/cultural competency and efforts to increase the provider network and improve access to care. Criteria-based clinical outcomes, quality indicator reports, cost avoidance. PCHC services have been implemented in 13 pharmacies in nine counties in this underserved region, and 78 pharmacists and 177 students have completed the American Pharmacists Association's MTM Certificate Training Program. Preliminary data from 468 patients showed 681 encounters in which 1,471 drug therapy problems were identified and resolved. Preliminary data for clinical indicators and economic outcome measures are trending in a positive direction. Preliminary data analyses suggest that pharmacist-provided PCHC is beneficial and has the potential to be replicated in similar rural communities that are plagued with chronic disease and traditional primary care provider shortages. This effort aligns with national priorities to reduce medication errors, improve health outcomes, and reduce health care costs in underserved communities.

  7. Advanced musculoskeletal physiotherapists in post arthroplasty review clinics: a state wide implementation program evaluation.

    PubMed

    Harding, Paula; Burge, Angela; Walter, Kerrie; Shaw, Bridget; Page, Carolyn; Phan, Uyen; Terrill, Desiree; Liew, Susan

    2018-03-01

    To evaluate outcomes following a state-wide implementation of post arthroplasty review (PAR) clinics for patients following total hip and knee arthroplasty, led by advanced musculoskeletal physiotherapists in collaboration with orthopaedic specialists. A prospective observational study analysed data collected by 10 implementation sites (five metropolitan and five regional/rural centres) between September 2014 and June 2015. The Victorian Innovation and Reform Impact Assessment Framework was used to assess efficiency, effectiveness (access to care, safety and quality, workforce capacity, utilisation of skill sets, patient and workforce satisfaction) and sustainability (stakeholder engagement, succession planning and availability of ongoing funding). 2362 planned occasions of service (OOS) were provided for 2057 patients. Reduced patient wait times from referral to appointment were recorded and no adverse events occurred. Average cost savings across 10 sites was AUD$38 per OOS (Baseline $63, PAR clinic $35), representing a reduced pathway cost of 44%. Average annual predicted total value of increased orthopaedic specialist capacity was $11,950 per PAR clinic (range $6149 to $23,400). The Australian Orthopaedic Association review guidelines were met (8/10 sites, 80%) and patient-reported outcome measures were introduced as routine clinical care. High workforce and patient satisfaction were expressed. Eighteen physiotherapists were trained creating a sustainable workforce. Eight sites secured ongoing funding. The PAR clinics delivered a safe, cost-efficient model of care that improved patient access and quality of care compared to traditional specialist-led workforce models. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  8. Multi-GPU Accelerated Admittance Method for High-Resolution Human Exposure Evaluation.

    PubMed

    Xiong, Zubiao; Feng, Shi; Kautz, Richard; Chandra, Sandeep; Altunyurt, Nevin; Chen, Ji

    2015-12-01

    A multi-graphics processing unit (GPU) accelerated admittance method solver is presented for solving the induced electric field in high-resolution anatomical models of human body when exposed to external low-frequency magnetic fields. In the solver, the anatomical model is discretized as a three-dimensional network of admittances. The conjugate orthogonal conjugate gradient (COCG) iterative algorithm is employed to take advantage of the symmetric property of the complex-valued linear system of equations. Compared against the widely used biconjugate gradient stabilized method, the COCG algorithm can reduce the solving time by 3.5 times and reduce the storage requirement by about 40%. The iterative algorithm is then accelerated further by using multiple NVIDIA GPUs. The computations and data transfers between GPUs are overlapped in time by using asynchronous concurrent execution design. The communication overhead is well hidden so that the acceleration is nearly linear with the number of GPU cards. Numerical examples show that our GPU implementation running on four NVIDIA Tesla K20c cards can reach 90 times faster than the CPU implementation running on eight CPU cores (two Intel Xeon E5-2603 processors). The implemented solver is able to solve large dimensional problems efficiently. A whole adult body discretized in 1-mm resolution can be solved in just several minutes. The high efficiency achieved makes it practical to investigate human exposure involving a large number of cases with a high resolution that meets the requirements of international dosimetry guidelines.

  9. Fluid-acoustic interactions and their impact on pathological voiced speech

    NASA Astrophysics Data System (ADS)

    Erath, Byron D.; Zanartu, Matias; Peterson, Sean D.; Plesniak, Michael W.

    2011-11-01

    Voiced speech is produced by vibration of the vocal fold structures. Vocal fold dynamics arise from aerodynamic pressure loadings, tissue properties, and acoustic modulation of the driving pressures. Recent speech science advancements have produced a physiologically-realistic fluid flow solver (BLEAP) capable of prescribing asymmetric intraglottal flow attachment that can be easily assimilated into reduced order models of speech. The BLEAP flow solver is extended to incorporate acoustic loading and sound propagation in the vocal tract by implementing a wave reflection analog approach for sound propagation based on the governing BLEAP equations. This enhanced physiological description of the physics of voiced speech is implemented into a two-mass model of speech. The impact of fluid-acoustic interactions on vocal fold dynamics is elucidated for both normal and pathological speech through linear and nonlinear analysis techniques. Supported by NSF Grant CBET-1036280.

  10. Teaching foster grandparents to train severely handicapped persons.

    PubMed Central

    Fabry, P L; Reid, D H

    1978-01-01

    Five foster grandparents were taught training skills for use in their daily interactions with severely handicapped persons in an institution. Following baseline, specific teaching procedures consisting of teacher instructions, prompts, modelling, and praise were implemented. The grandparents' frequency of training three skill areas increased as the specific teaching was implemented in multiple-baseline format. The total amount of training continued as teacher instructions, prompts, and modelling were terminated and praise continued, although the grandparents spent their training time emphasizing only two of the three skill areas. Teacher presence was gradually reduced over an 11-week period, with no decrease in grandparents' frequency of training. Four of the foster grandchildren, all profoundly retarded and multiply handicapped, demonstrated progress throughout the study. Results were discussed in light of the available contributions of foster grandparents in institutional settings and maintenance of staff training. PMID:148446

  11. Effects of tobacco control policies on smoking prevalence and tobacco-attributable deaths in Mexico: the SimSmoke model.

    PubMed

    Reynales-Shigematsu, Luz Myriam; Fleischer, Nancy L; Thrasher, James F; Zhang, Yian; Meza, Rafael; Cummings, K Michael; Levy, David T

    2015-10-01

    To examine how policies adopted in Mexico in response to the Framework Convention on Tobacco Control affected smoking prevalence and smoking-attributable deaths. The SimSmoke simulation model of tobacco control policy is applied to Mexico. This discrete time, first-order Markov model uses data on population size, smoking rates and tobacco control policy for Mexico. It assesses, individually and jointly, the effects of seven types of policies: cigarette taxes, smoke-free air laws, mass media campaigns, advertising bans, warning labels, cessation treatment, and youth tobacco access policies. The Mexico SimSmoke model estimates that smoking rates have been reduced by about 30% as a result of policies implemented since 2002, and that the number of smoking-attributable deaths will have been reduced by about 826 000 by 2053. Increases in cigarette prices are responsible for over 60% of the reductions, but health warnings, smoke-free air laws, marketing restrictions and cessation treatments also play important roles. Mexico has shown steady progress towards reducing smoking prevalence in a short period of time, as have other Latin American countries, such as Brazil, Panama and Uruguay. Tobacco control policies play an important role in continued efforts to reduce tobacco use and associated deaths in Mexico.

  12. A Single-blinded, Randomized Clinical Trial of How to Implement an Evidence-based Treatment for Generalized Anxiety Disorder [IMPLEMENT]--Effects of Three Different Strategies of Implementation.

    PubMed

    Flückiger, Christoph; Forrer, Lena; Schnider, Barbara; Bättig, Isabelle; Bodenmann, Guy; Zinbarg, Richard E

    2016-01-01

    Despite long-standing calls to disseminate evidence-based treatments for generalized anxiety (GAD), modest progress has been made in the study of how such treatments should be implemented. The primary objective of this study was to test three competing strategies on how to implement a cognitive behavioral treatment (CBT) for out-patients with GAD (i.e., comparison of one compensation vs. two capitalization models). For our three-arm, single-blinded, randomized controlled trial (implementation of CBT for GAD [IMPLEMENT]), we recruited adults with GAD using advertisements in high-circulation newspapers to participate in a 14-session cognitive behavioral treatment (Mastery of your Anxiety and Worry, MAW-packet). We randomly assigned eligible patients using a full randomization procedure (1:1:1) to three different conditions of implementation: adherence priming (compensation model), which had a systematized focus on patients' individual GAD symptoms and how to compensate for these symptoms within the MAW-packet, and resource priming and supportive resource priming (capitalization model), which had systematized focuses on patients' strengths and abilities and how these strengths can be capitalized within the same packet. In the intention-to-treat population an outcome composite of primary and secondary symptoms-related self-report questionnaires was analyzed based on a hierarchical linear growth model from intake to 6-month follow-up assessment. This trial is registered at ClinicalTrials.gov (identifier: NCT02039193) and is closed to new participants. From June 2012 to Nov. 2014, from 411 participants that were screened, 57 eligible participants were recruited and randomly assigned to three conditions. Forty-nine patients (86%) provided outcome data at post-assessment (14% dropout rate). All three conditions showed a highly significant reduction of symptoms over time. However, compared with the adherence priming condition, both resource priming conditions indicated faster symptom reduction. The observer ratings of a sub-sample of recorded videos (n = 100) showed that the therapists in the resource priming conditions conducted more strength-oriented interventions in comparison with the adherence priming condition. No patients died or attempted suicide. To our knowledge, this is the first trial that focuses on capitalization and compensation models during the implementation of one prescriptive treatment packet for GAD. We have shown that GAD related symptoms were significantly faster reduced by the resource priming conditions, although the limitations of our study included a well-educated population. If replicated, our results suggest that therapists who implement a mental health treatment for GAD might profit from a systematized focus on capitalization models. Swiss Science National Foundation (SNSF-Nr. PZ00P1_136937/1) awarded to CF.

  13. An Extended Kalman Filter to Assimilate Altimetric Data into a Non-Linear Model of the Tropical Pacific

    NASA Technical Reports Server (NTRS)

    Gourdeau, L.; Verron, J.; Murtugudde, R.; Busalacchi, A. J.

    1997-01-01

    A new implementation of the extended Kaman filter is developed for the purpose of assimilating altimetric observations into a primitive equation model of the tropical Pacific. Its specificity consists in defining the errors into a reduced basis that evolves in time with the model dynamic. Validation by twin experiments is conducted and the method is shown to be efficient in quasi real conditions. Data from the first 2 years of the Topex/Poseidon mission are assimilated into the Gent & Cane [1989] model. Assimilation results are evaluated against independent in situ data, namely TAO mooring observations.

  14. Rapid prototyping and AI programming environments applied to payload modeling

    NASA Technical Reports Server (NTRS)

    Carnahan, Richard S., Jr.; Mendler, Andrew P.

    1987-01-01

    This effort focused on using artificial intelligence (AI) programming environments and rapid prototyping to aid in both space flight manned and unmanned payload simulation and training. Significant problems addressed are the large amount of development time required to design and implement just one of these payload simulations and the relative inflexibility of the resulting model to accepting future modification. Results of this effort have suggested that both rapid prototyping and AI programming environments can significantly reduce development time and cost when applied to the domain of payload modeling for crew training. The techniques employed are applicable to a variety of domains where models or simulations are required.

  15. A random rule model of surface growth

    NASA Astrophysics Data System (ADS)

    Mello, Bernardo A.

    2015-02-01

    Stochastic models of surface growth are usually based on randomly choosing a substrate site to perform iterative steps, as in the etching model, Mello et al. (2001) [5]. In this paper I modify the etching model to perform sequential, instead of random, substrate scan. The randomicity is introduced not in the site selection but in the choice of the rule to be followed in each site. The change positively affects the study of dynamic and asymptotic properties, by reducing the finite size effect and the short-time anomaly and by increasing the saturation time. It also has computational benefits: better use of the cache memory and the possibility of parallel implementation.

  16. Faster and exact implementation of the continuous cellular automaton for anisotropic etching simulations

    NASA Astrophysics Data System (ADS)

    Ferrando, N.; Gosálvez, M. A.; Cerdá, J.; Gadea, R.; Sato, K.

    2011-02-01

    The current success of the continuous cellular automata for the simulation of anisotropic wet chemical etching of silicon in microengineering applications is based on a relatively fast, approximate, constant time stepping implementation (CTS), whose accuracy against the exact algorithm—a computationally slow, variable time stepping implementation (VTS)—has not been previously analyzed in detail. In this study we show that the CTS implementation can generate moderately wrong etch rates and overall etching fronts, thus justifying the presentation of a novel, exact reformulation of the VTS implementation based on a new state variable, referred to as the predicted removal time (PRT), and the use of a self-balanced binary search tree that enables storage and efficient access to the PRT values in each time step in order to quickly remove the corresponding surface atom/s. The proposed PRT method reduces the simulation cost of the exact implementation from {O}(N^{5/3}) to {O}(N^{3/2} log N) without introducing any model simplifications. This enables more precise simulations (only limited by numerical precision errors) with affordable computational times that are similar to the less precise CTS implementation and even faster for low reactivity systems.

  17. Implementation Science for the Environment.

    PubMed

    Hering, Janet G

    2018-05-15

    The establishment of the field of implementation science was motivated by the understanding that medical and health research alone is insufficient to generate better health outcomes. With strong support from funding agencies for medical research, implementation science promotes the application of a structured framework or model in the implementation of research-based results, specifically evidence-based practices (EBPs). Furthermore, explicit consideration is given to the context of EBP implementation (i.e., socio-economic, political, cultural, and institutional factors that could affect the implementation process). Finally, implementation is monitored in a robust and rigorous way. Today, the field of implementation science supports conferences and professional societies as well as one dedicated journal and numerous others with related content. The goal of these various activities is to reduce the estimated, average "bench to bedside" time lag of 17 years for uptake of EBPs from health research into routine practice. Despite similar time lags and impediments to uptake in the environmental domain, a parallel field of implementation science for the environment has not (yet) emerged. Although some parallels in needs and opportunities can easily be drawn between the health and environmental domains, a detailed mapping exercise is needed to understand which aspects of implementation science could be applied in the environmental domain either directly or in a modified form. This would allow an accelerated development of implementation science for the environment.

  18. Risk assessment and management of brucellosis in the southern greater Yellowstone area (II): Cost-benefit analysis of reducing elk brucellosis prevalence.

    PubMed

    Boroff, Kari; Kauffman, Mandy; Peck, Dannele; Maichak, Eric; Scurlock, Brandon; Schumaker, Brant

    2016-11-01

    Recent cases of bovine brucellosis (Brucella abortus) in cattle (Bos taurus) and domestic bison (Bison bison) of the southern Greater Yellowstone Area (SGYA) have been traced back to free-ranging elk (Cervus elaphus). Several management activities have been implemented to reduce brucellosis seroprevalence in elk, including test-and-slaughter, low-density feeding at elk winter feedgrounds, and elk vaccination. It is unclear which of these activities are most cost-effective at reducing the risk of elk transmitting brucellosis to cattle. In a companion paper, a stochastic risk model was used to translate a reduction in elk seroprevalence to a reduction in the risk of transmission to cattle. Here, we use those results to estimate the expected economic benefits and costs of reducing seroprevalence in elk using three different management activities: vaccination of elk with Brucella strain 19 (S19), low-density feeding of elk, and elk test-and-slaughter. Results indicate that the three elk management activities yield negative expected net benefits, ranging from -$2983 per year for low-density feeding to -$595,471 per year for test-and-slaughter. Society's risk preferences will determine whether strategies that generate small negative net benefit, such as low-density feeding, are worth implementing. However, activities with large negative net benefits, such as test-and-slaughter and S19 vaccination, are unlikely to be economically worthwhile. Given uncertainty about various model parameters, we identify some circumstances in which individual management activities might generate positive expected net benefit. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Symplectic modeling of beam loading in electromagnetic cavities

    DOE PAGES

    Abell, Dan T.; Cook, Nathan M.; Webb, Stephen D.

    2017-05-22

    Simulating beam loading in radio frequency accelerating structures is critical for understanding higher-order mode effects on beam dynamics, such as beam break-up instability in energy recovery linacs. Full wave simulations of beam loading in radio frequency structures are computationally expensive, and while reduced models can ignore essential physics, it can be difficult to generalize. Here, we present a self-consistent algorithm derived from the least-action principle which can model an arbitrary number of cavity eigenmodes and with a generic beam distribution. It has been implemented in our new Open Library for Investigating Vacuum Electronics (OLIVE).

  20. A modular method for evaluating the performance of picture archiving and communication systems.

    PubMed

    Sanders, W H; Kant, L A; Kudrimoti, A

    1993-08-01

    Modeling can be used to predict the performance of picture archiving and communication system (PACS) configurations under various load conditions at an early design stage. This is important because choices made early in the design of a system can have a significant impact on the performance of the resulting implementation. Because PACS consist of many types of components, it is important to do such evaluations in a modular manner, so that alternative configurations and designs can be easily investigated. Stochastic activity networks (SANs) and reduced base model construction methods can aid in doing this. SANs are a model type particularly suited to the evaluation of systems in which several activities may be in progress concurrently, and each activity may affect the others through the results of its completion. Together with SANs, reduced base model construction methods provide a means to build highly modular models, in which models of particular components can be easily reused. In this article, we investigate the use of SANs and reduced base model construction techniques in evaluating PACS. Construction and solution of the models is done using UltraSAN, a graphic-oriented software tool for model specification, analysis, and simulation. The method is illustrated via the evaluation of a realistically sized PACS for a typical United States hospital of 300 to 400 beds, and the derivation of system response times and component utilizations.

Top