Sample records for bornholm model set-up

  1. Bornholm disease in Upper Silesia

    PubMed Central

    Gibiński, Kornel; Makower, Henryk; Skurska, Zofia; Bara, Boleslaw; Sypułowa, Alicja

    1960-01-01

    Bornholm disease is generally attributed to infection with Coxsackie viruses of the B group, but in 1954-56 a number of sporadic cases occurred in Bytom, Upper Silesia, which were shown on virological examination to be caused by Coxsackie A4. In 1957, however, in the same area, an epidemic of Bornholm disease broke out for which Coxsackie B virus was clearly responsible. Re-examination of stocked material from the earlier sporadic cases to make sure that no B-type virus was present confirmed that these cases had been caused by A4. Clinically, the epidemic cases showed a preponderance of abdominal pains and comparatively infrequent chest pain, whereas the reverse was true of the sporadic cases; vomiting was also considerably less frequent in the sporadic than in the epidemic cases. PMID:13827939

  2. Early Cambrian wave-formed shoreline deposits: the Hardeberga Formation, Bornholm, Denmark

    NASA Astrophysics Data System (ADS)

    Clemmensen, Lars B.; Glad, Aslaug C.; Pedersen, Gunver K.

    2017-09-01

    During the early Cambrian, the Danish island Bornholm was situated on the northern edge of the continent Baltica with palaeolatitudes of about 35°S. An early Cambrian (Terreneuvian) transgression inundated large areas of Baltica including Bornholm creating shallow marine and coastline environments. During this period, wave-formed shoreline sediments (the Vik Member, Hardeberga Formation) were deposited on Bornholm and are presently exposed at Strøby quarry. The sediments consist of fine- and medium-grained quartz-cemented arenites in association with a few silt-rich mudstones. The presence of well-preserved subaqueous dunes and wave ripples indicates deposition in a wave-dominated upper shoreface (littoral zone) environment, and the presence of interference ripples indicates that the littoral zone environment experienced water level fluctuations due to tides and/or changing meteorological conditions. Discoidal structures (medusoids) are present in the quarry, but due to the relative poor preservation of their fine-scale structures it is difficult to determine if the discoids represent true medusae imprints or inorganic structures. The preservation of the shallow-water bedforms as well as the possible medusae imprints is related to either the formation of thin mud layers, formed during a period of calm water when winds blew offshore for a longer period, or to the growth of bacterial mats. The orientation of the wave-formed bedforms indicates a local palaeoshoreline trending NE-SW and facing a large ocean to the north.

  3. Setting up virgin stress conditions in discrete element models

    PubMed Central

    Rojek, J.; Karlis, G.F.; Malinowski, L.J.; Beer, G.

    2013-01-01

    In the present work, a methodology for setting up virgin stress conditions in discrete element models is proposed. The developed algorithm is applicable to discrete or coupled discrete/continuum modeling of underground excavation employing the discrete element method (DEM). Since the DEM works with contact forces rather than stresses there is a need for the conversion of pre-excavation stresses to contact forces for the DEM model. Different possibilities of setting up virgin stress conditions in the DEM model are reviewed and critically assessed. Finally, a new method to obtain a discrete element model with contact forces equivalent to given macroscopic virgin stresses is proposed. The test examples presented show that good results may be obtained regardless of the shape of the DEM domain. PMID:27087731

  4. Setting up virgin stress conditions in discrete element models.

    PubMed

    Rojek, J; Karlis, G F; Malinowski, L J; Beer, G

    2013-03-01

    In the present work, a methodology for setting up virgin stress conditions in discrete element models is proposed. The developed algorithm is applicable to discrete or coupled discrete/continuum modeling of underground excavation employing the discrete element method (DEM). Since the DEM works with contact forces rather than stresses there is a need for the conversion of pre-excavation stresses to contact forces for the DEM model. Different possibilities of setting up virgin stress conditions in the DEM model are reviewed and critically assessed. Finally, a new method to obtain a discrete element model with contact forces equivalent to given macroscopic virgin stresses is proposed. The test examples presented show that good results may be obtained regardless of the shape of the DEM domain.

  5. [Individual indirect bonding technique (IIBT) using set-up model].

    PubMed

    Kyung, H M

    1989-01-01

    There has been much progress in Edgewise Appliance since E.H. Angle. One of the most important procedures in edgewise appliance is correct bracket position. Not only conventional edgewise appliance but also straight wire appliance & lingual appliance cannot be used more effectively unless the bracket position is accurate. Improper bracket positioning may reveal much problems during treatment, especially in finishing state. It may require either rebonding after the removal of the malpositioned bracket or the greater number of arch wire and the more complex wire bending, causing much difficulty in performing effective treatments. This made me invent Individual Indirect Bonding Technique with the use of multi-purpose set-up model in order to determine a correct and objective bracket position according to individual patients. This technique is more accurate than former indirect bonding techniques in bracket positioning, because it decides the bracket position on a set-up model which has produced to have the occlusal relationship the clinician desired. This technique is especially effective in straight wire appliance and lingual appliance in which the correct bracket positioning is indispensible.

  6. UpSet: Visualization of Intersecting Sets

    PubMed Central

    Lex, Alexander; Gehlenborg, Nils; Strobelt, Hendrik; Vuillemot, Romain; Pfister, Hanspeter

    2016-01-01

    Understanding relationships between sets is an important analysis task that has received widespread attention in the visualization community. The major challenge in this context is the combinatorial explosion of the number of set intersections if the number of sets exceeds a trivial threshold. In this paper we introduce UpSet, a novel visualization technique for the quantitative analysis of sets, their intersections, and aggregates of intersections. UpSet is focused on creating task-driven aggregates, communicating the size and properties of aggregates and intersections, and a duality between the visualization of the elements in a dataset and their set membership. UpSet visualizes set intersections in a matrix layout and introduces aggregates based on groupings and queries. The matrix layout enables the effective representation of associated data, such as the number of elements in the aggregates and intersections, as well as additional summary statistics derived from subset or element attributes. Sorting according to various measures enables a task-driven analysis of relevant intersections and aggregates. The elements represented in the sets and their associated attributes are visualized in a separate view. Queries based on containment in specific intersections, aggregates or driven by attribute filters are propagated between both views. We also introduce several advanced visual encodings and interaction methods to overcome the problems of varying scales and to address scalability. UpSet is web-based and open source. We demonstrate its general utility in multiple use cases from various domains. PMID:26356912

  7. Precision assessment of model-based RSA for a total knee prosthesis in a biplanar set-up.

    PubMed

    Trozzi, C; Kaptein, B L; Garling, E H; Shelyakova, T; Russo, A; Bragonzoni, L; Martelli, S

    2008-10-01

    Model-based Roentgen Stereophotogrammetric Analysis (RSA) was recently developed for the measurement of prosthesis micromotion. Its main advantage is that markers do not need to be attached to the implants as traditional marker-based RSA requires. Model-based RSA has only been tested in uniplanar radiographic set-ups. A biplanar set-up would theoretically facilitate the pose estimation algorithm, since radiographic projections would show more different shape features of the implants than in uniplanar images. We tested the precision of model-based RSA and compared it with that of the traditional marker-based method in a biplanar set-up. Micromotions of both tibial and femoral components were measured with both the techniques from double examinations of patients participating in a clinical study. The results showed that in the biplanar set-up model-based RSA presents a homogeneous distribution of precision for all the translation directions, but an inhomogeneous error for rotations, especially internal-external rotation presented higher errors than rotations about the transverse and sagittal axes. Model-based RSA was less precise than the marker-based method, although the differences were not significant for the translations and rotations of the tibial component, with the exception of the internal-external rotations. For both prosthesis components the precisions of model-based RSA were below 0.2 mm for all the translations, and below 0.3 degrees for rotations about transverse and sagittal axes. These values are still acceptable for clinical studies aimed at evaluating total knee prosthesis micromotion. In a biplanar set-up model-based RSA is a valid alternative to traditional marker-based RSA where marking of the prosthesis is an enormous disadvantage.

  8. Setting up a hydrological model based on global data for the Ayeyarwady basin in Myanmar

    NASA Astrophysics Data System (ADS)

    ten Velden, Corine; Sloff, Kees; Nauta, Tjitte

    2017-04-01

    The use of global datasets in local hydrological modelling can be of great value. It opens up the possibility to include data for areas where local data is not or only sparsely available. In hydrological modelling the existence of both static physical data such as elevation and land use, and dynamic meteorological data such as precipitation and temperature, is essential for setting up a hydrological model, but often such data is difficult to obtain at the local level. For the Ayeyarwady catchment in Myanmar a distributed hydrological model (Wflow: https://github.com/openstreams/wflow) was set up with only global datasets, as part of a water resources study. Myanmar is an emerging economy, which has only recently become more receptive to foreign influences. It has a very limited hydrometeorological measurement network, with large spatial and temporal gaps, and data that are of uncertain quality and difficult to obtain. The hydrological model was thus set up based on resampled versions of the SRTM digital elevation model, the GlobCover land cover dataset and the HWSD soil dataset. Three global meteorological datasets were assessed and compared for use in the hydrological model: TRMM, WFDEI and MSWEP. The meteorological datasets were assessed based on their conformity with several precipitation station measurements, and the overall model performance was assessed by calculating the NSE and RVE based on discharge measurements of several gauging stations. The model was run for the period 1979-2012 on a daily time step, and the results show an acceptable applicability of the used global datasets in the hydrological model. The WFDEI forcing dataset gave the best results, with a NSE of 0.55 at the outlet of the model and a RVE of 8.5%, calculated over the calibration period 2006-2012. As a general trend the modelled discharge at the upstream stations tends to be underestimated, and at the downstream stations slightly overestimated. The quality of the discharge measurements

  9. Consequences of artificial deepwater ventilation in the Bornholm Basin for oxygen conditions, cod reproduction and benthic biomass - a model study

    NASA Astrophysics Data System (ADS)

    Stigebrandt, A.; Rosenberg, R.; Råman Vinnå, L.; Ödalen, M.

    2015-01-01

    We develop and use a circulation model to estimate hydrographical and ecological changes in the isolated basin water of the Bornholm Basin. By pumping well-oxygenated so-called winter water to the greatest depth, where it is forced to mix with the resident water, the rate of deepwater density reduction increases as well as the frequency of intrusions of new oxygen-rich deepwater. We show that pumping 1000 m3 s-1 should increase the rates of water exchange and oxygen supply by 2.5 and 3 times, respectively. The CRV (cod reproduction volume), the volume of water in the isolated basin meeting the requirements for successful cod reproduction (S > 11, O2 > 2 mL L-1), should every year be greater than 54 km3, which is an immense improvement, since it has been much less in certain years. Anoxic bottoms should no longer occur in the basin, and hypoxic events will become rare. This should permit extensive colonization of fauna on the earlier periodically anoxic bottoms. Increased biomass of benthic fauna should also mean increased food supply to economically valuable demersal fish like cod and flatfish. In addition, re-oxygenation of the sediments should lead to increased phosphorus retention by the sediments.

  10. Consequences of artificial deepwater ventilation in the Bornholm Basin for oxygen conditions, cod reproduction and benthic biomass - a model study

    NASA Astrophysics Data System (ADS)

    Stigebrandt, A.; Rosenberg, R.; Råman Vinnå, L.; Ödalen, M.

    2014-07-01

    We develop and use a circulation model to estimate hydrographical and ecological changes in the isolated basin water of the Bornholm Basin. By pumping well oxygenated so-called winter water, residing beneath the level of the summer thermocline, to the greatest depth of the basin, where it is forced to mix with the resident water, the rate of density reduction should increase and thereby the frequency of intrusions of new oxygen-rich deepwater. We show that pumping 1000 m3 s-1 should increase the rates of water exchange and oxygen supply by 2.5 and 3 times, respectively. The CRV (Cod Reproduction Volume), the volume of water in the isolated basin meeting the requirements for successful cod reproduction (S > 11, O2 > 2 mL L-1), should every year be greater than 54 km3, which is an immense improvement since it in certain years is currently much less. Anoxic bottoms should no longer occur in the basin and hypoxic events will become rare. This should permit extensive colonization of fauna on the earlier periodically anoxic bottoms. Increased biomass of benthic fauna should also mean increased food supply to economically valuable demersal fish like cod and flatfish. In addition, the bioturbation activity and re-oxygenation of the sediments should lead to increased phosphorus retention by the sediments.

  11. [Comparison of four identical electronic noses and three measurement set-ups].

    PubMed

    Koczulla, R; Hattesohl, A; Biller, H; Hofbauer, J; Hohlfeld, J; Oeser, C; Wirtz, H; Jörres, R A

    2011-08-01

    Volatile organic compounds (VOCs) can be used as biomarkers in exhaled air. VOC profiles can be detected by an array of nanosensors of an electronic nose. These profiles can be analysed using bioinformatics. It is, however, not known whether different devices of the same model measure identically and to which extent different set-ups and the humidity of the inhaled air influence the VOC profile. Three different measuring set-ups were designed and three healthy control subjects were measured with each of them, using four devices of the same model (Cyranose 320™, Smiths Detection). The exhaled air was collected in a plastic bag. Either ambient air was used as reference (set-up Leipzig), or the reference air was humidified (100% relative humidity) (set-up Marburg and set-up Munich). In the set-up Marburg the subjects inhaled standardised medical air (Aer medicinalis Linde, AGA AB) out of a compressed air bottle through a demand valve; this air (after humidification) was also used as reference. In the set-up Leipzig the subjects inhaled VOC-filtered ambient air, in the set-up Munich unfiltered room air. The data were evaluated using either the real-time data or the changes in resistance as calculated by the device. The results were clearly dependent on the set-up. Apparently, humidification of the reference air could reduce the variance between devices, but this result was also dependent on the evaluation method used. When comparing the three subjects, the set-ups Munich and Marburg mapped these in a similar way, whereas not only the signals but also the variance of the set-up Leipzig were larger. Measuring VOCs with an electronic nose has not yet been standardised and the set-up significantly affects the results. As other researchers use further methods, it is currently not possible to draw generally accepted conclusions. More systematic tests are required to find the most sensitive and reliable but still feasible set-up so that comparability is improved. © Georg

  12. Observations of near-bottom currents in Bornholm Basin, Slupsk Furrow and Gdansk Deep

    NASA Astrophysics Data System (ADS)

    Bulczak, A. I.; Rak, D.; Schmidt, B.; Beldowski, J.

    2016-06-01

    Dense bottom currents are responsible for transport of the salty inflow waters from the North Sea driving ventilation and renewal of Baltic deep waters. This study characterises dense currents in three deep locations of the Baltic Proper: Bornholm Basin (BB), Gdansk Basin (GB) and Slupsk Furrow (SF). These locations are of fundamental importance for the transport and pollution associated with chemical munitions deposited in BB and GB after 2nd World War. Of further importance the sub-basins are situated along the pathway of dense inflowing water.Current velocities were measured in the majority of the water column during regular cruises of r/v Oceania and r/v Baltica in 2001-2012 (38 cruises) by 307 kHz vessel mounted (VM), downlooking ADCP. Additionally, the high-resolution CTD and oxygen profiles were collected. Three moorings measured current velocity profiles in SF and GB over the summer 2012. In addition, temperature, salinity, oxygen and turbidity were measured at about 1 m above the bottom in GB. The results showed that mean current speed across the Baltic Proper was around 12 cm s-1 and the stronger flow was characteristic to the regions located above the sills, in the Bornholm and Slupsk Channels, reaching on average about 20 cm s-1. The results suggest that these regions are important for the inflow of saline waters into the eastern Baltic and are the areas of intense vertical mixing. The VM ADCP observations indicate that the average near-bottom flow across the basin can reach 35±6 cm s-1. The mooring observations also showed similar near-bottom flow velocities. However, they showed that the increased speed of the near-bottom layer occurred frequently in SF and GB during short time periods lasting for about few to several days or 10-20% of time. The observations showed that the bottom mixed layer occupies at least 10% of the water column and the turbulent mixing induced by near-bottom currents is likely to produce sediment resuspension and transport

  13. The Semi-opened Infrastructure Model (SopIM): A Frame to Set Up an Organizational Learning Process

    NASA Astrophysics Data System (ADS)

    Grundstein, Michel

    In this paper, we introduce the "Semi-opened Infrastructure Model (SopIM)" implemented to deploy Artificial Intelligence and Knowledge-based Systems within a large industrial company. This model illustrates what could be two of the operating elements of the Model for General Knowledge Management within the Enterprise (MGKME) that are essential to set up the organizational learning process that leads people to appropriate and use concepts, methods and tools of an innovative technology: the "Ad hoc Infrastructures" element, and the "Organizational Learning Processes" element.

  14. Multiprocessor speed-up, Amdahl's Law, and the Activity Set Model of parallel program behavior

    NASA Technical Reports Server (NTRS)

    Gelenbe, Erol

    1988-01-01

    An important issue in the effective use of parallel processing is the estimation of the speed-up one may expect as a function of the number of processors used. Amdahl's Law has traditionally provided a guideline to this issue, although it appears excessively pessimistic in the light of recent experimental results. In this note, Amdahl's Law is amended by giving a greater importance to the capacity of a program to make effective use of parallel processing, but also recognizing the fact that imbalance of the workload of each processor is bound to occur. An activity set model of parallel program behavior is then introduced along with the corresponding parallelism index of a program, leading to upper and lower bounds to the speed-up.

  15. Splendidly blended: a machine learning set up for CDU control

    NASA Astrophysics Data System (ADS)

    Utzny, Clemens

    2017-06-01

    As the concepts of machine learning and artificial intelligence continue to grow in importance in the context of internet related applications it is still in its infancy when it comes to process control within the semiconductor industry. Especially the branch of mask manufacturing presents a challenge to the concepts of machine learning since the business process intrinsically induces pronounced product variability on the background of small plate numbers. In this paper we present the architectural set up of a machine learning algorithm which successfully deals with the demands and pitfalls of mask manufacturing. A detailed motivation of this basic set up followed by an analysis of its statistical properties is given. The machine learning set up for mask manufacturing involves two learning steps: an initial step which identifies and classifies the basic global CD patterns of a process. These results form the basis for the extraction of an optimized training set via balanced sampling. A second learning step uses this training set to obtain the local as well as global CD relationships induced by the manufacturing process. Using two production motivated examples we show how this approach is flexible and powerful enough to deal with the exacting demands of mask manufacturing. In one example we show how dedicated covariates can be used in conjunction with increased spatial resolution of the CD map model in order to deal with pathological CD effects at the mask boundary. The other example shows how the model set up enables strategies for dealing tool specific CD signature differences. In this case the balanced sampling enables a process control scheme which allows usage of the full tool park within the specified tight tolerance budget. Overall, this paper shows that the current rapid developments off the machine learning algorithms can be successfully used within the context of semiconductor manufacturing.

  16. Setting up a model intercomparison project for the last deglaciation

    NASA Astrophysics Data System (ADS)

    Ivanovic, R. F.; Gregoire, L. J.; Valdes, P. J.; Roche, D. M.; Kageyama, M.

    2014-12-01

    The last deglaciation (~ 21-9 ka) presents a series of opportunities to study the underlying mechanisms of abrupt climate changes and long-term trends in the Earth System. Most of the forcings are relatively well constrained and geological archives record responses over a range of timescales. Despite this, large uncertainties remain over the feedback loops that culminated in the collapse of the great Northern Hemisphere ice sheets, and a consensus has yet to be reached on the chains of events that led to rapid surface warming and cooling during this period.Climate models are powerful tools for quantitatively assessing these outstanding issues through their ability to temporally resolve cause and effect, as well as break down the contributions from different forcings. This is well demonstrated by pioneering work; for example by Liu et al. (2009), Roche et al. (2011), Gregoire et al. (2012) and Menviel et al. (2011). However, such work is not without challenges; model-geological data mismatches remain unsolved and it is difficult to compare results from different models with unique experiment designs. Therefore, we have established a multidisciplinary Paleoclimate Model Intercomparison Project working group to coordinate transient climate model simulations and geological archive compilations of the last deglaciation. Here, we present the plans and progress of the working group in its first phase of activity; the investigation of Heinrich Stadial 1 and the lead into the Bolling warming event. We describe the set-up of the core deglacial experiment, explain our approach for dealing with uncertain climate forcings and outline our solutions to challenges posed by this research. By defining a common experiment design, we have built a framework to include models of different speeds, complexities and resolution, maximising the reward of this varied approach. One of the next challenges is to compile transient proxy records and develop a methodology for dealing with

  17. Development of a New Optical Measuring Set-Up

    NASA Astrophysics Data System (ADS)

    Miroshnichenko, I. P.; Parinov, I. A.

    2018-06-01

    The paper proposes a description of the developed optical measuring set-up for the contactless recording and processing of measurement results for small spatial (linear and angular) displacements of control surfaces based on the use of laser technologies and optical interference methods. The proposed set-up is designed to solve all the arising measurement tasks in the study of the physical and mechanical properties of new materials and in the process of diagnosing the state of structural materials by acoustic active methods of nondestructive testing. The structure of the set-up, its constituent parts are described, and the features of construction and functioning during measurements are discussed. New technical solutions for the implementation of the components of the set-up under consideration are obtained. The purpose and description of the original specialized software, used to perform a priori analysis of measurement results, are present, while performing measurements, for a posteriori analysis of measurement results. Moreover, the influences of internal and external disturbance effects on the measurement results and correcting measurement results directly in their implementation are determined. The technical solutions, used in the set-up, are protected by the patents of the Russian Federation for inventions, and software is protected by the certificates of state registration of computer programs. The proposed set-up is intended for use in instrumentation, mechanical engineering, shipbuilding, aviation, energy sector, etc.

  18. Automatized set-up procedure for transcranial magnetic stimulation protocols.

    PubMed

    Harquel, S; Diard, J; Raffin, E; Passera, B; Dall'Igna, G; Marendaz, C; David, O; Chauvin, A

    2017-06-01

    Transcranial Magnetic Stimulation (TMS) established itself as a powerful technique for probing and treating the human brain. Major technological evolutions, such as neuronavigation and robotized systems, have continuously increased the spatial reliability and reproducibility of TMS, by minimizing the influence of human and experimental factors. However, there is still a lack of efficient set-up procedure, which prevents the automation of TMS protocols. For example, the set-up procedure for defining the stimulation intensity specific to each subject is classically done manually by experienced practitioners, by assessing the motor cortical excitability level over the motor hotspot (HS) of a targeted muscle. This is time-consuming and introduces experimental variability. Therefore, we developed a probabilistic Bayesian model (AutoHS) that automatically identifies the HS position. Using virtual and real experiments, we compared the efficacy of the manual and automated procedures. AutoHS appeared to be more reproducible, faster, and at least as reliable as classical manual procedures. By combining AutoHS with robotized TMS and automated motor threshold estimation methods, our approach constitutes the first fully automated set-up procedure for TMS protocols. The use of this procedure decreases inter-experimenter variability while facilitating the handling of TMS protocols used for research and clinical routine. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. A Guide for Setting Up a Church-Sponsored Nursery School.

    ERIC Educational Resources Information Center

    Hostrawser, Sara

    This document provides a guide for setting up a church-sponsored nursery school. Chapter One outlines the verbal-cognitive model of preschool education which emphasizes interaction between teacher and child. Perceptual, motor, cognitive, social, emotional, and language objectives are indicated. Chapter Two covers aspects of school management such…

  20. INSERM sets up forum.

    PubMed

    Walgate, R

    Considerable public controversy is expected over the composition and role of a new ethics advisory committee set up in association with France's medical research council, INSERM. French president Mitterrand has tried to represent all possible conflicting groups on the committee, which includes 15 scientists nominated by research institutions, 15 persons knowledgeable about ethical issues nominated by various legislative and judicial bodies, and five representatives of the "principal philosophical families": Catholicism, Protestantism, Judaism, Islam, and Communism.

  1. UpSetR: an R package for the visualization of intersecting sets and their properties.

    PubMed

    Conway, Jake R; Lex, Alexander; Gehlenborg, Nils

    2017-09-15

    Venn and Euler diagrams are a popular yet inadequate solution for quantitative visualization of set intersections. A scalable alternative to Venn and Euler diagrams for visualizing intersecting sets and their properties is needed. We developed UpSetR, an open source R package that employs a scalable matrix-based visualization to show intersections of sets, their size, and other properties. UpSetR is available at https://github.com/hms-dbmi/UpSetR/ and released under the MIT License. A Shiny app is available at https://gehlenborglab.shinyapps.io/upsetr/ . nils@hms.harvard.edu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  2. UpSetR: an R package for the visualization of intersecting sets and their properties

    PubMed Central

    Conway, Jake R.; Lex, Alexander; Gehlenborg, Nils

    2017-01-01

    Abstract Motivation: Venn and Euler diagrams are a popular yet inadequate solution for quantitative visualization of set intersections. A scalable alternative to Venn and Euler diagrams for visualizing intersecting sets and their properties is needed. Results: We developed UpSetR, an open source R package that employs a scalable matrix-based visualization to show intersections of sets, their size, and other properties. Availability and implementation: UpSetR is available at https://github.com/hms-dbmi/UpSetR/ and released under the MIT License. A Shiny app is available at https://gehlenborglab.shinyapps.io/upsetr/. Contact: nils@hms.harvard.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28645171

  3. ALTERNATIVES FOR REDUCING INSECTICIDES ON COTTON AND CORN: ECONOMIC AND ENVIRONMENTAL IMPACT - SUPPLEMENT 2: PROCEDURES USED IN SETTING UP THE AGRICULTURAL PRODUCTION MODEL

    EPA Science Inventory

    The procedures used in setting up the agricultural production model used in a study of alternatives for reducing insecticides on cotton and corn are described. The major analytical tool used is a spatial equilibrium model of U.S. agriculture. This is a linear programming model th...

  4. Voluntary authority set up.

    PubMed

    Clarke, M

    The British Medical Research Council (MRC), in association with the Royal College of Obstetricians and Gynaecologists, has responded to the Warnock report on human fertilization and embryology by setting up a voluntary authority to license such research. MRC also seeks to define the term "embryo" and to include as research "new and untried treatment." Possible lines of investigation include studies on infertility, genetic and congenital diseases, and contraceptive methods. However, if Enoch Powell's Parliamentary bill should become law, all research on human embryos would be illegal.

  5. Reconstruction of climate and environmental changes in the Bornholm Basin during the last 6000 years, based on foraminiferal assemblages

    NASA Astrophysics Data System (ADS)

    Binczewska, Anna; Polovodova Asteman, Irina; Moros, Matthias; Sławińska, Joanna

    2016-04-01

    The Baltic Sea is the largest brackish sea in the world connected to the Atlantic Ocean through the narrow and shallow Danish Straits. The hydrography of the Baltic Sea is strongly dependent on inflows from the North Sea and its environmental conditions are influenced by meteorological and anthropogenic factors. To improve our understanding of the natural variability and forcing factors driving changes in the Baltic ecosystem, detailed analyses of palaeoecological archives are needed. Here we present a high-resolution study of foraminiferal assemblages together with sediment geochemistry (LOI, TOC, TIC, CNS) from a 8-m long gravity core (GC) and a 42-cm long multi core (MUC) taken in the Bornholm Basin in 2013. Both cores were investigated in order to reconstruct bottom water mass variability during the mid- and late Holocene. Cores were dated by AMS 14C (mostly on Macoma balthica shells), 210Pb and 137Cs. Age-model allowed us to place variability of foraminiferal assemblages in time and link them with the Holocene climate extremes and the Major Baltic Inflows (MBIs). High absolute abundances (ind./g wet sed.) of foraminifera are found within a core interval corresponding to the Dark Ages and the Medieval Warm Period (~AD 400-1200). The Little Ice Age is represented by rare to absent foraminiferal shells, while significant changes of foraminiferal abundances occur in the lower part of core(~ BC 2050-2995). The dominant species found in both cores are Cribroelphidium excavatum, C. excavatum f. clavatum, C. albiumbilicatum and C. incertum, all adapted to an ecologically unstable environment with high fluctuations of salinity and oxygen. The arenaceous species Reophax dentaliniformis strongly occurs at ~ AD 1450-1600, where calcareous species were rare. Presence of agglutinated foraminifera and prevailing small size of individuals in all studied material suggest bottom water undersaturation with respect to calcium carbonate. In the Baltic Sea, bottom waters

  6. Verification and implementation of set-up empirical models in pile design : research project capsule.

    DOT National Transportation Integrated Search

    2016-08-01

    The primary objectives of this research include: performing static and dynamic load tests on : newly instrumented test piles to better understand the set-up mechanism for individual soil : layers, verifying or recalibrating previously developed empir...

  7. Setting up crowd science projects.

    PubMed

    Scheliga, Kaja; Friesike, Sascha; Puschmann, Cornelius; Fecher, Benedikt

    2016-11-29

    Crowd science is scientific research that is conducted with the participation of volunteers who are not professional scientists. Thanks to the Internet and online platforms, project initiators can draw on a potentially large number of volunteers. This crowd can be involved to support data-rich or labour-intensive projects that would otherwise be unfeasible. So far, research on crowd science has mainly focused on analysing individual crowd science projects. In our research, we focus on the perspective of project initiators and explore how crowd science projects are set up. Based on multiple case study research, we discuss the objectives of crowd science projects and the strategies of their initiators for accessing volunteers. We also categorise the tasks allocated to volunteers and reflect on the issue of quality assurance as well as feedback mechanisms. With this article, we contribute to a better understanding of how crowd science projects are set up and how volunteers can contribute to science. We suggest that our findings are of practical relevance for initiators of crowd science projects, for science communication as well as for informed science policy making. © The Author(s) 2016.

  8. How to set up a psychodermatology clinic.

    PubMed

    Aguilar-Duran, S; Ahmed, A; Taylor, R; Bewley, A

    2014-07-01

    Psychodermatology is a recognized subspecialty, but lack of awareness among dermatologists and limitation of resources make the management of these patients challenging. Clinicians are often unsure about the practicalities of setting up a psychodermatology service. There is confusion about which model is best suited to which service, and about the development of a psychodermatology multidisciplinary team. To identify the necessary steps in setting up a psychodermatology clinic. The study was based on the experience of a UK-based psychodermatology unit and the recently published standards by the UK Psychodermatology Working Party. The type of service provision will depend on the type of patients seen in the unit. The core team will be composed of a psychodermatologist and a psychologist. Access to a psychiatrist is essential if patients present with primary psychiatric conditions or primary cutaneous conditions with suicidal or other psychiatric risks. Adequate training of the healthcare staff is advised. The premises and time allocation should be adequate, and this translates into higher tariffs. Using business care tariffs for people with mental health conditions might be more appropriate, as the consultations are longer and involve more members of staff; however, the overall cost remains lower than if these patients were seen in a general dermatology service or in the community. Psychodermatology services are globally limited, and yet the demand for psychodermatology care is high. There is evidence that dedicated psychodermatology services are cost-effective. Healthcare professionals need to be aware of the steps necessary to establish and maintain psychodermatology services. © 2014 British Association of Dermatologists.

  9. Antagonistic effect of chosen lactic acid bacteria strains on Yersinia enterocolitica species in model set-ups, meat and fermented sausages.

    PubMed

    Gomółka-Pawlicka, M; Uradziński, J

    2003-01-01

    The present study was aimed at determining the influence of 15 strains of lactic acid bacteria on the growth of 8 Yersinia enterocolitica strains in model set-ups, and in meat and ageing fermented sausages. The investigations were performed within the framework of three alternate stages which differed in respect to the products studied, the number of Lactobacillus sp. strains and, partly, methodological approach. The ratio between lactic acid bacteria and Yersinia enterocolitica strains studied was, depending on the variant of experiment, 1:1, 1:2 and 2:1, respectively. The study also considered water activity (aw) and pH of the products investigated. The results suggest that all the lactic acid bacteria strains used within the framework of the model set-ups had antagonistic effect on all the Salmonella sp. strains. However, this ability was not observed with respect to of tested lactic acid bacteria strains in meat and fermented sausage. This ability was possessed by one of the strains investigated--Lactobacillus helveticus T 78. The temperature and time of the incubation of sausages, but not aw and pH, were found to have a distinct influence on the antagonistic interaction between the bacteria tested.

  10. Model Developments for Development of Improved Emissions Scenarios: Developing Purchasing-Power Parity Models, Analyzing Uncertainty, and Developing Data Sets for Gridded Integrated Assessment Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Zili; Nordhaus, William

    2009-03-19

    In the duration of this project, we finished the main tasks set up in the initial proposal. These tasks include: setting up the basic platform in GAMS language for the new RICE 2007 model; testing various model structure of RICE 2007; incorporating PPP data set in the new RICE model; developing gridded data set for IA modeling.

  11. Early College for All: Efforts to Scale up Early Colleges in Multiple Settings

    ERIC Educational Resources Information Center

    Edmunds, Julie A.

    2016-01-01

    Given the positive impacts of the small, stand-alone early college model and the desire to provide those benefits to more students, organizations have begun efforts to scale up the early college model in a variety of settings. These efforts have been supported by the federal government, particularly by the Investing in Innovation (i3) program.…

  12. Field instrumentation and testing to study set-up phenomenon of piles driven into Louisiana clayey soils : final report.

    DOT National Transportation Integrated Search

    2016-07-01

    This research study aims to investigate the pile set-up phenomenon for clayey soils and develop empirical models to predict pile set-up : resistance at certain time after end of driving (EOD). To fulfill the objective, a total number of twelve prestr...

  13. Setting up home-based palliative care in countries with limited resources: a model from Sarawak, Malaysia.

    PubMed

    Devi, B C R; Tang, T S; Corbex, M

    2008-12-01

    The provision of palliative care (PC) and opioids is difficult to ensure in remote areas in low- and middle-income countries. We describe here the set up of a home-care program in Sarawak (the Malaysian part of the Borneo Island), where half the population lives in villages that are difficult to access. The establishment of this program, initiated in 1994 by the Department of Radiotherapy of Sarawak General Hospital, consisted of training, empowering nurses, simplifying referral, facilitating access to medication, and increasing awareness among public and health professionals about PC. The program has been sustainable and cost efficient, serving 936 patients in 2006. The total morphine usage in the program increased from <200 g in 1993 to >1400 g in 2006. The results show that pain medication can be provided even in remote areas with effective organization and empowerment of nurses, who were the most important determinants for the set up of this program. Education of family was also a key aspect. The authors believe that the experience gained in Sarawak may help other regions with low or middle resources in the set up of their PC program especially for their remote rural population.

  14. How to Set Up an Electronic Bulletin Board.

    ERIC Educational Resources Information Center

    Lukas, Terrence

    1981-01-01

    Describes a versatile, inexpensive information system using microcomputers and television sets which enables Indiana University Northwest to relay information for students to different sites simultaneously and to update information quickly and easily. Illustrates how to set up the hardware, discusses programing, and includes the actual program…

  15. Setting up home-based palliative care in countries with limited resources: a model from Sarawak, Malaysia

    PubMed Central

    Devi, B. C. R.; Corbex, M.

    2008-01-01

    Background: The provision of palliative care (PC) and opioids is difficult to ensure in remote areas in low- and middle-income countries. We describe here the set up of a home-care program in Sarawak (the Malaysian part of the Borneo Island), where half the population lives in villages that are difficult to access. Methods: The establishment of this program, initiated in 1994 by the Department of Radiotherapy of Sarawak General Hospital, consisted of training, empowering nurses, simplifying referral, facilitating access to medication, and increasing awareness among public and health professionals about PC. Results: The program has been sustainable and cost efficient, serving 936 patients in 2006. The total morphine usage in the program increased from <200 g in 1993 to >1400 g in 2006. The results show that pain medication can be provided even in remote areas with effective organization and empowerment of nurses, who were the most important determinants for the set up of this program. Education of family was also a key aspect. Conclusion: The authors believe that the experience gained in Sarawak may help other regions with low or middle resources in the set up of their PC program especially for their remote rural population. PMID:18641007

  16. A new mechatronic set-up and technique for investigation of firearms

    NASA Astrophysics Data System (ADS)

    Lesenciuc, Ioan; Suciu, Cornel

    2016-12-01

    Since ancient times, mankind has manifested interest in the development and improvement of weapons, either for military or hunting purposes. Today, in competition with these legal practices, the number of those who commit crimes by non-compliance with the regime of weapons and ammunition has increased exponentially. This is why the technology and methods employed in the area of judicial ballistics, requires constant research and continuous learning. The present paper advances a new experimental set-up and its corresponding methodology, meant to measure the force deployed by the firing pin. The new experimental set-up and procedure consists of a mechatronic structure, based on a piezoelectric force transducer, which allows to measure, in-situ, the force produced by the firing pin when it is deployed. The obtained information can further be used to establish a correspondence between this force and the imprint left on the firing cap. This correspondence furthers the possibility of elaborating a model that would permit ballistic experts to correctly identify a smoothbore weapon.

  17. How To Set Up Your Own Small Business. Study Guide.

    ERIC Educational Resources Information Center

    American Inst. of Small Business, Minneapolis, MN.

    This study guide is intended for use with the separately available entrepreneurship education text "How To Set Up Your Own Business." The guide includes student exercises that have been designed to accompany chapters dealing with the following topics: determining whether or not to set up a small business, doing market research, forecasting sales,…

  18. Quantum games with a multi-slit electron diffraction set-up

    NASA Astrophysics Data System (ADS)

    Iqbal, A.

    2003-05-01

    A set-up is proposed to play a quantum version of the famous bimatrix game of Prisoners' Dilemma. Multi-slit electron diffraction with each player's pure strategy consisting of opening one of the two slits at his/her disposal are essential features of the set-up. Instead of entanglement the association of waves with travelling material objects is suggested as another resource to play quantum games.

  19. Overall view of test set-up in bldg 13 at JSC during docking set-up tests

    NASA Image and Video Library

    1974-08-04

    S74-27049 (4 Aug. 1974) --- Overall view of test set-up in Building 23 at the Johnson Space Center during testing of the docking mechanisms for the joint U.S.-USSR Apollo-Soyuz Test Project. The cinematic check was being made when this picture was taken. The test control room is on the right. The Soviet-developed docking system is atop the USA-NASA developed docking system. Both American and Soviet engineers can be seen taking part in the docking testing. The ASTP docking mission in Earth orbit is scheduled for July 1975.

  20. Cost to Set up Common Languages

    NASA Astrophysics Data System (ADS)

    Latora, Vito

    Complexity is a highly interdisciplinary science. Although there are drawbacks for researchers to work at the interface of different fields, such as the cost to set up common languages, and the risks associated with not being recognized by any of the well-established scientific communities, some of my recent work indicates that interdisciplinarity can be extremely rewarding. Drawing on large data sets on scientific production during several decades, we have shown that highly interdisciplinary scholars can outperform specialized ones, and that scientists can enhance their performance by seeking collaborators with expertise in various fields. My vision for complexity is based on the added value of its interdisciplinary nature. I list below three research directions that I am personally eager to explore, and that I think will be among the main challenges of complexity in the next 10 years...

  1. Set-up uncertainties: online correction with X-ray volume imaging.

    PubMed

    Kataria, Tejinder; Abhishek, Ashu; Chadha, Pranav; Nandigam, Janardhan

    2011-01-01

    To determine interfractional three-dimensional set-up errors using X-ray volumetric imaging (XVI). Between December 2007 and August 2009, 125 patients were taken up for image-guided radiotherapy using online XVI. After matching of reference and acquired volume view images, set-up errors in three translation directions were recorded and corrected online before treatment each day. Mean displacements, population systematic (Σ), and random (σ) errors were calculated and analyzed using SPSS (v16) software. Optimum clinical target volume (CTV) to planning target volume (PTV) margin was calculated using Van Herk's (2.5Σ + 0.7 σ) and Stroom's (2Σ + 0.7 σ) formula. Patients were grouped in 4 cohorts, namely brain, head and neck, thorax, and abdomen-pelvis. The mean vector displacement recorded were 0.18 cm, 0.15 cm, 0.36 cm, and 0.35 cm for brain, head and neck, thorax, and abdomen-pelvis, respectively. Analysis of individual mean set-up errors revealed good agreement with the proposed 0.3 cm isotropic margins for brain and 0.5 cm isotropic margins for head-neck. Similarly, 0.5 cm circumferential and 1 cm craniocaudal proposed margins were in agreement with thorax and abdomen-pelvic cases. The calculated mean displacements were well within CTV-PTV margin estimates of Van Herk (90% population coverage to minimum 95% prescribed dose) and Stroom (99% target volume coverage by 95% prescribed dose). Employing these individualized margins in a particular cohort ensure comparable target coverage as described in literature, which is further improved if XVI-aided set-up error detection and correction is used before treatment.

  2. Trappers set up trap for lizard

    NASA Technical Reports Server (NTRS)

    2000-01-01

    In hope of catching a large monitor lizard seen in the area, state-licensed animal trappers Dewey Kessler and James Dean (at left), with Gary Povitch (kneeling) of the U.S. Wildlife and Dan Turner (standing) set up a trap on KSC. The lizard has been spotted recently near S.R. 3, a route into the Center, by several area residents. Turner is a monitor expert. The lizard is not a native of the area, and possibly a released pet. Dean is working with the cooperation of KSC and the Merritt Island National Wildlife Refuge.

  3. Stanford sets up 100m energy institute

    NASA Astrophysics Data System (ADS)

    Gwynne, Peter

    2009-02-01

    A new institute looking at how to provide for our energy needs while protecting the planet has been set up at Stanford University in the US. Named after one of its founding donors, the Precourt Institute for Energy will incorporate two existing organizations on the Stanford campus and be supported by donations of 100m plus the 30m that the university already spends on energy research each year.

  4. Economic communication model set

    NASA Astrophysics Data System (ADS)

    Zvereva, Olga M.; Berg, Dmitry B.

    2017-06-01

    This paper details findings from the research work targeted at economic communications investigation with agent-based models usage. The agent-based model set was engineered to simulate economic communications. Money in the form of internal and external currencies was introduced into the models to support exchanges in communications. Every model, being based on the general concept, has its own peculiarities in algorithm and input data set since it was engineered to solve the specific problem. Several and different origin data sets were used in experiments: theoretic sets were estimated on the basis of static Leontief's equilibrium equation and the real set was constructed on the basis of statistical data. While simulation experiments, communication process was observed in dynamics, and system macroparameters were estimated. This research approved that combination of an agent-based and mathematical model can cause a synergetic effect.

  5. Overhauling and Regulating Schools Set Up by Migrants: The Reason for Overhaul

    ERIC Educational Resources Information Center

    Jianzhong, Ding

    2004-01-01

    The article presents information on overhauling and regulating schools set up by migrants in the Pudong New District of China. As the number of migrants has risen sharply in the Pudong New District in recent years, so has the number of migrant children. An overall investigation of the fifty-nine schools set up by migrants was conducted and the…

  6. Development of a grinding-specific performance test set-up.

    PubMed

    Olesen, C G; Larsen, B H; Andresen, E L; de Zee, M

    2015-01-01

    The aim of this study was to develop a performance test set-up for America's Cup grinders. The test set-up had to mimic the on-boat grinding activity and be capable of collecting data for analysis and evaluation of grinding performance. This study included a literature-based analysis of grinding demands and a test protocol developed to accommodate the necessary physiological loads. This study resulted in a test protocol consisting of 10 intervals of 20 revolutions each interspersed with active resting periods of 50 s. The 20 revolutions are a combination of both forward and backward grinding and an exponentially rising resistance. A custom-made grinding ergometer was developed with computer-controlled resistance and capable of collecting data during the test. The data collected can be used to find measures of grinding performance such as peak power, time to complete and the decline in repeated grinding performance.

  7. Investigations in quantum games using EPR-type set-ups

    NASA Astrophysics Data System (ADS)

    Iqbal, Azhar

    2006-04-01

    Research in quantum games has flourished during recent years. However, it seems that opinion remains divided about their true quantum character and content. For example, one argument says that quantum games are nothing but 'disguised' classical games and that to quantize a game is equivalent to replacing the original game by a different classical game. The present thesis contributes towards the ongoing debate about quantum nature of quantum games by developing two approaches addressing the related issues. Both approaches take Einstein-Podolsky-Rosen (EPR)-type experiments as the underlying physical set-ups to play two-player quantum games. In the first approach, the players' strategies are unit vectors in their respective planes, with the knowledge of coordinate axes being shared between them. Players perform measurements in an EPR-type setting and their payoffs are defined as functions of the correlations, i.e. without reference to classical or quantum mechanics. Classical bimatrix games are reproduced if the input states are classical and perfectly anti-correlated, as for a classical correlation game. However, for a quantum correlation game, with an entangled singlet state as input, qualitatively different solutions are obtained. The second approach uses the result that when the predictions of a Local Hidden Variable (LHV) model are made to violate the Bell inequalities the result is that some probability measures assume negative values. With the requirement that classical games result when the predictions of a LHV model do not violate the Bell inequalities, our analysis looks at the impact which the emergence of negative probabilities has on the solutions of two-player games which are physically implemented using the EPR-type experiments.

  8. INTERIOR PERSPECTIVE, LOOKING SOUTH SOUTHWEST WITH FIELD SET UP IN ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR PERSPECTIVE, LOOKING SOUTH SOUTHWEST WITH FIELD SET UP IN FOOTBALL CONFIGURATION. FIELD SEATING ROTATES TO ACCOMMODATE BASEBALL GAMES. - Houston Astrodome, 8400 Kirby Drive, Houston, Harris County, TX

  9. Research on mechanical and sensoric set-up for high strain rate testing of high performance fibers

    NASA Astrophysics Data System (ADS)

    Unger, R.; Schegner, P.; Nocke, A.; Cherif, C.

    2017-10-01

    Within this research project, the tensile behavior of high performance fibers, such as carbon fibers, is investigated under high velocity loads. This contribution (paper) focuses on the clamp set-up of two testing machines. Based on a kinematic model, weight optimized clamps are designed and evaluated. By analyzing the complex dynamic behavior of conventional high velocity testing machines, it has been shown that the impact typically exhibits an elastic characteristic. This leads to barely predictable breaking speeds and will not work at higher speeds when acceleration force exceeds material specifications. Therefore, a plastic impact behavior has to be achieved, even at lower testing speeds. This type of impact behavior at lower speeds can be realized by means of some minor test set-up adaptions.

  10. 9. VIEW SOUTHSOUTHEAST STERN OF JFK, SCAFFOLDING SET UP FOR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. VIEW SOUTH-SOUTHEAST STERN OF JFK, SCAFFOLDING SET UP FOR REMOUNTING OF PROPELLERS. - Naval Base Philadelphia-Philadelphia Naval Shipyard, Dry Dock No. 5, League Island, Philadelphia, Philadelphia County, PA

  11. Archaeological predictive model set.

    DOT National Transportation Integrated Search

    2015-03-01

    This report is the documentation for Task 7 of the Statewide Archaeological Predictive Model Set. The goal of this project is to : develop a set of statewide predictive models to assist the planning of transportation projects. PennDOT is developing t...

  12. Computer Center: Setting Up a Microcomputer Center--1 Person's Perspective.

    ERIC Educational Resources Information Center

    Duhrkopf, Richard, Ed.; Collins, Michael, A. J., Ed.

    1988-01-01

    Considers eight components to be considered in setting up a microcomputer center for use with college classes. Discussions include hardware, software, physical facility, furniture, technical support, personnel, continuing financial expenditures, and security. (CW)

  13. Stereo particle image velocimetry set up for measurements in the wake of scaled wind turbines

    NASA Astrophysics Data System (ADS)

    Campanardi, Gabriele; Grassi, Donato; Zanotti, Alex; Nanos, Emmanouil M.; Campagnolo, Filippo; Croce, Alessandro; Bottasso, Carlo L.

    2017-08-01

    Stereo particle image velocimetry measurements were carried out in the boundary layer test section of Politecnico di Milano large wind tunnel to survey the wake of a scaled wind turbine model designed and developed by Technische Universität München. The stereo PIV instrumentation was set up to survey the three velocity components on cross-flow planes at different longitudinal locations. The area of investigation covered the entire extent of the wind turbines wake that was scanned by the use of two separate traversing systems for both the laser and the cameras. Such instrumentation set up enabled to gain rapidly high quality results suitable to characterise the behaviour of the flow field in the wake of the scaled wind turbine. This would be very useful for the evaluation of the performance of wind farm control methodologies based on wake redirection and for the validation of CFD tools.

  14. A Magnetic Set-Up to Help Teach Newton's Laws

    ERIC Educational Resources Information Center

    Panijpan, Bhinyo; Sujarittham, Thanida; Arayathanitkul, Kwan; Tanamatayarat, Jintawat; Nopparatjamjomras, Suchai

    2009-01-01

    A set-up comprising a magnetic disc, a solenoid and a mechanical balance was used to teach first-year physics students Newton's third law with the help of a free body diagram. The image of a floating magnet immobilized by the solenoid's repulsive force should help dispel a common misconception of students as regards the first law: that stationary…

  15. 41 CFR 102-34.285 - Where can we obtain help in setting up a maintenance program?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 41 Public Contracts and Property Management 3 2011-01-01 2011-01-01 false Where can we obtain help in setting up a maintenance program? 102-34.285 Section 102-34.285 Public Contracts and Property... obtain help in setting up a maintenance program? For help in setting up a maintenance program, contact...

  16. The learning unit "Orthodontic set-up" as a new-media module in teaching.

    PubMed

    Asselmeyer, T; Fischer, V; Matthies, H; Schwestka-Polly, R

    2004-07-01

    The present study examines the extent to which computer-assisted learning units provided independently of place and time are used in self-study as a supplement to the classical classroom instruction of dental students. Indications as to whether such teaching modules improve training in orthodontics should be obtained from this. Attention was focussed on the implementation and evaluation of the "Orthodontic set-up" teaching module, which can be accessed in the Internet and Intranet of the university. The didactic arrangement offered classical university courses in parallel (four lectures on the subjects of occlusion, function, diagnostics, and therapy) in addition to the electronically communicated teaching contents. In addition, intensive supervision during the production of the set-up was guaranteed. The use of this multimedia learning concept was in general assessed positively by 63 surveyed students in the 2002/03 winter semester. The results revealed on the one hand the intensity of use and features of the acquisition of knowledge (use types), and on the other hand, in terms of professional relevance, the contents were found to be well explained, didactically attractive, and understandably presented. However, numerous drawbacks were also mentioned (technical and time problems; qualification deficits). The experience gained in this project should encourage more future investment in the development of alternative university didactic models.

  17. A new level set model for cell image segmentation

    NASA Astrophysics Data System (ADS)

    Ma, Jing-Feng; Hou, Kai; Bao, Shang-Lian; Chen, Chun

    2011-02-01

    In this paper we first determine three phases of cell images: background, cytoplasm and nucleolus according to the general physical characteristics of cell images, and then develop a variational model, based on these characteristics, to segment nucleolus and cytoplasm from their relatively complicated backgrounds. In the meantime, the preprocessing obtained information of cell images using the OTSU algorithm is used to initialize the level set function in the model, which can speed up the segmentation and present satisfactory results in cell image processing.

  18. Setting Up the JBrowse Genome Browser

    PubMed Central

    Skinner, Mitchell E; Holmes, Ian H

    2010-01-01

    JBrowse is a web-based tool for visualizing genomic data. Unlike most other web-based genome browsers, JBrowse exploits the capabilities of the user's web browser to make scrolling and zooming fast and smooth. It supports the browsers used by almost all internet users, and is relatively simple to install. JBrowse can utilize multiple types of data in a variety of common genomic data formats, including genomic feature data in bioperl databases, GFF files, and BED files, and quantitative data in wiggle files. This unit describes how to obtain the JBrowse software, set it up on a Linux or Mac OS X computer running as a web server and incorporate genome annotation data from multiple sources into JBrowse. After completing the protocols described in this unit, the reader will have a web site that other users can visit to browse the genomic data. PMID:21154710

  19. Single ion hit detection set-up for the Zagreb ion microprobe

    NASA Astrophysics Data System (ADS)

    Smith, R. W.; Karlušić, M.; Jakšić, M.

    2012-04-01

    Irradiation of materials by heavy ions accelerated in MV tandem accelerators may lead to the production of latent ion tracks in many insulators and semiconductors. If irradiation is performed in a high resolution microprobe facility, ion tracks can be ordered by submicrometer positioning precision. However, full control of the ion track positioning can only be achieved by a reliable ion hit detection system that should provide a trigger signal irrespectively of the type and thickness of the material being irradiated. The most useful process that can be utilised for this purpose is emission of secondary electrons from the sample surface that follows the ion impact. The status report of the set-up presented here is based on the use of a channel electron multiplier (CEM) detector mounted on an interchangable sample holder that is inserted into the chamber in a close geometry along with the sample to be irradiated. The set-up has been tested at the Zagreb ion microprobe for different ions and energies, as well as different geometrical arrangements. For energies of heavy ions below 1 MeV/amu, results show that efficient (100%) control of ion impact can be achieved only for ions heavier than silicon. The successful use of the set-up is demonstrated by production of ordered single ion tracks in a polycarbonate film and by monitoring fluence during ion microbeam patterning of Foturan glass.

  20. A user-friendly technical set-up for infrared photography of forensic findings.

    PubMed

    Rost, Thomas; Kalberer, Nicole; Scheurer, Eva

    2017-09-01

    Infrared photography is interesting for a use in forensic science and forensic medicine since it reveals findings that normally are almost invisible to the human eye. Originally, infrared photography has been made possible by the placement of an infrared light transmission filter screwed in front of the camera objective lens. However, this set-up is associated with many drawbacks such as the loss of the autofocus function, the need of an external infrared source, and long exposure times which make the use of a tripod necessary. These limitations prevented up to now the routine application of infrared photography in forensics. In this study the use of a professional modification inside the digital camera body was evaluated regarding camera handling and image quality. This permanent modification consisted of the replacement of the in-built infrared blocking filter by an infrared transmission filter of 700nm and 830nm, respectively. The application of this camera set-up for the photo-documentation of forensically relevant post-mortem findings was investigated in examples of trace evidence such as gunshot residues on the skin, in external findings, e.g. hematomas, as well as in an exemplary internal finding, i.e., Wischnewski spots in a putrefied stomach. The application of scattered light created by indirect flashlight yielded a more uniform illumination of the object, and the use of the 700nm filter resulted in better pictures than the 830nm filter. Compared to pictures taken under visible light, infrared photographs generally yielded better contrast. This allowed for discerning more details and revealed findings which were not visible otherwise, such as imprints on a fabric and tattoos in mummified skin. The permanent modification of a digital camera by building in a 700nm infrared transmission filter resulted in a user-friendly and efficient set-up which qualified for the use in daily forensic routine. Main advantages were a clear picture in the viewfinder, an auto

  1. Prototype electron lens set-up for the Tevatron beam-beam compensation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crawford, C.; Saewert, G.; Santucci, J.

    1999-05-17

    A prototype "electron lens" for the Tevatron beam-beam compensation project is commissioned at Fermilab. We de-scribe the set-up, report results of the first tests of the elec-tron beam, and discuss future plans.

  2. A model for scale up of family health innovations in low-income and middle-income settings: a mixed methods study.

    PubMed

    Bradley, Elizabeth H; Curry, Leslie A; Taylor, Lauren A; Pallas, Sarah Wood; Talbert-Slagle, Kristina; Yuan, Christina; Fox, Ashley; Minhas, Dilpreet; Ciccone, Dana Karen; Berg, David; Pérez-Escamilla, Rafael

    2012-01-01

    Many family health innovations that have been shown to be both efficacious and cost-effective fail to scale up for widespread use particularly in low-income and middle-income countries (LMIC). Although individual cases of successful scale-up, in which widespread take up occurs, have been described, we lack an integrated and practical model of scale-up that may be applicable to a wide range of public health innovations in LMIC. To develop an integrated and practical model of scale-up that synthesises experiences of family health programmes in LMICs. We conducted a mixed methods study that included in-depth interviews with 33 key informants and a systematic review of peer-reviewed and grey literature from 11 electronic databases and 20 global health agency web sites. We included key informants and studies that reported on the scale up of several family health innovations including Depo-Provera as an example of a product innovation, exclusive breastfeeding as an example of a health behaviour innovation, community health workers (CHWs) as an example of an organisational innovation and social marketing as an example of a business model innovation. Key informants were drawn from non-governmental, government and international organisations using snowball sampling. An article was excluded if the article: did not meet the study's definition of the innovation; did not address dissemination, diffusion, scale up or sustainability of the innovation; did not address low-income or middle-income countries; was superficial in its discussion and/or did not provide empirical evidence about scale-up of the innovation; was not available online in full text; or was not available in English, French, Spanish or Portuguese, resulting in a final sample of 41 peer-reviewed articles and 30 grey literature sources. We used the constant comparative method of qualitative data analysis to extract recurrent themes from the interviews, and we integrated these themes with findings from the

  3. Persisting effect of community approaches to resuscitation.

    PubMed

    Nielsen, Anne Møller; Isbye, Dan Lou; Lippert, Freddy Knudsen; Rasmussen, Lars Simon

    2014-11-01

    On the Danish island of Bornholm an intervention was carried out during 2008-2010 aiming at increasing out-of-hospital cardiac arrest (OHCA) survival. The intervention included mass media focus on resuscitation and widespread educational activities. The aim of this study was to compare the bystander BLS rate and survival after OHCA on Bornholm in a 3-year follow-up period after the intervention took place. Data on OHCA on Bornholm were collected from September 28th, 2010 to September 27th, 2013 and compared to data from the intervention period, September 28th, 2008 to September 27th, 2010. The bystander BLS rate for non-EMS witnessed OHCAs with presumed cardiac aetiology was significantly higher in the follow-up period (70% [95% CI 61-77] vs. 47% [95% CI 37-57], p=0.001). AEDs were deployed in 22 (18%) cases in the follow-up period and a shock was provided in 13 cases. There was no significant change in all-rhythm 30-day survival for non-EMS witnessed OHCAs with presumed cardiac aetiology (6.7% [95% CI 3-13] in the follow-up period; vs. 4.6% [95% CI 1-12], p=0.76). In a 3-year follow-up period after an intervention engaging laypersons in resuscitation through mass education in BLS combined with a media focus on resuscitation, we observed a persistent significant increase in the bystander BLS rate for all OHCAs with presumed cardiac aetiology. There was no significant difference in 30-day survival. Copyright © 2014 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  4. Does a simple syringe applicator enhance bone cement set up time in knee arthroplasty?

    PubMed Central

    Sodhi, Nipun; Dalton, Sarah E.; Khlopas, Anton; Sultan, Assem A.; Curtis, Gannon L.; Harb, Matthew A.; Naziri, Qais; Barrington, John W.; Mont, Michael A.

    2017-01-01

    Background The time required for polymethylmethacrylate (PMMA) cement curing or hardening can be modified by a number of variables including the mixing technique, and the temperature and pressure at which the process is taking place. Therefore, the purpose of this study was to evaluate two different methods of PMMA application in terms of set up time. Specifically, we (I) compared the PMMA set up time of cement that remained in the mixing bowl to cement that was placed in a syringe and (II) extrapolated the associated annual cost difference on the national and individual surgeon levels. Methods The cement set up time was measured for a total of 146 consecutive patients who underwent either unicompartmental knee arthroplasty (n=136) or patellofemoral arthroplasty (n=10) between January 2016 and April 2017. One pack of PMMA powder and monomer were mixed, placed in a 300 mL small plastic bowl, and mixed with a tongue depressor. Then, 50 mL of the mixed PMMA was placed in a sterile 60 mL syringe with the tip cut to a 6-mm opening, and the syringe was used to apply the cement to the bone and the prosthesis surface. The remaining unused cement in the syringe (syringe group) and the remaining unused cement in the plastic bowl (bowl group) were removed and formed into a two separate 2 cm diameter cubes that were allowed to cure at room temperature on a sterile set of osteotomes. The two cubes of cement were timed for complete PMMA curing. A two-tailed student’s t-test was used to compare the curing time for the two groups. Annual cost differences were calculated on the national and individual surgeon level. The total number of daily cases performed and the operative time savings using the syringe applicator was used to find daily and annual cost savings. Results The mean time for the cement to set up in the bowl group was 16.8±2.1 minutes, and the mean time for cement set up in the syringe group was 15.1±1.7 minutes. Compared to the bowl group cement set up time, the

  5. Cone beam CT-based set-up strategies with and without rotational correction for stereotactic body radiation therapy in the liver.

    PubMed

    Bertholet, Jenny; Worm, Esben; Høyer, Morten; Poulsen, Per

    2017-06-01

    Accurate patient positioning is crucial in stereotactic body radiation therapy (SBRT) due to a high dose regimen. Cone-beam computed tomography (CBCT) is often used for patient positioning based on radio-opaque markers. We compared six CBCT-based set-up strategies with or without rotational correction. Twenty-nine patients with three implanted markers received 3-6 fraction liver SBRT. The markers were delineated on the mid-ventilation phase of a 4D-planning-CT. One pretreatment CBCT was acquired per fraction. Set-up strategy 1 used only translational correction based on manual marker match between the CBCT and planning CT. Set-up strategy 2 used automatic 6 degrees-of-freedom registration of the vertebrae closest to the target. The 3D marker trajectories were also extracted from the projections and the mean position of each marker was calculated and used for set-up strategies 3-6. Translational correction only was used for strategy 3. Translational and rotational corrections were used for strategies 4-6 with the rotation being either vertebrae based (strategy 4), or marker based and constrained to ±3° (strategy 5) or unconstrained (strategy 6). The resulting set-up error was calculated as the 3D root-mean-square set-up error of the three markers. The set-up error of the spinal cord was calculated for all strategies. The bony anatomy set-up (2) had the largest set-up error (5.8 mm). The marker-based set-up with unconstrained rotations (6) had the smallest set-up error (0.8 mm) but the largest spinal cord set-up error (12.1 mm). The marker-based set-up with translational correction only (3) or with bony anatomy rotational correction (4) had equivalent set-up error (1.3 mm) but rotational correction reduced the spinal cord set-up error from 4.1 mm to 3.5 mm. Marker-based set-up was substantially better than bony-anatomy set-up. Rotational correction may improve the set-up, but further investigations are required to determine the optimal correction

  6. The urban boundary-layer field campaign in marseille (ubl/clu-escompte): set-up and first results

    NASA Astrophysics Data System (ADS)

    Mestayer, P.G.; Durand, P.; Augustin, P.; Bastin, S.; Bonnefond, J.-M.; Benech, B.; Campistron, B.; Coppalle, A.; Delbarre, H.; Dousset, B.; Drobinski, P.; Druilhet, A.; Frejafon, E.; Grimmond, C.S.B.; Groleau, D.; Irvine, M.; Kergomard, C.; Kermadi, S.; Lagouarde, J.-P.; Lemonsu, A.; Lohou, F.; Long, N.; Masson, V.; Moppert, C.; Noilhan, J.; Offerle, B.; Oke, T.R.; Pigeon, G.; Puygrenier, V.; Roberts, S.; Rosant, J.-M.; Sanid, F.; Salmond, J.; Talbaut, M.; Voogt, J.

    The UBL/CLU (urban boundary layer/couche limite urbaine) observation and modelling campaign is a side-project of the regional photochemistry campaign ESCOMPTE. UBL/CLU focuses on the dynamics and thermodynamics of the urban boundary layer of Marseille, on the Mediterranean coast of France. The objective of UBL/CLU is to document the four-dimensional structure of the urban boundary layer and its relation to the heat and moisture exchanges between the urban canopy and the atmosphere during periods of low wind conditions, from June 4 to July 16, 2001. The project took advantage of the comprehensive observational set-up of the ESCOMPTE campaign over the Berre-Marseille area, especially the ground-based remote sensing, airborne measurements, and the intensive documentation of the regional meteorology. Additional instrumentation was installed as part of UBL/CLU. Analysis objectives focus on (i) validation of several energy balance computational schemes such as LUMPS, TEB and SM2-U, (ii) ground truth and urban canopy signatures suitable for the estimation of urban albedos and aerodynamic surface temperatures from satellite data, (iii) high resolution mapping of urban land cover, land-use and aerodynamic parameters used in UBL models, and (iv) testing the ability of high resolution atmospheric models to simulate the structure of the UBL during land and sea breezes, and the related transport and diffusion of pollutants over different districts of the city. This paper presents initial results from such analyses and details of the overall experimental set-up.

  7. Practical guidelines for setting up neurosurgery skills training cadaver laboratory in India.

    PubMed

    Suri, Ashish; Roy, Tara Sankar; Lalwani, Sanjeev; Deo, Rama Chandra; Tripathi, Manjul; Dhingra, Renu; Bhardwaj, Daya Nand; Sharma, Bhawani Shankar

    2014-01-01

    Though the necessity of cadaver dissection is felt by the medical fraternity, and described as early as 600 BC, in India, there are no practical guidelines available in the world literature for setting up a basic cadaver dissection laboratory for neurosurgery skills training. Hands-on dissection practice on microscopic and endoscopic procedures is essential in technologically demanding modern neurosurgery training where ethical issues, cost constraints, medico-legal pitfalls, and resident duty time restrictions have resulted in lesser opportunities to learn. Collaboration of anatomy, forensic medicine, and neurosurgery is essential for development of a workflow of cadaver procurement, preservation, storage, dissection, and disposal along with setting up the guidelines for ethical and legal concerns.

  8. Nurse-led clinics: 10 essential steps to setting up a service.

    PubMed

    Hatchett, Richard

    This article outlines 10 key steps for practitioners to consider when setting up and running a nurse-led clinic. It lays emphasis on careful planning, professional development and the need to audit and evaluate the service to ensure the clinic is measurably effective.

  9. Setting up chronic disease programs: perspectives from Aboriginal Australia.

    PubMed

    Hoy, Wendy E; Kondalsamy-Chennakesavan, S; Smith, J; Sharma, S; Davey, R; Gokel, G

    2006-01-01

    To share some perspectives on setting up programs to improve management of hypertension, renal disease, and diabetes in high-risk populations, derived from experience in remote Australian Aboriginal settings. Regular integrated checks for chronic disease and their risk factors and appropriate treatment are essential elements of regular adult health care. Programs should be run by local health workers, following algorithms for testing and treatment, with back up from nurses. Constant evaluation is essential. COMPONENTS: Theses include testing, treatment, education for individuals and communities, skills and career development for staff, ongoing evaluation, program modification, and advocacy. Target groups, elements, and frequency of testing, as well as the reagents and treatment modalities must be designed for local circumstances, which include disease burden and impact, competing priorities, and available resources. Pilot surveys or record reviews can define target groups and conditions. Opportunistic testing will suffice if people are seen with some regularity for other conditions; otherwise, systematic screening is needed, preferably embedded in primary care streams. The chief goal of treatment is to lower blood pressure, and if the patient is diabetic, to control hyperglycemia. Many people will need multiple drugs for many years. Challenges include lack of resources, competing demands of acute care, the burden of treatment when disease rates are high, problems with information systems, and in our setting, health worker absenteeism. Businesses, altruistic organizations, and pharmaceutical and biotechnology companies might fund feasibility studies. Where governments or insurance companies already support health services, they must ultimately commit to chronic disease services over the long- term. Effective advocacy requires the presentation of an integrated view of chronic disease and a single cross-disciplinary program for its containment. Arguments based on

  10. Duty to speak up in the health care setting a professionalism and ethics analysis.

    PubMed

    Topazian, Rachel J; Hook, C Christopher; Mueller, Paul S

    2013-11-01

    Staff and students working in health care settings are sometimes reluctant to speak up when they perceive patients to be at risk for harm. In this article, we describe four incidents that occurred at our institution (Mayo Clinic). In two of them, health care professionals failed to speak up, which resulted in harm; in the other two, they did speak up, which prevented harm and improved patient care. We analyzed each scenario using the Physician's Charter on Medical Professionalism and prima facie ethics principles to determine whether principles were violated or upheld. We conclude that anyone who works in a health care setting has a duty to speak up when a patient faces harm. We also provide guidance for health care institutions on promoting a culture in which speaking up is encouraged and integrated into routine practice.

  11. Investigation of rare nuclear decays with the DAMA set-ups

    NASA Astrophysics Data System (ADS)

    Bernabei, Rita; Cappella, Fabio

    2018-03-01

    The DAMA project has obtained many competitive or new results in the search for various rare nuclear processes. Most of them have been obtained with the help of many different high purity crystal scintillators which have been measured in the low-background DAMA set-ups located in the Gran Sasso underground laboratory of INFN In this paper, the main results will be summarized.

  12. Reliable measurement of E. coli single cell fluorescence distribution using a standard microscope set-up.

    PubMed

    Cortesi, Marilisa; Bandiera, Lucia; Pasini, Alice; Bevilacqua, Alessandro; Gherardi, Alessandro; Furini, Simone; Giordano, Emanuele

    2017-01-01

    Quantifying gene expression at single cell level is fundamental for the complete characterization of synthetic gene circuits, due to the significant impact of noise and inter-cellular variability on the system's functionality. Commercial set-ups that allow the acquisition of fluorescent signal at single cell level (flow cytometers or quantitative microscopes) are expensive apparatuses that are hardly affordable by small laboratories. A protocol that makes a standard optical microscope able to acquire quantitative, single cell, fluorescent data from a bacterial population transformed with synthetic gene circuitry is presented. Single cell fluorescence values, acquired with a microscope set-up and processed with custom-made software, are compared with results that were obtained with a flow cytometer in a bacterial population transformed with the same gene circuitry. The high correlation between data from the two experimental set-ups, with a correlation coefficient computed over the tested dynamic range > 0.99, proves that a standard optical microscope- when coupled with appropriate software for image processing- might be used for quantitative single-cell fluorescence measurements. The calibration of the set-up, together with its validation, is described. The experimental protocol described in this paper makes quantitative measurement of single cell fluorescence accessible to laboratories equipped with standard optical microscope set-ups. Our method allows for an affordable measurement/quantification of intercellular variability, whose better understanding of this phenomenon will improve our comprehension of cellular behaviors and the design of synthetic gene circuits. All the required software is freely available to the synthetic biology community (MUSIQ Microscope flUorescence SIngle cell Quantification).

  13. The Objective Borderline Method (OBM): A Probability-Based Model for Setting up an Objective Pass/Fail Cut-Off Score in Medical Programme Assessments

    ERIC Educational Resources Information Center

    Shulruf, Boaz; Turner, Rolf; Poole, Phillippa; Wilkinson, Tim

    2013-01-01

    The decision to pass or fail a medical student is a "high stakes" one. The aim of this study is to introduce and demonstrate the feasibility and practicality of a new objective standard-setting method for determining the pass/fail cut-off score from borderline grades. Three methods for setting up pass/fail cut-off scores were compared: the…

  14. New large solar photocatalytic plant: set-up and preliminary results.

    PubMed

    Malato, S; Blanco, J; Vidal, A; Fernández, P; Cáceres, J; Trincado, P; Oliveira, J C; Vincent, M

    2002-04-01

    A European industrial consortium called SOLARDETOX has been created as the result of an EC-DGXII BRITE-EURAM-III-financed project on solar photocatalytic detoxification of water. The project objective was to develop a simple, efficient and commercially competitive water-treatment technology, based on compound parabolic collectors (CPCs) solar collectors and TiO2 photocatalysis, to make possible easy design and installation. The design, set-up and preliminary results of the main project deliverable, the first European industrial solar detoxification treatment plant, is presented. This plant has been designed for the batch treatment of 2 m3 of water with a 100 m2 collector-aperture area and aqueous aerated suspensions of polycrystalline TiO2 irradiated by sunlight. Fully automatic control reduces operation and maintenance manpower. Plant behaviour has been compared (using dichloroacetic acid and cyanide at 50 mg l(-1) initial concentration as model compounds) with the small CPC pilot plants installed at the Plataforma Solar de Almería several years ago. The first results with high-content cyanide (1 g l(-1)) waste water are presented and plant treatment capacity is calculated.

  15. Kuipers sets up the CSA-CP in the U.S. Laboratory

    NASA Image and Video Library

    2012-01-26

    ISS030-E-156455 (26 Jan. 2012) --- European Space Agency astronaut Andre Kuipers, Expedition 30 flight engineer, sets up the Compound Specific Analyzer - Combustion Products (CSA-CP) in the Destiny laboratory of the International Space Station. The purpose of the analyzer is to measure the concentrations of carbon monoxide, hydrogen cyanide, hydrogen chloride and oxygen.

  16. Setting up spaces for collaboration in industry between researchers from the natural and social sciences.

    PubMed

    Flipse, Steven M; van der Sanden, Maarten C A; Osseweijer, Patricia

    2014-03-01

    Policy makers call upon researchers from the natural and social sciences to collaborate for the responsible development and deployment of innovations. Collaborations are projected to enhance both the technical quality of innovations, and the extent to which relevant social and ethical considerations are integrated into their development. This could make these innovations more socially robust and responsible, particularly in new and emerging scientific and technological fields, such as synthetic biology and nanotechnology. Some researchers from both fields have embarked on collaborative research activities, using various Technology Assessment approaches and Socio-Technical Integration Research activities such as Midstream Modulation. Still, practical experience of collaborations in industry is limited, while much may be expected from industry in terms of socially responsible innovation development. Experience in and guidelines on how to set up and manage such collaborations are not easily available. Having carried out various collaborative research activities in industry ourselves, we aim to share in this paper our experiences in setting up and working in such collaborations. We highlight the possibilities and boundaries in setting up and managing collaborations, and discuss how we have experienced the emergence of 'collaborative spaces.' Hopefully our findings can facilitate and encourage others to set up collaborative research endeavours.

  17. Benchmarking of protein descriptor sets in proteochemometric modeling (part 2): modeling performance of 13 amino acid descriptor sets

    PubMed Central

    2013-01-01

    Background While a large body of work exists on comparing and benchmarking descriptors of molecular structures, a similar comparison of protein descriptor sets is lacking. Hence, in the current work a total of 13 amino acid descriptor sets have been benchmarked with respect to their ability of establishing bioactivity models. The descriptor sets included in the study are Z-scales (3 variants), VHSE, T-scales, ST-scales, MS-WHIM, FASGAI, BLOSUM, a novel protein descriptor set (termed ProtFP (4 variants)), and in addition we created and benchmarked three pairs of descriptor combinations. Prediction performance was evaluated in seven structure-activity benchmarks which comprise Angiotensin Converting Enzyme (ACE) dipeptidic inhibitor data, and three proteochemometric data sets, namely (1) GPCR ligands modeled against a GPCR panel, (2) enzyme inhibitors (NNRTIs) with associated bioactivities against a set of HIV enzyme mutants, and (3) enzyme inhibitors (PIs) with associated bioactivities on a large set of HIV enzyme mutants. Results The amino acid descriptor sets compared here show similar performance (<0.1 log units RMSE difference and <0.1 difference in MCC), while errors for individual proteins were in some cases found to be larger than those resulting from descriptor set differences ( > 0.3 log units RMSE difference and >0.7 difference in MCC). Combining different descriptor sets generally leads to better modeling performance than utilizing individual sets. The best performers were Z-scales (3) combined with ProtFP (Feature), or Z-Scales (3) combined with an average Z-Scale value for each target, while ProtFP (PCA8), ST-Scales, and ProtFP (Feature) rank last. Conclusions While amino acid descriptor sets capture different aspects of amino acids their ability to be used for bioactivity modeling is still – on average – surprisingly similar. Still, combining sets describing complementary information consistently leads to small but consistent improvement in

  18. The effect of systematic set-up deviations on the absorbed dose distribution for left-sided breast cancer treated with respiratory gating

    NASA Astrophysics Data System (ADS)

    Edvardsson, A.; Ceberg, S.

    2013-06-01

    The aim of this study was 1) to investigate interfraction set-up uncertainties for patients treated with respiratory gating for left-sided breast cancer, 2) to investigate the effect of the inter-fraction set-up on the absorbed dose-distribution for the target and organs at risk (OARs) and 3) optimize the set-up correction strategy. By acquiring multiple set-up images the systematic set-up deviation was evaluated. The effect of the systematic set-up deviation on the absorbed dose distribution was evaluated by 1) simulation in the treatment planning system and 2) measurements with a biplanar diode array. The set-up deviations could be decreased using a no action level correction strategy. Not using the clinically implemented adaptive maximum likelihood factor for the gating patients resulted in better set-up. When the uncorrected set-up deviations were simulated the average mean absorbed dose was increased from 1.38 to 2.21 Gy for the heart, 4.17 to 8.86 Gy to the left anterior descending coronary artery and 5.80 to 7.64 Gy to the left lung. Respiratory gating can induce systematic set-up deviations which would result in increased mean absorbed dose to the OARs if not corrected for and should therefore be corrected for by an appropriate correction strategy.

  19. Setting up Targeted Research Interviews: A Primer for Students and New Interviewers

    ERIC Educational Resources Information Center

    Noy, Darren

    2009-01-01

    This article analyzes key strategic considerations for setting up targeted research interviews, including human subjects and Institutional Review Board requirements, approaching respondents, the medium of contact, using technology, cultural conceptions of time and commitment, using networks, wading through bureaucracies, and watching for warning…

  20. Convex set and linear mixing model

    NASA Technical Reports Server (NTRS)

    Xu, P.; Greeley, R.

    1993-01-01

    A major goal of optical remote sensing is to determine surface compositions of the earth and other planetary objects. For assessment of composition, single pixels in multi-spectral images usually record a mixture of the signals from various materials within the corresponding surface area. In this report, we introduce a closed and bounded convex set as a mathematical model for linear mixing. This model has a clear geometric implication because the closed and bounded convex set is a natural generalization of a triangle in n-space. The endmembers are extreme points of the convex set. Every point in the convex closure of the endmembers is a linear mixture of those endmembers, which is exactly how linear mixing is defined. With this model, some general criteria for selecting endmembers could be described. This model can lead to a better understanding of linear mixing models.

  1. Kaleri sets up Russian MBI-12 Payload in the SM

    NASA Image and Video Library

    2010-12-09

    ISS026-E-008718 (8 Dec. 2010) --- Russian cosmonaut Alexander Kaleri, Expedition 26 flight engineer, sets up the Russian MBI-12 payload for a Sonokard experiment session in the Zvezda Service Module of the International Space Station. Kaleri used a sports shirt from the Sonokard kit with a special device in the pocket for testing a new method for acquiring physiological data without using direct contact on the skin. Measurements are recorded on a data card for return to Earth.

  2. American & Soviet engineers examine ASTP docking set-up following tests

    NASA Image and Video Library

    1974-07-10

    S74-25394 (10 July 1974) --- A group of American and Soviet engineers of the Apollo-Soyuz Test Project working group three examines an ASTP docking set-up following a docking mechanism fitness test conducted in Building 13 at the Johnson Space Center. Working Group No. 3 is concerned with ASTP docking problems and techniques. The joint U.S.-USSR ASTP docking mission in Earth orbit is scheduled for the summer of 1975. The Apollo docking mechanism is atop the Soyuz docking mechanism.

  3. Spatio-temporal dynamics of cod nursery areas in the Baltic Sea

    NASA Astrophysics Data System (ADS)

    Hinrichsen, H.-H.; von Dewitz, B.; Lehmann, A.; Bergström, U.; Hüssy, K.

    2017-06-01

    In this study the drift of eastern Baltic cod larvae and juveniles spawned within the historical eastern Baltic cod spawning grounds was investigated by detailed drift model simulations for the years 1971-2010, to examine the spatio-temporal dynamics of environmental suitability in the nursery areas of juvenile cod settlement. The results of the long-term model scenario runs, where juvenile cod were treated as simulated passively drifting particles, enabled us to find strong indications for long-term variations of settlement and potentially the reproduction success of the historically important eastern Baltic cod nursery grounds. Only low proportions of juveniles hatched in the Arkona Basin and in the Gotland Basin were able to settle in their respective spawning ground. Ocean currents were either unfavorable for the juveniles to reach suitable habitats or transported the juveniles to nursery grounds of neighboring subdivisions. Juveniles which hatched in the Bornholm Basin were most widely dispersed and showed the highest settlement probability, while the second highest settlement probability and horizontal dispersal was observed for juveniles originating from the Gdansk Deep. In a long-term perspective, wind-driven transport of larvae/juveniles positively affected the settlement success predominately in the Bornholm Basin and in the Bay of Gdansk. The Bornholm Basin has the potential to contribute on average 54% and the Bay of Gdansk 11% to the production of juveniles in the Baltic Sea. Furthermore, transport of juveniles surviving to the age of settlement with origin in the Bornholm Basin contributed on average 13 and 11% to the total settlement in the Arkona Basin and in the Gdansk Deep, respectively. The time-series of the simulated occupied juvenile cod habitat in the Bornholm Basin and in the Gdansk Deep showed a similar declining trend as the Fulton's K condition factor of demersal 1-group cod, which may confirm the importance of oxygen-dependent habitat

  4. CHALLENGES IN SETTING UP QUALITY CONTROL IN DIAGNOSTIC RADIOLOGY FACILITIES IN NIGERIA.

    PubMed

    Inyang, S O; Egbe, N O; Ekpo, E

    2015-01-01

    The Nigerian Nuclear Regulatory Authority (NNRA) was established to regulate and control the use of radioactive and radiation emitting sources in Nigeria. Quality control (QC) on diagnostic radiology equipment form part of the fundamental requirements for the authorization of diagnostic radiology facilities in the Country. Some quality control tests (output, exposure linearity and reproducibility) were measured on the x-ray machines in the facilities that took part in the study. Questionnaire was developed to evaluate the frequencies at which QC tests were conducted in the facilities and the challenges in setting up QC. Results show great variation in the values of the QC parameters measured. Inadequate cooperation by facilities management, lack of QC equipment and insufficient staff form the major challenges in setting up QC in the facilities under study. The responses on the frequencies at which QC tests should be conducted did not correspond to the recommended standards; indicating that personnel were not familiar with QC implementation and may require further training on QC.

  5. The experimental set-up of the RIB in-flight facility EXOTIC

    NASA Astrophysics Data System (ADS)

    Pierroutsakou, D.; Boiano, A.; Boiano, C.; Di Meo, P.; La Commara, M.; Manea, C.; Mazzocco, M.; Nicoletto, M.; Parascandolo, C.; Signorini, C.; Soramel, F.; Strano, E.; Toniolo, N.; Torresi, D.; Tortone, G.; Anastasio, A.; Bettini, M.; Cassese, C.; Castellani, L.; Corti, D.; Costa, L.; De Fazio, B.; Galet, G.; Glodariu, T.; Grebosz, J.; Guglielmetti, A.; Molini, P.; Pontoriere, G.; Rocco, R.; Romoli, M.; Roscilli, L.; Sandoli, M.; Stroe, L.; Tessaro, M.; Zatti, P. G.

    2016-10-01

    We describe the experimental set-up of the Radioactive Ion Beam (RIB) in-flight facility EXOTIC consisting of: (a) two position-sensitive Parallel Plate Avalanche Counters (PPACs), dedicated to the event-by-event tracking of the produced RIBs and to time of flight measurements and (b) the new high-granularity compact telescope array EXPADES (EXotic PArticle DEtection System), designed for nuclear physics and nuclear astrophysics experiments employing low-energy light RIBs. EXPADES consists of eight ΔE -Eres telescopes arranged in a cylindrical configuration around the target. Each telescope is made up of two Double Sided Silicon Strip Detectors (DSSSDs) with a thickness of 40/60 μm and 300 μm for the ΔE and Eres layer, respectively. Additionally, eight ionization chambers were constructed to be used as an alternative ΔE stage or, in conjunction with the entire DSSSD array, to build up more complex triple telescopes. New low-noise multi-channel charge-sensitive preamplifiers and spectroscopy amplifiers, associated with constant fraction discriminators, peak-and-hold and Time to Amplitude Converter circuits were developed for the electronic readout of the ΔE stage. Application Specific Integrated Circuit-based electronics was employed for the treatment of the Eres signals. An 8-channel, 12-bit multi-sampling 50 MHz Analog to Digital Converter, a Trigger Supervisor Board for handling the trigger signals of the whole experimental set-up and an ad hoc data acquisition system were also developed. The performance of the PPACs, EXPADES and of the associated electronics was obtained offline with standard α calibration sources and in-beam by measuring the scattering process for the systems 17O+58Ni and 17O+208Pb at incident energies around their respective Coulomb barriers and, successively, during the first experimental runs with the RIBs of the EXOTIC facility.

  6. Set up for Success: An Examination of the Ronald E. McNair Postbaccalaureate Achievement Program's Mentoring Component

    ERIC Educational Resources Information Center

    Wyre, Dwuena Cene

    2011-01-01

    Often, individuals are set up to fail. However, effective mentoring can set individuals up to succeed. This nonexperimental cross-sectional, predictive study examines the Ronald E. McNair Postbaccalaureate Achievement Program's mentoring component. Specific focus is placed on faculty mentor competency and its impact on McNair student intent to…

  7. A Novel Computer-Based Set-Up to Study Movement Coordination in Human Ensembles

    PubMed Central

    Alderisio, Francesco; Lombardi, Maria; Fiore, Gianfranco; di Bernardo, Mario

    2017-01-01

    Existing experimental works on movement coordination in human ensembles mostly investigate situations where each subject is connected to all the others through direct visual and auditory coupling, so that unavoidable social interaction affects their coordination level. Here, we present a novel computer-based set-up to study movement coordination in human groups so as to minimize the influence of social interaction among participants and implement different visual pairings between them. In so doing, players can only take into consideration the motion of a designated subset of the others. This allows the evaluation of the exclusive effects on coordination of the structure of interconnections among the players in the group and their own dynamics. In addition, our set-up enables the deployment of virtual computer players to investigate dyadic interaction between a human and a virtual agent, as well as group synchronization in mixed teams of human and virtual agents. We show how this novel set-up can be employed to study coordination both in dyads and in groups over different structures of interconnections, in the presence as well as in the absence of virtual agents acting as followers or leaders. Finally, in order to illustrate the capabilities of the architecture, we describe some preliminary results. The platform is available to any researcher who wishes to unfold the mechanisms underlying group synchronization in human ensembles and shed light on its socio-psychological aspects. PMID:28649217

  8. 19 CFR 10.605 - Goods classifiable as goods put up in sets.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...-Central America-United States Free Trade Agreement Rules of Origin § 10.605 Goods classifiable as goods... 19 Customs Duties 1 2010-04-01 2010-04-01 false Goods classifiable as goods put up in sets. 10.605 Section 10.605 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY...

  9. The application of a low-cost 3D depth camera for patient set-up and respiratory motion management in radiotherapy

    NASA Astrophysics Data System (ADS)

    Tahavori, Fatemeh

    Respiratory motion induces uncertainty in External Beam Radiotherapy (EBRT), which can result in sub-optimal dose delivery to the target tissue and unwanted dose to normal tissue. The conventional approach to managing patient respiratory motion for EBRT within the area of abdominal-thoracic cancer is through the use of internal radiological imaging methods (e.g. Megavoltage imaging or Cone-Beam Computed Tomography) or via surrogate estimates of tumour position using external markers placed on the patient chest. This latter method uses tracking with video-based techniques, and relies on an assumed correlation or mathematical model, between the external surrogate signal and the internal target position. The marker's trajectory can be used in both respiratory gating techniques and real-time tracking methods. Internal radiological imaging methods bring with them limited temporal resolution, and additional radiation burden, which can be addressed by external marker-based methods that carry no such issues. Moreover, by including multiple external markers and placing them closer to the internal target organs, the effciency of correlation algorithms can be increased. However, the quality of such external monitoring methods is underpinned by the performance of the associated correlation model. Therefore, several new approaches to correlation modelling have been developed as part of this thesis and compared using publicly-available datasets. Highly competitive results have been obtained when compared against state-of-the-art methods. Marker-based methods also have the disadvantages of requiring manual set-up time for marker placement and patient positioning and potential issues with reproducibility of marker placement. This motivates the investigation of non-contact marker-free methods for use in EBRT, which is the main topic of this thesis. The Microsoft Kinect is used as an example of a low-cost consumer grade 3D depth camera for capturing and analysing external

  10. a New Set-Up for Total Reaction Cross Section Measuring

    NASA Astrophysics Data System (ADS)

    Sobolev, Yu. G.; Ivanov, M. P.; Kugler, A.; Penionzhkevich, Yu. E.

    2013-06-01

    The experimental method and set-up based on 4 n-Υ-technique for direct and modelindependent measuring of the total reaction cross section σR have been presented. The excitation function σR(E) for 6He+197Au reaction at the Coulomb barrier energy region has been measured. The measured data are compared with the summarized cross section which has been prepared by summing of measured cross sections of main reaction channels: 1n-transfer and 197Au(6He, xn)203-xnT1 with x = 2÷7 evaporation reaction channels.

  11. Building up STEM education professional learning community in school setting: Case of Khon Kaen Wittayayon School

    NASA Astrophysics Data System (ADS)

    Thana, Aduldej; Siripun, Kulpatsorn; Yuenyong, Chokchai

    2018-01-01

    The STEM education is new issue of teaching and learning in school setting. Building up STEM education professional learning community may provide some suggestions for further collaborative work of STEM Education from grounded up. This paper aimed to clarify the building up STEM education learning community in Khon Kaen Wittayayon (KKW) School setting. Participants included Khon Kaen University researchers, Khon Kaen Wittayayon School administrators and teachers. Methodology regarded interpretative paradigm. The tools of interpretation included participant observation, interview and document analysis. Data was analyzed to categories of condition for building up STEM education professional learning community. The findings revealed that the actions of developing STEM learning activities and research showed some issues of KKW STEM community of inquiry and improvement. The paper will discuss what and how the community learns about sharing vision of STEM Education, supportive physical and social conditions of KKW, sharing activities of STEM, and good things from some key STEM teachers' ambition. The paper may has implication of supporting STEM education in Thailand school setting.

  12. Simulating the X-Ray Image Contrast to Set-Up Techniques with Desired Flaw Detectability

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay M.

    2015-01-01

    The paper provides simulation data of previous work by the author in developing a model for estimating detectability of crack-like flaws in radiography. The methodology is being developed to help in implementation of NASA Special x-ray radiography qualification, but is generically applicable to radiography. The paper describes a method for characterizing X-ray detector resolution for crack detection. Applicability of ASTM E 2737 resolution requirements to the model are also discussed. The paper describes a model for simulating the detector resolution. A computer calculator application, discussed here, also performs predicted contrast and signal-to-noise ratio calculations. Results of various simulation runs in calculating x-ray flaw size parameter and image contrast for varying input parameters such as crack depth, crack width, part thickness, x-ray angle, part-to-detector distance, part-to-source distance, source sizes, and detector sensitivity and resolution are given as 3D surfaces. These results demonstrate effect of the input parameters on the flaw size parameter and the simulated image contrast of the crack. These simulations demonstrate utility of the flaw size parameter model in setting up x-ray techniques that provide desired flaw detectability in radiography. The method is applicable to film radiography, computed radiography, and digital radiography.

  13. PromISS 4 hardware set up in the MSG during Expedition 12

    NASA Image and Video Library

    2006-01-18

    ISS012-E-16184 (18 Jan. 2006) --- Astronaut William S. (Bill) McArthur, Jr., Expedition 12 commander and NASA space station science officer, sets up the Protein Crystal Growth Monitoring by Digital Holographic Microscope (PromISS) experiment hardware inside the Microgravity Science Glovebox (MSG) facility in the Destiny laboratory on the International Space Station.

  14. Development of Data Acquisition Set-up for Steady-state Experiments

    NASA Astrophysics Data System (ADS)

    Srivastava, Amit K.; Gupta, Arnab D.; Sunil, S.; Khan, Ziauddin

    2017-04-01

    For short duration experiments, generally digitized data is transferred for processing and storage after the experiment whereas in case of steady-state experiment the data is acquired, processed, displayed and stored continuously in pipelined manner. This requires acquiring data through special techniques for storage and on-the-go viewing data to display the current data trends for various physical parameters. A small data acquisition set-up is developed for continuously acquiring signals from various physical parameters at different sampling rate for long duration experiment. This includes the hardware set-up for signal digitization, Field Programmable Gate Arrays (FPGA) based timing system for clock synchronization and event/trigger distribution, time slicing of data streams for storage of data chunks to enable viewing of data during acquisition and channel profile display through down sampling etc. In order to store a long data stream of indefinite/long time duration, the data stream is divided into data slices/chunks of user defined time duration. Data chunks avoid the problem of non-access of server data until the channel data file is closed at the end of the long duration experiment. A graphical user interface has been developed in Lab VIEW application development environment for configuring the data acquisition hardware and storing data chunks on local machine as well as at remote data server through Python for further data access. The data plotting and analysis utilities have been developed with Python software, which provides tools for further data processing. This paper describes the development and implementation of data acquisition for steady-state experiment.

  15. Calibration of a Background Oriented Schlieren (BOS) Set-up

    NASA Astrophysics Data System (ADS)

    Porta, David; Echeverría, Carlos; Cardoso, Hiroki; Aguayo, Alejandro; Stern, Catalina

    2014-11-01

    We use two materials with different known indexes of refraction to calibrate a Background Oriented Schlieren (BOS) experimental set-up, and to validate the Lorenz-Lorentz equation. BOS is used in our experiments to determine local changes of density in the shock pattern of an axisymmetric supersonic air jet. It is important to validate, in particular, the Gladstone Dale approximation (index of refraction close to one) in our experimental conditions and determine the uncertainty of our density measurements. In some cases, the index of refraction of the material is well known, but in others the density is measured and related to the displacement field. We acknowledge support from UNAM through DGAPA PAPIIT IN117712 and the Graduate Program in Mechanical Engineering.

  16. Predicting inpatient clinical order patterns with probabilistic topic models vs conventional order sets.

    PubMed

    Chen, Jonathan H; Goldstein, Mary K; Asch, Steven M; Mackey, Lester; Altman, Russ B

    2017-05-01

    Build probabilistic topic model representations of hospital admissions processes and compare the ability of such models to predict clinical order patterns as compared to preconstructed order sets. The authors evaluated the first 24 hours of structured electronic health record data for > 10 K inpatients. Drawing an analogy between structured items (e.g., clinical orders) to words in a text document, the authors performed latent Dirichlet allocation probabilistic topic modeling. These topic models use initial clinical information to predict clinical orders for a separate validation set of > 4 K patients. The authors evaluated these topic model-based predictions vs existing human-authored order sets by area under the receiver operating characteristic curve, precision, and recall for subsequent clinical orders. Existing order sets predict clinical orders used within 24 hours with area under the receiver operating characteristic curve 0.81, precision 16%, and recall 35%. This can be improved to 0.90, 24%, and 47% ( P  < 10 -20 ) by using probabilistic topic models to summarize clinical data into up to 32 topics. Many of these latent topics yield natural clinical interpretations (e.g., "critical care," "pneumonia," "neurologic evaluation"). Existing order sets tend to provide nonspecific, process-oriented aid, with usability limitations impairing more precise, patient-focused support. Algorithmic summarization has the potential to breach this usability barrier by automatically inferring patient context, but with potential tradeoffs in interpretability. Probabilistic topic modeling provides an automated approach to detect thematic trends in patient care and generate decision support content. A potential use case finds related clinical orders for decision support. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  17. Predicting inpatient clinical order patterns with probabilistic topic models vs conventional order sets

    PubMed Central

    Goldstein, Mary K; Asch, Steven M; Mackey, Lester; Altman, Russ B

    2017-01-01

    Objective: Build probabilistic topic model representations of hospital admissions processes and compare the ability of such models to predict clinical order patterns as compared to preconstructed order sets. Materials and Methods: The authors evaluated the first 24 hours of structured electronic health record data for > 10 K inpatients. Drawing an analogy between structured items (e.g., clinical orders) to words in a text document, the authors performed latent Dirichlet allocation probabilistic topic modeling. These topic models use initial clinical information to predict clinical orders for a separate validation set of > 4 K patients. The authors evaluated these topic model-based predictions vs existing human-authored order sets by area under the receiver operating characteristic curve, precision, and recall for subsequent clinical orders. Results: Existing order sets predict clinical orders used within 24 hours with area under the receiver operating characteristic curve 0.81, precision 16%, and recall 35%. This can be improved to 0.90, 24%, and 47% (P < 10−20) by using probabilistic topic models to summarize clinical data into up to 32 topics. Many of these latent topics yield natural clinical interpretations (e.g., “critical care,” “pneumonia,” “neurologic evaluation”). Discussion: Existing order sets tend to provide nonspecific, process-oriented aid, with usability limitations impairing more precise, patient-focused support. Algorithmic summarization has the potential to breach this usability barrier by automatically inferring patient context, but with potential tradeoffs in interpretability. Conclusion: Probabilistic topic modeling provides an automated approach to detect thematic trends in patient care and generate decision support content. A potential use case finds related clinical orders for decision support. PMID:27655861

  18. Mental models of audit and feedback in primary care settings.

    PubMed

    Hysong, Sylvia J; Smitham, Kristen; SoRelle, Richard; Amspoker, Amber; Hughes, Ashley M; Haidet, Paul

    2018-05-30

    Audit and feedback has been shown to be instrumental in improving quality of care, particularly in outpatient settings. The mental model individuals and organizations hold regarding audit and feedback can moderate its effectiveness, yet this has received limited study in the quality improvement literature. In this study we sought to uncover patterns in mental models of current feedback practices within high- and low-performing healthcare facilities. We purposively sampled 16 geographically dispersed VA hospitals based on high and low performance on a set of chronic and preventive care measures. We interviewed up to 4 personnel from each location (n = 48) to determine the facility's receptivity to audit and feedback practices. Interview transcripts were analyzed via content and framework analysis to identify emergent themes. We found high variability in the mental models of audit and feedback, which we organized into positive and negative themes. We were unable to associate mental models of audit and feedback with clinical performance due to high variance in facility performance over time. Positive mental models exhibit perceived utility of audit and feedback practices in improving performance; whereas, negative mental models did not. Results speak to the variability of mental models of feedback, highlighting how facilities perceive current audit and feedback practices. Findings are consistent with prior research  in that variability in feedback mental models is associated with lower performance.; Future research should seek to empirically link mental models revealed in this paper to high and low levels of clinical performance.

  19. A technique for treating local breast cancer using a single set-up point and asymmetric collimation.

    PubMed

    Rosenow, U F; Valentine, E S; Davis, L W

    1990-07-01

    Using both pairs of asymmetric jaws of a linear accelerator local-regional breast cancer may be treated from a single set-up point. This point is placed at the abutment of the supraclavicular fields with the medial and lateral tangential fields. Positioning the jaws to create a half-beam superiorly permits treatment of the supraclavicular field. Positioning both jaws asymmetrically at midline to define a single beam in the inferoanterior quadrant permits treatment of the breast from medial and lateral tangents. The highest possible matching accuracy between the supraclavicular and tangential fields is inherently provided by this technique. For treatment of all fields at 100 cm source to axis distance (SAD) the lateral placement and depth of the set-up point may be determined by simulation and simple trigonometry. We elaborate on the clinical procedure. For the technologists treatment of all fields from a single set-up point is simple and efficient. Since the tissue at the superior border of the tangential fields is generally firmer than in mid-breast, greater accuracy in day-to-day set-up is permitted. This technique eliminates the need for table angles even when tangential fields only are planned. Because of half-beam collimation the limit to the tangential field length is 20 cm. Means will be suggested to overcome this limitation in the few cases where it occurs. Another modification is suggested for linear accelerators with only one independent pair of jaws.

  20. Kondratvez sets up Sonokard Experiment in the SM during Expedition 26

    NASA Image and Video Library

    2011-01-03

    ISS026-E-014250 (3 Jan. 2011) --- Russian cosmonaut Dmitry Kondratyev, Expedition 26 flight engineer, sets up the Russian MBI-12 payload for a Sonokard experiment session in the Zvezda Service Module of the International Space Station. Kondratyev used a sports shirt from the Sonokard kit with a special device in the pocket for testing a new method for acquiring physiological data without using direct contact on the skin. Measurements are recorded on a data card for return to Earth.

  1. An integrative top-down and bottom-up qualitative model construction framework for exploration of biochemical systems.

    PubMed

    Wu, Zujian; Pang, Wei; Coghill, George M

    Computational modelling of biochemical systems based on top-down and bottom-up approaches has been well studied over the last decade. In this research, after illustrating how to generate atomic components by a set of given reactants and two user pre-defined component patterns, we propose an integrative top-down and bottom-up modelling approach for stepwise qualitative exploration of interactions among reactants in biochemical systems. Evolution strategy is applied to the top-down modelling approach to compose models, and simulated annealing is employed in the bottom-up modelling approach to explore potential interactions based on models constructed from the top-down modelling process. Both the top-down and bottom-up approaches support stepwise modular addition or subtraction for the model evolution. Experimental results indicate that our modelling approach is feasible to learn the relationships among biochemical reactants qualitatively. In addition, hidden reactants of the target biochemical system can be obtained by generating complex reactants in corresponding composed models. Moreover, qualitatively learned models with inferred reactants and alternative topologies can be used for further web-lab experimental investigations by biologists of interest, which may result in a better understanding of the system.

  2. Abundance Patterns in S-type AGB Stars: Setting Constraints on Nucleosynthesis and Stellar Evolution Models

    NASA Astrophysics Data System (ADS)

    Neyskens, P.; van Eck, S.; Plez, B.; Goriely, S.; Siess, L.; Jorissen, A.

    2011-09-01

    During evolution on the AGB, stars of type S are the first to experience s-process nucleosynthesis and the third dredge-up, and therefore to exhibit s-process signatures in their atmospheres. Their high mass-loss rates (10-7 to 10-6 M⊙/year) make them major contributors to the AGB nucleosynthesis yields at solar metallicity. Precise abundance determinations in S stars are of the utmost importance for constraining e.g. the third dredge-up luminosity and efficiency (which has been only crudely parameterized in current nucleosynthetic models so far). Here, dedicated S-star model atmospheres are used to determine precise abundances of key s-process elements, and to set constraints on nucleosynthesis and stellar evolution models. Special interest is paid to technetium, an element with no stable isotopes. Its detection is considered the best signature that the star effectively populates the thermally-pulsing AGB phase of evolution. The derived Tc/Zr abundances are compared, as a function of the derived [Zr/Fe] overabundances, with AGB stellar model predictions. The [Zr/Fe] overabundances are in good agreement with model predictions, while the Tc/Zr abundances are slightly overpredicted. This discrepancy can help to set better constraints on nucleosynthesis and stellar evolution models of AGB stars.

  3. Setting up a sexual health one stop shop for young people at college.

    PubMed

    Sands, Lindsay

    This article describes the development of a nurse led drop-in sexual health service in a further education college. It looks at the issues for practitioners to consider when setting up a nurse led clinic in this environment and the importance of working in partnership with education and youth services.

  4. Bottom-up priority setting revised. A second evaluation of an institutional intervention in a Swedish health care organisation.

    PubMed

    Waldau, Susanne

    2015-09-01

    Transparent priority setting in health care based on specific ethical principles is requested by the Swedish Parliament since 1997. Implementation has been limited. In this case, transparent priority setting was performed for a second time round and engaged an entire health care organisation. Objectives were to refine a bottom-up priority setting process, reach a political decision on service limits to make reallocation towards higher prioritised services possible, and raise systems knowledge. An action research approach was chosen. The national model for priority setting was used with addition of dimensions costs, volumes, gender distribution and feasibility. The intervention included a three step process and specific procedures for each step which were created, revised and evaluated regarding factual and functional aspects. Evaluations methods included analyses of documents, recordings and surveys. Vertical and horizontal priority setting occurred and resources were reallocated. Participants' attitudes remained positive, however less so than in the first priority setting round. Identifying low-priority services was perceived difficult, causing resentment and strategic behaviour. The horizontal stage served to raise quality of the knowledge base, level out differences in ranking of services and raise systems knowledge. Existing health care management systems do not meet institutional requirements for transparent priority setting. Introducing transparent priority setting constitutes a complex institutional reform, which needs to be driven by management/administration. Strong managerial commitment is required. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  5. Epfl Lyisimeters Measurements Campaign Summer 2010:Set-Up and First Results

    NASA Astrophysics Data System (ADS)

    Ciocca, F.; Parlange, M.; Lunati, I.; van de Giesen, N.; Huwald, H.

    2010-09-01

    The goal of this experience is to evaluate the main contribution to heat and moisture fluxes into two different kinds of bare soils, one artificially realized and one real. The main hope is to definitely give an answer to the still open question of the effective role played by water vapor in the diffusion processes of heat and moisture, theoretically less efficient than liquid water of several order of magnitude but still considered the main responsible of unexpected high heat fluxes measured in many previous experiments. A refutation or a confirmation of the existence of the so discussed enhancement factor, or of a meaningful contribution by air advection, is also waited. To do this the six weighable lysimeters installed at the EPF Lausanne have been set up with a very accurate weighing system and used. Three of them filled up in the same way with natural sand silty soil coming from the site of Conthey (Sion - CH), filtered and put inside the tanks in homogeneous layer using a big sieve, without trying to preserve the original structure. For the remaining three an artificial porous mix, with textural properties as close as possible to those of the real soil, has been realized. Then a comparison between the real soil containing organic matter and the artificial sterile medium will be possible. A thick series of FDR and tensiometers has been installed in the upper part of each lysimeter and a new technique to measure volumetric water content using warmed optical fiber has been installed in two of them (one natural and one artificial). Incoming (general) and outcoming (for every lysimeter) short and longwave radiation have been measured, for consideration about energy balance. A comparison of the results obtained using a simple numerical model will also be realized.

  6. Re-using biological devices: a model-aided analysis of interconnected transcriptional cascades designed from the bottom-up.

    PubMed

    Pasotti, Lorenzo; Bellato, Massimo; Casanova, Michela; Zucca, Susanna; Cusella De Angelis, Maria Gabriella; Magni, Paolo

    2017-01-01

    The study of simplified, ad-hoc constructed model systems can help to elucidate if quantitatively characterized biological parts can be effectively re-used in composite circuits to yield predictable functions. Synthetic systems designed from the bottom-up can enable the building of complex interconnected devices via rational approach, supported by mathematical modelling. However, such process is affected by different, usually non-modelled, unpredictability sources, like cell burden. Here, we analyzed a set of synthetic transcriptional cascades in Escherichia coli . We aimed to test the predictive power of a simple Hill function activation/repression model (no-burden model, NBM) and of a recently proposed model, including Hill functions and the modulation of proteins expression by cell load (burden model, BM). To test the bottom-up approach, the circuit collection was divided into training and test sets, used to learn individual component functions and test the predicted output of interconnected circuits, respectively. Among the constructed configurations, two test set circuits showed unexpected logic behaviour. Both NBM and BM were able to predict the quantitative output of interconnected devices with expected behaviour, but only the BM was also able to predict the output of one circuit with unexpected behaviour. Moreover, considering training and test set data together, the BM captures circuits output with higher accuracy than the NBM, which is unable to capture the experimental output exhibited by some of the circuits even qualitatively. Finally, resource usage parameters, estimated via BM, guided the successful construction of new corrected variants of the two circuits showing unexpected behaviour. Superior descriptive and predictive capabilities were achieved considering resource limitation modelling, but further efforts are needed to improve the accuracy of models for biological engineering.

  7. Totomatix: a novel automatic set-up to control diurnal, diel and long-term plant nitrate nutrition

    PubMed Central

    Adamowicz, Stéphane; Le Bot, Jacques; Huanosto Magaña, Ruth; Fabre, José

    2012-01-01

    Background Stand-alone nutritional set-ups are useful tools to grow plants at defined nutrient availabilities and to measure nutrient uptake rates continuously, in particular that for nitrate. Their use is essential when the measurements are meant to cover long time periods. These complex systems have, however, important drawbacks, including poor long-term reliability and low precision at high nitrate concentration. This explains why the information dealing with diel dynamics of nitrate uptake rate is scarce and concerns mainly young plants grown at low nitrate concentration. Scope The novel system detailed in this paper has been developed to allow versatile use in growth rooms, greenhouses or open fields at nitrate concentrations ranging from a few micro- to several millimoles per litres. The system controls, at set frequencies, the solution nitrate concentration, pH and volumes. Nitrate concentration is measured by spectral deconvolution of UV spectra. The main advantages of the set-up are its low maintenance (weekly basis), an ability to diagnose interference or erroneous analyses and high precision of nitrate concentration measurements (0·025 % at 3 mm). The paper details the precision of diurnal nitrate uptake rate measurements, which reveals sensitivity to solution volume at low nitrate concentration, whereas at high concentration, it is mostly sensitive to the precision of volume estimates. Conclusions This novel set-up allows us to measure and characterize the dynamics of plant nitrate nutrition at high temporal resolution (minutes to hours) over long-term experiments (up to 1 year). It is reliable and also offers a novel method to regulate up to seven N treatments by adjusting the daily uptake of test plants relative to controls, in variable environments such as open fields and glasshouses. PMID:21985796

  8. CUTSETS - MINIMAL CUT SET CALCULATION FOR DIGRAPH AND FAULT TREE RELIABILITY MODELS

    NASA Technical Reports Server (NTRS)

    Iverson, D. L.

    1994-01-01

    Fault tree and digraph models are frequently used for system failure analysis. Both type of models represent a failure space view of the system using AND and OR nodes in a directed graph structure. Fault trees must have a tree structure and do not allow cycles or loops in the graph. Digraphs allow any pattern of interconnection between loops in the graphs. A common operation performed on digraph and fault tree models is the calculation of minimal cut sets. A cut set is a set of basic failures that could cause a given target failure event to occur. A minimal cut set for a target event node in a fault tree or digraph is any cut set for the node with the property that if any one of the failures in the set is removed, the occurrence of the other failures in the set will not cause the target failure event. CUTSETS will identify all the minimal cut sets for a given node. The CUTSETS package contains programs that solve for minimal cut sets of fault trees and digraphs using object-oriented programming techniques. These cut set codes can be used to solve graph models for reliability analysis and identify potential single point failures in a modeled system. The fault tree minimal cut set code reads in a fault tree model input file with each node listed in a text format. In the input file the user specifies a top node of the fault tree and a maximum cut set size to be calculated. CUTSETS will find minimal sets of basic events which would cause the failure at the output of a given fault tree gate. The program can find all the minimal cut sets of a node, or minimal cut sets up to a specified size. The algorithm performs a recursive top down parse of the fault tree, starting at the specified top node, and combines the cut sets of each child node into sets of basic event failures that would cause the failure event at the output of that gate. Minimal cut set solutions can be found for all nodes in the fault tree or just for the top node. The digraph cut set code uses the same

  9. A practical model for the train-set utilization: The case of Beijing-Tianjin passenger dedicated line in China

    PubMed Central

    Li, Xiaomeng; Yang, Zhuo

    2017-01-01

    As a sustainable transportation mode, high-speed railway (HSR) has become an efficient way to meet the huge travel demand. However, due to the high acquisition and maintenance cost, it is impossible to build enough infrastructure and purchase enough train-sets. Great efforts are required to improve the transport capability of HSR. The utilization efficiency of train-sets (carrying tools of HSR) is one of the most important factors of the transport capacity of HSR. In order to enhance the utilization efficiency of the train-sets, this paper proposed a train-set circulation optimization model to minimize the total connection time. An innovative two-stage approach which contains segments generation and segments combination was designed to solve this model. In order to verify the feasibility of the proposed approach, an experiment was carried out in the Beijing-Tianjin passenger dedicated line, to fulfill a 174 trips train diagram. The model results showed that compared with the traditional Ant Colony Algorithm (ACA), the utilization efficiency of train-sets can be increased from 43.4% (ACA) to 46.9% (Two-Stage), and 1 train-set can be saved up to fulfill the same transportation tasks. The approach proposed in the study is faster and more stable than the traditional ones, by using which, the HSR staff can draw up the train-sets circulation plan more quickly and the utilization efficiency of the HSR system is also improved. PMID:28489933

  10. Set-free Markov state model building

    NASA Astrophysics Data System (ADS)

    Weber, Marcus; Fackeldey, Konstantin; Schütte, Christof

    2017-03-01

    Molecular dynamics (MD) simulations face challenging problems since the time scales of interest often are much longer than what is possible to simulate; and even if sufficiently long simulations are possible the complex nature of the resulting simulation data makes interpretation difficult. Markov State Models (MSMs) help to overcome these problems by making experimentally relevant time scales accessible via coarse grained representations that also allow for convenient interpretation. However, standard set-based MSMs exhibit some caveats limiting their approximation quality and statistical significance. One of the main caveats results from the fact that typical MD trajectories repeatedly re-cross the boundary between the sets used to build the MSM which causes statistical bias in estimating the transition probabilities between these sets. In this article, we present a set-free approach to MSM building utilizing smooth overlapping ansatz functions instead of sets and an adaptive refinement approach. This kind of meshless discretization helps to overcome the recrossing problem and yields an adaptive refinement procedure that allows us to improve the quality of the model while exploring state space and inserting new ansatz functions into the MSM.

  11. Load test set-up for the Airmass Sunburst Ultra-Light Aircraft

    NASA Technical Reports Server (NTRS)

    Krug, Daniel W.; Smith, Howard W.

    1993-01-01

    The purpose of this project was to set up, instrument, and test a Sunburst Ultra-Light aircraft. The intentions of the project were that the aircraft would need to be suspended from the test stand, leveled in the stand, the strain gauges tested and wired to the test equipment, and finally, the aircraft would be destroyed to obtain the failing loads. All jobs were completed, except for the destruction of the aircraft. This notebook shows the group's progress as these tasks were completed, and the following section attempts to explain the photographs in the notebook.

  12. On the assimilation set-up of ASCAT soil moisture data for improving streamflow catchment simulation

    NASA Astrophysics Data System (ADS)

    Loizu, Javier; Massari, Christian; Álvarez-Mozos, Jesús; Tarpanelli, Angelica; Brocca, Luca; Casalí, Javier

    2018-01-01

    Assimilation of remotely sensed surface soil moisture (SSM) data into hydrological catchment models has been identified as a means to improve streamflow simulations, but reported results vary markedly depending on the particular model, catchment and assimilation procedure used. In this study, the influence of key aspects, such as the type of model, re-scaling technique and SSM observation error considered, were evaluated. For this aim, Advanced SCATterometer ASCAT-SSM observations were assimilated through the ensemble Kalman filter into two hydrological models of different complexity (namely MISDc and TOPLATS) run on two Mediterranean catchments of similar size (750 km2). Three different re-scaling techniques were evaluated (linear re-scaling, variance matching and cumulative distribution function matching), and SSM observation error values ranging from 0.01% to 20% were considered. Four different efficiency measures were used for evaluating the results. Increases in Nash-Sutcliffe efficiency (0.03-0.15) and efficiency indices (10-45%) were obtained, especially when linear re-scaling and observation errors within 4-6% were considered. This study found out that there is a potential to improve streamflow prediction through data assimilation of remotely sensed SSM in catchments of different characteristics and with hydrological models of different conceptualizations schemes, but for that, a careful evaluation of the observation error and re-scaling technique set-up utilized is required.

  13. Diagnostic and Prognostic Models for Generator Step-Up Transformers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vivek Agarwal; Nancy J. Lybeck; Binh T. Pham

    In 2014, the online monitoring (OLM) of active components project under the Light Water Reactor Sustainability program at Idaho National Laboratory (INL) focused on diagnostic and prognostic capabilities for generator step-up transformers. INL worked with subject matter experts from the Electric Power Research Institute (EPRI) to augment and revise the GSU fault signatures previously implemented in the Electric Power Research Institute’s (EPRI’s) Fleet-Wide Prognostic and Health Management (FW-PHM) Suite software. Two prognostic models were identified and implemented for GSUs in the FW-PHM Suite software. INL and EPRI demonstrated the use of prognostic capabilities for GSUs. The complete set of faultmore » signatures developed for GSUs in the Asset Fault Signature Database of the FW-PHM Suite for GSUs is presented in this report. Two prognostic models are described for paper insulation: the Chendong model for degree of polymerization, and an IEEE model that uses a loading profile to calculates life consumption based on hot spot winding temperatures. Both models are life consumption models, which are examples of type II prognostic models. Use of the models in the FW-PHM Suite was successfully demonstrated at the 2014 August Utility Working Group Meeting, Idaho Falls, Idaho, to representatives from different utilities, EPRI, and the Halden Research Project.« less

  14. Optical digital to analog conversion performance analysis for indoor set-up conditions

    NASA Astrophysics Data System (ADS)

    Dobesch, Aleš; Alves, Luis Nero; Wilfert, Otakar; Ribeiro, Carlos Gaspar

    2017-10-01

    In visible light communication (VLC) the optical digital to analog conversion (ODAC) approach was proposed as a suitable driving technique able to overcome light-emitting diode's (LED) non-linear characteristic. This concept is analogous to an electrical digital-to-analog converter (EDAC). In other words, digital bits are binary weighted to represent an analog signal. The method supports elementary on-off based modulations able to exploit the essence of LED's non-linear characteristic allowing simultaneous lighting and communication. In the ODAC concept the reconstruction error does not simply rely upon the converter bit depth as in case of EDAC. It rather depends on communication system set-up and geometrical relation between emitter and receiver as well. The paper describes simulation results presenting the ODAC's error performance taking into account: the optical channel, the LED's half power angle (HPA) and the receiver field of view (FOV). The set-up under consideration examines indoor conditions for a square room with 4 m length and 3 m height, operating with one dominant wavelength (blue) and having walls with a reflection coefficient of 0.8. The achieved results reveal that reconstruction error increases for higher data rates as a result of interference due to multipath propagation.

  15. Set Up of an Automatic Water Quality Sampling System in Irrigation Agriculture

    PubMed Central

    Heinz, Emanuel; Kraft, Philipp; Buchen, Caroline; Frede, Hans-Georg; Aquino, Eugenio; Breuer, Lutz

    2014-01-01

    We have developed a high-resolution automatic sampling system for continuous in situ measurements of stable water isotopic composition and nitrogen solutes along with hydrological information. The system facilitates concurrent monitoring of a large number of water and nutrient fluxes (ground, surface, irrigation and rain water) in irrigated agriculture. For this purpose we couple an automatic sampling system with a Wavelength-Scanned Cavity Ring Down Spectrometry System (WS-CRDS) for stable water isotope analysis (δ2H and δ18O), a reagentless hyperspectral UV photometer (ProPS) for monitoring nitrate content and various water level sensors for hydrometric information. The automatic sampling system consists of different sampling stations equipped with pumps, a switch cabinet for valve and pump control and a computer operating the system. The complete system is operated via internet-based control software, allowing supervision from nearly anywhere. The system is currently set up at the International Rice Research Institute (Los Baños, The Philippines) in a diversified rice growing system to continuously monitor water and nutrient fluxes. Here we present the system's technical set-up and provide initial proof-of-concept with results for the isotopic composition of different water sources and nitrate values from the 2012 dry season. PMID:24366178

  16. Inverse and Forward Modeling of The 2014 Iquique Earthquake with Run-up Data

    NASA Astrophysics Data System (ADS)

    Fuentes, M.

    2015-12-01

    The April 1, 2014 Mw 8.2 Iquique earthquake excited a moderate tsunami which turned on the national alert of tsunami threat. This earthquake was located in the well-known seismic gap in northern Chile which had a high seismic potential (~ Mw 9.0) after the two main large historic events of 1868 and 1877. Nonetheless, studies of the seismic source performed with seismic data inversions suggest that the event exhibited a main patch located around 19.8° S at 40 km of depth with a seismic moment equivalent to Mw = 8.2. Thus, a large seismic deficit remains in the gap being capable to release an event of Mw = 8.8-8.9. To understand the importance of the tsunami threat in this zone, a seismic source modeling of the Iquique Earthquake is performed. A new approach based on stochastic k2 seismic sources is presented. A set of those sources is generated and for each one, a full numerical tsunami model is performed in order to obtain the run-up heights along the coastline. The results are compared with the available field run-up measurements and with the tide gauges that registered the signal. The comparison is not uniform; it penalizes more when the discrepancies are larger close to the peak run-up location. This criterion allows to identify the best seismic source from the set of scenarios that explains better the observations from a statistical point of view. By the other hand, a L2 norm minimization is used to invert the seismic source by comparing the peak nearshore tsunami amplitude (PNTA) with the run-up observations. This method searches in a space of solutions the best seismic configuration by retrieving the Green's function coefficients in order to explain the field measurements. The results obtained confirm that a concentrated down-dip patch slip adequately models the run-up data.

  17. Comparisons of thermospheric density data sets and models

    NASA Astrophysics Data System (ADS)

    Doornbos, Eelco; van Helleputte, Tom; Emmert, John; Drob, Douglas; Bowman, Bruce R.; Pilinski, Marcin

    During the past decade, continuous long-term data sets of thermospheric density have become available to researchers. These data sets have been derived from accelerometer measurements made by the CHAMP and GRACE satellites and from Space Surveillance Network (SSN) tracking data and related Two-Line Element (TLE) sets. These data have already resulted in a large number of publications on physical interpretation and improvement of empirical density modelling. This study compares four different density data sets and two empirical density models, for the period 2002-2009. These data sources are the CHAMP (1) and GRACE (2) accelerometer measurements, the long-term database of densities derived from TLE data (3), the High Accuracy Satellite Drag Model (4) run by Air Force Space Command, calibrated using SSN data, and the NRLMSISE-00 (5) and Jacchia-Bowman 2008 (6) empirical models. In describing these data sets and models, specific attention is given to differences in the geo-metrical and aerodynamic satellite modelling, applied in the conversion from drag to density measurements, which are main sources of density biases. The differences in temporal and spa-tial resolution of the density data sources are also described and taken into account. With these aspects in mind, statistics of density comparisons have been computed, both as a function of solar and geomagnetic activity levels, and as a function of latitude and local solar time. These statistics give a detailed view of the relative accuracy of the different data sets and of the biases between them. The differences are analysed with the aim at providing rough error bars on the data and models and pinpointing issues which could receive attention in future iterations of data processing algorithms and in future model development.

  18. How patients understand physicians' solicitations of additional concerns: implications for up-front agenda setting in primary care.

    PubMed

    Robinson, Jeffrey D; Heritage, John

    2016-01-01

    In the more than 1 billion primary-care visits each year in the United States, the majority of patients bring more than one distinct concern, yet many leave with "unmet" concerns (i.e., ones not addressed during visits). Unmet concerns have potentially negative consequences for patients' health, and may pose utilization-based financial burdens to health care systems if patients return to deal with such concerns. One solution to the problem of unmet concerns is the communication skill known as up-front agenda setting, where physicians (after soliciting patients' chief concerns) continue to solicit patients' concerns to "exhaustion" with questions such as "Are there some other issues you'd like to address?" Although this skill is trainable and efficacious, it is not yet a panacea. This article uses conversation analysis to demonstrate that patients understand up-front agenda-setting questions in ways that hamper their effectiveness. Specifically, we demonstrate that up-front agenda-setting questions are understood as making relevant "new problems" (i.e., concerns that are either totally new or "new since last visit," and in need of diagnosis), and consequently bias answers away from "non-new problems" (i.e., issues related to previously diagnosed concerns, including much of chronic care). Suggestions are made for why this might be so, and for improving up-front agenda setting. Data are 144 videotapes of community-based, acute, primary-care, outpatient visits collected in the United States between adult patients and 20 family-practice physicians.

  19. Setting up recovery clinics and promoting service user involvement.

    PubMed

    John, Thomas

    2017-06-22

    Service user involvement in mental health has gained considerable momentum. Evidence from the literature suggests that it remains largely theoretical rather than being put into practice. The current nature of acute inpatient mental health units creates various challenges for nurses to put this concept into practice. Recovery clinics were introduced to bridge this gap and to promote service user involvement practice within the current care delivery model at Kent and Medway NHS and Social Care Partnership Trust. It has shaped new ways of working for nurses with a person-centred approach as its philosophy. Service users and nurses were involved in implementing a needs-led and bottom-up initiative using Kotter's change model. Initial results suggest that it has been successful in meeting its objectives evidenced through increased meaningful interactions and involvement in care by service users and carers. The clinics have gained wide recognition and have highlighted a need for further research into care delivery models to promote service user involvement in these units.

  20. Sensor set-up for wireless measurement of automotive rim and wheel parameters in laboratory conditions

    NASA Astrophysics Data System (ADS)

    Borecki, M.; Prus, P.; Korwin-Pawlowski, M. L.; Rychlik, A.; Kozubel, W.

    2017-08-01

    Modern rims and wheels are tested at the design and production stages. Tests can be performed in laboratory conditions and on the ride. In the laboratory, complex and costly equipment is used, as for example wheel balancers and impact testers. Modern wheel balancers are equipped with electronic and electro-mechanical units that enable touch-less measurement of dimensions, including precision measurement of radial and lateral wheel run-out, automatic positioning and application of the counterweights, and vehicle wheel set monitoring - tread wear, drift angles and run-out unbalance. Those tests are performed by on-wheel axis measurements with laser distance meters. The impact tester enables dropping of weights from a defined height onto a wheel. Test criteria are the loss of pressure of the tire and generation of cracks in the wheel without direct impact of the falling weights. In the present paper, a set up composed of three accelerometers, a temperature sensor and a pressure sensor is examined as the base of a wheel tester. The sensor set-up configuration, on-line diagnostic and signal transmission are discussed.

  1. Exploration of warm-up period in conceptual hydrological modelling

    NASA Astrophysics Data System (ADS)

    Kim, Kue Bum; Kwon, Hyun-Han; Han, Dawei

    2018-01-01

    One of the important issues in hydrological modelling is to specify the initial conditions of the catchment since it has a major impact on the response of the model. Although this issue should be a high priority among modelers, it has remained unaddressed by the community. The typical suggested warm-up period for the hydrological models has ranged from one to several years, which may lead to an underuse of data. The model warm-up is an adjustment process for the model to reach an 'optimal' state, where internal stores (e.g., soil moisture) move from the estimated initial condition to an 'optimal' state. This study explores the warm-up period of two conceptual hydrological models, HYMOD and IHACRES, in a southwestern England catchment. A series of hydrologic simulations were performed for different initial soil moisture conditions and different rainfall amounts to evaluate the sensitivity of the warm-up period. Evaluation of the results indicates that both initial wetness and rainfall amount affect the time required for model warm up, although it depends on the structure of the hydrological model. Approximately one and a half months are required for the model to warm up in HYMOD for our study catchment and climatic conditions. In addition, it requires less time to warm up under wetter initial conditions (i.e., saturated initial conditions). On the other hand, approximately six months is required for warm-up in IHACRES, and the wet or dry initial conditions have little effect on the warm-up period. Instead, the initial values that are close to the optimal value result in less warm-up time. These findings have implications for hydrologic model development, specifically in determining soil moisture initial conditions and warm-up periods to make full use of the available data, which is very important for catchments with short hydrological records.

  2. Rank the Voltage across Light Bulbs … Then Set up the Live Experiment

    ERIC Educational Resources Information Center

    Jacobs, Greg C.

    2018-01-01

    The Tasks Inspired by Physics Education Research (TIPERS) workbooks pose questions in styles quite different from the end-of-chapter problems that those of us of a certain age were assigned back in the days before Netscape. My own spin on TIPERS is not just to do them on paper, but to have students set up the situations in the laboratory to…

  3. Kuipers sets up the EHS/TEPC Spectrometer and Detector Assembly in the SM

    NASA Image and Video Library

    2012-03-12

    ISS030-E-177101 (12 March 2012) --- European Space Agency astronaut Andre Kuipers, Expedition 30 flight engineer, sets up the Environmental Health System / Tissue Equivalent Proportional Counter (EHS/TEPC) spectrometer and detector assembly on panel 327 in the Zvezda Service Module of the International Space Station. The TEPC detector assembly is the primary radiation measurement tool on the space station.

  4. Application of activation methods on the Dubna experimental transmutation set-ups.

    PubMed

    Stoulos, S; Fragopoulou, M; Adloff, J C; Debeauvais, M; Brandt, R; Westmeier, W; Krivopustov, M; Sosnin, A; Papastefanou, C; Zamani, M; Manolopoulou, M

    2003-02-01

    High spallation neutron fluxes were produced by irradiating massive heavy targets with proton beams in the GeV range. The experiments were performed at the Dubna High Energy Laboratory using the nuclotron accelerator. Two different experimental set-ups were used to produce neutron spectra convenient for transmutation of radioactive waste by (n,x) reactions. By a theoretical analysis neutron spectra can be reproduced from activation measurements. Thermal-epithermal and fast-super-fast neutron fluxes were estimated using the 197Au, 238U (n,gamma) and (n,2n) reactions, respectively. Depleted uranium transmutation rates were also studied in both experiments.

  5. Suggested set-up and layout of instruments and equipment for advanced operative laparoscopy.

    PubMed

    Winer, W K; Lyons, T L

    1995-02-01

    Crucial elements that ensure the organization and smoothness of a laparoscopic procedure are clear communication among well-trained endoscopy team members, properly maintained equipment, and a sensible layout of the instruments. The team consists of the surgeon, surgical assistant, circulator, scrub nurse, laser nurse, and anesthesiologist. To promote continuity and interaction and to ensure a systematic, pleasant pace for laparoscopic procedures, the team should establish a specific routine, as well as set-up and layout of tables, equipment, and instruments. Key ingredients for advanced operative laparoscopy to be performed with optimum efficiency and effectiveness are the best organization and placement of the equipment, instrumentation, and team in a particular setting in the operating room.

  6. Superheated liquid carbon dioxide jets: setting up and phenomena

    NASA Astrophysics Data System (ADS)

    Engelmeier, Lena; Pollak, Stefan; Peters, Franz; Weidner, Eckhard

    2018-01-01

    We present an experimental investigation on liquid, superheated carbon dioxide jets. Our main goal is to identify the setting up requirements for generating coherent jets because these raise expectations on applications in the cleaning and cutting industry. The study leads us through a number of phenomena, which are described, categorized and explained. The experiments are based on compressed (350 MPa) and cooled carbon dioxide, which expands through a cylindrical nozzle into the atmosphere. The nozzle provokes hydraulic flip by a sharp-edge inlet leading to separation and constriction. Upstream-temperature and pressure are varied and the jet's structure and phase state are monitored by a high-speed camera. We observe coherent, liquid jets far from equilibrium, which demands the solid or gaseous state. Therefore, these jets are superheated. Carbon dioxide jets, like water jets, below certain nozzle diameters are subject to fluid dynamic instabilities resulting in breakup. Above certain diameters flashing jet breakup appears, which is associated with nucleation.

  7. SU-F-P-18: Development of the Technical Training System for Patient Set-Up Considering Rotational Correction in the Virtual Environment Using Three-Dimensional Computer Graphic Engine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Imura, K; Fujibuchi, T; Hirata, H

    Purpose: Patient set-up skills in radiotherapy treatment room have a great influence on treatment effect for image guided radiotherapy. In this study, we have developed the training system for improving practical set-up skills considering rotational correction in the virtual environment away from the pressure of actual treatment room by using three-dimensional computer graphic (3DCG) engine. Methods: The treatment room for external beam radiotherapy was reproduced in the virtual environment by using 3DCG engine (Unity). The viewpoints to perform patient set-up in the virtual treatment room were arranged in both sides of the virtual operable treatment couch to assume actual performancemore » by two clinical staffs. The position errors to mechanical isocenter considering alignment between skin marker and laser on the virtual patient model were displayed by utilizing numerical values expressed in SI units and the directions of arrow marks. The rotational errors calculated with a point on the virtual body axis as the center of each rotation axis for the virtual environment were corrected by adjusting rotational position of the body phantom wound the belt with gyroscope preparing on table in a real space. These rotational errors were evaluated by describing vector outer product operations and trigonometric functions in the script for patient set-up technique. Results: The viewpoints in the virtual environment allowed individual user to visually recognize the position discrepancy to mechanical isocenter until eliminating the positional errors of several millimeters. The rotational errors between the two points calculated with the center point could be efficiently corrected to display the minimum technique mathematically by utilizing the script. Conclusion: By utilizing the script to correct the rotational errors as well as accurate positional recognition for patient set-up technique, the training system developed for improving patient set-up skills enabled individual

  8. Installing and Setting Up the Git Software Tool on OS X | High-Performance

    Science.gov Websites

    Computing | NREL the Git Software Tool on OS X Installing and Setting Up the Git Software Tool on OS X Learn how to install the Git software tool on OS X for use with the Peregrine system. You can . Binary Installer for OS X - Easiest! You can download the latest version of git from http://git-scm.com

  9. Quantum efficiency test set up performances for NIR detector characterization at ESTEC

    NASA Astrophysics Data System (ADS)

    Crouzet, P.-E.; Duvet, L.; De Wit, F.; Beaufort, T.; Blommaert, S.; Butler, B.; Van Duinkerken, G.; ter Haar, J.; Heijnen, J.; van der Luijt, K.; Smit, H.; Viale, T.

    2014-07-01

    The Payload Technology Validation Section (Future mission preparation Office) at ESTEC is in charge of specific mission oriented validation activities, for science and robotic exploration missions, aiming at reducing development risks in the implementation phase. These activities take place during the early mission phases or during the implementation itself. In this framework, a test set up to characterize the quantum efficiency of near infrared detectors has been developed. The first detector to be tested will an HAWAII-2RG detector with a 2.5μm cut off, it will be used as commissioning device in preparation to the tests of prototypes European detectors developed under ESA funding. The capability to compare on the same setup detectors from different manufacturers will be a unique asset for the future mission preparation office. This publication presents the performances of the quantum efficiency test bench to prepare measurements on the HAWAII-2RG detector. A SOFRADIR Saturn detector has been used as a preliminary test vehicle for the bench. A test set up with a lamp, chopper, monochromator, pinhole and off axis mirrors allows to create a spot of 1mm diameter between 700nm and 2.5μm.The shape of the beam has been measured to match the rms voltage read by the Merlin Lock -in amplifier and the amplitude of the incoming signal. The reference detectors have been inter-calibrated with an uncertainty up to 3 %. For the measurement with HAWAII-2RG detector, the existing cryostat [1] has been modified to adapt cold black baffling, a cold filter wheel and a sapphire window. An statistic uncertainty of +/-2.6% on the quantum efficiency on the detector under test measurement is expected.

  10. Model-based setting of inspiratory pressure and respiratory rate in pressure-controlled ventilation.

    PubMed

    Schranz, C; Becher, T; Schädler, D; Weiler, N; Möller, K

    2014-03-01

    Mechanical ventilation carries the risk of ventilator-induced-lung-injury (VILI). To minimize the risk of VILI, ventilator settings should be adapted to the individual patient properties. Mathematical models of respiratory mechanics are able to capture the individual physiological condition and can be used to derive personalized ventilator settings. This paper presents model-based calculations of inspiration pressure (pI), inspiration and expiration time (tI, tE) in pressure-controlled ventilation (PCV) and a retrospective evaluation of its results in a group of mechanically ventilated patients. Incorporating the identified first order model of respiratory mechanics in the basic equation of alveolar ventilation yielded a nonlinear relation between ventilation parameters during PCV. Given this patient-specific relation, optimized settings in terms of minimal pI and adequate tE can be obtained. We then retrospectively analyzed data from 16 ICU patients with mixed pathologies, whose ventilation had been previously optimized by ICU physicians with the goal of minimization of inspiration pressure, and compared the algorithm's 'optimized' settings to the settings that had been chosen by the physicians. The presented algorithm visualizes the patient-specific relations between inspiration pressure and inspiration time. The algorithm's calculated results highly correlate to the physician's ventilation settings with r = 0.975 for the inspiration pressure, and r = 0.902 for the inspiration time. The nonlinear patient-specific relations of ventilation parameters become transparent and support the determination of individualized ventilator settings according to therapeutic goals. Thus, the algorithm is feasible for a variety of ventilated ICU patients and has the potential of improving lung-protective ventilation by minimizing inspiratory pressures and by helping to avoid the build-up of clinically significant intrinsic positive end-expiratory pressure.

  11. Iterated greedy algorithms to minimize the total family flow time for job-shop scheduling with job families and sequence-dependent set-ups

    NASA Astrophysics Data System (ADS)

    Kim, Ji-Su; Park, Jung-Hyeon; Lee, Dong-Ho

    2017-10-01

    This study addresses a variant of job-shop scheduling in which jobs are grouped into job families, but they are processed individually. The problem can be found in various industrial systems, especially in reprocessing shops of remanufacturing systems. If the reprocessing shop is a job-shop type and has the component-matching requirements, it can be regarded as a job shop with job families since the components of a product constitute a job family. In particular, sequence-dependent set-ups in which set-up time depends on the job just completed and the next job to be processed are also considered. The objective is to minimize the total family flow time, i.e. the maximum among the completion times of the jobs within a job family. A mixed-integer programming model is developed and two iterated greedy algorithms with different local search methods are proposed. Computational experiments were conducted on modified benchmark instances and the results are reported.

  12. Standard fire behavior fuel models: a comprehensive set for use with Rothermel's surface fire spread model

    Treesearch

    Joe H. Scott; Robert E. Burgan

    2005-01-01

    This report describes a new set of standard fire behavior fuel models for use with Rothermel's surface fire spread model and the relationship of the new set to the original set of 13 fire behavior fuel models. To assist with transition to using the new fuel models, a fuel model selection guide, fuel model crosswalk, and set of fuel model photos are provided.

  13. Setting up an Online Panel Representative of the General Population: The German Internet Panel

    ERIC Educational Resources Information Center

    Blom, Annelies G.; Gathmann, Christina; Krieger, Ulrich

    2015-01-01

    This article looks into the processes and outcomes of setting up and maintaining a probability-based longitudinal online survey, which is recruited face-to-face and representative of both the online and the offline population aged 16-75 in Germany. This German Internet Panel studies political and economic attitudes and reform preferences through…

  14. GeneTopics - interpretation of gene sets via literature-driven topic models

    PubMed Central

    2013-01-01

    Background Annotation of a set of genes is often accomplished through comparison to a library of labelled gene sets such as biological processes or canonical pathways. However, this approach might fail if the employed libraries are not up to date with the latest research, don't capture relevant biological themes or are curated at a different level of granularity than is required to appropriately analyze the input gene set. At the same time, the vast biomedical literature offers an unstructured repository of the latest research findings that can be tapped to provide thematic sub-groupings for any input gene set. Methods Our proposed method relies on a gene-specific text corpus and extracts commonalities between documents in an unsupervised manner using a topic model approach. We automatically determine the number of topics summarizing the corpus and calculate a gene relevancy score for each topic allowing us to eliminate non-specific topics. As a result we obtain a set of literature topics in which each topic is associated with a subset of the input genes providing directly interpretable keywords and corresponding documents for literature research. Results We validate our method based on labelled gene sets from the KEGG metabolic pathway collection and the genetic association database (GAD) and show that the approach is able to detect topics consistent with the labelled annotation. Furthermore, we discuss the results on three different types of experimentally derived gene sets, (1) differentially expressed genes from a cardiac hypertrophy experiment in mice, (2) altered transcript abundance in human pancreatic beta cells, and (3) genes implicated by GWA studies to be associated with metabolite levels in a healthy population. In all three cases, we are able to replicate findings from the original papers in a quick and semi-automated manner. Conclusions Our approach provides a novel way of automatically generating meaningful annotations for gene sets that are directly

  15. Study of Parameters Affecting the Level of Ultrasound Exposure with In Vitro Set-Ups

    NASA Astrophysics Data System (ADS)

    Leskinen, Jarkko J.; Hynynen, Kullervo

    2010-03-01

    Ultrasound (US) exposures are widely used with in vitro cell systems e.g. in stem cell and tissue engineering research. However, without the knowledge of factors affecting the level of US exposure, the outcome of the biological result may vary from test to test or even be misinterpreted. Thereby, some of the factors affecting in vitro US exposures were studied. The level of US exposure was characterized in standard commercial cell culturing plates. The temperature distributions were measured inside the wells using infrared camera and fine wire thermocouples, and pressure and intensity distributions using a laser vibrometer and a schlieren system. The measurements were made at operating frequency of around 1 MHz with varying temporal parameters and powers (up to 2 W of acoustic power). Heat accumulation between the wells varied up to 40-50% depending on the location of the well on the plate. This well-to-well variation was be linked to the activity of reporter plasmid on osteoblastic cells. Similar temperature variations within the wells were also measured. Small sub-wavelength change in the exposure distance or, respectively, liquid volume inside the well was found to alter the acoustic field in both magnitude and shape due the standing waves. The gathered data reveals the complexity of the acoustic field in a typical in vitro set-up and gives new information about the environment of the in vitro cells during US exposures. This data may be especially useful when US set-ups are designed or characterized.

  16. Community Exchange Systems. What They Are. How They Work. How to Set One Up.

    ERIC Educational Resources Information Center

    Page, Leslie

    This booklet explains the concept of a community exchange system (CES), or barter system, for the exchange of goods and services and describes how to set one up. The booklet is concerned only with nonprofit, voluntary organizations. The booklet is organized in four sections. The first section introduces the community exchange systems idea and…

  17. How To Set Up Your Own Small Business. Volumes I-II and Overhead Transparencies.

    ERIC Educational Resources Information Center

    Fallek, Max

    This two-volume textbook and collection of overhead transparency masters is intended for use in a course in setting up a small business. The following topics are covered in the first volume: getting off to a good start, doing market research, forecasting sales, financing a small business, understanding the different legal needs of different types…

  18. A managed clinical network for cardiac services: set-up, operation and impact on patient care.

    PubMed

    Stc Hamilton, Karen E; Sullivan, Frank M; Donnan, Peter T; Taylor, Rex; Ikenwilo, Divine; Scott, Anthony; Baker, Chris; Wyke, Sally

    2005-01-01

    To investigate the set up and operation of a Managed Clinical Network for cardiac services and assess its impact on patient care. This single case study used process evaluation with observational before and after comparison of indicators of quality of care and costs. The study was conducted in Dumfries and Galloway, Scotland and used a three-level framework. Process evaluation of the network set-up and operation through a documentary review of minutes; guidelines and protocols; transcripts of fourteen semi-structured interviews with health service personnel including senior managers, general practitioners, nurses, cardiologists and members of the public. Outcome evaluation of the impact of the network through interrupted time series analysis of clinical data of 202 patients aged less than 76 years admitted to hospital with a confirmed myocardial infarction one-year pre and one-year post, the establishment of the network. The main outcome measures were differences between indicators of quality of care targeted by network protocols. Economic evaluation of the transaction costs of the set-up and operation of the network and the resource costs of the clinical care of the 202 myocardial infarction patients from the time of hospital admission to 6 months post discharge through interrupted time series analysis. The outcome measure was different in National Health Service resource use. Despite early difficulties, the network was successful in bringing together clinicians, patients and managers to redesign services, exhibiting most features of good network management. The role of the energetic lead clinician was crucial, but the network took time to develop and 'bed down'. Its primary "modus operand" was the development of a myocardial infarction pathway and associated protocols. Of sixteen clinical care indicators, two improved significantly following the launch of the network and nine showed improvements, which were not statistically significant. There was no difference

  19. Swell-generated Set-up and Infragravity Wave Propagation Over a Fringing Coral Reef: Implications for Wave-driven Inundation of Atoll Islands

    NASA Astrophysics Data System (ADS)

    Cheriton, O. M.; Storlazzi, C. D.; Rosenberger, K. J.; Quataert, E.; van Dongeren, A.

    2014-12-01

    The Republic of the Marshall Islands is comprised of 1156 islands on 29 low-lying atolls with a mean elevation of 2 m that are susceptible to sea-level rise and often subjected to overwash during large wave events. A 6-month deployment of wave and tide gauges across two shore-normal sections of north-facing coral reef on the Roi-Namur Island on Kwajalein Atoll was conducted during 2013-2014 to quantify wave dynamics and wave-driven water levels on the fringing coral reef. Wave heights and periods on the reef flat were strongly correlated to the water levels. On the fore reef, the majority of wave energy was concentrated in the incident band (5-25 s); due to breaking at the reef crest, however, the wave energy over the reef flat was dominated by infragravity-band (25-250 s) motions. Two large wave events with heights of 6-8 m at 15 s over the fore reef were observed. During these events, infragravity-band wave heights exceeded the incident band wave heights and approximately 1.0 m of set-up was established over the innermost reef flat. This set-up enabled the propagation of large waves across the reef flat, reaching maximum heights of nearly 2 m on the innermost reef flat adjacent to the toe of the beach. XBEACH models of the instrument transects were able to replicate the incident waves, infragravity waves, and wave-driven set-up across the reef when the hydrodynamic roughness of the reef was correctly parameterized. These events led to more than 3 m of wave-driven run-up and inundation of the island that drove substantial morphological change to the beach face.

  20. University Start-ups: A Better Business Model

    NASA Astrophysics Data System (ADS)

    Dehn, J.; Webley, P. W.

    2015-12-01

    Many universities look to start-up companies as a way to attract faculty, supporting research and students as traditional federal sources become harder to come by. University affiliated start-up companies can apply for a broader suite of grants, as well as market their services to a broad customer base. Often university administrators see this as a potential panacea, but national statistics show this is not the case. Rarely do universities profit significantly from their start-ups. With a success rates of around 20%, most start-ups end up costing the university money as well as faculty-time. For the faculty, assuming they want to continue in academia, a start-up is often unattractive because it commonly leads out of academia. Running a successful business as well as maintaining a strong teaching and research load is almost impossible to do at the same time. Most business models and business professionals work outside of academia, and the models taught in business schools do not merge well in a university environment. To mitigate this a new business model is proposed where university start-ups are aligned with the academic and research missions of the university. A university start-up must work within the university, directly support research and students, and the work done maintaining the business be recognized as part of the faculty member's university obligations. This requires a complex conflict of interest management plan and for the companies to be non-profit in order to not jeopardize the university's status. This approach may not work well for all universities, but would be ideal for many to conserve resources and ensure a harmonious relationship with their start-ups and faculty.

  1. Setting up of teeth in the neutral zone and its effect on speech

    PubMed Central

    Al-Magaleh, Wafa’a Radwan; Swelem, Amal Ali; Shohdi, Sahar Saad; Mawsouf, Nadia Mohamed

    2011-01-01

    Rational goals for denture construction are basically directed at the restoration of esthetics and masticatory function and the healthy preservation of the remaining natural tissues. Little concern has been given to the perfection and optimization of the phonetic quality of denture users. However, insertion of prosthodontic restorations may lead to speech defects. Most such defects are mild but, nevertheless, can be a source of concern to the patient. For the dental practitioner, there are few guidelines for designing a prosthetic restoration with maximum phonetic success. One of these guidelines involves the setting up of teeth within the neutral zone. The aim of this study was to evaluate, subjectively and objectively, the effect on speech of setting up teeth in the neutral zone. Three groups were examined: group I (control) included 10 completely dentulous subjects, group II included 10 completely edentulous patients with conventional dentures, and group III included the same 10 edentulous patients with neutral zone dentures. Subjective assessment included patient satisfaction. Objective assessment included duration taken for recitation of Al-Fateha and acoustic analysis. Subjectively, patients were more satisfied with their neutral zone dentures. Objectively, speech produced with the neutral zone dentures was closer to normal than speech with conventional dentures. PMID:23960527

  2. Learning Setting-Generalized Activity Models for Smart Spaces

    PubMed Central

    Cook, Diane J.

    2011-01-01

    The data mining and pervasive computing technologies found in smart homes offer unprecedented opportunities for providing context-aware services, including health monitoring and assistance to individuals experiencing difficulties living independently at home. In order to provide these services, smart environment algorithms need to recognize and track activities that people normally perform as part of their daily routines. However, activity recognition has typically involved gathering and labeling large amounts of data in each setting to learn a model for activities in that setting. We hypothesize that generalized models can be learned for common activities that span multiple environment settings and resident types. We describe our approach to learning these models and demonstrate the approach using eleven CASAS datasets collected in seven environments. PMID:21461133

  3. Analyzing ROC curves using the effective set-size model

    NASA Astrophysics Data System (ADS)

    Samuelson, Frank W.; Abbey, Craig K.; He, Xin

    2018-03-01

    The Effective Set-Size model has been used to describe uncertainty in various signal detection experiments. The model regards images as if they were an effective number (M*) of searchable locations, where the observer treats each location as a location-known-exactly detection task with signals having average detectability d'. The model assumes a rational observer behaves as if he searches an effective number of independent locations and follows signal detection theory at each location. Thus the location-known-exactly detectability (d') and the effective number of independent locations M* fully characterize search performance. In this model the image rating in a single-response task is assumed to be the maximum response that the observer would assign to these many locations. The model has been used by a number of other researchers, and is well corroborated. We examine this model as a way of differentiating imaging tasks that radiologists perform. Tasks involving more searching or location uncertainty may have higher estimated M* values. In this work we applied the Effective Set-Size model to a number of medical imaging data sets. The data sets include radiologists reading screening and diagnostic mammography with and without computer-aided diagnosis (CAD), and breast tomosynthesis. We developed an algorithm to fit the model parameters using two-sample maximum-likelihood ordinal regression, similar to the classic bi-normal model. The resulting model ROC curves are rational and fit the observed data well. We find that the distributions of M* and d' differ significantly among these data sets, and differ between pairs of imaging systems within studies. For example, on average tomosynthesis increased readers' d' values, while CAD reduced the M* parameters. We demonstrate that the model parameters M* and d' are correlated. We conclude that the Effective Set-Size model may be a useful way of differentiating location uncertainty from the diagnostic uncertainty in medical

  4. How To Set Up Your Own Small Business. Service Company Case Study. Manufacturing Firm Case Study. Retail Store Case Study.

    ERIC Educational Resources Information Center

    Fallek, Max

    This collection of case studies is intended for use in a course in setting up a small business. The first, a case study of the process of setting up a service company, covers analyzing the pros and cons of starting one's own business, assessing the competition and local market, and selecting a site for and financing the business. The principal…

  5. Set up and Operation of Video Cassette Recorders or "...How Do I Work This Thing???"

    ERIC Educational Resources Information Center

    Alaska State Dept. of Education, Juneau.

    Designed to assist Alaskans in making optimum use of the LearnAlaska TV transmitter network, this booklet provides instructions for the operation and maintenance of videocassette recorders (VCRs). After a brief introduction, which lists state film library addresses for ordering an accompanying videocassette entitled "Set Up & Operation…

  6. Test set up description and performances for HAWAII-2RG detector characterization at ESTEC

    NASA Astrophysics Data System (ADS)

    Crouzet, P.-E.; ter Haar, J.; de Wit, F.; Beaufort, T.; Butler, B.; Smit, H.; van der Luijt, C.; Martin, D.

    2012-07-01

    In the frame work of the European Space Agency's Cosmic Vision program, the Euclid mission has the objective to map the geometry of the Dark Universe. Galaxies and clusters of galaxies will be observed in the visible and near-infrared wavelengths by an imaging and spectroscopic channel. For the Near Infrared Spectrometer instrument (NISP), the state-of-the-art HAWAII-2RG detectors will be used, associated with the SIDECAR ASIC readout electronic which will perform the image frame acquisitions. To characterize and validate the performance of these detectors, a test bench has been designed, tested and validated. This publication describes the pre-tests performed to build the set up dedicated to dark current measurements and tests requiring reasonably uniform light levels (such as for conversion gain measurements). Successful cryogenic and vacuum tests on commercial LEDs and photodiodes are shown. An optimized feed through in stainless steel with a V-groove to pot the flex cable connecting the SIDECAR ASIC to the room temperature board (JADE2) has been designed and tested. The test set up for quantum efficiency measurements consisting of a lamp, a monochromator, an integrating sphere and set of cold filters, and which is currently under construction will ensure a uniform illumination across the detector with variations lower than 2%. A dedicated spot projector for intra-pixel measurements has been designed and built to reach a spot diameter of 5 μm at 920nm with 2nm of bandwidth [1].

  7. Simulation as a set-up for technical proficiency: can a virtual warm-up improve live fibre-optic intubation?

    PubMed

    Samuelson, S T; Burnett, G; Sim, A J; Hofer, I; Weinberg, A D; Goldberg, A; Chang, T S; DeMaria, S

    2016-03-01

    Fibre-optic intubation (FOI) is an advanced technical skill, which anaesthesia residents must frequently perform under pressure. In surgical subspecialties, a virtual 'warm-up' has been used to prime a practitioner's skill set immediately before performance of challenging procedures. This study examined whether a virtual warm-up improved the performance of elective live patient FOI by anaesthesia residents. Clinical anaesthesia yr 1 and 2 (CA1 and CA2) residents were recruited to perform elective asleep oral FOI. Residents either underwent a 5 min, guided warm-up (using a bronchoscopy simulator) immediately before live FOI on patients with predicted normal airways or performed live FOI on similar patients without the warm-up. Subjects were timed performing FOI (from scope passing teeth to viewing the carina) and were graded on a 45-point skill scale by attending anaesthetists. After a washout period, all subjects were resampled as members of the opposite cohort. Multivariate analysis was performed to control for variations in previous FOI experience of the residents. Thirty-three anaesthesia residents were recruited, of whom 22 were CA1 and 11 were CA2. Virtual warm-up conferred a 37% reduction in time for CA1s (mean 35.8 (SD 3.2) s vs. 57 (SD 3.2) s, P<0.0002) and a 26% decrease for CA2s (mean 23 (SD 1.7) s vs. 31 (SD 1.7) s, P=0.0118). Global skill score increased with warm-up by 4.8 points for CA1s (mean 32.8 (SD 1.2) vs. 37.6 (SD 1.2), P=0.0079) and 5.1 points for CA2s (37.7 (SD 1.1) vs. 42.8 (SD 1.1), P=0.0125). Crossover period and sequence did not show a statistically significant association with performance. Virtual warm-up significantly improved performance by residents of FOI in live patients with normal airway anatomy, as measured both by speed and by a scaled evaluation of skills. © The Author 2016. Published by Oxford University Press on behalf of the British Journal of Anaesthesia. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  8. Evaluation of RSA set-up from a clinical biplane fluoroscopy system for 3D joint kinematic analysis.

    PubMed

    Bonanzinga, Tommaso; Signorelli, Cecilia; Bontempi, Marco; Russo, Alessandro; Zaffagnini, Stefano; Marcacci, Maurilio; Bragonzoni, Laura

    2016-01-01

    dinamic roentgen stereophotogrammetric analysis (RSA), a technique currently based only on customized radiographic equipment, has been shown to be a very accurate method for detecting three-dimensional (3D) joint motion. The aim of the present work was to evaluate the applicability of an innovative RSA set-up for in vivo knee kinematic analysis, using a biplane fluoroscopic image system. To this end, the Authors describe the set-up as well as a possible protocol for clinical knee joint evaluation. The accuracy of the kinematic measurements is assessed. the Authors evaluated the accuracy of 3D kinematic analysis of the knee in a new RSA set-up, based on a commercial biplane fluoroscopy system integrated into the clinical environment. The study was organized in three main phases: an in vitro test under static conditions, an in vitro test under dynamic conditions reproducing a flexion-extension range of motion (ROM), and an in vivo analysis of the flexion-extension ROM. For each test, the following were calculated, as an indication of the tracking accuracy: mean, minimum, maximum values and standard deviation of the error of rigid body fitting. in terms of rigid body fitting, in vivo test errors were found to be 0.10±0.05 mm. Phantom tests in static and kinematic conditions showed precision levels, for translations and rotations, of below 0.1 mm/0.2° and below 0.5 mm/0.3° respectively for all directions. the results of this study suggest that kinematic RSA can be successfully performed using a standard clinical biplane fluoroscopy system for the acquisition of slow movements of the lower limb. a kinematic RSA set-up using a clinical biplane fluoroscopy system is potentially applicable and provides a useful method for obtaining better characterization of joint biomechanics.

  9. Testing normative and self-appraisal feedback in an online slot-machine pop-up in a real-world setting

    PubMed Central

    Auer, Michael M.; Griffiths, Mark D.

    2015-01-01

    Over the last few years, there have been an increasing number of gaming operators that have incorporated on-screen pop-up messages while gamblers play on slot machines and/or online as one of a range of tools to help encourage responsible gambling. Coupled with this, there has also been an increase in empirical research into whether such pop-up messages are effective, particularly in laboratory settings. However, very few studies have been conducted on the utility of pop-up messages in real-world gambling settings. The present study investigated the effects of normative and self-appraisal feedback in a slot machine pop-up message compared to a simple (non-enhanced) pop-up message. The study was conducted in a real-world gambling environment by comparing the behavioral tracking data of two representative random samples of 800,000 gambling sessions (i.e., 1.6 million sessions in total) across two conditions (i.e., simple pop-up message versus an enhanced pop-up message). The results indicated that the additional normative and self-appraisal content doubled the number of gamblers who stopped playing after they received the enhanced pop-up message (1.39%) compared to the simple pop-up message (0.67%). The data suggest that pop-up messages influence only a small number of gamblers to cease long playing sessions and that enhanced messages are slightly more effective in helping gamblers to stop playing in-session. PMID:25852630

  10. Performability modeling with continuous accomplishment sets

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1979-01-01

    A general modeling framework that permits the definition, formulation, and evaluation of performability is described. It is shown that performability relates directly to system effectiveness, and is a proper generalization of both performance and reliability. A hierarchical modeling scheme is used to formulate the capability function used to evaluate performability. The case in which performance variables take values in a continuous accomplishment set is treated explicitly.

  11. Ebola 2014: Setting up a port health screening programme at an international train station.

    PubMed

    Cleary, Vivien; Wynne-Evans, Edward; Freed, James; Fleet, Katie; Thorn, Simone; Turbitt, Deborah

    2017-12-01

    An outbreak of Ebola virus disease (EVD) began in Guinea in December 2013 and was declared a Public Health Emergency of International Concern by the World Health Organization in August 2014. In October, the UK government tasked Public Health England (PHE) to set up EVD screening at key ports. The key aim of port-of-entry screening was to identify passengers coming from areas with high risk of EVD, and give them advice to raise their awareness of symptoms and what actions to take. Direct flights from Sierra Leone, Guinea or Liberia had all been cancelled, so intelligence on passenger numbers and routes was used to identify the most commonly used routes from the affected countries into the UK. One of these was St Pancras International train station. Screening had never previously been implemented at a UK train station so had to be set up from scratch. Key to the success of this was excellent multi-agency working between PHE, the UK Border Force, Eurostar, Network Rail and the Cabinet Office. This paper gives an overview of the activation of EVD screening at St Pancras International and the subsequent decommissioning.

  12. Combined group and individual model for postbariatric surgery follow-up care.

    PubMed

    Lorentz, Paul A; Swain, James M; Gall, Margaret M; Collazo-Clavell, Maria L

    2012-01-01

    The prevalence of bariatric surgery in the United States has increased significantly during the past decade, increasing the number of patients requiring postbariatric surgery follow-up care. Our objective was to develop and implement an efficient, financially viable, postbariatric surgery practice model that would be acceptable to patients. The setting was the Mayo Clinic (Rochester, MN). By monitoring the attendance rates and using patient surveys, we tested patient acceptance of a new, shared medical appointment practice model in the care of postbariatric surgery patients. Efficiency was assessed by comparing differences in time per patient and total provider time required between the former and new care models. Individual-only patient/provider visits were replaced by combined group and individual visits (CGV). Our CGV model was well-attended and accepted. The patient attendance rate was >90% at all postoperative follow-up points. Furthermore, 83%, 85.2%, and 75.7% of the 3-, 6-, and 12-month postbariatric surgery patients, respectively, responded that they would not prefer to have only individual visits with their healthcare providers. The CGV model also resulted in greater time efficiency and cost reduction. On average, 5 patients were seen within 4.9 provider hours compared with 10.4 provider hours with the individual-only patient/provider visit model. Furthermore, the average billable charge for the CGV model's group medical nutrition therapy was 50-64% less than the equivalent individual medical nutrition therapy used in the individual-only patient/provider visit model. Shared medical appointments have a valuable role in the care of the postbariatric surgery population, offering a time- and cost-effective model for healthcare provision that is well-accepted by patients. Copyright © 2012 American Society for Metabolic and Bariatric Surgery. Published by Elsevier Inc. All rights reserved.

  13. Modelling uncertainty with generalized credal sets: application to conjunction and decision

    NASA Astrophysics Data System (ADS)

    Bronevich, Andrey G.; Rozenberg, Igor N.

    2018-01-01

    To model conflict, non-specificity and contradiction in information, upper and lower generalized credal sets are introduced. Any upper generalized credal set is a convex subset of plausibility measures interpreted as lower probabilities whose bodies of evidence consist of singletons and a certain event. Analogously, contradiction is modelled in the theory of evidence by a belief function that is greater than zero at empty set. Based on generalized credal sets, we extend the conjunctive rule for contradictory sources of information, introduce constructions like natural extension in the theory of imprecise probabilities and show that the model of generalized credal sets coincides with the model of imprecise probabilities if the profile of a generalized credal set consists of probability measures. We give ways how the introduced model can be applied to decision problems.

  14. Spatial occupancy models for large data sets

    USGS Publications Warehouse

    Johnson, Devin S.; Conn, Paul B.; Hooten, Mevin B.; Ray, Justina C.; Pond, Bruce A.

    2013-01-01

    Since its development, occupancy modeling has become a popular and useful tool for ecologists wishing to learn about the dynamics of species occurrence over time and space. Such models require presence–absence data to be collected at spatially indexed survey units. However, only recently have researchers recognized the need to correct for spatially induced overdisperison by explicitly accounting for spatial autocorrelation in occupancy probability. Previous efforts to incorporate such autocorrelation have largely focused on logit-normal formulations for occupancy, with spatial autocorrelation induced by a random effect within a hierarchical modeling framework. Although useful, computational time generally limits such an approach to relatively small data sets, and there are often problems with algorithm instability, yielding unsatisfactory results. Further, recent research has revealed a hidden form of multicollinearity in such applications, which may lead to parameter bias if not explicitly addressed. Combining several techniques, we present a unifying hierarchical spatial occupancy model specification that is particularly effective over large spatial extents. This approach employs a probit mixture framework for occupancy and can easily accommodate a reduced-dimensional spatial process to resolve issues with multicollinearity and spatial confounding while improving algorithm convergence. Using open-source software, we demonstrate this new model specification using a case study involving occupancy of caribou (Rangifer tarandus) over a set of 1080 survey units spanning a large contiguous region (108 000 km2) in northern Ontario, Canada. Overall, the combination of a more efficient specification and open-source software allows for a facile and stable implementation of spatial occupancy models for large data sets.

  15. MO-F-CAMPUS-T-05: Correct Or Not to Correct for Rotational Patient Set-Up Errors in Stereotactic Radiosurgery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Briscoe, M; Ploquin, N; Voroney, JP

    2015-06-15

    Purpose: To quantify the effect of patient rotation in stereotactic radiation therapy and establish a threshold where rotational patient set-up errors have a significant impact on target coverage. Methods: To simulate rotational patient set-up errors, a Matlab code was created to rotate the patient dose distribution around the treatment isocentre, located centrally in the lesion, while keeping the structure contours in the original locations on the CT and MRI. Rotations of 1°, 3°, and 5° for each of the pitch, roll, and yaw, as well as simultaneous rotations of 1°, 3°, and 5° around all three axes were applied tomore » two types of brain lesions: brain metastasis and acoustic neuroma. In order to analyze multiple tumour shapes, these plans included small spherical (metastasis), elliptical (acoustic neuroma), and large irregular (metastasis) tumour structures. Dose-volume histograms and planning target volumes were compared between the planned patient positions and those with simulated rotational set-up errors. The RTOG conformity index for patient rotation was also investigated. Results: Examining the tumour volumes that received 80% of the prescription dose in the planned and rotated patient positions showed decreases in prescription dose coverage of up to 2.3%. Conformity indices for treatments with simulated rotational errors showed decreases of up to 3% compared to the original plan. For irregular lesions, degradation of 1% of the target coverage can be seen for rotations as low as 3°. Conclusions: This data shows that for elliptical or spherical targets, rotational patient set-up errors less than 3° around any or all axes do not have a significant impact on the dose delivered to the target volume or the conformity index of the plan. However the same rotational errors would have an impact on plans for irregular tumours.« less

  16. A Python tool to set up relative free energy calculations in GROMACS

    PubMed Central

    Klimovich, Pavel V.; Mobley, David L.

    2015-01-01

    Free energy calculations based on molecular dynamics (MD) simulations have seen a tremendous growth in the last decade. However, it is still difficult and tedious to set them up in an automated manner, as the majority of the present-day MD simulation packages lack that functionality. Relative free energy calculations are a particular challenge for several reasons, including the problem of finding a common substructure and mapping the transformation to be applied. Here we present a tool, alchemical-setup.py, that automatically generates all the input files needed to perform relative solvation and binding free energy calculations with the MD package GROMACS. When combined with Lead Optimization Mapper [14], recently developed in our group, alchemical-setup.py allows fully automated setup of relative free energy calculations in GROMACS. Taking a graph of the planned calculations and a mapping, both computed by LOMAP, our tool generates the topology and coordinate files needed to perform relative free energy calculations for a given set of molecules, and provides a set of simulation input parameters. The tool was validated by performing relative hydration free energy calculations for a handful of molecules from the SAMPL4 challenge [16]. Good agreement with previously published results and the straightforward way in which free energy calculations can be conducted make alchemical-setup.py a promising tool for automated setup of relative solvation and binding free energy calculations. PMID:26487189

  17. Development of a new in-air micro-PIXE set-up with in-vacuum charge measurements in Atomki

    NASA Astrophysics Data System (ADS)

    Török, Zs.; Huszánk, R.; Csedreki, L.; Dani, J.; Szoboszlai, Z.; Kertész, Zs.

    2015-11-01

    A new external microbeam set-up has recently been installed as the extension of the existing microprobe system at the Laboratory of Ion Beam Applications of Atomki, Debrecen, Hungary. The external beam set-up, based on the system of Oxford Microbeams (OM), is equipped with two X-ray detectors for PIXE analysis, a digital microscope, two alignment lasers and a precision XYZ stage for easy and reproducible positioning of the sample. Exit windows with different thicknesses and of different materials can be used according to the actual demands, currently silicon-nitride (Si3N4) film with 200 nm thickness is employed in our laboratory. The first application was demonstrated in the field of archaeometry, on Bronze Age hoards from Hungary.

  18. Setting up a new CZO in the Ganga basin: instrumentation, stakeholder engagement and preliminary observations

    NASA Astrophysics Data System (ADS)

    Gupta, S.; Tripathi, S.; Sinha, R.; Karumanchi, S. H.; Paul, D.; Tripathi, S. N.; Sen, I. S.; Dash, S. K.

    2017-12-01

    The Ganga plains represent the abode of more than 400 million people and a region of severe anthropogenic disturbance to natural processes. Changing agricultural practices, inefficient use of water, contamination of groundwater systems, and decrease in soil fertility are some of the issues that have affected the long-term resilience of hydrological processes. The quantification of these processes demands a network of hydro-meteorological instrumentation, low-cost sensors, continuous engagement of stakeholders and real time data transmission at a fine interval. We have therefore set up a Critical Zone Observatory (CZO) in a small watershed (35km2) that forms an intensively managed rural landscape consisting of 92% of agricultural land in the Pandu River Basin (a small tributary of the Ganga River). Apart from setting up a hydro-meteorological observatory, the major science questions we want to address relate to development of water balance model, understanding the soil-water interaction and estimation of nutrient fluxes in the watershed. This observatory currently has various types of sensors that are divided into three categories: (a) spatially not dense but temporally fine data, (b) spatially dense but temporally not fine data and(c) spatially dense and temporally fine data. The first category represent high-cost sensors namely automatic weather stations that are deployed at two locations and provide data at 15-minute interval. The second category includes portable soil moisture, discharge and groundwater level at weekly/ biweekly interval. The third category comprises low-cost sensors including automatic surface and groundwater level sensors installed on open wells to monitor the continuous fluctuation of water level at every 15 minutes. In addition to involving the local communities in data collection (e.g. manual rainfall measurement, water and soil sampling), this CZO also aims to provide relevant information to them for improving their sustainability. The

  19. Parameterization of Model Validating Sets for Uncertainty Bound Optimizations. Revised

    NASA Technical Reports Server (NTRS)

    Lim, K. B.; Giesy, D. P.

    2000-01-01

    Given measurement data, a nominal model and a linear fractional transformation uncertainty structure with an allowance on unknown but bounded exogenous disturbances, easily computable tests for the existence of a model validating uncertainty set are given. Under mild conditions, these tests are necessary and sufficient for the case of complex, nonrepeated, block-diagonal structure. For the more general case which includes repeated and/or real scalar uncertainties, the tests are only necessary but become sufficient if a collinearity condition is also satisfied. With the satisfaction of these tests, it is shown that a parameterization of all model validating sets of plant models is possible. The new parameterization is used as a basis for a systematic way to construct or perform uncertainty tradeoff with model validating uncertainty sets which have specific linear fractional transformation structure for use in robust control design and analysis. An illustrative example which includes a comparison of candidate model validating sets is given.

  20. Evaluating Gene Set Enrichment Analysis Via a Hybrid Data Model

    PubMed Central

    Hua, Jianping; Bittner, Michael L.; Dougherty, Edward R.

    2014-01-01

    Gene set enrichment analysis (GSA) methods have been widely adopted by biological labs to analyze data and generate hypotheses for validation. Most of the existing comparison studies focus on whether the existing GSA methods can produce accurate P-values; however, practitioners are often more concerned with the correct gene-set ranking generated by the methods. The ranking performance is closely related to two critical goals associated with GSA methods: the ability to reveal biological themes and ensuring reproducibility, especially for small-sample studies. We have conducted a comprehensive simulation study focusing on the ranking performance of seven representative GSA methods. We overcome the limitation on the availability of real data sets by creating hybrid data models from existing large data sets. To build the data model, we pick a master gene from the data set to form the ground truth and artificially generate the phenotype labels. Multiple hybrid data models can be constructed from one data set and multiple data sets of smaller sizes can be generated by resampling the original data set. This approach enables us to generate a large batch of data sets to check the ranking performance of GSA methods. Our simulation study reveals that for the proposed data model, the Q2 type GSA methods have in general better performance than other GSA methods and the global test has the most robust results. The properties of a data set play a critical role in the performance. For the data sets with highly connected genes, all GSA methods suffer significantly in performance. PMID:24558298

  1. Setting up of a cerebral visual impairment clinic for children: Challenges and future developments.

    PubMed

    Philip, Swetha Sara

    2017-01-01

    The aim of this study is to describe the setting up of a cerebral visual impairment (CVI) clinic in a tertiary care hospital in South India and to describe the spectrum of cases seen. The CVI clinic, set up in February 2011, receives interdisciplinary input from a core team involving a pediatrician, neurologist, psychiatrist, occupational therapist, pediatric ophthalmologist, and an optometrist. All children, <18 years of age, with cerebral palsy (CP), learning disability, autism, neurodegenerative diseases, and brain trauma are referred to the clinic for functional vision assessment and opinion for further management. One thousand four hundred and seventy-eight patients were seen in the CVI clinic from February 2011 to September 2015. Eighty-five percent of the patients were from different parts of India. In the clinic, 61% had CP, 28% had seizure disorders, autism was seen in 9.5%, and learning disability, neurodegenerative conditions, and brain injury together constituted 1.5%. Most of the children (45%) had moderate CP. Forty percent of CVI was due to birth asphyxia, but about 20% did not have any known cause for CVI. Seventy percent of patients, who came back for follow-up, were carrying out the habilitation strategies suggested. Average attendance of over 300 new patients a year suggests a definite need for CVI clinics in the country. These children need specialized care to handle their complex needs. Although difficult to coordinate, an interdisciplinary team including the support groups and voluntary organizations is needed to facilitate the successful implementation of such specialized service.

  2. Setting up of a cerebral visual impairment clinic for children: Challenges and future developments

    PubMed Central

    Philip, Swetha Sara

    2017-01-01

    Aim: The aim of this study is to describe the setting up of a cerebral visual impairment (CVI) clinic in a tertiary care hospital in South India and to describe the spectrum of cases seen. Materials and Methods: The CVI clinic, set up in February 2011, receives interdisciplinary input from a core team involving a pediatrician, neurologist, psychiatrist, occupational therapist, pediatric ophthalmologist, and an optometrist. All children, <18 years of age, with cerebral palsy (CP), learning disability, autism, neurodegenerative diseases, and brain trauma are referred to the clinic for functional vision assessment and opinion for further management. Results: One thousand four hundred and seventy-eight patients were seen in the CVI clinic from February 2011 to September 2015. Eighty-five percent of the patients were from different parts of India. In the clinic, 61% had CP, 28% had seizure disorders, autism was seen in 9.5%, and learning disability, neurodegenerative conditions, and brain injury together constituted 1.5%. Most of the children (45%) had moderate CP. Forty percent of CVI was due to birth asphyxia, but about 20% did not have any known cause for CVI. Seventy percent of patients, who came back for follow-up, were carrying out the habilitation strategies suggested. Conclusions: Average attendance of over 300 new patients a year suggests a definite need for CVI clinics in the country. These children need specialized care to handle their complex needs. Although difficult to coordinate, an interdisciplinary team including the support groups and voluntary organizations is needed to facilitate the successful implementation of such specialized service. PMID:28300737

  3. Introduction to Fuzzy Set Theory

    NASA Technical Reports Server (NTRS)

    Kosko, Bart

    1990-01-01

    An introduction to fuzzy set theory is described. Topics covered include: neural networks and fuzzy systems; the dynamical systems approach to machine intelligence; intelligent behavior as adaptive model-free estimation; fuzziness versus probability; fuzzy sets; the entropy-subsethood theorem; adaptive fuzzy systems for backing up a truck-and-trailer; product-space clustering with differential competitive learning; and adaptive fuzzy system for target tracking.

  4. The Use of Instructional and Motivational Self-Talk in Setting up a Physical Education Lesson

    ERIC Educational Resources Information Center

    Zourbanos, Nikos

    2013-01-01

    The main purpose of this article is to provide guidelines to physical educators for setting up a self-talk program during their lesson. The article briefly presents definitions of self-talk and research findings in sport and physical education to highlight the important benefits of positive self-talk in enhancing task performance. It also provides…

  5. Setting up a clinic to assess children and young people for female genital mutilation.

    PubMed

    Hodes, Deborah; Creighton, Sarah M

    2017-02-01

    It is now mandatory for health, social care professionals and teachers to report to the police all under-18s where female genital mutilation (FGM) has been disclosed by the child or where physical signs of FGM are seen. Such referrals are likely to result in a request for medical examination. New multiagency statutory guidance sets out instructions for physical examination but provides no details how services should be set-up. This review gives practical guidance learnt from the first year of the UK's only dedicated children's FGM service. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  6. Considerations for setting up an order entry system for nuclear medicine tests.

    PubMed

    Hara, Narihiro; Onoguchi, Masahisa; Nishida, Toshihiko; Honda, Minoru; Houjou, Osamu; Yuhi, Masaru; Takayama, Teruhiko; Ueda, Jun

    2007-12-01

    Integrating the Healthcare Enterprise-Japan (IHE-J) was established in Japan in 2001 and has been working to standardize health information and make it accessible on the basis of the fundamental Integrating Healthcare Enterprise (IHE) specifications. However, because specialized operations are used in nuclear medicine tests, online sharing of patient information and test order information from the order entry system as shown by the scheduled workflow (SWF) is difficult, making information inconsistent throughout the facility and uniform management of patient information impossible. Therefore, we examined the basic design (subsystem design) for order entry systems, which are considered an important aspect of information management for nuclear medicine tests and needs to be consistent with the system used throughout the rest of the facility. There are many items that are required by the subsystem when setting up an order entry system for nuclear medicine tests. Among these items, those that are the most important in the order entry system are constructed using exclusion settings, because of differences in the conditions for using radiopharmaceuticals and contrast agents and appointment frame settings for differences in the imaging method and test items. To establish uniform management of patient information for nuclear medicine tests throughout the facility, it is necessary to develop an order entry system with exclusion settings and appointment frames as standard features. Thereby, integration of health information with the Radiology Information System (RIS) or Picture Archiving Communication System (PACS) based on Digital Imaging Communications in Medicine (DICOM) standards and real-time health care assistance can be attained, achieving the IHE agenda of improving health care service and efficiently sharing information.

  7. Reference set design for relational modeling of fuzzy systems

    NASA Astrophysics Data System (ADS)

    Lapohos, Tibor; Buchal, Ralph O.

    1994-10-01

    One of the keys to the successful relational modeling of fuzzy systems is the proper design of fuzzy reference sets. This has been discussed throughout the literature. In the frame of modeling a stochastic system, we analyze the problem numerically. First, we briefly describe the relational model and present the performance of the modeling in the most trivial case: the reference sets are triangle shaped. Next, we present a known fuzzy reference set generator algorithm (FRSGA) which is based on the fuzzy c-means (Fc-M) clustering algorithm. In the second section of this chapter we improve the previous FRSGA by adding a constraint to the Fc-M algorithm (modified Fc-M or MFc-M): two cluster centers are forced to coincide with the domain limits. This is needed to obtain properly shaped extreme linguistic reference values. We apply this algorithm to uniformly discretized domains of the variables involved. The fuzziness of the reference sets produced by both Fc-M and MFc-M is determined by a parameter, which in our experiments is modified iteratively. Each time, a new model is created and its performance analyzed. For certain algorithm parameter values both of these two algorithms have shortcomings. To eliminate the drawbacks of these two approaches, we develop a completely new generator algorithm for reference sets which we call Polyline. This algorithm and its performance are described in the last section. In all three cases, the modeling is performed for a variety of operators used in the inference engine and two defuzzification methods. Therefore our results depend neither on the system model order nor the experimental setup.

  8. Two-Dimensional Modeling of Heat and Moisture Dynamics in Swedish Roads: Model Set up and Parameter Sensitivity

    NASA Astrophysics Data System (ADS)

    Rasul, H.; Wu, M.; Olofsson, B.

    2017-12-01

    Modelling moisture and heat changes in road layers is very important to understand road hydrology and for better construction and maintenance of roads in a sustainable manner. In cold regions due to the freezing/thawing process in the partially saturated material of roads, the modeling task will become more complicated than simple model of flow through porous media without freezing/thawing pores considerations. This study is presenting a 2-D model simulation for a section of highway with considering freezing/thawing and vapor changes. Partial deferential equations (PDEs) are used in formulation of the model. Parameters are optimized from modelling results based on the measured data from test station on E18 highway near Stockholm. Impacts of phase change considerations in the modelling are assessed by comparing the modeled soil moisture with TDR-measured data. The results show that the model can be used for prediction of water and ice content in different layers of the road and at different seasons. Parameter sensitivities are analyzed by implementing a calibration strategy. In addition, the phase change consideration is evaluated in the modeling process, by comparing the PDE model with another model without considerations of freezing/thawing in roads. The PDE model shows high potential in understanding the moisture dynamics in the road system.

  9. Improving a Lecture-Size Molecular Model Set by Repurposing Used Whiteboard Markers

    ERIC Educational Resources Information Center

    Dragojlovic, Veljko

    2015-01-01

    Preparation of an inexpensive model set from whiteboard markers and either HGS molecular model set or atoms made of wood is described. The model set is relatively easy to prepare and is sufficiently large to be suitable as an instructor set for use in lectures.

  10. The Thick Level-Set model for dynamic fragmentation

    DOE PAGES

    Stershic, Andrew J.; Dolbow, John E.; Moës, Nicolas

    2017-01-04

    The Thick Level-Set (TLS) model is implemented to simulate brittle media undergoing dynamic fragmentation. This non-local model is discretized by the finite element method with damage represented as a continuous field over the domain. A level-set function defines the extent and severity of damage, and a length scale is introduced to limit the damage gradient. Numerical studies in one dimension demonstrate that the proposed method reproduces the rate-dependent energy dissipation and fragment length observations from analytical, numerical, and experimental approaches. In conclusion, additional studies emphasize the importance of appropriate bulk constitutive models and sufficient spatial resolution of the length scale.

  11. Astronaut Andrew M. Allen, mission commander, sets up systems for a television downlink on the

    NASA Technical Reports Server (NTRS)

    1996-01-01

    STS-75 ONBOARD VIEW --- Astronaut Andrew M. Allen, mission commander, sets up systems for a television downlink on the flight deck of the Space Shuttle Columbia. Allen was joined by four other astronauts and an international payload specialist for more than 16 days of research aboard Columbia. The photograph was taken with a 70mm handheld camera.

  12. Specific heat measurement set-up for quench condensed thin superconducting films.

    PubMed

    Poran, Shachaf; Molina-Ruiz, Manel; Gérardin, Anne; Frydman, Aviad; Bourgeois, Olivier

    2014-05-01

    We present a set-up designed for the measurement of specific heat of very thin or ultra-thin quench condensed superconducting films. In an ultra-high vacuum chamber, materials of interest can be thermally evaporated directly on a silicon membrane regulated in temperature from 1.4 K to 10 K. On this membrane, a heater and a thermometer are lithographically fabricated, allowing the measurement of heat capacity of the quench condensed layers. This apparatus permits the simultaneous thermal and electrical characterization of successively deposited layers in situ without exposing the deposited materials to room temperature or atmospheric conditions, both being irreversibly harmful to the samples. This system can be used to study specific heat signatures of phase transitions through the superconductor to insulator transition of quench condensed films.

  13. Rank the voltage across light bulbs … then set up the live experiment

    NASA Astrophysics Data System (ADS)

    Jacobs, Greg C.

    2018-02-01

    The Tasks Inspired by Physics Education Research (TIPERS) workbooks pose questions in styles quite different from the end-of-chapter problems that those of us of a certain age were assigned back in the days before Netscape. My own spin on TIPERS is not just to do them on paper, but to have students set up the situations in the laboratory to verify—or contradict —their paper solutions. The circuits unit is particularly conducive to creating quick-and-dirty lab setups that demonstrate the result of conceptually framed problems.

  14. An experimental methodology for a fuzzy set preference model

    NASA Technical Reports Server (NTRS)

    Turksen, I. B.; Willson, Ian A.

    1992-01-01

    A flexible fuzzy set preference model first requires approximate methodologies for implementation. Fuzzy sets must be defined for each individual consumer using computer software, requiring a minimum of time and expertise on the part of the consumer. The amount of information needed in defining sets must also be established. The model itself must adapt fully to the subject's choice of attributes (vague or precise), attribute levels, and importance weights. The resulting individual-level model should be fully adapted to each consumer. The methodologies needed to develop this model will be equally useful in a new generation of intelligent systems which interact with ordinary consumers, controlling electronic devices through fuzzy expert systems or making recommendations based on a variety of inputs. The power of personal computers and their acceptance by consumers has yet to be fully utilized to create interactive knowledge systems that fully adapt their function to the user. Understanding individual consumer preferences is critical to the design of new products and the estimation of demand (market share) for existing products, which in turn is an input to management systems concerned with production and distribution. The question of what to make, for whom to make it and how much to make requires an understanding of the customer's preferences and the trade-offs that exist between alternatives. Conjoint analysis is a widely used methodology which de-composes an overall preference for an object into a combination of preferences for its constituent parts (attributes such as taste and price), which are combined using an appropriate combination function. Preferences are often expressed using linguistic terms which cannot be represented in conjoint models. Current models are also not implemented an individual level, making it difficult to reach meaningful conclusions about the cause of an individual's behavior from an aggregate model. The combination of complex aggregate

  15. Simulating run-up on steep slopes with operational Boussinesq models; capabilities, spurious effects and instabilities

    NASA Astrophysics Data System (ADS)

    Løvholt, F.; Lynett, P.; Pedersen, G.

    2013-06-01

    Tsunamis induced by rock slides plunging into fjords constitute a severe threat to local coastal communities. The rock slide impact may give rise to highly non-linear waves in the near field, and because the wave lengths are relatively short, frequency dispersion comes into play. Fjord systems are rugged with steep slopes, and modeling non-linear dispersive waves in this environment with simultaneous run-up is demanding. We have run an operational Boussinesq-type TVD (total variation diminishing) model using different run-up formulations. Two different tests are considered, inundation on steep slopes and propagation in a trapezoidal channel. In addition, a set of Lagrangian models serves as reference models. Demanding test cases with solitary waves with amplitudes ranging from 0.1 to 0.5 were applied, and slopes were ranging from 10 to 50°. Different run-up formulations yielded clearly different accuracy and stability, and only some provided similar accuracy as the reference models. The test cases revealed that the model was prone to instabilities for large non-linearity and fine resolution. Some of the instabilities were linked with false breaking during the first positive inundation, which was not observed for the reference models. None of the models were able to handle the bore forming during drawdown, however. The instabilities are linked to short-crested undulations on the grid scale, and appear on fine resolution during inundation. As a consequence, convergence was not always obtained. It is reason to believe that the instability may be a general problem for Boussinesq models in fjords.

  16. A Logical Difficulty of the Parameter Setting Model.

    ERIC Educational Resources Information Center

    Sasaki, Yoshinori

    1990-01-01

    Seeks to prove that the parameter setting model (PSM) of Chomsky's Universal Grammar theory contains an internal contradiction when it is seriously taken to model the internal state of language learners. (six references) (JL)

  17. Fully Characterizing Axially Symmetric Szekeres Models with Three Data Sets

    NASA Astrophysics Data System (ADS)

    Célérier, Marie-Nöelle Mishra, Priti; Singh, Tejinder P.

    2015-01-01

    Inhomogeneous exact solutions of General Relativity with zero cosmological constant have been used in the literature to challenge the ΛCDM model. From one patch Lemaître-Tolman-Bondi (LTB) models to axially symmetric quasi-spherical Szekeres (QSS) Swiss-cheese models, some of them are able to reproduce to a good accuracy the cosmological data. It has been shown in the literature that a zero Λ LTB model with a central observer can be fully determined by two data sets. We demonstrate that an axially symmetric zero Λ QSS model with an observer located at the origin can be fully reconstructed from three data sets, number counts, luminosity distance and redshift drift. This is a first step towards a future demonstration involving five data sets and the most general Szekeres model.

  18. Effectiveness of Goal-Setting Telephone Follow-Up on Health Behaviors of Patients with Ischemic Stroke: A Randomized Controlled Trial.

    PubMed

    Wan, Li-Hong; Zhang, Xiao-Pei; Mo, Miao-Miao; Xiong, Xiao-Ni; Ou, Cui-Ling; You, Li-Ming; Chen, Shao-Xian; Zhang, Min

    2016-09-01

    Adopting healthy behaviors is critical for secondary stroke prevention, but many patients fail to follow national guidelines regarding diet, exercise, and abstinence from risk factors. Compliance often decreases with time after hospital discharge, yet few studies have examined programs promoting long-term adherence to health behaviors. Goal setting and telephone follow-up have been proven to be effective in other areas of medicine, so this study evaluated the effectiveness of a guideline-based, goal-setting telephone follow-up program for patients with ischemic stroke. This was a multicenter, assessor-blinded, parallel-group, randomized controlled trial. Ninety-one stroke patients were randomized to either a control group or an intervention group. Intervention consisted of predischarge education and 3 goal-setting follow-up sessions conducted by phone. Data were collected at baseline and during the third and sixth months after hospital discharge. Six months after discharge, patients in the intervention group exhibited significantly higher medication adherence than patients in the control group. There were no statistically significant differences in physical activity, nutrition, low-salt diet adherence, blood pressure monitoring, smoking abstinence, unhealthy use of alcohol, and modified Rankin Scale (mRS) scores between the 2 groups. Goal-setting telephone follow-up intervention for ischemic stroke patients is feasible and leads to improved medication adherence. However, the lack of group differences in other health behavior subcategories and in themRS score indicates a need for more effective intervention strategies to help patients reach guideline-recommended targets. Copyright © 2016 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  19. Three Collaborative Models for Scaling Up Evidence-Based Practices

    PubMed Central

    Roberts, Rosemarie; Jones, Helen; Marsenich, Lynne; Sosna, Todd; Price, Joseph M.

    2015-01-01

    The current paper describes three models of research-practice collaboration to scale-up evidence-based practices (EBP): (1) the Rolling Cohort model in England, (2) the Cascading Dissemination model in San Diego County, and (3) the Community Development Team model in 53 California and Ohio counties. Multidimensional Treatment Foster Care (MTFC) and KEEP are the focal evidence-based practices that are designed to improve outcomes for children and families in the child welfare, juvenile justice, and mental health systems. The three scale-up models each originated from collaboration between community partners and researchers with the shared goal of wide-spread implementation and sustainability of MTFC/KEEP. The three models were implemented in a variety of contexts; Rolling Cohort was implemented nationally, Cascading Dissemination was implemented within one county, and Community Development Team was targeted at the state level. The current paper presents an overview of the development of each model, the policy frameworks in which they are embedded, system challenges encountered during scale-up, and lessons learned. Common elements of successful scale-up efforts, barriers to success, factors relating to enduring practice relationships, and future research directions are discussed. PMID:21484449

  20. Setting up psychiatric services: cross-cultural issues in planning and delivery.

    PubMed

    Bhugra, D

    1997-01-01

    There is convincing evidence that in the UK, various ethnic minorities are over-represented in psychiatric hospitals, prisons and special hospitals. Various explanations have been put forward for this picture. The expanding emphasis on community care means that models for looking after ethnic communities should be innovative and work from a bottom-up approach. Psychiatric emphasis on clinical diagnoses means that services are developed according to the diagnoses rather than needs.

  1. The Objective Borderline method (OBM): a probability-based model for setting up an objective pass/fail cut-off score in medical programme assessments.

    PubMed

    Shulruf, Boaz; Turner, Rolf; Poole, Phillippa; Wilkinson, Tim

    2013-05-01

    The decision to pass or fail a medical student is a 'high stakes' one. The aim of this study is to introduce and demonstrate the feasibility and practicality of a new objective standard-setting method for determining the pass/fail cut-off score from borderline grades. Three methods for setting up pass/fail cut-off scores were compared: the Regression Method, the Borderline Group Method, and the new Objective Borderline Method (OBM). Using Year 5 students' OSCE results from one medical school we established the pass/fail cut-off scores by the abovementioned three methods. The comparison indicated that the pass/fail cut-off scores generated by the OBM were similar to those generated by the more established methods (0.840 ≤ r ≤ 0.998; p < .0001). Based on theoretical and empirical analysis, we suggest that the OBM has advantages over existing methods in that it combines objectivity, realism, robust empirical basis and, no less importantly, is simple to use.

  2. A Guide To Setting Up a Creative Art Experiences Program for Older Adults with Developmental Disabilities.

    ERIC Educational Resources Information Center

    Harlan, Jane E.

    This guide is intended to help agencies serving older adults with mental retardation and other developmental disabilities in setting up a relatively inexpensive creative art program. The first section presents a rationale for creative art experiences for this population and then provides specific information on program development, including…

  3. Setting Up a Veterinary Medicine Skills Lab in Germany

    PubMed Central

    Dilly, Marc; Tipold, Andrea; Schaper, Elisabeth; Ehlers, Jan P.

    2014-01-01

    The amendments introduced to the current Veterinary Licensing Ordinance (TAppV) by the Veterinary Licensing Regulation (TAppO) have brought a high degree of skills orientation to fill the gap between academic study and preparing for a wide range of professional skills. In order to improve the veterinary skills of students while conveying fundamental methods in a structured and reproducible way, the University of Veterinary Medicine Hannover, Foundation, has set up the first central veterinary skills lab in Germany. Practical training is provided by means of a three-tier delivery approach. This involves around 40 simulators on an area of approx. 800 m² under the guidance of 6-8 staff members, along with supplementary resources such as posters, text instructions and YouTube videos. Since it opened in March 2013, there have been 769 visits to the skills lab and 30,734 hits on YouTube. Initial results show that the skills lab helps to maintain student motivation by teaching them practical skills at an early stage of the basic study-based acquisition of knowledge, whilst reinforcing skills acquisition per se in competence-based teaching. It enables veterinary students to prepare for their first examinations and treatments of live patients in a manner compliant with animal welfare. PMID:24872855

  4. Setting up your own business. Facing the future as an entrepreneur.

    PubMed

    Brent, N J

    1990-01-01

    Other areas of setting up and running a business also are important to explore, especially if the business plans to use employees. You will become an employer, and you must be familiar with rules and regulations that include areas such as the employee's right to a safe workplace, worker's compensation laws, unemployment compensation laws and tax liabilities, antidiscrimination laws, and wage and tax laws. If independent contractors are going to be used, you must recognize that well-developed contracts are a necessity. If you are going to market a new product, consult with an attorney whose practice concentrates in trademark and patent law before the product is shared with others. Being well informed about the proposed business venture, not only before its establishment but as it develops and grows, can help you be in the best position to have a successful business.

  5. Setting up and running an advanced light microscopy and imaging facility.

    PubMed

    Sánchez, Carlos; Muñoz, Ma Ángeles; Villalba, Maite; Labrador, Verónica; Díez-Guerra, F Javier

    2011-07-01

    During the last twenty years, interest in light microscopy and imaging techniques has grown in various fields, such as molecular and cellular biology, developmental biology, and neurobiology. In addition, the number of scientific articles and journals using these techniques is rapidly increasing. Nowadays, most research institutions require sophisticated microscopy systems to cover their investigation demands. In general, such instruments are too expensive and complex to be purchased and managed by a single laboratory or research group, so they have to be shared with other groups and supervised by specialized personnel. This is the reason why microscopy and imaging facilities are becoming so important at research institutions nowadays. In this unit, we have gathered and presented a number of issues and considerations from our own experience that we hope will be helpful when planning or setting up a new facility.

  6. Application of the Monte Carlo method for building up models for octanol-water partition coefficient of platinum complexes

    NASA Astrophysics Data System (ADS)

    Toropov, Andrey A.; Toropova, Alla P.

    2018-06-01

    Predictive model of logP for Pt(II) and Pt(IV) complexes built up with the Monte Carlo method using the CORAL software has been validated with six different splits into the training and validation sets. The improving of the predictive potential of models for six different splits has been obtained using so-called index of ideality of correlation. The suggested models give possibility to extract molecular features, which cause the increase or vice versa decrease of the logP.

  7. 3 N.J. Community Colleges Set Up New Programs as Part of State's Plans for Welfare Reform.

    ERIC Educational Resources Information Center

    Jaschik, Scott

    1987-01-01

    Bergen Community College, Middlesex County College, and Union County College are setting up (1) counseling programs to help welfare recipients determine their job interests and skills, (2) job-training courses, and (3) day-care centers for participants children. (MLW)

  8. The effect of different foot and hand set-up positions on backstroke start performance.

    PubMed

    de Jesus, Karla; de Jesus, Kelly; Abraldes, J Arturo; Mourão, Luis; Borgonovo-Santos, Márcio; Medeiros, Alexandre I A; Gonçalves, Pedro; Chainok, Phornpot; Fernandes, Ricardo J; Vaz, Mário A P; Vilas-Boas, João Paulo

    2016-11-01

    Foot and hand set-up position effects were analysed on backstroke start performance. Ten swimmers randomly completed 27 starts grouped in trials (n = 3) of each variation, changing foot (totally immersed, partially and totally emerged) and hand (lowest, highest horizontal and vertical) positioning. Fifteen cameras recorded kinematics, and four force plates collected hands and feet kinetics. Standardised mean difference and 95% confidence intervals were used. Variations with feet immersed have shown lower vertical centre of mass (CM) set-up position (0.16 m), vertical impulse exerted at the hands, horizontal and vertical impulse exerted at the feet (0.28, 0.41, 0.16 N/BW.s, respectively) than feet emerged with hands horizontal and vertically positioned. Most variations with feet partially emerged exhibited higher and lesser vertical impulse exerted at hands than feet immersed and emerged (e.g. vertical handgrip, 0.13, 0.15 N/BW.s, respectively). Variation with feet emerged and hands on the lowest horizontal handgrip depicted shorter horizontal (0.23, 0.26 m) and vertical CM positioning at flight (0.16, 0.15 m) than the highest horizontal and vertical handgrip, respectively. Start variations have not affected 15-m time. Variations with feet partially or totally emerged depicted advantages, but focusing on the entry and underwater biomechanics is relevant for a shorter start time.

  9. Using the level set method in slab detachment modeling

    NASA Astrophysics Data System (ADS)

    Hillebrand, B.; Geenen, T.; Spakman, W.; van den Berg, A. P.

    2012-04-01

    Slab detachment plays an important role in the dynamics of several regions in the world such as the Mediterranean-Carpathian region and the Anatolia-Aegean Region. It is therefore important to gain better insights in the various aspects of this process by further modeling of this phenomenon. In this study we model slab detachment using a visco-plastic composite rheology consisting of diffusion, dislocation and Peierls creep. In order to gain more control over this visco-plastic composite rheology, as well as some deterministic advantages, the models presented in this study make use of the level set method (Osher and Sethian J. Comp. Phys., 1988). The level set method is a computational method to track interfaces. It works by creating a signed distance function which is zero at the interface of interest which is then advected by the flow field. This does not only allow one to track the interface but also to determine on which side of the interface a certain point is located since the level set function is determined in the entire domain and not just on the interface. The level set method is used in a wide variety of scientific fields including geophysics. In this study we use the level set method to keep track of the interface between the slab and the mantle. This allows us to determine more precisely the moment and depth of slab detachment. It also allows us to clearly distinguish the mantle from the slab and have therefore more control over their different rheologies. We focus on the role of Peierls creep in the slab detachment process and on the use of the level set method in modeling this process.

  10. 41 CFR 102-34.285 - Where can we obtain help in setting up a maintenance program?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... PROPERTY 34-MOTOR VEHICLE MANAGEMENT Scheduled Maintenance of Motor Vehicles § 102-34.285 Where can we... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false Where can we obtain help in setting up a maintenance program? 102-34.285 Section 102-34.285 Public Contracts and Property...

  11. A Python tool to set up relative free energy calculations in GROMACS.

    PubMed

    Klimovich, Pavel V; Mobley, David L

    2015-11-01

    Free energy calculations based on molecular dynamics (MD) simulations have seen a tremendous growth in the last decade. However, it is still difficult and tedious to set them up in an automated manner, as the majority of the present-day MD simulation packages lack that functionality. Relative free energy calculations are a particular challenge for several reasons, including the problem of finding a common substructure and mapping the transformation to be applied. Here we present a tool, alchemical-setup.py, that automatically generates all the input files needed to perform relative solvation and binding free energy calculations with the MD package GROMACS. When combined with Lead Optimization Mapper (LOMAP; Liu et al. in J Comput Aided Mol Des 27(9):755-770, 2013), recently developed in our group, alchemical-setup.py allows fully automated setup of relative free energy calculations in GROMACS. Taking a graph of the planned calculations and a mapping, both computed by LOMAP, our tool generates the topology and coordinate files needed to perform relative free energy calculations for a given set of molecules, and provides a set of simulation input parameters. The tool was validated by performing relative hydration free energy calculations for a handful of molecules from the SAMPL4 challenge (Mobley et al. in J Comput Aided Mol Des 28(4):135-150, 2014). Good agreement with previously published results and the straightforward way in which free energy calculations can be conducted make alchemical-setup.py a promising tool for automated setup of relative solvation and binding free energy calculations.

  12. A managed clinical network for cardiac services: set-up, operation and impact on patient care

    PubMed Central

    E StC Hamilton, Karen; M Sullivan, Frank; T Donnan, Peter; Taylor, Rex; Ikenwilo, Divine; Scott, Anthony; Baker, Chris; Wyke, Sally

    2005-01-01

    Abstract Purpose To investigate the set up and operation of a Managed Clinical Network for cardiac services and assess its impact on patient care. Methods This single case study used process evaluation with observational before and after comparison of indicators of quality of care and costs. The study was conducted in Dumfries and Galloway, Scotland and used a three-level framework. Process evaluation of the network set-up and operation through a documentary review of minutes; guidelines and protocols; transcripts of fourteen semi-structured interviews with health service personnel including senior managers, general practitioners, nurses, cardiologists and members of the public. Outcome evaluation of the impact of the network through interrupted time series analysis of clinical data of 202 patients aged less than 76 years admitted to hospital with a confirmed myocardial infarction one-year pre and one-year post, the establishment of the network. The main outcome measures were differences between indicators of quality of care targeted by network protocols. Economic evaluation of the transaction costs of the set-up and operation of the network and the resource costs of the clinical care of the 202 myocardial infarction patients from the time of hospital admission to 6 months post discharge through interrupted time series analysis. The outcome measure was different in National Health Service resource use. Results Despite early difficulties, the network was successful in bringing together clinicians, patients and managers to redesign services, exhibiting most features of good network management. The role of the energetic lead clinician was crucial, but the network took time to develop and ‘bed down’. Its primary “modus operand” was the development of a myocardial infarction pathway and associated protocols. Of sixteen clinical care indicators, two improved significantly following the launch of the network and nine showed improvements, which were not

  13. Does rational selection of training and test sets improve the outcome of QSAR modeling?

    PubMed

    Martin, Todd M; Harten, Paul; Young, Douglas M; Muratov, Eugene N; Golbraikh, Alexander; Zhu, Hao; Tropsha, Alexander

    2012-10-22

    Prior to using a quantitative structure activity relationship (QSAR) model for external predictions, its predictive power should be established and validated. In the absence of a true external data set, the best way to validate the predictive ability of a model is to perform its statistical external validation. In statistical external validation, the overall data set is divided into training and test sets. Commonly, this splitting is performed using random division. Rational splitting methods can divide data sets into training and test sets in an intelligent fashion. The purpose of this study was to determine whether rational division methods lead to more predictive models compared to random division. A special data splitting procedure was used to facilitate the comparison between random and rational division methods. For each toxicity end point, the overall data set was divided into a modeling set (80% of the overall set) and an external evaluation set (20% of the overall set) using random division. The modeling set was then subdivided into a training set (80% of the modeling set) and a test set (20% of the modeling set) using rational division methods and by using random division. The Kennard-Stone, minimal test set dissimilarity, and sphere exclusion algorithms were used as the rational division methods. The hierarchical clustering, random forest, and k-nearest neighbor (kNN) methods were used to develop QSAR models based on the training sets. For kNN QSAR, multiple training and test sets were generated, and multiple QSAR models were built. The results of this study indicate that models based on rational division methods generate better statistical results for the test sets than models based on random division, but the predictive power of both types of models are comparable.

  14. Soybean canopy reflectance modeling data sets

    NASA Technical Reports Server (NTRS)

    Ranson, K. J.; Biehl, L. L.; Daughtry, C. S. T.

    1984-01-01

    Numerous mathematical models of the interaction of radiation with vegetation canopies have been developed over the last two decades. However, data with which to exercise and validate these models are scarce. During three days in the summer of 1980, experiments are conducted with the objective of gaining insight about the effects of solar illumination and view angles on soybean canopy reflectance. In concert with these experiment, extensive measurements of the soybean canopies are obtained. This document is a compilation of the bidirectional reflectance factors, agronomic, characteristics, canopy geometry, and leaf, stem, and pod optical properties of the soybean canopies. These data sets should be suitable for use with most vegetation canopy reflectance models.

  15. Worms Eat My Garbage. How To Set Up and Maintain a Worm Composting System. First Edition.

    ERIC Educational Resources Information Center

    Appelhof, Mary

    This book is a resource for parents and teachers who want to teach about recycling and composting by setting up and maintaining a worm composting system. It is designed to be a detailed yet simple manual of vermicomposting. The manual covers the basics of vermicomposting and answers such questions as where to store a composting container, what…

  16. Inconsistent Strategies to Spin up Models in CMIP5: Implications for Ocean Biogeochemical Model Performance Assessment

    NASA Technical Reports Server (NTRS)

    Seferian, Roland; Gehlen, Marion; Bopp, Laurent; Resplandy, Laure; Orr, James C.; Marti, Olivier; Dunne, John P.; Christian, James R.; Doney, Scott C.; Ilyina, Tatiana; hide

    2015-01-01

    During the fifth phase of the Coupled Model Intercomparison Project (CMIP5) substantial efforts were made to systematically assess the skill of Earth system models. One goal was to check how realistically representative marine biogeochemical tracer distributions could be reproduced by models. In routine assessments model historical hindcasts were compared with available modern biogeochemical observations. However, these assessments considered neither how close modeled biogeochemical reservoirs were to equilibrium nor the sensitivity of model performance to initial conditions or to the spin-up protocols. Here, we explore how the large diversity in spin-up protocols used for marine biogeochemistry in CMIP5 Earth system models (ESMs) contributes to model-to-model differences in the simulated fields. We take advantage of a 500-year spin-up simulation of IPSL-CM5A-LR to quantify the influence of the spin-up protocol on model ability to reproduce relevant data fields. Amplification of biases in selected biogeochemical fields (O2, NO3, Alk-DIC) is assessed as a function of spin-up duration. We demonstrate that a relationship between spin-up duration and assessment metrics emerges from our model results and holds when confronted with a larger ensemble of CMIP5 models. This shows that drift has implications for performance assessment in addition to possibly aliasing estimates of climate change impact. Our study suggests that differences in spin-up protocols could explain a substantial part of model disparities, constituting a source of model-to- model uncertainty. This requires more attention in future model intercomparison exercises in order to provide quantitatively more correct ESM results on marine biogeochemistry and carbon cycle feedbacks.

  17. Numerical models for continental break-up: Implications for the South Atlantic

    NASA Astrophysics Data System (ADS)

    Beniest, A.; Koptev, A.; Burov, E.

    2017-03-01

    We propose a mechanism that explains in one unified framework the presence of continental break-up features such as failed rift arms and high-velocity and high-density bodies that occur along the South Atlantic rifted continental margins. We used 2D and 3D numerical models to investigate the impact of thermo-rheological structure of the continental lithosphere and initial plume position on continental rifting and break-up processes. 2D experiments show that break-up can be 1) "central", mantle plume-induced and directly located above the centre of the mantle anomaly, 2) "shifted", mantle plume-induced and 50 to 200 km shifted from the initial plume location or 3) "distant", self-induced due to convection and/or slab-subduction/delamination and 300 to 800 km off-set from the original plume location. With a 3D, perfectly symmetrical and laterally homogeneous setup, the location of continental break-up can be shifted hundreds of kilometres from the initial position of the mantle anomaly. We demonstrate that in case of shifted or distant continental break-up with respect to the original plume location, multiple features can be explained. Its deep-seated source can remain below the continent at one or both sides of the newly-formed ocean. This mantle material, glued underneath the margins at lower crustal levels, resembles the geometry and location of high velocity/high density bodies observed along the South Atlantic conjugate margins. Impingement of vertically up-welled plume material on the base of the lithosphere results in pre-break-up topography variations that are located just above this initial anomaly impingement. This can be interpreted as aborted rift features that are also observed along the rifted margins. When extension continues after continental break-up, high strain rates can relocalize. This relocation has been so far attributed to rift jumps. Most importantly, this study shows that there is not one, single rift mode for plume-induced crustal break-up.

  18. A new region-edge based level set model with applications to image segmentation

    NASA Astrophysics Data System (ADS)

    Zhi, Xuhao; Shen, Hong-Bin

    2018-04-01

    Level set model has advantages in handling complex shapes and topological changes, and is widely used in image processing tasks. The image segmentation oriented level set models can be grouped into region-based models and edge-based models, both of which have merits and drawbacks. Region-based level set model relies on fitting to color intensity of separated regions, but is not sensitive to edge information. Edge-based level set model evolves by fitting to local gradient information, but can get easily affected by noise. We propose a region-edge based level set model, which considers saliency information into energy function and fuses color intensity with local gradient information. The evolution of the proposed model is implemented by a hierarchical two-stage protocol, and the experimental results show flexible initialization, robust evolution and precise segmentation.

  19. Use of fuzzy sets in modeling of GIS objects

    NASA Astrophysics Data System (ADS)

    Mironova, Yu N.

    2018-05-01

    The paper discusses modeling and methods of data visualization in geographic information systems. Information processing in Geoinformatics is based on the use of models. Therefore, geoinformation modeling is a key in the chain of GEODATA processing. When solving problems, using geographic information systems often requires submission of the approximate or insufficient reliable information about the map features in the GIS database. Heterogeneous data of different origin and accuracy have some degree of uncertainty. In addition, not all information is accurate: already during the initial measurements, poorly defined terms and attributes (e.g., "soil, well-drained") are used. Therefore, there are necessary methods for working with uncertain requirements, classes, boundaries. The author proposes using spatial information fuzzy sets. In terms of a characteristic function, a fuzzy set is a natural generalization of ordinary sets, when one rejects the binary nature of this feature and assumes that it can take any value in the interval.

  20. Learning Data Set Influence on Identification Accuracy of Gas Turbine Neural Network Model

    NASA Astrophysics Data System (ADS)

    Kuznetsov, A. V.; Makaryants, G. M.

    2018-01-01

    There are many gas turbine engine identification researches via dynamic neural network models. It should minimize errors between model and real object during identification process. Questions about training data set processing of neural networks are usually missed. This article presents a study about influence of data set type on gas turbine neural network model accuracy. The identification object is thermodynamic model of micro gas turbine engine. The thermodynamic model input signal is the fuel consumption and output signal is the engine rotor rotation frequency. Four types input signals was used for creating training and testing data sets of dynamic neural network models - step, fast, slow and mixed. Four dynamic neural networks were created based on these types of training data sets. Each neural network was tested via four types test data sets. In the result 16 transition processes from four neural networks and four test data sets from analogous solving results of thermodynamic model were compared. The errors comparison was made between all neural network errors in each test data set. In the comparison result it was shown error value ranges of each test data set. It is shown that error values ranges is small therefore the influence of data set types on identification accuracy is low.

  1. Wind-Induced Air-Flow Patterns in an Urban Setting: Observations and Numerical Modeling

    NASA Astrophysics Data System (ADS)

    Sattar, Ahmed M. A.; Elhakeem, Mohamed; Gerges, Bishoy N.; Gharabaghi, Bahram; Gultepe, Ismail

    2018-04-01

    City planning can have a significant effect on wind flow velocity patterns and thus natural ventilation. Buildings with different heights are roughness elements that can affect the near- and far-field wind flow velocity. This paper aims at investigating the impact of an increase in building height on the nearby velocity fields. A prototype urban setting of buildings with two different heights (25 and 62.5 cm) is built up and placed in a wind tunnel. Wind flow velocity around the buildings is mapped at different heights. Wind tunnel measurements are used to validate a 3D-numerical Reynolds averaged Naviers-Stokes model. The validated model is further used to calculate the wind flow velocity patterns for cases with different building heights. It was found that increasing the height of some buildings in an urban setting can lead to the formation of large horseshoe vortices and eddies around building corners. A separation area is formed at the leeward side of the building, and the recirculation of air behind the building leads to the formation of slow rotation vortices. The opposite effect is observed in the wake (cavity) region of the buildings, where both the cavity length and width are significantly reduced, and this resulted in a pronounced increase in the wind flow velocity. A significant increase in the wind flow velocity in the wake region of tall buildings with a value of up to 30% is observed. The spatially averaged velocities around short buildings also increased by 25% compared to those around buildings with different heights. The increase in the height of some buildings is found to have a positive effect on the wind ventilation at the pedestrian level.

  2. A fuzzy set preference model for market share analysis

    NASA Technical Reports Server (NTRS)

    Turksen, I. B.; Willson, Ian A.

    1992-01-01

    Consumer preference models are widely used in new product design, marketing management, pricing, and market segmentation. The success of new products depends on accurate market share prediction and design decisions based on consumer preferences. The vague linguistic nature of consumer preferences and product attributes, combined with the substantial differences between individuals, creates a formidable challenge to marketing models. The most widely used methodology is conjoint analysis. Conjoint models, as currently implemented, represent linguistic preferences as ratio or interval-scaled numbers, use only numeric product attributes, and require aggregation of individuals for estimation purposes. It is not surprising that these models are costly to implement, are inflexible, and have a predictive validity that is not substantially better than chance. This affects the accuracy of market share estimates. A fuzzy set preference model can easily represent linguistic variables either in consumer preferences or product attributes with minimal measurement requirements (ordinal scales), while still estimating overall preferences suitable for market share prediction. This approach results in flexible individual-level conjoint models which can provide more accurate market share estimates from a smaller number of more meaningful consumer ratings. Fuzzy sets can be incorporated within existing preference model structures, such as a linear combination, using the techniques developed for conjoint analysis and market share estimation. The purpose of this article is to develop and fully test a fuzzy set preference model which can represent linguistic variables in individual-level models implemented in parallel with existing conjoint models. The potential improvements in market share prediction and predictive validity can substantially improve management decisions about what to make (product design), for whom to make it (market segmentation), and how much to make (market share

  3. Evolving Non-Dominated Parameter Sets for Computational Models from Multiple Experiments

    NASA Astrophysics Data System (ADS)

    Lane, Peter C. R.; Gobet, Fernand

    2013-03-01

    Creating robust, reproducible and optimal computational models is a key challenge for theorists in many sciences. Psychology and cognitive science face particular challenges as large amounts of data are collected and many models are not amenable to analytical techniques for calculating parameter sets. Particular problems are to locate the full range of acceptable model parameters for a given dataset, and to confirm the consistency of model parameters across different datasets. Resolving these problems will provide a better understanding of the behaviour of computational models, and so support the development of general and robust models. In this article, we address these problems using evolutionary algorithms to develop parameters for computational models against multiple sets of experimental data; in particular, we propose the `speciated non-dominated sorting genetic algorithm' for evolving models in several theories. We discuss the problem of developing a model of categorisation using twenty-nine sets of data and models drawn from four different theories. We find that the evolutionary algorithms generate high quality models, adapted to provide a good fit to all available data.

  4. The Roles of Feature-Specific Task Set and Bottom-Up Salience in Attentional Capture: An ERP Study

    ERIC Educational Resources Information Center

    Eimer, Martin; Kiss, Monika; Press, Clare; Sauter, Disa

    2009-01-01

    We investigated the roles of top-down task set and bottom-up stimulus salience for feature-specific attentional capture. Spatially nonpredictive cues preceded search arrays that included a color-defined target. For target-color singleton cues, behavioral spatial cueing effects were accompanied by cue-induced N2pc components, indicative of…

  5. Setting up a Low-Cost Lab Management System for a Multi-Purpose Computing Laboratory Using Virtualisation Technology

    ERIC Educational Resources Information Center

    Mok, Heng Ngee; Lee, Yeow Leong; Tan, Wee Kiat

    2012-01-01

    This paper describes how a generic computer laboratory equipped with 52 workstations is set up for teaching IT-related courses and other general purpose usage. The authors have successfully constructed a lab management system based on decentralised, client-side software virtualisation technology using Linux and free software tools from VMware that…

  6. Guide for SDEC Set up

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bibby, R; Guthrie, E

    2009-01-30

    The instrument has four collection vials that must be filled with ethylene glycol before operation. Each of the four vials should be labeled 1 through 4 and the empty weights recorded. Fill each vial with 80 mL of ethylene glycol and record the weight again. In order for the instrument to operate properly, the collection vials should always have less than 160 mL of total liquid in them. After completing a sample run, remove the collection vials, use a transfer pipette to remove any liquid that might still be on the air paddler, wipe off any condensation from the exteriormore » of the collection vial and record weight. From the instrument, record the ending volume and the time of operation. The solution mixed in the scintillation vial will be 2 ml of a 95% to 50% ethylene glycol to water mixture. To determine the efficiency of counting at all of these concentrations, a series of vials should be set up that consist of 18 ml of Ultima Gold LLT cocktail mixed with standard, regular deionized water and ethylene glycol. The efficiency curve should be counted in the 'Low Level' count mode with the Luminescence Correction ON and the Color Quench Correction ON. Once the tSIE values are determined, chart the cpm against the tSIE numbers and find the best fit for the data. The resulting equation is to be used to converting tSIE values from the collection vials to efficiency. To determine the background cpm value of the ethylene glycol, count a 2 ml sample of ethylene glycol with 18 ml of Ultima Gold for 100 minutes. To determine the total activity of the sample, take two 2 ml aliquots of sample from the first vial and place in separate scintillation vials. Record the weight of each aliquot. Determine the percentage of total sample each aliquot represents by dividing the aliquot weight by the total solution weight from the vial. Also, determine the percentage of ethylene glycol in the sample by dividing the initial solution weight by the final solution weight and multiplying

  7. Setting up an atmospheric-hydrologic model for seasonal forecasts of water flow into dams in a mountainous semi-arid environment (Cyprus)

    NASA Astrophysics Data System (ADS)

    Camera, Corrado; Bruggeman, Adriana; Zittis, Georgios; Hadjinicolaou, Panos

    2017-04-01

    Due to limited rainfall concentrated in the winter months and long dry summers, storage and management of water resources is of paramount importance in Cyprus. For water storage purposes, the Cyprus Water Development Department is responsible for the operation of 56 large dams total volume of 310 Mm3) and 51 smaller reservoirs (total volume of 17 Mm3) over the island. Climate change is also expected to heavily affect Cyprus water resources with a 1.5%-12% decrease in mean annual rainfall (Camera et al., 2016) projected for the period 2020-2050, relative to 1980-2010. This will make reliable seasonal water inflow forecasts even more important for water managers. The overall aim of this study is to set-up the widely used Weather Research and Forecasting (WRF) model with its hydrologic extension (WRF-hydro), for seasonal forecasts of water inflow in dams located in the Troodos Mountains of Cyprus. The specific objectives of this study are: i) the calibration and evaluation of WRF-Hydro for the simulation of stream flows, in the Troodos Mountains, for past rainfall seasons; ii) a sensitivity analysis of the model parameters; iii) a comparison of the application of the atmospheric-hydrologic modelling chain versus the use of climate observations as forcing. The hydrologic model is run in its off-line version with daily forcing over a 1-km grid, while the overland and channel routing is performed on a 100-m grid with a time-step of 6 seconds. Model outputs are exported on a daily base. First, WRF-Hydro is calibrated and validated over two 1-year periods (October-September), using a 1-km gridded observational precipitation dataset (Camera et al., 2014) as input. For the calibration and validation periods, years with annual rainfall close to the long-term average and with the presence of extreme rainfall and flow events were selected. A sensitivity analysis is performed, for the following parameters: partitioning of rainfall into runoff and infiltration (REFKDT), the

  8. Setting conservation management thresholds using a novel participatory modeling approach.

    PubMed

    Addison, P F E; de Bie, K; Rumpff, L

    2015-10-01

    We devised a participatory modeling approach for setting management thresholds that show when management intervention is required to address undesirable ecosystem changes. This approach was designed to be used when management thresholds: must be set for environmental indicators in the face of multiple competing objectives; need to incorporate scientific understanding and value judgments; and will be set by participants with limited modeling experience. We applied our approach to a case study where management thresholds were set for a mat-forming brown alga, Hormosira banksii, in a protected area management context. Participants, including management staff and scientists, were involved in a workshop to test the approach, and set management thresholds to address the threat of trampling by visitors to an intertidal rocky reef. The approach involved trading off the environmental objective, to maintain the condition of intertidal reef communities, with social and economic objectives to ensure management intervention was cost-effective. Ecological scenarios, developed using scenario planning, were a key feature that provided the foundation for where to set management thresholds. The scenarios developed represented declines in percent cover of H. banksii that may occur under increased threatening processes. Participants defined 4 discrete management alternatives to address the threat of trampling and estimated the effect of these alternatives on the objectives under each ecological scenario. A weighted additive model was used to aggregate participants' consequence estimates. Model outputs (decision scores) clearly expressed uncertainty, which can be considered by decision makers and used to inform where to set management thresholds. This approach encourages a proactive form of conservation, where management thresholds and associated actions are defined a priori for ecological indicators, rather than reacting to unexpected ecosystem changes in the future. © 2015 The

  9. A Validated Set of MIDAS V5 Task Network Model Scenarios to Evaluate Nextgen Closely Spaced Parallel Operations Concepts

    NASA Technical Reports Server (NTRS)

    Gore, Brian Francis; Hooey, Becky Lee; Haan, Nancy; Socash, Connie; Mahlstedt, Eric; Foyle, David C.

    2013-01-01

    The Closely Spaced Parallel Operations (CSPO) scenario is a complex, human performance model scenario that tested alternate operator roles and responsibilities to a series of off-nominal operations on approach and landing (see Gore, Hooey, Mahlstedt, Foyle, 2013). The model links together the procedures, equipment, crewstation, and external environment to produce predictions of operator performance in response to Next Generation system designs, like those expected in the National Airspaces NextGen concepts. The task analysis that is contained in the present report comes from the task analysis window in the MIDAS software. These tasks link definitions and states for equipment components, environmental features as well as operational contexts. The current task analysis culminated in 3300 tasks that included over 1000 Subject Matter Expert (SME)-vetted, re-usable procedural sets for three critical phases of flight; the Descent, Approach, and Land procedural sets (see Gore et al., 2011 for a description of the development of the tasks included in the model; Gore, Hooey, Mahlstedt, Foyle, 2013 for a description of the model, and its results; Hooey, Gore, Mahlstedt, Foyle, 2013 for a description of the guidelines that were generated from the models results; Gore, Hooey, Foyle, 2012 for a description of the models implementation and its settings). The rollout, after landing checks, taxi to gate and arrive at gate illustrated in Figure 1 were not used in the approach and divert scenarios exercised. The other networks in Figure 1 set up appropriate context settings for the flight deck.The current report presents the models task decomposition from the tophighest level and decomposes it to finer-grained levels. The first task that is completed by the model is to set all of the initial settings for the scenario runs included in the model (network 75 in Figure 1). This initialization process also resets the CAD graphic files contained with MIDAS, as well as the embedded

  10. Effect of cluster set warm-up configurations on sprint performance in collegiate male soccer players.

    PubMed

    Nickerson, Brett S; Mangine, Gerald T; Williams, Tyler D; Martinez, Ismael A

    2018-06-01

    The purpose of this study was to determine if back squat cluster sets (CS) with varying inter-repetition rest periods would potentiate greater sprint performance compared with a traditional set parallel back squat in collegiate soccer players. Twelve collegiate male soccer players (age, 21.0 ± 2.0 years; height, 180.0 ± 9.0 cm; body mass, 79.0 ± 9.5 kg) performed a 20-m sprint prior to a potentiation complex and at 1, 4, 7, and 10 min postexercise on 3 separate, randomized occasions. On each occasion, the potentiation complex consisted of 1 set of 3 repetitions at 85% 1-repetition maximum (1RM) for the traditional parallel back squat. However, on 1 occasion the 3-repetition set was performed in a traditional manner (i.e., continuously), whereas on the other 2 occasions, 30s (CS 30 ) and 60 s (CS 60 ) of rest were allotted between each repetition. Repeated-measures ANOVA revealed greater (p = 0.022) mean barbell velocity on CS 60 compared with the traditional set. However, faster (p < 0.040) 20-m sprint times were observed for CS 30 (3.15 ± 0.16 s) compared with traditional (3.20 ± 0.17 s) only at 10 min postexercise. No other differences were observed. These data suggest that a single cluster set of 3 repetitions with 30-s inter-repetition rest periods at 85% 1RM acutely improves 20-m sprinting performance. Strength and conditioning professionals and their athletes might consider its inclusion during the specific warm-up to acutely improve athletic performance during the onset (≤10 min) of training or competition.

  11. Diverse Data Sets Can Yield Reliable Information through Mechanistic Modeling: Salicylic Acid Clearance.

    PubMed

    Raymond, G M; Bassingthwaighte, J B

    This is a practical example of a powerful research strategy: putting together data from studies covering a diversity of conditions can yield a scientifically sound grasp of the phenomenon when the individual observations failed to provide definitive understanding. The rationale is that defining a realistic, quantitative, explanatory hypothesis for the whole set of studies, brings about a "consilience" of the often competing hypotheses considered for individual data sets. An internally consistent conjecture linking multiple data sets simultaneously provides stronger evidence on the characteristics of a system than does analysis of individual data sets limited to narrow ranges of conditions. Our example examines three very different data sets on the clearance of salicylic acid from humans: a high concentration set from aspirin overdoses; a set with medium concentrations from a research study on the influences of the route of administration and of sex on the clearance kinetics, and a set on low dose aspirin for cardiovascular health. Three models were tested: (1) a first order reaction, (2) a Michaelis-Menten (M-M) approach, and (3) an enzyme kinetic model with forward and backward reactions. The reaction rates found from model 1 were distinctly different for the three data sets, having no commonality. The M-M model 2 fitted each of the three data sets but gave a reliable estimates of the Michaelis constant only for the medium level data (K m = 24±5.4 mg/L); analyzing the three data sets together with model 2 gave K m = 18±2.6 mg/L. (Estimating parameters using larger numbers of data points in an optimization increases the degrees of freedom, constraining the range of the estimates). Using the enzyme kinetic model (3) increased the number of free parameters but nevertheless improved the goodness of fit to the combined data sets, giving tighter constraints, and a lower estimated K m = 14.6±2.9 mg/L, demonstrating that fitting diverse data sets with a single model

  12. Concept and set-up of an IR-gas sensor construction kit

    NASA Astrophysics Data System (ADS)

    Sieber, Ingo; Perner, Gernot; Gengenbach, Ulrich

    2015-10-01

    The paper presents an approach to a cost-efficient modularly built non-dispersive optical IR-gas sensor (NDIR) based on a construction kit. The modularity of the approach offers several advantages: First of all it allows for an adaptation of the performance of the gas sensor to individual specifications by choosing the suitable modular components. The sensitivity of the sensor e.g. can be altered by selecting a source which emits a favorable wavelength spectrum with respect to the absorption spectrum of the gas to be measured or by tuning the measuring distance (ray path inside the medium to be measured). Furthermore the developed approach is very well suited to be used in teaching. Together with students a construction kit on basis of an optical free space system was developed and partly implemented to be further used as a teaching and training aid for bachelor and master students at our institute. The components of the construction kit are interchangeable and freely fixable on a base plate. The components are classified into five groups: sources, reflectors, detectors, gas feed, and analysis cell. Source, detector, and the positions of the components are fundamental to experiment and test different configurations and beam paths. The reflectors are implemented by an aluminum coated adhesive foil, mounted onto a support structure fabricated by additive manufacturing. This approach allows derivation of the reflecting surface geometry from the optical design tool and generating the 3D-printing files by applying related design rules. The rapid fabrication process and the adjustment of the modules on the base plate allow rapid, almost LEGO®-like, experimental assessment of design ideas. Subject of this paper is modeling, design, and optimization of the reflective optical components, as well as of the optical subsystem. The realization of a sample set-up used as a teaching aid and the optical measurement of the beam path in comparison to the simulation results are

  13. A Model Evaluation Data Set for the Tropical ARM Sites

    DOE Data Explorer

    Jakob, Christian

    2008-01-15

    This data set has been derived from various ARM and external data sources with the main aim of providing modelers easy access to quality controlled data for model evaluation. The data set contains highly aggregated (in time) data from a number of sources at the tropical ARM sites at Manus and Nauru. It spans the years of 1999 and 2000. The data set contains information on downward surface radiation; surface meteorology, including precipitation; atmospheric water vapor and cloud liquid water content; hydrometeor cover as a function of height; and cloud cover, cloud optical thickness and cloud top pressure information provided by the International Satellite Cloud Climatology Project (ISCCP).

  14. Setting Up Decision-Making Tools toward a Quality-Oriented Participatory Maize Breeding Program

    PubMed Central

    Alves, Mara L.; Brites, Cláudia; Paulo, Manuel; Carbas, Bruna; Belo, Maria; Mendes-Moreira, Pedro M. R.; Brites, Carla; Bronze, Maria do Rosário; Gunjača, Jerko; Šatović, Zlatko; Vaz Patto, Maria C.

    2017-01-01

    Previous studies have reported promising differences in the quality of kernels from farmers' maize populations collected in a Portuguese region known to produce maize-based bread. However, several limitations have been identified in the previous characterizations of those populations, such as a limited set of quality traits accessed and a missing accurate agronomic performance evaluation. The objectives of this study were to perform a more detailed quality characterization of Portuguese farmers' maize populations; to estimate their agronomic performance in a broader range of environments; and to integrate quality, agronomic, and molecular data in the setting up of decision-making tools for the establishment of a quality-oriented participatory maize breeding program. Sixteen farmers' maize populations, together with 10 other maize populations chosen for comparison purposes, were multiplied in a common-garden experiment for quality evaluation. Flour obtained from each population was used to study kernel composition (protein, fat, fiber), flour's pasting behavior, and bioactive compound levels (carotenoids, tocopherols, phenolic compounds). These maize populations were evaluated for grain yield and ear weight in nine locations across Portugal; the populations' adaptability and stability were evaluated using additive main effects and multiplication interaction (AMMI) model analysis. The phenotypic characterization of each population was complemented with a molecular characterization, in which 30 individuals per population were genotyped with 20 microsatellites. Almost all farmers' populations were clustered into the same quality-group characterized by high levels of protein and fiber, low levels of carotenoids, volatile aldehydes, α- and δ-tocopherols, and breakdown viscosity. Within this quality-group, variability on particular quality traits (color and some bioactive compounds) could still be found. Regarding the agronomic performance, farmers' maize populations

  15. Set-up and validation of a Delft-FEWS based coastal hazard forecasting system

    NASA Astrophysics Data System (ADS)

    Valchev, Nikolay; Eftimova, Petya; Andreeva, Nataliya

    2017-04-01

    European coasts are increasingly threatened by hazards related to low-probability and high-impact hydro-meteorological events. Uncertainties in hazard prediction and capabilities to cope with their impact lie in both future storm pattern and increasing coastal development. Therefore, adaptation to future conditions requires a re-evaluation of coastal disaster risk reduction (DRR) strategies and introduction of a more efficient mix of prevention, mitigation and preparedness measures. The latter presumes that development of tools, which can manage the complex process of merging data and models and generate products on the current and expected hydro-and morpho-dynamic states of the coasts, such as forecasting system of flooding and erosion hazards at vulnerable coastal locations (hotspots), is of vital importance. Output of such system can be of an utmost value for coastal stakeholders and the entire coastal community. In response to these challenges, Delft-FEWS provides a state-of-the-art framework for implementation of such system with vast capabilities to trigger the early warning process. In addition, this framework is highly customizable to the specific requirements of any individual coastal hotspot. Since its release many Delft-FEWS based forecasting system related to inland flooding have been developed. However, limited number of coastal applications was implemented. In this paper, a set-up of Delft-FEWS based forecasting system for Varna Bay (Bulgaria) and a coastal hotspot, which includes a sandy beach and port infrastructure, is presented. It is implemented in the frame of RISC-KIT project (Resilience-Increasing Strategies for Coasts - toolKIT). The system output generated in hindcast mode is validated with available observations of surge levels, wave and morphodynamic parameters for a sequence of three short-duration and relatively weak storm events occurred during February 4-12, 2015. Generally, the models' performance is considered as very good and

  16. Should I use that model? Assessing the transferability of ecological models to new settings

    EPA Science Inventory

    Analysts and scientists frequently apply existing models that estimate ecological endpoints or simulate ecological processes to settings where the models have not been used previously, and where data to parameterize and validate the model may be sparse. Prior to transferring an ...

  17. Analysis of the optimal laminated target made up of discrete set of materials

    NASA Technical Reports Server (NTRS)

    Aptukov, Valery N.; Belousov, Valentin L.

    1991-01-01

    A new class of problems was analyzed to estimate an optimal structure of laminated targets fabricated from the specified set of homogeneous materials. An approximate description of the perforation process is based on the model of radial hole extension. The problem is solved by using the needle-type variation technique. The desired optimization conditions and quantitative/qualitative estimations of optimal targets were obtained and are discussed using specific examples.

  18. Modelling fatigue and the use of fatigue models in work settings.

    PubMed

    Dawson, Drew; Ian Noy, Y; Härmä, Mikko; Akerstedt, Torbjorn; Belenky, Gregory

    2011-03-01

    In recent years, theoretical models of the sleep and circadian system developed in laboratory settings have been adapted to predict fatigue and, by inference, performance. This is typically done using the timing of prior sleep and waking or working hours as the primary input and the time course of the predicted variables as the primary output. The aim of these models is to provide employers, unions and regulators with quantitative information on the likely average level of fatigue, or risk, associated with a given pattern of work and sleep with the goal of better managing the risk of fatigue-related errors and accidents/incidents. The first part of this review summarises the variables known to influence workplace fatigue and draws attention to the considerable variability attributable to individual and task variables not included in current models. The second part reviews the current fatigue models described in the scientific and technical literature and classifies them according to whether they predict fatigue directly by using the timing of prior sleep and wake (one-step models) or indirectly by using work schedules to infer an average sleep-wake pattern that is then used to predict fatigue (two-step models). The third part of the review looks at the current use of fatigue models in field settings by organizations and regulators. Given their limitations it is suggested that the current generation of models may be appropriate for use as one element in a fatigue risk management system. The final section of the review looks at the future of these models and recommends a standardised approach for their use as an element of the 'defenses-in-depth' approach to fatigue risk management. Copyright © 2010 Elsevier Ltd. All rights reserved.

  19. Clinical Profile of Children and Adolescents Attending the Behavioural Paediatrics Unit OPD in a Tertiary Care Set up

    ERIC Educational Resources Information Center

    Jayaprakash, R.

    2012-01-01

    Background: There are limited studies on the clinical profile of children attending child guidance clinic under Paediatric background. Aims: To study clinical profile of Children & adolescents attending the Behavioural Paediatrics Unit (BPU) OPD under department of Paediatrics in a tertiary care set up. Methods: Monthly average turnover in the…

  20. Modeling Resources Allocation in Attacker-Defender Games with "Warm Up" CSF.

    PubMed

    Guan, Peiqiu; Zhuang, Jun

    2016-04-01

    Like many other engineering investments, the attacker's and defender's investments may have limited impact without initial capital to "warm up" the systems. This article studies such "warm up" effects on both the attack and defense equilibrium strategies in a sequential-move game model by developing a class of novel and more realistic contest success functions. We first solve a single-target attacker-defender game analytically and provide numerical solutions to a multiple-target case. We compare the results of the models with and without consideration of the investment "warm up" effects, and find that the defender would suffer higher expected damage, and either underestimate the attacker effort or waste defense investment if the defender falsely believes that no investment "warm up" effects exist. We illustrate the model results with real data, and compare the results of the models with and without consideration of the correlation between the "warm up" threshold and the investment effectiveness. Interestingly, we find that the defender is suggested to give up defending all the targets when the attack or the defense "warm up" thresholds are sufficiently high. This article provides new insights and suggestions on policy implications for homeland security resource allocation. © 2015 Society for Risk Analysis.

  1. Model fitting for small skin permeability data sets: hyperparameter optimisation in Gaussian Process Regression.

    PubMed

    Ashrafi, Parivash; Sun, Yi; Davey, Neil; Adams, Roderick G; Wilkinson, Simon C; Moss, Gary Patrick

    2018-03-01

    The aim of this study was to investigate how to improve predictions from Gaussian Process models by optimising the model hyperparameters. Optimisation methods, including Grid Search, Conjugate Gradient, Random Search, Evolutionary Algorithm and Hyper-prior, were evaluated and applied to previously published data. Data sets were also altered in a structured manner to reduce their size, which retained the range, or 'chemical space' of the key descriptors to assess the effect of the data range on model quality. The Hyper-prior Smoothbox kernel results in the best models for the majority of data sets, and they exhibited significantly better performance than benchmark quantitative structure-permeability relationship (QSPR) models. When the data sets were systematically reduced in size, the different optimisation methods generally retained their statistical quality, whereas benchmark QSPR models performed poorly. The design of the data set, and possibly also the approach to validation of the model, is critical in the development of improved models. The size of the data set, if carefully controlled, was not generally a significant factor for these models and that models of excellent statistical quality could be produced from substantially smaller data sets. © 2018 Royal Pharmaceutical Society.

  2. Density functional theory calculations of the lowest energy quintet and triplet states of model hemes: role of functional, basis set, and zero-point energy corrections.

    PubMed

    Khvostichenko, Daria; Choi, Andrew; Boulatov, Roman

    2008-04-24

    We investigated the effect of several computational variables, including the choice of the basis set, application of symmetry constraints, and zero-point energy (ZPE) corrections, on the structural parameters and predicted ground electronic state of model 5-coordinate hemes (iron(II) porphines axially coordinated by a single imidazole or 2-methylimidazole). We studied the performance of B3LYP and B3PW91 with eight Pople-style basis sets (up to 6-311+G*) and B97-1, OLYP, and TPSS functionals with 6-31G and 6-31G* basis sets. Only hybrid functionals B3LYP, B3PW91, and B97-1 reproduced the quintet ground state of the model hemes. With a given functional, the choice of the basis set caused up to 2.7 kcal/mol variation of the quintet-triplet electronic energy gap (DeltaEel), in several cases, resulting in the inversion of the sign of DeltaEel. Single-point energy calculations with triple-zeta basis sets of the Pople (up to 6-311G++(2d,2p)), Ahlrichs (TZVP and TZVPP), and Dunning (cc-pVTZ) families showed the same trend. The zero-point energy of the quintet state was approximately 1 kcal/mol lower than that of the triplet, and accounting for ZPE corrections was crucial for establishing the ground state if the electronic energy of the triplet state was approximately 1 kcal/mol less than that of the quintet. Within a given model chemistry, effects of symmetry constraints and of a "tense" structure of the iron porphine fragment coordinated to 2-methylimidazole on DeltaEel were limited to 0.3 kcal/mol. For both model hemes the best agreement with crystallographic structural data was achieved with small 6-31G and 6-31G* basis sets. Deviation of the computed frequency of the Fe-Im stretching mode from the experimental value with the basis set decreased in the order: nonaugmented basis sets, basis sets with polarization functions, and basis sets with polarization and diffuse functions. Contraction of Pople-style basis sets (double-zeta or triple-zeta) affected the results

  3. Preliminary investigation of flow dynamics during the start-up of a bulb turbine model

    NASA Astrophysics Data System (ADS)

    Coulaud, M.; Fraser, R.; Lemay, J.; Duquesne, P.; Aeschlimann, V.; Deschênes, C.

    2016-11-01

    Nowadays, the electricity network undergoes more perturbations due to the market demand. Additionally, an increase of the production from alternative resources such as wind or solar also induces important variations on the grid. Hydraulic power plants are used to respond quickly to these variations to stabilize the network. Hydraulic turbines have to face more frequent start-up and stop sequences that might shorten significantly their life time. In this context, an experimental analysis of start-up sequences has been conducted on the bulb turbine model of the BulbT project at the Hydraulic Machines Laboratory (LAMH) of Laval University. Maintaining a constant head, guide vanes are opened from 0 ° to 30 °. Three guide vanes opening speed have been chosen from 5 °/s to 20 °/s. Several repetitions were done for each guide vanes opening speed. During these sequences, synchronous time resolved measurements have been performed. Pressure signals were recorded at the runner inlet and outlet and along the draft tube. Also, 25 pressure measurements and strain measurements were obtained on the runner blades. Time resolved particle image velocimetry were used to evaluate flowrate during start-up for some repetitions. Torque fluctuations at shaft were also monitored. This paper presents the experimental set-up and start-up conditions chosen to simulate a prototype start-up. Transient flowrate methodology is explained and validation measurements are detailed. The preliminary results of global performances and runner pressure measurements are presented.

  4. The Setting-up of Multi-Site School Collaboratives: The Benefits of This Organizational Reform in Terms of Networking Opportunities and Their Effects

    ERIC Educational Resources Information Center

    Mifsud, Denise

    2015-01-01

    This article, which is set within the Maltese education scenario of unfolding decentralization through the setting-up of multi-site school collaboratives (legally termed "colleges") via a policy mandate, explores a particular aspect of this reform--that of "networking". This is examined in terms of the potential for…

  5. Setting development goals using stochastic dynamical system models

    PubMed Central

    Nicolis, Stamatios C.; Bali Swain, Ranjula; Sumpter, David J. T.

    2017-01-01

    The Millennium Development Goals (MDG) programme was an ambitious attempt to encourage a globalised solution to important but often-overlooked development problems. The programme led to wide-ranging development but it has also been criticised for unrealistic and arbitrary targets. In this paper, we show how country-specific development targets can be set using stochastic, dynamical system models built from historical data. In particular, we show that the MDG target of two-thirds reduction of child mortality from 1990 levels was infeasible for most countries, especially in sub-Saharan Africa. At the same time, the MDG targets were not ambitious enough for fast-developing countries such as Brazil and China. We suggest that model-based setting of country-specific targets is essential for the success of global development programmes such as the Sustainable Development Goals (SDG). This approach should provide clear, quantifiable targets for policymakers. PMID:28241057

  6. Setting development goals using stochastic dynamical system models.

    PubMed

    Ranganathan, Shyam; Nicolis, Stamatios C; Bali Swain, Ranjula; Sumpter, David J T

    2017-01-01

    The Millennium Development Goals (MDG) programme was an ambitious attempt to encourage a globalised solution to important but often-overlooked development problems. The programme led to wide-ranging development but it has also been criticised for unrealistic and arbitrary targets. In this paper, we show how country-specific development targets can be set using stochastic, dynamical system models built from historical data. In particular, we show that the MDG target of two-thirds reduction of child mortality from 1990 levels was infeasible for most countries, especially in sub-Saharan Africa. At the same time, the MDG targets were not ambitious enough for fast-developing countries such as Brazil and China. We suggest that model-based setting of country-specific targets is essential for the success of global development programmes such as the Sustainable Development Goals (SDG). This approach should provide clear, quantifiable targets for policymakers.

  7. Numerical modelling of the buoyant marine microplastics in the South-Eastern Baltic Sea

    NASA Astrophysics Data System (ADS)

    Bagaev, Andrei; Mizyuk, Artem; Chubarenko, Irina; Khatmullilna, Liliya

    2017-04-01

    Microplastics is a burning issue in the marine pollution science. Its sources, ways of propagation and final destiny pose a lot of questions to the modern oceanographers. Hence, a numerical model is an optimal tool for reconstruction of microplastics pathways and fate. Within the MARBLE project (lamp.ocean.ru), a model of Lagrangian particles transport was developed. It was tested coupled with oceanographic transport fields from the operational oceanography product of Copernicus Marine Monitoring Environment Service. Our model deals with two major types of microplastics such as microfibres and buoyant spheroidal particles. We are currently working to increase the grid resolution by means of the NEMO regional configuration for the south-eastern Baltic Sea. Several expeditions were organised to the three regions of the Baltic Sea (the Gotland, the Bornholm, and the Gdansk basins). Water samples from the surface and different water layers were collected, processed, and analysed by our team. A set of laboratory experiments was specifically designed to establish the settling velocity of particles of various shapes and densities. The analysis in question provided us with the understanding necessary for the model to reproduce the large-scale dynamics of microfibres. In the simulation, particles were spreading from the shore to the deep sea, slowly sinking to the bottom, while decreasing in quantity due to conditional sedimentation. Our model is expected to map out the microplastics life cycle and to account for its distribution patterns under the impact of wind and currents. For this purpose, we have already included the parameterization for the wind drag force applied to a particle. Initial results of numerical experiments seem to indicate the importance of proper implicit parameterization of the particle dynamics at the vertical solid boundary. Our suggested solutions to that problem will be presented at the EGU-2017. The MARBLE project is supported by Russian Science

  8. Assessment and quantification of patient set-up errors in nasopharyngeal cancer patients and their biological and dosimetric impact in terms of generalized equivalent uniform dose (gEUD), tumour control probability (TCP) and normal tissue complication probability (NTCP).

    PubMed

    Boughalia, A; Marcie, S; Fellah, M; Chami, S; Mekki, F

    2015-06-01

    The aim of this study is to assess and quantify patients' set-up errors using an electronic portal imaging device and to evaluate their dosimetric and biological impact in terms of generalized equivalent uniform dose (gEUD) on predictive models, such as the tumour control probability (TCP) and the normal tissue complication probability (NTCP). 20 patients treated for nasopharyngeal cancer were enrolled in the radiotherapy-oncology department of HCA. Systematic and random errors were quantified. The dosimetric and biological impact of these set-up errors on the target volume and the organ at risk (OARs) coverage were assessed using calculation of dose-volume histogram, gEUD, TCP and NTCP. For this purpose, an in-house software was developed and used. The standard deviations (1SDs) of the systematic set-up and random set-up errors were calculated for the lateral and subclavicular fields and gave the following results: ∑ = 0.63 ± (0.42) mm and σ = 3.75 ± (0.79) mm, respectively. Thus a planning organ at risk volume (PRV) margin of 3 mm was defined around the OARs, and a 5-mm margin used around the clinical target volume. The gEUD, TCP and NTCP calculations obtained with and without set-up errors have shown increased values for tumour, where ΔgEUD (tumour) = 1.94% Gy (p = 0.00721) and ΔTCP = 2.03%. The toxicity of OARs was quantified using gEUD and NTCP. The values of ΔgEUD (OARs) vary from 0.78% to 5.95% in the case of the brainstem and the optic chiasm, respectively. The corresponding ΔNTCP varies from 0.15% to 0.53%, respectively. The quantification of set-up errors has a dosimetric and biological impact on the tumour and on the OARs. The developed in-house software using the concept of gEUD, TCP and NTCP biological models has been successfully used in this study. It can be used also to optimize the treatment plan established for our patients. The gEUD, TCP and NTCP may be more suitable tools to assess the treatment plans

  9. [Testing the efficacy of disinfectants during drinking water treatment. A new experimental set-up at the German EPA (Umweltbundesamt - UBA)].

    PubMed

    Grützmacher, G; Bartel, H; Althoff, H W; Clemen, S

    2007-03-01

    A set-up for experiments in the flow-through mode was constructed in order to test the efficacy of substances used for disinfecting water during drinking water treatment. A flow-through mode - in contrast to experiments under stationary conditions (so-called batch experiments) - was chosen, because this experimental design allows experiments to be carried out under constant conditions for an extended time (up to one week) and because efficacy testing is possible repeatedly, simultaneously and under exactly the same conditions for short (about 0.5 min) and also longer (about 47 min) contact times. With this experimental design the effect of biofilms along the inner pipe surfaces can be included in the observations. The construction of the experimental set-up is based on experience with laboratory flow-through systems that were installed by the UBA's drinking water department (formerly Institute for Water-, Soil- and Air Hygiene (WaBoLu) Institute) for testing disinfection with chlorine. In the first step, a test pipe for the simulation of a water works situation was installed. Water of different qualities can be mixed in large volumes beforehand so that the experimental procedure can be run with constant water quality for a minimum of one week. The kinetics of the disinfection reaction can be observed by extracting samples from eight sampling ports situated along the test pipe. In order to assign exact residence times to each of the sampling ports, tracer experiments were performed prior to testing disinfectant efficacy. This paper gives the technical details of the experimental set-up and presents the results of the tracer experiments to provide an introduction with respect to its potential.

  10. Cardiac rehabilitation delivery model for low-resource settings

    PubMed Central

    Grace, Sherry L; Turk-Adawi, Karam I; Contractor, Aashish; Atrey, Alison; Campbell, Norm; Derman, Wayne; Melo Ghisi, Gabriela L; Oldridge, Neil; Sarkar, Bidyut K; Yeo, Tee Joo; Lopez-Jimenez, Francisco; Mendis, Shanthi; Oh, Paul; Hu, Dayi; Sarrafzadegan, Nizal

    2016-01-01

    Objective Cardiovascular disease is a global epidemic, which is largely preventable. Cardiac rehabilitation (CR) is demonstrated to be cost-effective and efficacious in high-income countries. CR could represent an important approach to mitigate the epidemic of cardiovascular disease in lower-resource settings. The purpose of this consensus statement was to review low-cost approaches to delivering the core components of CR, to propose a testable model of CR which could feasibly be delivered in middle-income countries. Methods A literature review regarding delivery of each core CR component, namely: (1) lifestyle risk factor management (ie, physical activity, diet, tobacco and mental health), (2) medical risk factor management (eg, lipid control, blood pressure control), (3) education for self-management and (4) return to work, in low-resource settings was undertaken. Recommendations were developed based on identified articles, using a modified GRADE approach where evidence in a low-resource setting was available, or consensus where evidence was not. Results Available data on cost of CR delivery in low-resource settings suggests it is not feasible to deliver CR in low-resource settings as is delivered in high-resource ones. Strategies which can be implemented to deliver all of the core CR components in low-resource settings were summarised in practice recommendations, and approaches to patient assessment proffered. It is suggested that CR be adapted by delivery by non-physician healthcare workers, in non-clinical settings. Conclusions Advocacy to achieve political commitment for broad delivery of adapted CR services in low-resource settings is needed. PMID:27181874

  11. California Dental Hygiene Educators' Perceptions of an Application of the ADHA Advanced Dental Hygiene Practitioner (ADHP) Model in Medical Settings.

    PubMed

    Smith, Lauren; Walsh, Margaret

    2015-12-01

    To assess California dental hygiene educators' perceptions of an application of the American Dental Hygienists' Association's (ADHA) advanced dental hygiene practitioner model (ADHP) in medical settings where the advanced dental hygiene practitioner collaborates in medical settings with other health professionals to meet clients' oral health needs. In 2014, 30 directors of California dental hygiene programs were contacted to participate in and distribute an online survey to their faculty. In order to capture non-respondents, 2 follow-up e-mails were sent. Descriptive analysis and cross-tabulations were analyzed using the online survey software program, Qualtrics™. The educator response rate was 18% (70/387). Nearly 90% of respondents supported the proposed application of the ADHA ADHP model and believed it would increase access to care and reduce oral health disparities. They also agreed with most of the proposed services, target populations and workplace settings. Slightly over half believed a master's degree was the appropriate educational level needed. Among California dental hygiene educators responding to this survey, there was strong support for the proposed application of the ADHA model in medical settings. More research is needed among a larger sample of dental hygiene educators and clinicians, as well as among other health professionals such as physicians, nurses and dentists. Copyright © 2015 The American Dental Hygienists’ Association.

  12. Building up a QSAR model for toxicity toward Tetrahymena pyriformis by the Monte Carlo method: A case of benzene derivatives.

    PubMed

    Toropova, Alla P; Schultz, Terry W; Toropov, Andrey A

    2016-03-01

    Data on toxicity toward Tetrahymena pyriformis is indicator of applicability of a substance in ecologic and pharmaceutical aspects. Quantitative structure-activity relationships (QSARs) between the molecular structure of benzene derivatives and toxicity toward T. pyriformis (expressed as the negative logarithms of the population growth inhibition dose, mmol/L) are established. The available data were randomly distributed three times into the visible training and calibration sets, and invisible validation sets. The statistical characteristics for the validation set are the following: r(2)=0.8179 and s=0.338 (first distribution); r(2)=0.8682 and s=0.341 (second distribution); r(2)=0.8435 and s=0.323 (third distribution). These models are built up using only information on the molecular structure: no data on physicochemical parameters, 3D features of the molecular structure and quantum mechanics descriptors are involved in the modeling process. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. The impact of urban open space and 'lift-up' building design on building intake fraction and daily pollutant exposure in idealized urban models.

    PubMed

    Sha, Chenyuan; Wang, Xuemei; Lin, Yuanyuan; Fan, Yifan; Chen, Xi; Hang, Jian

    2018-08-15

    Sustainable urban design is an effective way to improve urban ventilation and reduce vehicular pollutant exposure to urban residents. This paper investigated the impacts of urban open space and 'lift-up' building design on vehicular CO (carbon monoxide) exposure in typical three-dimensional (3D) urban canopy layer (UCL) models under neutral atmospheric conditions. The building intake fraction (IF) represents the fraction of total vehicular pollutant emissions inhaled by residents when they stay at home. The building daily CO exposure (E t ) means the extent of human beings' contact with CO within one day indoor at home. Computational fluid dynamics (CFD) simulations integrating with these two concepts were performed to solve turbulent flow and assess vehicular CO exposure to urban residents. CFD technique with the standard k-ε model was successfully validated by wind tunnel data. The initial numerical UCL model consists of 5-row and 5-column (5×5) cubic buildings (building height H=street width W=30m) with four approaching wind directions (θ=0°, 15°, 30°, 45°). In Group I, one of the 25 building models is removed to attain urban open space settings. In Group II, the first floor (Lift-up1), or second floor (Lift-up2), or third floor (Lift-up3) of all buildings is elevated respectively to create wind pathways through buildings. Compared to the initial case, urban open space can slightly or significantly reduce pollutant exposure for urban residents. As θ=30° and 45°, open space settings are more effective to reduce pollutant exposure than θ=0° and 15°.The pollutant dilution near or surrounding open space and in its adjacent downstream regions is usually enhanced. Lift-up1 and Lift-up2 experience much greater pollutant exposure reduction in all wind directions than Lift-up3 and open space. Although further investigations are still required to provide practical guidelines, this study is one of the first attempts for reducing urban pollutant exposure by

  14. Modeling of Protection in Dynamic Simulation Using Generic Relay Models and Settings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samaan, Nader A.; Dagle, Jeffery E.; Makarov, Yuri V.

    This paper shows how generic protection relay models available in planning tools can be augmented with settings that are based on NERC standards or best engineering practice. Selected generic relay models in Siemens PSS®E have been used in dynamic simulations in the proposed approach. Undervoltage, overvoltage, underfrequency, and overfrequency relays have been modeled for each generating unit. Distance-relay protection was modeled for transmission system protection. Two types of load-shedding schemes were modeled: underfrequency (frequency-responsive non-firm load shedding) and underfrequency and undervoltage firm load shedding. Several case studies are given to show the impact of protection devices on dynamic simulations. Thismore » is useful for simulating cascading outages.« less

  15. Search and rescue helicopter-assisted transfer of ST-elevation myocardial infarction patients from an island in the Baltic Sea: results from over 100 rescue missions.

    PubMed

    Schoos, Mikkel Malby; Kelbæk, Henning; Pedersen, Frants; Kjærgaard, Benedict; Trautner, Sven; Holmvang, Lene; Jørgensen, Erik; Helqvist, Steffen; Saunamäki, Kari; Engstrøm, Thomas; Clemmensen, Peter

    2014-11-01

    Since 2005, ST-elevation myocardial infarction (STEMI) patients from the island of Bornholm in the Baltic Sea have been transferred for primary percutaneous coronary intervention (pPCI) by an airborne service. We describe the result of pPCI as part of the Danish national reperfusion strategy offered to a remote island population. In this observational study, patients from Bornholm (n=101) were compared with patients from the mainland (Zealand) (n=2495), who were grouped according to time intervals (<120, 121-180, >180 min). The primary endpoint was all-cause 30-day mortality. Individual-level data from the Central Population Registry provided outcome that was linked to our inhospital PCI database. Treatment delay was longer in patients from Bornholm (349 min (IQR 267-446)) vs Zealand (211 (IQR 150-315)) (p<0.001). In patients from Zealand, 30-day mortality did not increase with time intervals (p=0.176), whereas, long-term mortality did (∼3 years) (p=0.007). Thirty-day mortality was similar for Bornholm and the overall Zealand group (5.9% vs 6.2% p=0.955). Early presenters (<180 min) from Zealand (37%) had similar 30-day (5.3% vs 5.9% p=0.789), but numerically reduced long-term mortality compared with Bornholm (12.8% vs 15.8% p=0.387). Age, female gender, diabetes, Killipclass >2 and preprocedural thrombolysis in myocardial infarction (TIMI) flow 0/1 independently predicted 30-day mortality, however, treatment delay did not. Postprocedural TIMI flow 3 predicted improved survival. In this small population of STEMI patients from a remote island, airborne transfer appears feasible and safe, and their 30-day mortality after pPCI comparable with that of the mainland population despite inherent reperfusion delay exceeding guidelines. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  16. Timely Follow-Up of Abnormal Diagnostic Imaging Test Results in an Outpatient Setting: Are Electronic Medical Records Achieving Their Potential?

    PubMed Central

    Singh, Hardeep; Thomas, Eric J.; Mani, Shrinidi; Sittig, Dean; Arora, Harvinder; Espadas, Donna; Khan, Myrna M.; Petersen, Laura A.

    2010-01-01

    Background Given the fragmentation of outpatient care, timely follow-up of abnormal diagnostic test results remains a challenge. We hypothesized that an EMR that facilitates the transmission and availability of critical imaging results through either automated notification (alerting) or direct access to the primary report would eliminate this problem. Methods We studied critical imaging alert notifications in the outpatient setting of a tertiary care VA facility from November 2007 to June 2008. Tracking software determined whether the alert was acknowledged (i.e. provider opened the message for viewing) within two weeks of transmission; acknowledged alerts were considered read. We reviewed medical records and contacted providers to determine timely follow-up actions (e.g. ordering a follow-up test or consultation) within 4 weeks of transmission. Multivariable logistic regression models accounting for clustering effect by providers analyzed predictors for two outcomes; lack of acknowledgment and lack of timely follow-up. Results Of 123,638 studies (including X-rays, CT scans, ultrasounds, MRI and mammography), 1196 (0.97%) images generated alerts; 217 (18.1%) of these were unacknowledged. Alerts had a higher risk of being unacknowledged when ordering providers were trainees (OR, 5.58;95%CI, 2.86-10.89) and when dual (more than one provider alerted) as opposed to single communication was used (OR, 2.02;95%CI, 1.22-3.36). Timely follow-up was lacking in 92 (7.7% of all alerts) and was similar for acknowledged and unacknowledged alerts (7.3% vs. 9.7%;p=0.2). Risk for lack of timely follow-up was higher with dual communication (OR,1.99;95%CI, 1.06-3.48) but lower when additional verbal communication was used by the radiologist (OR, 0.12;95%CI: 0.04-0.38). Nearly all abnormal results lacking timely follow-up at 4 weeks were eventually found to have measurable clinical impact in terms of further diagnostic testing or treatment. Conclusions Critical imaging results may not

  17. A fusion of top-down and bottom-up modeling techniques to constrain regional scale carbon budgets

    NASA Astrophysics Data System (ADS)

    Goeckede, M.; Turner, D. P.; Michalak, A. M.; Vickers, D.; Law, B. E.

    2009-12-01

    . Bayesian inversion is then applied to assign scaling factors that align the surface fluxes with the CO2 time series. Our project demonstrates how bottom-up and top-down techniques can be reconciled to arrive at a more robust and balanced spatial carbon budget. We will show how to evaluate existing flux products through regionally representative atmospheric observations, i.e. how well the underlying model assumptions represent processes on the regional scale. Adapting process model parameterizations sets for e.g. sub-regions, disturbance regimes, or land cover classes, in order to optimize the agreement between surface fluxes and atmospheric observations can lead to improved understanding of the underlying flux mechanisms, and reduces uncertainties in the regional carbon budgets.

  18. Density-based cluster algorithms for the identification of core sets

    NASA Astrophysics Data System (ADS)

    Lemke, Oliver; Keller, Bettina G.

    2016-10-01

    The core-set approach is a discretization method for Markov state models of complex molecular dynamics. Core sets are disjoint metastable regions in the conformational space, which need to be known prior to the construction of the core-set model. We propose to use density-based cluster algorithms to identify the cores. We compare three different density-based cluster algorithms: the CNN, the DBSCAN, and the Jarvis-Patrick algorithm. While the core-set models based on the CNN and DBSCAN clustering are well-converged, constructing core-set models based on the Jarvis-Patrick clustering cannot be recommended. In a well-converged core-set model, the number of core sets is up to an order of magnitude smaller than the number of states in a conventional Markov state model with comparable approximation error. Moreover, using the density-based clustering one can extend the core-set method to systems which are not strongly metastable. This is important for the practical application of the core-set method because most biologically interesting systems are only marginally metastable. The key point is to perform a hierarchical density-based clustering while monitoring the structure of the metric matrix which appears in the core-set method. We test this approach on a molecular-dynamics simulation of a highly flexible 14-residue peptide. The resulting core-set models have a high spatial resolution and can distinguish between conformationally similar yet chemically different structures, such as register-shifted hairpin structures.

  19. Energy and direction distribution of neutrons in workplace fields: implication of the results from the EVIDOS project for the set-up of simulated workplace fields.

    PubMed

    Luszik-Bhadra, M; Lacoste, V; Reginatto, M; Zimbal, A

    2007-01-01

    Workplace neutron spectra from nuclear facilities obtained within the European project EVIDOS are compared with those of the simulated workplace fields CANEL and SIGMA and fields set-up with radionuclide sources at the PTB. Contributions of neutrons to ambient dose equivalent and personal dose equivalent are given in three energy intervals (for thermal, intermediate and fast neutrons) together with the corresponding direction distribution, characterised by three different types of distributions (isotropic, weakly directed and directed). The comparison shows that none of the simulated workplace fields investigated here can model all the characteristics of the fields observed at power reactors.

  20. Trophic cascades of bottom-up and top-down forcing on nutrients and plankton in the Kattegat, evaluated by modelling

    NASA Astrophysics Data System (ADS)

    Petersen, Marcell Elo; Maar, Marie; Larsen, Janus; Møller, Eva Friis; Hansen, Per Juel

    2017-05-01

    The aim of the study was to investigate the relative importance of bottom-up and top-down forcing on trophic cascades in the pelagic food-web and the implications for water quality indicators (summer phytoplankton biomass and winter nutrients) in relation to management. The 3D ecological model ERGOM was validated and applied in a local set-up of the Kattegat, Denmark, using the off-line Flexsem framework. The model scenarios were conducted by changing the forcing by ± 20% of nutrient inputs (bottom-up) and mesozooplankton mortality (top-down), and both types of forcing combined. The model results showed that cascading effects operated differently depending on the forcing type. In the single-forcing bottom-up scenarios, the cascade directions were in the same direction as the forcing. For scenarios involving top-down, there was a skipped-level-transmission in the trophic responses that was either attenuated or amplified at different trophic levels. On a seasonal scale, bottom-up forcing showed strongest response during winter-spring for DIN and Chl a concentrations, whereas top-down forcing had the highest cascade strength during summer for Chl a concentrations and microzooplankton biomass. On annual basis, the system was more bottom-up than top-down controlled. Microzooplankton was found to play an important role in the pelagic food web as mediator of nutrient and energy fluxes. This study demonstrated that the best scenario for improved water quality was a combined reduction in nutrient input and mesozooplankton mortality calling for the need of an integrated management of marine areas exploited by human activities.

  1. Stress testing hydrologic models using bottom-up climate change assessment

    NASA Astrophysics Data System (ADS)

    Stephens, C.; Johnson, F.; Marshall, L. A.

    2017-12-01

    Bottom-up climate change assessment is a promising approach for understanding the vulnerability of a system to potential future changes. The technique has been utilised successfully in risk-based assessments of future flood severity and infrastructure vulnerability. We find that it is also an ideal tool for assessing hydrologic model performance in a changing climate. In this study, we applied bottom-up climate change to compare the performance of two different hydrologic models (an event-based and a continuous model) under increasingly severe climate change scenarios. This allowed us to diagnose likely sources of future prediction error in the two models. The climate change scenarios were based on projections for southern Australia, which indicate drier average conditions with increased extreme rainfall intensities. We found that the key weakness in using the event-based model to simulate drier future scenarios was the model's inability to dynamically account for changing antecedent conditions. This led to increased variability in model performance relative to the continuous model, which automatically accounts for the wetness of a catchment through dynamic simulation of water storages. When considering more intense future rainfall events, representation of antecedent conditions became less important than assumptions around (non)linearity in catchment response. The linear continuous model we applied may underestimate flood risk in a future climate with greater extreme rainfall intensity. In contrast with the recommendations of previous studies, this indicates that continuous simulation is not necessarily the key to robust flood modelling under climate change. By applying bottom-up climate change assessment, we were able to understand systematic changes in relative model performance under changing conditions and deduce likely sources of prediction error in the two models.

  2. Multiple Versus Single Set Validation of Multivariate Models to Avoid Mistakes.

    PubMed

    Harrington, Peter de Boves

    2018-01-02

    Validation of multivariate models is of current importance for a wide range of chemical applications. Although important, it is neglected. The common practice is to use a single external validation set for evaluation. This approach is deficient and may mislead investigators with results that are specific to the single validation set of data. In addition, no statistics are available regarding the precision of a derived figure of merit (FOM). A statistical approach using bootstrapped Latin partitions is advocated. This validation method makes an efficient use of the data because each object is used once for validation. It was reviewed a decade earlier but primarily for the optimization of chemometric models this review presents the reasons it should be used for generalized statistical validation. Average FOMs with confidence intervals are reported and powerful, matched-sample statistics may be applied for comparing models and methods. Examples demonstrate the problems with single validation sets.

  3. Numerical Modelling of Three-Fluid Flow Using The Level-set Method

    NASA Astrophysics Data System (ADS)

    Li, Hongying; Lou, Jing; Shang, Zhi

    2014-11-01

    This work presents a numerical model for simulation of three-fluid flow involving two different moving interfaces. These interfaces are captured using the level-set method via two different level-set functions. A combined formulation with only one set of conservation equations for the whole physical domain, consisting of the three different immiscible fluids, is employed. Numerical solution is performed on a fixed mesh using the finite volume method. Surface tension effect is incorporated using the Continuum Surface Force model. Validation of the present model is made against available results for stratified flow and rising bubble in a container with a free surface. Applications of the present model are demonstrated by a variety of three-fluid flow systems including (1) three-fluid stratified flow, (2) two-fluid stratified flow carrying the third fluid in the form of drops and (3) simultaneous rising and settling of two drops in a stationary third fluid. The work is supported by a Thematic and Strategic Research from A*STAR, Singapore (Ref. #: 1021640075).

  4. Route constraints model based on polychromatic sets

    NASA Astrophysics Data System (ADS)

    Yin, Xianjun; Cai, Chao; Wang, Houjun; Li, Dongwu

    2018-03-01

    With the development of unmanned aerial vehicle (UAV) technology, the fields of its application are constantly expanding. The mission planning of UAV is especially important, and the planning result directly influences whether the UAV can accomplish the task. In order to make the results of mission planning for unmanned aerial vehicle more realistic, it is necessary to consider not only the physical properties of the aircraft, but also the constraints among the various equipment on the UAV. However, constraints among the equipment of UAV are complex, and the equipment has strong diversity and variability, which makes these constraints difficult to be described. In order to solve the above problem, this paper, referring to the polychromatic sets theory used in the advanced manufacturing field to describe complex systems, presents a mission constraint model of UAV based on polychromatic sets.

  5. Emergency residential care settings: A model for service assessment and design.

    PubMed

    Graça, João; Calheiros, Maria Manuela; Patrício, Joana Nunes; Magalhães, Eunice Vieira

    2018-02-01

    There have been calls for uncovering the "black box" of residential care services, with a particular need for research focusing on emergency care settings for children and youth in danger. In fact, the strikingly scant empirical attention that these settings have received so far contrasts with the role that they often play as gateway into the child welfare system. To answer these calls, this work presents and tests a framework for assessing a service model in residential emergency care. It comprises seven studies which address a set of different focal areas (e.g., service logic model; care experiences), informants (e.g., case records; staff; children/youth), and service components (e.g., case assessment/evaluation; intervention; placement/referral). Drawing on this process-consultation approach, the work proposes a set of key challenges for emergency residential care in terms of service improvement and development, and calls for further research targeting more care units and different types of residential care services. These findings offer a contribution to inform evidence-based practice and policy in service models of residential care. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. SETTING UP OF A HOMECARE SYSTEM FOR HIGH COST NEBULISERS IN A PAEDIATRIC CYSTIC FIBROSIS CENTRE.

    PubMed

    Chorro-Mari, Veronica; Christiansen, Nanna

    2016-09-01

    Due to national changes to the commissioning process of high cost nebulisers (HCN) for Cystic Fibrosis (CF) patients, CF centres have to repatriate the prescribing of the HCN to the tertiary care centres.1 The following nebulisers will no longer be prescribed by primary care: Cayston® (Aztreonam); Colomycin®, Pomixin®, Clobreathe® (Colistimethate); Pulmozyme® (Dornase alfa); Tobi®, Tobi Podhaler ®, Bramitob® (Tobramycin).This abstract explains how the Royal London Hospital (RLH) Paediatric Pharmacy recruited over 100 paediatric (CF) patients smoothly within a period of 4 months and set up a homecare system to avoid patients and families having to travel large distances to obtain their medication. A number of homecare companies were evaluated initially. Parameters looked at were reports of customer satisfaction, delivery cost, turn-around time once the prescription was received and availability of same day delivery service.In order to capture existing patients we met with CF Specialist Nurses to establish the total number of patients on HCN, what nebulised treatment they were on and their respective doses. We prioritised patients that had known problems with GP prescribing and anybody newly starting on HCN.To communicate the change to parents, a letter was sent to all parents explaining the changeover to homecare delivery and tertiary prescribing. In addition a section in the parent bulletin was dedicated to the topic as well. Following this we contacted parents via phone and in clinic to request consent and explain the process.Up to 10 patients were contacted weekly (average of 7); the consent form and registration form were then faxed to the Homecare company for patient registration. In parallel to this prescriptions were requested for the patients that had been set up in the previous week, ensuring that prescribing was spread out over time to avoid having peak times for repeat prescriptions.In addition to the letter to parents GP surgeries were also

  7. Cardiac rehabilitation delivery model for low-resource settings.

    PubMed

    Grace, Sherry L; Turk-Adawi, Karam I; Contractor, Aashish; Atrey, Alison; Campbell, Norm; Derman, Wayne; Melo Ghisi, Gabriela L; Oldridge, Neil; Sarkar, Bidyut K; Yeo, Tee Joo; Lopez-Jimenez, Francisco; Mendis, Shanthi; Oh, Paul; Hu, Dayi; Sarrafzadegan, Nizal

    2016-09-15

    Cardiovascular disease is a global epidemic, which is largely preventable. Cardiac rehabilitation (CR) is demonstrated to be cost-effective and efficacious in high-income countries. CR could represent an important approach to mitigate the epidemic of cardiovascular disease in lower-resource settings. The purpose of this consensus statement was to review low-cost approaches to delivering the core components of CR, to propose a testable model of CR which could feasibly be delivered in middle-income countries. A literature review regarding delivery of each core CR component, namely: (1) lifestyle risk factor management (ie, physical activity, diet, tobacco and mental health), (2) medical risk factor management (eg, lipid control, blood pressure control), (3) education for self-management and (4) return to work, in low-resource settings was undertaken. Recommendations were developed based on identified articles, using a modified GRADE approach where evidence in a low-resource setting was available, or consensus where evidence was not. Available data on cost of CR delivery in low-resource settings suggests it is not feasible to deliver CR in low-resource settings as is delivered in high-resource ones. Strategies which can be implemented to deliver all of the core CR components in low-resource settings were summarised in practice recommendations, and approaches to patient assessment proffered. It is suggested that CR be adapted by delivery by non-physician healthcare workers, in non-clinical settings. Advocacy to achieve political commitment for broad delivery of adapted CR services in low-resource settings is needed. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  8. Neonatal resuscitation in low-resource settings: What, who, and how to overcome challenges to scale up?

    PubMed Central

    Wall, Stephen N.; Lee, Anne CC; Niermeyer, Susan; English, Mike; Keenan, William J.; Carlo, Wally; Bhutta, Zulfiqar A.; Bang, Abhay; Narayanan, Indira; Ariawan, Iwan; Lawn, Joy E.

    2009-01-01

    Background Each year approximately 10 million babies do not breathe immediately at birth, of which about 6 million require basic neonatal resuscitation. The major burden is in low-income settings, where health system capacity to provide neonatal resuscitation is inadequate. Objective To systematically review the evidence for neonatal resuscitation content, training and competency, equipment and supplies, cost, and key program considerations, specifically for resource-constrained settings. Results Evidence from several observational studies shows that facility-based basic neonatal resuscitation may avert 30% of intrapartum-related neonatal deaths. Very few babies require advanced resuscitation (endotracheal intubation and drugs) and these newborns may not survive without ongoing ventilation; hence, advanced neonatal resuscitation is not a priority in settings without neonatal intensive care. Of the 60 million nonfacility births, most do not have access to resuscitation. Several trials have shown that a range of community health workers can perform neonatal resuscitation with an estimated effect of a 20% reduction in intrapartum-related neonatal deaths, based on expert opinion. Case studies illustrate key considerations for scale up. Conclusion Basic resuscitation would substantially reduce intrapartum-related neonatal deaths. Where births occur in facilities, it is a priority to ensure that all birth attendants are competent in resuscitation. Strategies to address the gap for home births are urgently required. More data are required to determine the impact of neonatal resuscitation, particularly on long-term outcomes in low-income settings. PMID:19815203

  9. Level-set techniques for facies identification in reservoir modeling

    NASA Astrophysics Data System (ADS)

    Iglesias, Marco A.; McLaughlin, Dennis

    2011-03-01

    In this paper we investigate the application of level-set techniques for facies identification in reservoir models. The identification of facies is a geometrical inverse ill-posed problem that we formulate in terms of shape optimization. The goal is to find a region (a geologic facies) that minimizes the misfit between predicted and measured data from an oil-water reservoir. In order to address the shape optimization problem, we present a novel application of the level-set iterative framework developed by Burger in (2002 Interfaces Free Bound. 5 301-29 2004 Inverse Problems 20 259-82) for inverse obstacle problems. The optimization is constrained by (the reservoir model) a nonlinear large-scale system of PDEs that describes the reservoir dynamics. We reformulate this reservoir model in a weak (integral) form whose shape derivative can be formally computed from standard results of shape calculus. At each iteration of the scheme, the current estimate of the shape derivative is utilized to define a velocity in the level-set equation. The proper selection of this velocity ensures that the new shape decreases the cost functional. We present results of facies identification where the velocity is computed with the gradient-based (GB) approach of Burger (2002) and the Levenberg-Marquardt (LM) technique of Burger (2004). While an adjoint formulation allows the straightforward application of the GB approach, the LM technique requires the computation of the large-scale Karush-Kuhn-Tucker system that arises at each iteration of the scheme. We efficiently solve this system by means of the representer method. We present some synthetic experiments to show and compare the capabilities and limitations of the proposed implementations of level-set techniques for the identification of geologic facies.

  10. Photon up-conversion increases biomass yield in Chlorella vulgaris.

    PubMed

    Menon, Kavya R; Jose, Steffi; Suraishkumar, Gadi K

    2014-12-01

    Photon up-conversion, a process whereby lower energy radiations are converted to higher energy levels via the use of appropriate phosphor systems, was employed as a novel strategy for improving microalgal growth and lipid productivity. Photon up-conversion enables the utilization of regions of the solar spectrum, beyond the typical photosynthetically active radiation, that are usually wasted or are damaging to the algae. The effects of up-conversion of red light by two distinct sets of up-conversion phosphors were studied in the model microalgae Chlorella vulgaris. Up-conversion by set 1 phosphors led to a 2.85 fold increase in biomass concentration and a 3.2 fold increase in specific growth rate of the microalgae. While up-conversion by set 2 phosphors resulted in a 30% increase in biomass and 12% increase in specific intracellular neutral lipid, while the specific growth rates were comparable to that of the control. Furthermore, up-conversion resulted in higher levels of specific intracellular reactive oxygen species in C. vulgaris. Up-conversion of red light (654 nm) was shown to improve biomass yields in C. vulgaris. In principle, up-conversion can be used to increase the utilization range of the electromagnetic spectrum for improved cultivation of photosynthetic systems such as plants, algae, and microalgae. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Predicting factors for malaria re-introduction: an applied model in an elimination setting to prevent malaria outbreaks.

    PubMed

    Ranjbar, Mansour; Shoghli, Alireza; Kolifarhood, Goodarz; Tabatabaei, Seyed Mehdi; Amlashi, Morteza; Mohammadi, Mahdi

    2016-03-02

    Malaria re-introduction is a challenge in elimination settings. To prevent re-introduction, receptivity, vulnerability, and health system capacity of foci should be monitored using appropriate tools. This study aimed to design an applicable model to monitor predicting factors of re-introduction of malaria in highly prone areas. This exploratory, descriptive study was conducted in a pre-elimination setting with a high-risk of malaria transmission re-introduction. By using nominal group technique and literature review, a list of predicting indicators for malaria re-introduction and outbreak was defined. Accordingly, a checklist was developed and completed in the field for foci affected by re-introduction and for cleared-up foci as a control group, for a period of 12 weeks before re-introduction and for the same period in the previous year. Using field data and analytic hierarchical process (AHP), each variable and its sub-categories were weighted, and by calculating geometric means for each sub-category, score of corresponding cells of interaction matrices, lower and upper threshold of different risks strata, including low and mild risk of re-introduction and moderate and high risk of malaria outbreaks, were determined. The developed predictive model was calibrated through resampling with different sets of explanatory variables using R software. Sensitivity and specificity of the model were calculated based on new samples. Twenty explanatory predictive variables of malaria re-introduction were identified and a predictive model was developed. Unpermitted immigrants from endemic neighbouring countries were determined as a pivotal factor (AHP score: 0.181). Moreover, quality of population movement (0.114), following malaria transmission season (0.088), average daily minimum temperature in the previous 8 weeks (0.062), an outdoor resting shelter for vectors (0.045), and rainfall (0.042) were determined. Positive and negative predictive values of the model were 81.8 and

  12. Regionalisation of statistical model outputs creating gridded data sets for Germany

    NASA Astrophysics Data System (ADS)

    Höpp, Simona Andrea; Rauthe, Monika; Deutschländer, Thomas

    2016-04-01

    The goal of the German research program ReKliEs-De (regional climate projection ensembles for Germany, http://.reklies.hlug.de) is to distribute robust information about the range and the extremes of future climate for Germany and its neighbouring river catchment areas. This joint research project is supported by the German Federal Ministry of Education and Research (BMBF) and was initiated by the German Federal States. The Project results are meant to support the development of adaptation strategies to mitigate the impacts of future climate change. The aim of our part of the project is to adapt and transfer the regionalisation methods of the gridded hydrological data set (HYRAS) from daily station data to the station based statistical regional climate model output of WETTREG (regionalisation method based on weather patterns). The WETTREG model output covers the period of 1951 to 2100 with a daily temporal resolution. For this, we generate a gridded data set of the WETTREG output for precipitation, air temperature and relative humidity with a spatial resolution of 12.5 km x 12.5 km, which is common for regional climate models. Thus, this regionalisation allows comparing statistical to dynamical climate model outputs. The HYRAS data set was developed by the German Meteorological Service within the German research program KLIWAS (www.kliwas.de) and consists of daily gridded data for Germany and its neighbouring river catchment areas. It has a spatial resolution of 5 km x 5 km for the entire domain for the hydro-meteorological elements precipitation, air temperature and relative humidity and covers the period of 1951 to 2006. After conservative remapping the HYRAS data set is also convenient for the validation of climate models. The presentation will consist of two parts to present the actual state of the adaptation of the HYRAS regionalisation methods to the statistical regional climate model WETTREG: First, an overview of the HYRAS data set and the regionalisation

  13. Assessment and quantification of patient set-up errors in nasopharyngeal cancer patients and their biological and dosimetric impact in terms of generalized equivalent uniform dose (gEUD), tumour control probability (TCP) and normal tissue complication probability (NTCP)

    PubMed Central

    Marcie, S; Fellah, M; Chami, S; Mekki, F

    2015-01-01

    Objective: The aim of this study is to assess and quantify patients' set-up errors using an electronic portal imaging device and to evaluate their dosimetric and biological impact in terms of generalized equivalent uniform dose (gEUD) on predictive models, such as the tumour control probability (TCP) and the normal tissue complication probability (NTCP). Methods: 20 patients treated for nasopharyngeal cancer were enrolled in the radiotherapy–oncology department of HCA. Systematic and random errors were quantified. The dosimetric and biological impact of these set-up errors on the target volume and the organ at risk (OARs) coverage were assessed using calculation of dose–volume histogram, gEUD, TCP and NTCP. For this purpose, an in-house software was developed and used. Results: The standard deviations (1SDs) of the systematic set-up and random set-up errors were calculated for the lateral and subclavicular fields and gave the following results: ∑ = 0.63 ± (0.42) mm and σ = 3.75 ± (0.79) mm, respectively. Thus a planning organ at risk volume (PRV) margin of 3 mm was defined around the OARs, and a 5-mm margin used around the clinical target volume. The gEUD, TCP and NTCP calculations obtained with and without set-up errors have shown increased values for tumour, where ΔgEUD (tumour) = 1.94% Gy (p = 0.00721) and ΔTCP = 2.03%. The toxicity of OARs was quantified using gEUD and NTCP. The values of ΔgEUD (OARs) vary from 0.78% to 5.95% in the case of the brainstem and the optic chiasm, respectively. The corresponding ΔNTCP varies from 0.15% to 0.53%, respectively. Conclusion: The quantification of set-up errors has a dosimetric and biological impact on the tumour and on the OARs. The developed in-house software using the concept of gEUD, TCP and NTCP biological models has been successfully used in this study. It can be used also to optimize the treatment plan established for our patients. Advances in knowledge: The g

  14. Spin-Up and Tuning of the Global Carbon Cycle Model Inside the GISS ModelE2 GCM

    NASA Technical Reports Server (NTRS)

    Aleinov, Igor; Kiang, Nancy Y.; Romanou, Anastasia

    2015-01-01

    Planetary carbon cycle involves multiple phenomena, acting at variety of temporal and spacial scales. The typical times range from minutes for leaf stomata physiology to centuries for passive soil carbon pools and deep ocean layers. So, finding a satisfactory equilibrium state becomes a challenging and computationally expensive task. Here we present the spin-up processes for different configurations of the GISS Carbon Cycle model from the model forced with MODIS observed Leaf Area Index (LAI) and prescribed ocean to the prognostic LAI and to the model fully coupled to the dynamic ocean and ocean biology. We investigate the time it takes the model to reach the equilibrium and discuss the ways to speed up this process. NASA Goddard Institute for Space Studies General Circulation Model (GISS ModelE2) is currently equipped with all major algorithms necessary for the simulation of the Global Carbon Cycle. The terrestrial part is presented by Ent Terrestrial Biosphere Model (Ent TBM), which includes leaf biophysics, prognostic phenology and soil biogeochemistry module (based on Carnegie-Ames-Stanford model). The ocean part is based on the NASA Ocean Biogeochemistry Model (NOBM). The transport of atmospheric CO2 is performed by the atmospheric part of ModelE2, which employs quadratic upstream algorithm for this purpose.

  15. A new small-angle X-ray scattering set-up on the crystallography beamline I711 at MAX-lab.

    PubMed

    Knaapila, M; Svensson, C; Barauskas, J; Zackrisson, M; Nielsen, S S; Toft, K N; Vestergaard, B; Arleth, L; Olsson, U; Pedersen, J S; Cerenius, Y

    2009-07-01

    A small-angle X-ray scattering (SAXS) set-up has recently been developed at beamline I711 at the MAX II storage ring in Lund (Sweden). An overview of the required modifications is presented here together with a number of application examples. The accessible q range in a SAXS experiment is 0.009-0.3 A(-1) for the standard set-up but depends on the sample-to-detector distance, detector offset, beamstop size and wavelength. The SAXS camera has been designed to have a low background and has three collinear slit sets for collimating the incident beam. The standard beam size is about 0.37 mm x 0.37 mm (full width at half-maximum) at the sample position, with a flux of 4 x 10(10) photons s(-1) and lambda = 1.1 A. The vacuum is of the order of 0.05 mbar in the unbroken beam path from the first slits until the exit window in front of the detector. A large sample chamber with a number of lead-throughs allows different sample environments to be mounted. This station is used for measurements on weakly scattering proteins in solutions and also for colloids, polymers and other nanoscale structures. A special application supported by the beamline is the effort to establish a micro-fluidic sample environment for structural analysis of samples that are only available in limited quantities. Overall, this work demonstrates how a cost-effective SAXS station can be constructed on a multipurpose beamline.

  16. Multiple shock reverberation compression of dense Ne up to the warm dense regime: Evaluating the theoretical models

    NASA Astrophysics Data System (ADS)

    Tang, J.; Gu, Y. J.; Chen, Q. F.; Li, Z. G.; Zheng, J.; Li, C. J.; Li, J. T.

    2018-04-01

    Multiple shock reverberation compression experiments are designed and performed to determine the equation of state of neon ranging from the initial dense gas up to the warm dense regime where the pressure is from about 40 MPa to 120 GPa and the temperature is from about 297 K up to above 20 000 K. The wide region experimental data are used to evaluate the available theoretical models. It is found that, for neon below 1.1 g/cm 3 , within the framework of density functional theory molecular dynamics, a van der Waals correction is meaningful. Under high pressure and temperature, results from the self-consistent fluid variational theory model are sensitive to the potential parameter and could give successful predictions in the whole experimental regime if a set of proper parameters is employed. The new observations on neon under megabar (1 Mbar =1011Pa ) pressure and eV temperature (1 eV ≈104K ) enrich the understanding on properties of warm dense matter and have potential applications in revealing the formation and evolution of gaseous giants or mega-Earths.

  17. Predictive information speeds up visual awareness in an individuation task by modulating threshold setting, not processing efficiency.

    PubMed

    De Loof, Esther; Van Opstal, Filip; Verguts, Tom

    2016-04-01

    Theories on visual awareness claim that predicted stimuli reach awareness faster than unpredicted ones. In the current study, we disentangle whether prior information about the upcoming stimulus affects visual awareness of stimulus location (i.e., individuation) by modulating processing efficiency or threshold setting. Analogous research on stimulus identification revealed that prior information modulates threshold setting. However, as identification and individuation are two functionally and neurally distinct processes, the mechanisms underlying identification cannot simply be extrapolated directly to individuation. The goal of this study was therefore to investigate how individuation is influenced by prior information about the upcoming stimulus. To do so, a drift diffusion model was fitted to estimate the processing efficiency and threshold setting for predicted versus unpredicted stimuli in a cued individuation paradigm. Participants were asked to locate a picture, following a cue that was congruent, incongruent or neutral with respect to the picture's identity. Pictures were individuated faster in the congruent and neutral condition compared to the incongruent condition. In the diffusion model analysis, the processing efficiency was not significantly different across conditions. However, the threshold setting was significantly higher following an incongruent cue compared to both congruent and neutral cues. Our results indicate that predictive information about the upcoming stimulus influences visual awareness by shifting the threshold for individuation rather than by enhancing processing efficiency. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Setting up of the Indian HIPEC Registry: A Registry for Indian Patients with Peritoneal Surface Malignancies.

    PubMed

    Bhatt, Aditi; Mehta, Sanket; Ramakrishnan As; Pande, Pankaj; Rajan, Firoz; Rangole, Ashvin; Saklani, Avanish; Sethna, Kayomarz; Singh, Shivendra; Zaveri, Shabber; Gopinath, K S

    2017-12-01

    There are various registries for patients with peritoneal metastases (PM) that aid pooling of data and generate evidence that dictates current clinical practice. This manuscript describes the setting up of the Indian HIPEC registry that was set up with a similar goal by a group of Indian surgeons. This is a registry for patients with PM treated with CRS and HIPEC in India. It also acts as a database for storing treatment-related information. Patients with PM from colorectal ovarian, gastric, appendiceal tumors, and other rare peritoneal tumors/metastases from rare tumors are enrolled in the registry. A coordinator updates the disease status of patients on a yearly basis. A private organization maintains the database. A non-disclosure agreement is signed between the company and each surgeon contributing to the registry to maintain confidentiality. For enrolling patients, securing institutional permission depends on the requirement of each institute; patient consent is mandatory. Data entry can be prospective or retrospective. To propose and conduct a study, the approval of a scientific committee linked to the registry is required. The Indian HIPEC registry is a practical database for Indian surgeons. There is no regulatory body that mandates collection and publication of scientific data in India. The onus is on each surgeon to capture valuable information pertaining to these common and rare diseases that could contribute to the existing scientific knowledge and guide the treatment of these patients in the future. The next challenge will be to enter data into the registry.

  19. Setting up a Free School: Successful Proposers' Experiences

    ERIC Educational Resources Information Center

    Miller, Paul; Craven, Barrie; Tooley, James

    2014-01-01

    The 2010 Academies Act was significant in introducing Free Schools to the English education system. Opening up funding to new, non-profit entrants on the basis of demand, the policy has aroused support and controversy on political, philosophical and practical educational grounds with implications for social justice in terms of equity and freedom.…

  20. Modelling wave-induced sea ice break-up in the marginal ice zone

    NASA Astrophysics Data System (ADS)

    Montiel, F.; Squire, V. A.

    2017-10-01

    A model of ice floe break-up under ocean wave forcing in the marginal ice zone (MIZ) is proposed to investigate how floe size distribution (FSD) evolves under repeated wave break-up events. A three-dimensional linear model of ocean wave scattering by a finite array of compliant circular ice floes is coupled to a flexural failure model, which breaks a floe into two floes provided the two-dimensional stress field satisfies a break-up criterion. A closed-feedback loop algorithm is devised, which (i) solves the wave-scattering problem for a given FSD under time-harmonic plane wave forcing, (ii) computes the stress field in all the floes, (iii) fractures the floes satisfying the break-up criterion, and (iv) generates an updated FSD, initializing the geometry for the next iteration of the loop. The FSD after 50 break-up events is unimodal and near normal, or bimodal, suggesting waves alone do not govern the power law observed in some field studies. Multiple scattering is found to enhance break-up for long waves and thin ice, but to reduce break-up for short waves and thick ice. A break-up front marches forward in the latter regime, as wave-induced fracture weakens the ice cover, allowing waves to travel deeper into the MIZ.

  1. Setting-up tension in the style of Marantaceae.

    PubMed

    Pischtschan, E; Classen-Bockhoff, R

    2008-07-01

    The Marantaceae stand out from other plant families through their unique style movement which is combined with a highly derived form of secondary pollen presentation. Although known for a long time, the mechanism underlying the movement is not yet understood. In this paper, we report an investigation into the biomechanical principles of this movement. For the first time we experimentally confirm that, in Maranta noctiflora, longitudinal growth of the maturing style within the 'straitjacket' of the hooded staminode involves both arresting of the style before tripping and building up of potential for the movement. The longer the style grows in relation to the enclosing hooded staminode, the more does its capacity for curling increase. We distinguish between the basic tension that a growing style builds up normally, even when the hooded staminode is removed beforehand, and the induced tension which comes about only under the pressure of a too short hooded staminode and which enables the movement. The results of our investigations are discussed in relation to previous interpretations, ranging from biomechanical to electrophysiological mechanisms.

  2. Make-up wells drilling cost in financial model for a geothermal project

    NASA Astrophysics Data System (ADS)

    Oktaviani Purwaningsih, Fitri; Husnie, Ruly; Afuar, Waldy; Abdurrahman, Gugun

    2017-12-01

    After commissioning of a power plant, geothermal reservoir will encounter pressure decline, which will affect wells productivity. Therefore, further drilling is carried out to enhance steam production. Make-up wells are production wells drilled inside an already confirmed reservoir to maintain steam production in a certain level. Based on Sanyal (2004), geothermal power cost consists of three components, those are capital cost, O&M cost and make-up drilling cost. The make-up drilling cost component is a major part of power cost which will give big influence in a whole economical value of the project. The objective of this paper it to analyse the make-up wells drilling cost component in financial model of a geothermal power project. The research will calculate make-up wells requirements, drilling costs as a function of time and how they influence the financial model and affect the power cost. The best scenario in determining make-up wells strategy in relation with the project financial model would be the result of this research.

  3. [Setting up of a day group service system for severely disabled children, the "Koala Club"].

    PubMed

    Ozawa, Hiroshi; Kubota, Masaya; Tanaka, Yoshiko; Atsumi, So; Arimoto, Kiyoshi; Kimiya, Satoshi

    2006-01-01

    This is a report of the setting up of a day group service system for severely disabled children, the "Koala Club". The "Koala Club" was started in 1993, and has been running outside of the hospital since 1997. A support group for the "Koala Club" was established in 1999. Currently 13 children attend the "Koala Club". The staff of the "Koala Club" consists of one coordinater, four nurses and eight care workers. The medical care is fulfilled by nurses. The "Koala Club" open two days a week. It has been supervised by a doctor and a case worker. There is an important role for physicians in the regional care of disabled children.

  4. A set-up for a biased electrode experiment in ADITYA Tokamak

    NASA Astrophysics Data System (ADS)

    Dhyani, Pravesh; Ghosh, Joydeep; Sathyanarayana, K.; Praveenlal, V. E.; Gautam, Pramila; Shah, Minsha; Tanna, R. L.; Kumar, Pintu; Chavda, C.; Patel, N. C.; Panchal, V.; Gupta, C. N.; Jadeja, K. A.; Bhatt, S. B.; Kumar, S.; Raju, D.; Atrey, P. K.; Joisa, S.; Chattopadhyay, P. K.; Saxena, Y. C.

    2014-10-01

    An experimental set-up to investigate the effect of a biased electrode introduced in the edge region on ADITYA tokamak discharges is presented. A specially designed double-bellow mechanical assembly is fabricated for controlling the electrode location as well as its exposed length inside the plasma. The cylindrical molybdenum electrode is powered by a capacitor-bank based pulsed power supply (PPS) using a semiconductor controlled rectifier (SCR) as a switch with forced commutation. A Langmuir probe array for radial profile measurements of plasma potential and density is fabricated and installed. Standard results of improvement of global confinement have been obtained using a biased electrode. In addition to that, in this paper we show for the first time that the same biasing system can be used to avoid disruptions through stabilisation of magnetohydrodynamic (MHD) modes. Real time disruption control experiments have also been carried out by triggering the bias-voltage on the electrode automatically when the Mirnov probe signal exceeds a preset threshold value using a uniquely designed electronic comparator circuit. Most of the results related to the improved confinement and disruption mitigation are obtained in case of the electrode tip being kept at ~3 cm inside the last closed flux surface (LCFS) with an exposed length of ~20 mm in typical discharges of ADITYA tokamak.

  5. Experimental Errors in QSAR Modeling Sets: What We Can Do and What We Cannot Do.

    PubMed

    Zhao, Linlin; Wang, Wenyi; Sedykh, Alexander; Zhu, Hao

    2017-06-30

    Numerous chemical data sets have become available for quantitative structure-activity relationship (QSAR) modeling studies. However, the quality of different data sources may be different based on the nature of experimental protocols. Therefore, potential experimental errors in the modeling sets may lead to the development of poor QSAR models and further affect the predictions of new compounds. In this study, we explored the relationship between the ratio of questionable data in the modeling sets, which was obtained by simulating experimental errors, and the QSAR modeling performance. To this end, we used eight data sets (four continuous endpoints and four categorical endpoints) that have been extensively curated both in-house and by our collaborators to create over 1800 various QSAR models. Each data set was duplicated to create several new modeling sets with different ratios of simulated experimental errors (i.e., randomizing the activities of part of the compounds) in the modeling process. A fivefold cross-validation process was used to evaluate the modeling performance, which deteriorates when the ratio of experimental errors increases. All of the resulting models were also used to predict external sets of new compounds, which were excluded at the beginning of the modeling process. The modeling results showed that the compounds with relatively large prediction errors in cross-validation processes are likely to be those with simulated experimental errors. However, after removing a certain number of compounds with large prediction errors in the cross-validation process, the external predictions of new compounds did not show improvement. Our conclusion is that the QSAR predictions, especially consensus predictions, can identify compounds with potential experimental errors. But removing those compounds by the cross-validation procedure is not a reasonable means to improve model predictivity due to overfitting.

  6. Experimental Errors in QSAR Modeling Sets: What We Can Do and What We Cannot Do

    PubMed Central

    2017-01-01

    Numerous chemical data sets have become available for quantitative structure–activity relationship (QSAR) modeling studies. However, the quality of different data sources may be different based on the nature of experimental protocols. Therefore, potential experimental errors in the modeling sets may lead to the development of poor QSAR models and further affect the predictions of new compounds. In this study, we explored the relationship between the ratio of questionable data in the modeling sets, which was obtained by simulating experimental errors, and the QSAR modeling performance. To this end, we used eight data sets (four continuous endpoints and four categorical endpoints) that have been extensively curated both in-house and by our collaborators to create over 1800 various QSAR models. Each data set was duplicated to create several new modeling sets with different ratios of simulated experimental errors (i.e., randomizing the activities of part of the compounds) in the modeling process. A fivefold cross-validation process was used to evaluate the modeling performance, which deteriorates when the ratio of experimental errors increases. All of the resulting models were also used to predict external sets of new compounds, which were excluded at the beginning of the modeling process. The modeling results showed that the compounds with relatively large prediction errors in cross-validation processes are likely to be those with simulated experimental errors. However, after removing a certain number of compounds with large prediction errors in the cross-validation process, the external predictions of new compounds did not show improvement. Our conclusion is that the QSAR predictions, especially consensus predictions, can identify compounds with potential experimental errors. But removing those compounds by the cross-validation procedure is not a reasonable means to improve model predictivity due to overfitting. PMID:28691113

  7. Pharmacists providing care in the outpatient setting through telemedicine models: a narrative review.

    PubMed

    Littauer, Sydney L; Dixon, Dave L; Mishra, Vimal K; Sisson, Evan M; Salgado, Teresa M

    2017-01-01

    Telemedicine refers to the delivery of clinical services using technology that allows two-way, real time, interactive communication between the patient and the clinician at a distant site. Commonly, telemedicine is used to improve access to general and specialty care for patients in rural areas. This review aims to provide an overview of existing telemedicine models involving the delivery of care by pharmacists via telemedicine (including telemonitoring and video, but excluding follow-up telephone calls) and to highlight the main areas of chronic-disease management where these models have been applied. Studies within the areas of hypertension, diabetes, asthma, anticoagulation and depression were identified, but only two randomized controlled trials with adequate sample size demonstrating the positive impact of telemonitoring combined with pharmacist care in hypertension were identified. The evidence for the impact of pharmacist-based telemedicine models is sparse and weak, with the studies conducted presenting serious threats to internal and external validity. Therefore, no definitive conclusions about the impact of pharmacist-led telemedicine models can be made at this time. In the Unites States, the increasing shortage of primary care providers and specialists represents an opportunity for pharmacists to assume a more prominent role managing patients with chronic disease in the ambulatory care setting. However, lack of reimbursement may pose a barrier to the provision of care by pharmacists using telemedicine.

  8. Pharmacists providing care in the outpatient setting through telemedicine models: a narrative review

    PubMed Central

    Littauer, Sydney L.

    2017-01-01

    Telemedicine refers to the delivery of clinical services using technology that allows two-way, real time, interactive communication between the patient and the clinician at a distant site. Commonly, telemedicine is used to improve access to general and specialty care for patients in rural areas. This review aims to provide an overview of existing telemedicine models involving the delivery of care by pharmacists via telemedicine (including telemonitoring and video, but excluding follow-up telephone calls) and to highlight the main areas of chronic-disease management where these models have been applied. Studies within the areas of hypertension, diabetes, asthma, anticoagulation and depression were identified, but only two randomized controlled trials with adequate sample size demonstrating the positive impact of telemonitoring combined with pharmacist care in hypertension were identified. The evidence for the impact of pharmacist-based telemedicine models is sparse and weak, with the studies conducted presenting serious threats to internal and external validity. Therefore, no definitive conclusions about the impact of pharmacist-led telemedicine models can be made at this time. In the Unites States, the increasing shortage of primary care providers and specialists represents an opportunity for pharmacists to assume a more prominent role managing patients with chronic disease in the ambulatory care setting. However, lack of reimbursement may pose a barrier to the provision of care by pharmacists using telemedicine. PMID:29317927

  9. Using Set Model for Learning Addition of Integers

    ERIC Educational Resources Information Center

    Lestari, Umi Puji; Putri, Ratu Ilma Indra; Hartono, Yusuf

    2015-01-01

    This study aims to investigate how set model can help students' understanding of addition of integers in fourth grade. The study has been carried out to 23 students and a teacher of IVC SD Iba Palembang in January 2015. This study is a design research that also promotes PMRI as the underlying design context and activity. Results showed that the…

  10. Validation of tsunami inundation model TUNA-RP using OAR-PMEL-135 benchmark problem set

    NASA Astrophysics Data System (ADS)

    Koh, H. L.; Teh, S. Y.; Tan, W. K.; Kh'ng, X. Y.

    2017-05-01

    A standard set of benchmark problems, known as OAR-PMEL-135, is developed by the US National Tsunami Hazard Mitigation Program for tsunami inundation model validation. Any tsunami inundation model must be tested for its accuracy and capability using this standard set of benchmark problems before it can be gainfully used for inundation simulation. The authors have previously developed an in-house tsunami inundation model known as TUNA-RP. This inundation model solves the two-dimensional nonlinear shallow water equations coupled with a wet-dry moving boundary algorithm. This paper presents the validation of TUNA-RP against the solutions provided in the OAR-PMEL-135 benchmark problem set. This benchmark validation testing shows that TUNA-RP can indeed perform inundation simulation with accuracy consistent with that in the tested benchmark problem set.

  11. Consistency of QSAR models: Correct split of training and test sets, ranking of models and performance parameters.

    PubMed

    Rácz, A; Bajusz, D; Héberger, K

    2015-01-01

    Recent implementations of QSAR modelling software provide the user with numerous models and a wealth of information. In this work, we provide some guidance on how one should interpret the results of QSAR modelling, compare and assess the resulting models, and select the best and most consistent ones. Two QSAR datasets are applied as case studies for the comparison of model performance parameters and model selection methods. We demonstrate the capabilities of sum of ranking differences (SRD) in model selection and ranking, and identify the best performance indicators and models. While the exchange of the original training and (external) test sets does not affect the ranking of performance parameters, it provides improved models in certain cases (despite the lower number of molecules in the training set). Performance parameters for external validation are substantially separated from the other merits in SRD analyses, highlighting their value in data fusion.

  12. "The Perfect Set Up" A Study of Intensive Service

    ERIC Educational Resources Information Center

    Brummit, Houston; Schieren, Anne G.

    1970-01-01

    Extensive three year study of program involving resident school psychiatrist in extremely mobil setting. Medical help can reduce behavior problems but not academic retardation. Recommendations are offered to increase effectiveness of guidance programs for disadvantaged, culturally different pupils. (CJ)

  13. Bottom-up modeling of damage in heterogeneous quasi-brittle solids

    NASA Astrophysics Data System (ADS)

    Rinaldi, Antonio

    2013-03-01

    The theoretical modeling of multisite cracking in quasi-brittle materials is a complex damage problem, hard to model with traditional methods of fracture mechanics due to its multiscale nature and to strain localization induced by microcracks interaction. Macroscale "effective" elastic models can be conveniently applied if a suitable Helmholtz free energy function is identified for a given material scenario. Del Piero and Truskinovsky (Continuum Mech Thermodyn 21:141-171, 2009), among other authors, investigated macroscale continuum solutions capable of matching—in a top-down view—the phenomenology of the damage process for quasi-brittle materials regardless of the microstructure. On the contrary, this paper features a physically based solution method that starts from the direct consideration of the microscale properties and, in a bottom-up view, recovers a continuum elastic description. This procedure is illustrated for a simple one-dimensional problem of this type, a bar modeled stretched by an axial displacement, where the bar is modeled as a 2D random lattice of decohesive spring elements of finite strength. The (microscale) data from simulations are used to identify the "exact" (macro-) damage parameter and to build up the (macro-) Helmholtz function for the equivalent elastic model, bridging the macroscale approach by Del Piero and Truskinovsky. The elastic approach, coupled with microstructural knowledge, becomes a more powerful tool to reproduce a broad class of macroscopic material responses by changing the convexity-concavity of the Helmholtz energy. The analysis points out that mean-field statistics are appropriate prior to damage localization but max-field statistics are better suited in the softening regime up to failure, where microstrain fluctuation needs to be incorporated in the continuum model. This observation is of consequence to revise mean-field damage models from literature and to calibrate Nth gradient continuum models.

  14. Machine Tool Technology. Automatic Screw Machine Troubleshooting & Set-Up Training Outlines [and] Basic Operator's Skills Set List.

    ERIC Educational Resources Information Center

    Anoka-Hennepin Technical Coll., Minneapolis, MN.

    This set of two training outlines and one basic skills set list are designed for a machine tool technology program developed during a project to retrain defense industry workers at risk of job loss or dislocation because of conversion of the defense industry. The first troubleshooting training outline lists the categories of problems that develop…

  15. Using the Lives Saved Tool (LiST) to Model mHealth Impact on Neonatal Survival in Resource-Limited Settings

    PubMed Central

    Jo, Youngji; Labrique, Alain B.; Lefevre, Amnesty E.; Mehl, Garrett; Pfaff, Teresa; Walker, Neff; Friberg, Ingrid K.

    2014-01-01

    While the importance of mHealth scale-up has been broadly emphasized in the mHealth community, it is necessary to guide scale up efforts and investment in ways to help achieve the mortality reduction targets set by global calls to action such as the Millennium Development Goals, not merely to expand programs. We used the Lives Saved Tool (LiST)–an evidence-based modeling software–to identify priority areas for maternal and neonatal health services, by formulating six individual and combined interventions scenarios for two countries, Bangladesh and Uganda. Our findings show that skilled birth attendance and increased facility delivery as targets for mHealth strategies are likely to provide the biggest mortality impact relative to other intervention scenarios. Although further validation of this model is desirable, tools such as LiST can help us leverage the benefit of mHealth by articulating the most appropriate delivery points in the continuum of care to save lives. PMID:25014008

  16. The Revolving Fund Pharmacy Model: backing up the Ministry of Health supply chain in western Kenya.

    PubMed

    Manji, Imran; Manyara, Simon M; Jakait, Beatrice; Ogallo, William; Hagedorn, Isabel C; Lukas, Stephanie; Kosgei, Eunice J; Pastakia, Sonak D

    2016-10-01

    A pressing challenge in low and middle-income countries (LMIC) is inadequate access to essential medicines, especially for chronic diseases. The Revolving Fund Pharmacy (RFP) model is an initiative to provide high-quality medications consistently to patients, using revenues generated from the sale of medications to sustainably resupply medications. This article describes the utilization of RFPs developed by the Academic Model Providing Access to Healthcare (AMPATH) with the aim of stimulating the implementation of similar models elsewhere to ensure sustainable access to quality and affordable medications in similar LMIC settings. The service evaluation of three pilot RFPs started between April 2011 and January 2012 in select government facilities is described. The evaluation assessed cross-sectional availability of essential medicines before and after implementation of the RFPs, number of patient encounters and the impact of community awareness activities. Availability of essential medicines in the three pilot RFPs increased from 40%, 36% and <10% to 90%, 94% and 91% respectively. After the first year of operation, the pilot RFPs had a total of 33 714 patient encounters. As of February 2014, almost 3 years after starting up the first RFP, the RFPs had a total of 115 991 patient encounters. In the Eldoret RFP, community awareness activities led to a 51% increase in sales. With proper oversight and stakeholder involvement, this model is a potential solution to improve availability of essential medicines in LMICs. These pilots exemplify the feasibility of implementing and scaling up this model in other locations. © 2016 Royal Pharmaceutical Society.

  17. Modelling wave-induced sea ice break-up in the marginal ice zone

    PubMed Central

    Squire, V. A.

    2017-01-01

    A model of ice floe break-up under ocean wave forcing in the marginal ice zone (MIZ) is proposed to investigate how floe size distribution (FSD) evolves under repeated wave break-up events. A three-dimensional linear model of ocean wave scattering by a finite array of compliant circular ice floes is coupled to a flexural failure model, which breaks a floe into two floes provided the two-dimensional stress field satisfies a break-up criterion. A closed-feedback loop algorithm is devised, which (i) solves the wave-scattering problem for a given FSD under time-harmonic plane wave forcing, (ii) computes the stress field in all the floes, (iii) fractures the floes satisfying the break-up criterion, and (iv) generates an updated FSD, initializing the geometry for the next iteration of the loop. The FSD after 50 break-up events is unimodal and near normal, or bimodal, suggesting waves alone do not govern the power law observed in some field studies. Multiple scattering is found to enhance break-up for long waves and thin ice, but to reduce break-up for short waves and thick ice. A break-up front marches forward in the latter regime, as wave-induced fracture weakens the ice cover, allowing waves to travel deeper into the MIZ. PMID:29118659

  18. Modelling wave-induced sea ice break-up in the marginal ice zone.

    PubMed

    Montiel, F; Squire, V A

    2017-10-01

    A model of ice floe break-up under ocean wave forcing in the marginal ice zone (MIZ) is proposed to investigate how floe size distribution (FSD) evolves under repeated wave break-up events. A three-dimensional linear model of ocean wave scattering by a finite array of compliant circular ice floes is coupled to a flexural failure model, which breaks a floe into two floes provided the two-dimensional stress field satisfies a break-up criterion. A closed-feedback loop algorithm is devised, which (i) solves the wave-scattering problem for a given FSD under time-harmonic plane wave forcing, (ii) computes the stress field in all the floes, (iii) fractures the floes satisfying the break-up criterion, and (iv) generates an updated FSD, initializing the geometry for the next iteration of the loop. The FSD after 50 break-up events is unimodal and near normal, or bimodal, suggesting waves alone do not govern the power law observed in some field studies. Multiple scattering is found to enhance break-up for long waves and thin ice, but to reduce break-up for short waves and thick ice. A break-up front marches forward in the latter regime, as wave-induced fracture weakens the ice cover, allowing waves to travel deeper into the MIZ.

  19. Using Set Covering with Item Sampling to Analyze the Infeasibility of Linear Programming Test Assembly Models

    ERIC Educational Resources Information Center

    Huitzing, Hiddo A.

    2004-01-01

    This article shows how set covering with item sampling (SCIS) methods can be used in the analysis and preanalysis of linear programming models for test assembly (LPTA). LPTA models can construct tests, fulfilling a set of constraints set by the test assembler. Sometimes, no solution to the LPTA model exists. The model is then said to be…

  20. SU-C-204-06: Surface Imaging for the Set-Up of Proton Post-Mastectomy Chestwall Irradiation: Gated Images Vs Non Gated Images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Batin, E; Depauw, N; MacDonald, S

    Purpose: Historically, the set-up for proton post-mastectomy chestwall irradiation at our institution started with positioning the patient using tattoos and lasers. One or more rounds of orthogonal X-rays at gantry 0° and beamline X-ray at treatment gantry angle were then taken to finalize the set-up position. As chestwall targets are shallow and superficial, surface imaging is a promising tool for set-up and needs to be investigated Methods: The orthogonal imaging was entirely replaced by AlignRT™ (ART) images. The beamline X-Ray image is kept as a confirmation, based primarily on three opaque markers placed on skin surface instead of bony anatomy.more » In the first phase of the process, ART gated images were used to set-up the patient and the same specific point of the breathing curve was used every day. The moves (translations and rotations) computed for each point of the breathing curve during the first five fractions were analyzed for ten patients. During a second phase of the study, ART gated images were replaced by ART non-gated images combined with real-time monitoring. In both cases, ART images were acquired just before treatment to access the patient position compare to the non-gated CT. Results: The average difference between the maximum move and the minimum move depending on the chosen breathing curve point was less than 1.7 mm for all translations and less than 0.7° for all rotations. The average position discrepancy over the course of treatment obtained by ART non gated images combined to real-time monitoring taken before treatment to the planning CT were smaller than the average position discrepancy obtained using ART gated images. The X-Ray validation images show similar results with both ART imaging process. Conclusion: The use of ART non gated images combined with real time imaging allows positioning post-mastectomy chestwall patients in less than 3 mm / 1°.« less

  1. Rational selection of training and test sets for the development of validated QSAR models

    NASA Astrophysics Data System (ADS)

    Golbraikh, Alexander; Shen, Min; Xiao, Zhiyan; Xiao, Yun-De; Lee, Kuo-Hsiung; Tropsha, Alexander

    2003-02-01

    Quantitative Structure-Activity Relationship (QSAR) models are used increasingly to screen chemical databases and/or virtual chemical libraries for potentially bioactive molecules. These developments emphasize the importance of rigorous model validation to ensure that the models have acceptable predictive power. Using k nearest neighbors ( kNN) variable selection QSAR method for the analysis of several datasets, we have demonstrated recently that the widely accepted leave-one-out (LOO) cross-validated R2 (q2) is an inadequate characteristic to assess the predictive ability of the models [Golbraikh, A., Tropsha, A. Beware of q2! J. Mol. Graphics Mod. 20, 269-276, (2002)]. Herein, we provide additional evidence that there exists no correlation between the values of q 2 for the training set and accuracy of prediction ( R 2) for the test set and argue that this observation is a general property of any QSAR model developed with LOO cross-validation. We suggest that external validation using rationally selected training and test sets provides a means to establish a reliable QSAR model. We propose several approaches to the division of experimental datasets into training and test sets and apply them in QSAR studies of 48 functionalized amino acid anticonvulsants and a series of 157 epipodophyllotoxin derivatives with antitumor activity. We formulate a set of general criteria for the evaluation of predictive power of QSAR models.

  2. The effectiveness of flipped classroom learning model in secondary physics classroom setting

    NASA Astrophysics Data System (ADS)

    Prasetyo, B. D.; Suprapto, N.; Pudyastomo, R. N.

    2018-03-01

    The research aimed to describe the effectiveness of flipped classroom learning model on secondary physics classroom setting during Fall semester of 2017. The research object was Secondary 3 Physics group of Singapore School Kelapa Gading. This research was initiated by giving a pre-test, followed by treatment setting of the flipped classroom learning model. By the end of the learning process, the pupils were given a post-test and questionnaire to figure out pupils' response to the flipped classroom learning model. Based on the data analysis, 89% of pupils had passed the minimum criteria of standardization. The increment level in the students' mark was analysed by normalized n-gain formula, obtaining a normalized n-gain score of 0.4 which fulfil medium category range. Obtains from the questionnaire distributed to the students that 93% of students become more motivated to study physics and 89% of students were very happy to carry on hands-on activity based on the flipped classroom learning model. Those three aspects were used to generate a conclusion that applying flipped classroom learning model in Secondary Physics Classroom setting is effectively applicable.

  3. Assessing the Effectiveness of Ramp-Up During Sonar Operations Using Exposure Models.

    PubMed

    von Benda-Beckmann, Alexander M; Wensveen, Paul J; Kvadsheim, Petter H; Lam, Frans-Peter A; Miller, Patrick J O; Tyack, Peter L; Ainslie, Michael A

    2016-01-01

    Ramp-up procedures are used to mitigate the impact of sound on marine mammals. Sound exposure models combined with observations of marine mammals responding to sound can be used to assess the effectiveness of ramp-up procedures. We found that ramp-up procedures before full-level sonar operations can reduce the risk of hearing threshold shifts with marine mammals, but their effectiveness depends strongly on the responsiveness of the animals. In this paper, we investigated the effect of sonar parameters (source level, pulse-repetition time, ship speed) on sound exposure by using a simple analytical model and highlight the mechanisms that limit the effectiveness of ramp-up procedures.

  4. 7 CFR Exhibit J to Subpart A of... - Manufactured Home Sites, Rental Projects and Subdivisions: Development, Installation and Set-Up

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... established frost line without exceeding the safe bearing capacity of the supporting soil. Set-Up. The work... architectural practices and shall provide for all utilities in a manner which allows adequate, economic, safe... residential environment which is an asset to the community in which it is located. 4. Lot Size. The size of...

  5. Effectiveness of reactive case detection for malaria elimination in three archetypical transmission settings: a modelling study.

    PubMed

    Gerardin, Jaline; Bever, Caitlin A; Bridenbecker, Daniel; Hamainza, Busiku; Silumbe, Kafula; Miller, John M; Eisele, Thomas P; Eckhoff, Philip A; Wenger, Edward A

    2017-06-12

    reactive case detection or mass drug campaigns. Reactive case detection is recommended only for settings where transmission has recently been reduced rather than all low-transmission settings. This is demonstrated in a modelling framework with strong out-of-sample accuracy across a range of transmission settings while including methodologies for understanding the most resource-effective allocations of health workers. This approach generalizes to providing a platform for planning rational scale-up of health systems based on locally-optimized impact according to simplified stratification.

  6. Reconstruction of gene regulatory modules from RNA silencing of IFN-α modulators: experimental set-up and inference method.

    PubMed

    Grassi, Angela; Di Camillo, Barbara; Ciccarese, Francesco; Agnusdei, Valentina; Zanovello, Paola; Amadori, Alberto; Finesso, Lorenzo; Indraccolo, Stefano; Toffolo, Gianna Maria

    2016-03-12

    Inference of gene regulation from expression data may help to unravel regulatory mechanisms involved in complex diseases or in the action of specific drugs. A challenging task for many researchers working in the field of systems biology is to build up an experiment with a limited budget and produce a dataset suitable to reconstruct putative regulatory modules worth of biological validation. Here, we focus on small-scale gene expression screens and we introduce a novel experimental set-up and a customized method of analysis to make inference on regulatory modules starting from genetic perturbation data, e.g. knockdown and overexpression data. To illustrate the utility of our strategy, it was applied to produce and analyze a dataset of quantitative real-time RT-PCR data, in which interferon-α (IFN-α) transcriptional response in endothelial cells is investigated by RNA silencing of two candidate IFN-α modulators, STAT1 and IFIH1. A putative regulatory module was reconstructed by our method, revealing an intriguing feed-forward loop, in which STAT1 regulates IFIH1 and they both negatively regulate IFNAR1. STAT1 regulation on IFNAR1 was object of experimental validation at the protein level. Detailed description of the experimental set-up and of the analysis procedure is reported, with the intent to be of inspiration for other scientists who want to realize similar experiments to reconstruct gene regulatory modules starting from perturbations of possible regulators. Application of our approach to the study of IFN-α transcriptional response modulators in endothelial cells has led to many interesting novel findings and new biological hypotheses worth of validation.

  7. Influence of the set-up on the recording of diffractive optical elements into photopolymers

    NASA Astrophysics Data System (ADS)

    Gallego, S.; Fernández, R.; Márquez, A.; Neipp, C.; Beléndez, A.; Pascual, I.

    2014-05-01

    Photopolymers are often used as a base of holographic memories displays. Recently the capacity of photopolymers to record diffractive optical elements (DOE's) has been demonstrated. To fabricate diffractive optical elements we use a hybrid setup that is composed by three different parts: LCD, optical system and the recording material. The DOE pattern is introduced by a liquid crystal display (LCD) working in the amplitude only mode to work as a master to project optically the DOE onto the recording material. The main advantage of this display is that permit us modify the DOE automatically, we use the electronics of the video projector to send the voltage to the pixels of the LCD. The LCD is used in the amplitude-mostly modulation regime by proper orientation of the external polarizers (P); then the pattern is imaged onto the material with an increased spatial frequency (a demagnifying factor of 2) by the optical system. The use of the LCD allows us to change DOE recorded in the photopolymer without moving any mechanical part of the set-up. A diaphragm is placed in the focal plane of the relay lens so as to eliminate the diffraction orders produced by the pixelation of the LCD. It can be expected that the final pattern imaged onto the recording material will be low filtered due to the finite aperture of the imaging system and especially due to the filtering process produced by the diaphragm. In this work we analyze the effect of the visibility achieved with the LCD and the high frequency cut-off due to the diaphragm in the final DOE recorded into the photopolymer. To simulate the recording we have used the fitted values parameters obtained for PVA/AA based photopolymers and the 3 dimensional models presented in previous works.

  8. A full set of langatate high-temperature acoustic wave constants: elastic, piezoelectric, dielectric constants up to 900°C.

    PubMed

    Davulis, Peter M; da Cunha, Mauricio Pereira

    2013-04-01

    A full set of langatate (LGT) elastic, dielectric, and piezoelectric constants with their respective temperature coefficients up to 900°C is presented, and the relevance of the dielectric and piezoelectric constants and temperature coefficients are discussed with respect to predicted and measured high-temperature SAW propagation properties. The set of constants allows for high-temperature acoustic wave (AW) propagation studies and device design. The dielectric constants and polarization and conductive losses were extracted by impedance spectroscopy of parallel-plate capacitors. The measured dielectric constants at high temperatures were combined with previously measured LGT expansion coefficients and used to determine the elastic and piezoelectric constants using resonant ultrasound spectroscopy (RUS) measurements at temperatures up to 900°C. The extracted LGT piezoelectric constants and temperature coefficients show that e11 and e14 change by up to 62% and 77%, respectively, for the entire 25°C to 900°C range when compared with room-temperature values. The LGT high-temperature constants and temperature coefficients were verified by comparing measured and predicted phase velocities (vp) and temperature coefficients of delay (TCD) of SAW delay lines fabricated along 6 orientations in the LGT plane (90°, 23°, Ψ) up to 900°C. For the 6 tested orientations, the predicted SAW vp agree within 0.2% of the measured vp on average and the calculated TCD is within 9.6 ppm/°C of the measured value on average over the temperature range of 25°C to 900°C. By including the temperature dependence of both dielectric and piezoelectric constants, the average discrepancies between predicted and measured SAW properties were reduced, on average: 77% for vp, 13% for TCD, and 63% for the turn-over temperatures analyzed.

  9. ProtSqueeze: simple and effective automated tool for setting up membrane protein simulations.

    PubMed

    Yesylevskyy, Semen O

    2007-01-01

    The major challenge in setting up membrane protein simulations is embedding the protein into the pre-equilibrated lipid bilayer. Several techniques were proposed to achieve optimal packing of the lipid molecules around the protein. However, all of them possess serious disadvantages, which limit their applicability and discourage the users of simulation packages from using them. In the present work, we analyzed existing approaches and proposed a new procedure of protein insertion into the lipid bilayer, which is implemented in the ProtSqueeze software. The advantages of ProtSqueeze are as follows: (1) the insertion algorithm is simple, understandable, and controllable; (2) the software can work with virtually any simulation package on virtually any platform; (3) no modification of the source code of the simulation package is needed; (4) the procedure of insertion is as automated as possible; (5) ProtSqueeze is distributed for free under a general public license. In this work, we present the architecture and the algorithm of ProtSqueeze and demonstrate its usage in case studies.

  10. [Care practices for neonates while setting up a neonatal unit in a university hospital].

    PubMed

    Pedron, Cecília Drebes; Bonilha, Ana Lúcia de Lourenzi

    2008-12-01

    The hospitalization process of neonates makes them vulnerable to several care practices. The aim of this study was to get to know the care practices adopted by health professionals while setting up a neonatal unit at the Hospital de Clínicas of Porto Alegre, Rio Grande do Sul, Brazil. This is a qualitative study based on the New History Theory. The study collected data from October 2006 to January 2007. Fifteen health professionals responsible for the project and/or its implementation from 1972 to 1984 provided information. The thematic data analysis highlighted the concern among health professionals of making good use of technological advances, as well as unifying scientifically-based conducts. Besides, they tried to establish routines enabling neonate's parents to stay at the bedside during the whole hospitalization period. Finally, it was inferred that the main objective of these practices was to increase the survival of neonates.

  11. Surface- and Contour-Preserving Origamic Architecture Paper Pop-Ups.

    PubMed

    Le, Sang N; Leow, Su-Jun; Le-Nguyen, Tuong-Vu; Ruiz, Conrado; Low, Kok-Lim

    2013-08-02

    Origamic architecture (OA) is a form of papercraft that involves cutting and folding a single sheet of paper to produce a 3D pop-up, and is commonly used to depict architectural structures. Because of the strict geometric and physical constraints, OA design requires considerable skill and effort. In this paper, we present a method to automatically generate an OA design that closely depicts an input 3D model. Our algorithm is guided by a novel set of geometric conditions to guarantee the foldability and stability of the generated pop-ups. The generality of the conditions allows our algorithm to generate valid pop-up structures that are previously not accounted for by other algorithms. Our method takes a novel image-domain approach to convert the input model to an OA design. It performs surface segmentation of the input model in the image domain, and carefully represents each surface with a set of parallel patches. Patches are then modified to make the entire structure foldable and stable. Visual and quantitative comparisons of results have shown our algorithm to be significantly better than the existing methods in the preservation of contours, surfaces and volume. The designs have also been shown to more closely resemble those created by real artists.

  12. Surface and contour-preserving origamic architecture paper pop-ups.

    PubMed

    Le, Sang N; Leow, Su-Jun; Le-Nguyen, Tuong-Vu; Ruiz, Conrado; Low, Kok-Lim

    2014-02-01

    Origamic architecture (OA) is a form of papercraft that involves cutting and folding a single sheet of paper to produce a 3D pop-up, and is commonly used to depict architectural structures. Because of the strict geometric and physical constraints, OA design requires considerable skill and effort. In this paper, we present a method to automatically generate an OA design that closely depicts an input 3D model. Our algorithm is guided by a novel set of geometric conditions to guarantee the foldability and stability of the generated pop-ups. The generality of the conditions allows our algorithm to generate valid pop-up structures that are previously not accounted for by other algorithms. Our method takes a novel image-domain approach to convert the input model to an OA design. It performs surface segmentation of the input model in the image domain, and carefully represents each surface with a set of parallel patches. Patches are then modified to make the entire structure foldable and stable. Visual and quantitative comparisons of results have shown our algorithm to be significantly better than the existing methods in the preservation of contours, surfaces, and volume. The designs have also been shown to more closely resemble those created by real artists.

  13. What Time is Your Sunset? Accounting for Refraction in Sunrise/set Prediction Models

    NASA Astrophysics Data System (ADS)

    Wilson, Teresa; Bartlett, Jennifer Lynn; Chizek Frouard, Malynda; Hilton, James; Phlips, Alan; Edgar, Roman

    2018-01-01

    Algorithms that predict sunrise and sunset times currently have an uncertainty of one to four minutes at mid-latitudes (0° - 55° N/S) due to limitations in the atmospheric models they incorporate. At higher latitudes, slight changes in refraction can cause significant discrepancies, including difficulties determining whether the Sun appears to rise or set. While different components of refraction are known, how they affect predictions of sunrise/set has not yet been quantified. A better understanding of the contributions from temperature profile, pressure, humidity, and aerosols could significantly improve the standard prediction.We present a sunrise/set calculator that interchanges the refraction component by varying the refraction model. We, then, compared these predictions with data sets of observed rise/set times taken from Mount Wilson Observatory in California, University of Alberta in Edmonton, Alberta, and onboard the SS James Franco in the Atlantic. A thorough investigation of the problem requires a more substantial data set of observed rise/set times and corresponding meteorological data from around the world.We have developed a mobile application, Sunrise & Sunset Observer, so that anyone can capture this astronomical and meteorological data using their smartphone video recorder as part of a citizen science project. The Android app for this project is available in the Google Play store. Videos can also be submitted through the project website (riseset.phy.mtu.edu). Data analysis will lead to more complete models that will provide higher accuracy rise/set predictions to benefit astronomers, navigators, and outdoorsmen everywhere.

  14. Stereo Electro-optical Tracking System (SETS)

    NASA Astrophysics Data System (ADS)

    Koenig, E. W.

    1984-09-01

    The SETS is a remote, non-contacting, high-accuracy tracking system for the measurement of deflection of models in the National Transonic Facility at Langley Research Center. The system consists of four electronically scanned image dissector trackers which locate the position of Light Emitting Diodes embedded in the wing or body of aircraft models. Target location data is recorded on magnetic tape for later 3-D processing. Up to 63 targets per model may be tracked at typical rates of 1280 targets per second and to precision of 0.02mm at the target under the cold (-193 C) environment of the NTF tunnel.

  15. Development of a reliable experimental set-up for Dover sole larvae Solea solea L. and exploring the possibility of implementing this housing system in a gnotobiotic model.

    PubMed

    De Swaef, Evelien; Demeestere, Kristof; Boon, Nico; Van den Broeck, Wim; Haesebrouck, Freddy; Decostere, Annemie

    2017-12-01

    Due to the increasing importance of the aquaculture sector, diversification in the number of cultured species imposes itself. Dover sole Solea solea L. is put forward as an important new aquaculture candidate due to its high market value and high flesh quality. However, as for many other fish species, sole production is hampered by amongst others high susceptibility to diseases and larval mortality, rendering the need for more research in this area. In this respect, in first instance, a housing system for Dover sole larvae was pinpointed by keeping the animals individually in 24-well plates for 26days with good survival rates and initiating metamorphosis. This ensures a standardised and reliable experimental set-up in which the possible death of one larva has no effect on the other larvae, rendering experiments adopting such a system more reproducible. In addition to proving valuable in many other applications, this multi well system constitutes a firm basis to enable the gnotobiotic rearing of larvae, which hitherto is non-existing for Dover sole. In this respect, secondly, a large number of disinfection protocols were tested, making use of widely employed disinfectants as hydrogen peroxide, glutaraldehyde and/or ozone whether or not combined with a mixture of antimicrobial agents for 24h. Although none of the tested protocols was sufficient to reproducibly generate a gnotobiotic model, the combination of glutaraldehyde and hydrogen peroxide resulted in hatchable, bacteria-free eggs in some cases. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Mathematical modeling and analysis of heat pipe start-up from the frozen state

    NASA Technical Reports Server (NTRS)

    Jang, Jong Hoon; Faghri, Amir; Chang, Won Soon; Mahefkey, Edward T.

    1989-01-01

    The start-up process of a frozen heat pipe is described and a complete mathematical model for the start-up of the frozen heat pipe is developed based on the existing experimental data, which is simplified and solved numerically. The two-dimensional transient model for the wall and wick is coupled with the one-dimensional transient model for the vapor flow when vaporization and condensation occur at the interface. A parametric study is performed to examine the effect of the boundary specification at the surface of the outer wall on the successful start-up from the frozen state. For successful start-up, the boundary specification at the outer wall surface must melt the working substance in the condenser before dry-out takes place in the evaporator.

  17. Mathematical modeling and analysis of heat pipe start-up from the frozen state

    NASA Technical Reports Server (NTRS)

    Jang, J. H.; Faghri, A.; Chang, W. S.; Mahefkey, E. T.

    1990-01-01

    The start-up process of a frozen heat pipe is described and a complete mathematical model for the start-up of the frozen heat pipe is developed based on the existing experimental data, which is simplified and solved numerically. The two-dimensional transient model for the wall and wick is coupled with the one-dimensional transient model for the vapor flow when vaporization and condensation occur at the interface. A parametric study is performed to examine the effect of the boundary specification at the surface of the outer wall on the successful start-up from the frozen state. For successful start-up, the boundary specification at the outer wall surface must melt the working substance in the condenser before dry-out takes place in the evaporator.

  18. The Interaction Between Dynamics and Chemistry of Ozone in the Set-up Phase of the Northern Hemisphere Polar Vortex

    NASA Technical Reports Server (NTRS)

    Kawa, S. R.; Bevilacqua, R.; Margitan, J. J.; Douglass, A. R.; Schoeberl, M. R.; Hoppel, K.; Sen, B.; Bhartia, P. K. (Technical Monitor)

    2001-01-01

    The morphology and evolution of the stratospheric ozone (O3) distribution at high latitudes in the Northern Hemisphere (NH) are examined for the late summer and fall seasons of 1999. This time period sets the O3 initial condition for the SOLVE/THESEO field mission performed during winter 1999-2000. In situ and satellite data are used along with a three-dimensional model of chemistry and transport (CTM) to determine the key processes that control the distribution of O3 in the lower-to-middle stratosphere. O3 in the vortex at the beginning of the winter season is found to be nearly constant from 500 to above 800 K with a value at 3 ppmv +/- approx. 10%. Values outside the vortex are up to a factor of 2 higher and increase significantly with potential temperature. The seasonal time series of data from POAM shows that relatively low O3 mixing ratios, which characterize the vortex in late fall, are already present at high latitudes at the end of summer before the vortex circulation sets up. Analysis of the CTM output shows that the minimum O3 and increase in variance in late summer are the result of: 1) stirring of polar concentric O3 gradients by nascent wave-driven transport, and 2) an acceleration of net photochemical loss with decreasing solar illumination. The segregation of low O3 mixing ratios into the vortex as the circulation strengthens through the fall suggests a possible feedback role between O3 chemistry and the vortex formation dynamics. Trajectory calculations from O3 sample points early in the fall, however, show only a weak correlation between initial O3 mixing ratio and potential vorticity later in the season consistent with order-of-magnitude calculations for the relative importance of O3 in the fall radiative balance at high latitudes. The possible connection between O3 chemistry and the dynamics of vortex formation does suggest that these feedbacks and sensitivities need to be better understood in order to make confident predictions of the recovery

  19. Goal setting with mothers in child development services.

    PubMed

    Forsingdal, S; St John, W; Miller, V; Harvey, A; Wearne, P

    2014-07-01

    The aim of this grounded theory study was to explore mothers' perspectives of the processes of collaborative goal setting in multidisciplinary child development services involving follow-up home therapy. Semi-structured interviews were conducted in South East Queensland, Australia with 14 mothers of children aged 3-6 years who were accessing multidisciplinary child development services. Interviews were focussed around the process of goal setting. A grounded theory of Maternal Roles in Goal Setting (The M-RIGS Model) was developed from analysis of data. Mothers assumed Dependent, Active Participator and Collaborator roles when engaging with the therapist in goal-setting processes. These roles were characterized by the mother's level of dependence on the therapist and insight into their child's needs and therapy processes. Goal Factors, Parent Factors and Therapist Factors influenced and added complexity to the goal-setting process. The M-RIGS Model highlights that mothers take on a range of roles in the goal-setting process. Although family-centred practice encourages negotiation and collaborative goal setting, parents may not always be ready to take on highly collaborative roles. Better understanding of parent roles, goal-setting processes and influencing factors will inform better engagement with families accessing multidisciplinary child development services. © 2013 John Wiley & Sons Ltd.

  20. Estimating the CCSD basis-set limit energy from small basis sets: basis-set extrapolations vs additivity schemes

    NASA Astrophysics Data System (ADS)

    Spackman, Peter R.; Karton, Amir

    2015-05-01

    Coupled cluster calculations with all single and double excitations (CCSD) converge exceedingly slowly with the size of the one-particle basis set. We assess the performance of a number of approaches for obtaining CCSD correlation energies close to the complete basis-set limit in conjunction with relatively small DZ and TZ basis sets. These include global and system-dependent extrapolations based on the A + B/Lα two-point extrapolation formula, and the well-known additivity approach that uses an MP2-based basis-set-correction term. We show that the basis set convergence rate can change dramatically between different systems(e.g.it is slower for molecules with polar bonds and/or second-row elements). The system-dependent basis-set extrapolation scheme, in which unique basis-set extrapolation exponents for each system are obtained from lower-cost MP2 calculations, significantly accelerates the basis-set convergence relative to the global extrapolations. Nevertheless, we find that the simple MP2-based basis-set additivity scheme outperforms the extrapolation approaches. For example, the following root-mean-squared deviations are obtained for the 140 basis-set limit CCSD atomization energies in the W4-11 database: 9.1 (global extrapolation), 3.7 (system-dependent extrapolation), and 2.4 (additivity scheme) kJ mol-1. The CCSD energy in these approximations is obtained from basis sets of up to TZ quality and the latter two approaches require additional MP2 calculations with basis sets of up to QZ quality. We also assess the performance of the basis-set extrapolations and additivity schemes for a set of 20 basis-set limit CCSD atomization energies of larger molecules including amino acids, DNA/RNA bases, aromatic compounds, and platonic hydrocarbon cages. We obtain the following RMSDs for the above methods: 10.2 (global extrapolation), 5.7 (system-dependent extrapolation), and 2.9 (additivity scheme) kJ mol-1.

  1. A rapid estimation of tsunami run-up based on finite fault models

    NASA Astrophysics Data System (ADS)

    Campos, J.; Fuentes, M. A.; Hayes, G. P.; Barrientos, S. E.; Riquelme, S.

    2014-12-01

    Many efforts have been made to estimate the maximum run-up height of tsunamis associated with large earthquakes. This is a difficult task, because of the time it takes to construct a tsunami model using real time data from the source. It is possible to construct a database of potential seismic sources and their corresponding tsunami a priori. However, such models are generally based on uniform slip distributions and thus oversimplify our knowledge of the earthquake source. Instead, we can use finite fault models of earthquakes to give a more accurate prediction of the tsunami run-up. Here we show how to accurately predict tsunami run-up from any seismic source model using an analytic solution found by Fuentes et al, 2013 that was especially calculated for zones with a very well defined strike, i.e, Chile, Japan, Alaska, etc. The main idea of this work is to produce a tool for emergency response, trading off accuracy for quickness. Our solutions for three large earthquakes are promising. Here we compute models of the run-up for the 2010 Mw 8.8 Maule Earthquake, the 2011 Mw 9.0 Tohoku Earthquake, and the recent 2014 Mw 8.2 Iquique Earthquake. Our maximum rup-up predictions are consistent with measurements made inland after each event, with a peak of 15 to 20 m for Maule, 40 m for Tohoku, and 2,1 m for the Iquique earthquake. Considering recent advances made in the analysis of real time GPS data and the ability to rapidly resolve the finiteness of a large earthquake close to existing GPS networks, it will be possible in the near future to perform these calculations within the first five minutes after the occurrence of any such event. Such calculations will thus provide more accurate run-up information than is otherwise available from existing uniform-slip seismic source databases.

  2. Using the Many-Facet Rasch Model to Evaluate Standard-Setting Judgments: Setting Performance Standards for Advanced Placement® Examinations

    ERIC Educational Resources Information Center

    Kaliski, Pamela; Wind, Stefanie A.; Engelhard, George, Jr.; Morgan, Deanna; Plake, Barbara; Reshetar, Rosemary

    2012-01-01

    The Many-Facet Rasch (MFR) Model is traditionally used to evaluate the quality of ratings on constructed response assessments; however, it can also be used to evaluate the quality of judgments from panel-based standard setting procedures. The current study illustrates the use of the MFR Model by examining the quality of ratings obtained from a…

  3. Comparison of different statistical methods for estimation of extreme sea levels with wave set-up contribution

    NASA Astrophysics Data System (ADS)

    Kergadallan, Xavier; Bernardara, Pietro; Benoit, Michel; Andreewsky, Marc; Weiss, Jérôme

    2013-04-01

    Estimating the probability of occurrence of extreme sea levels is a central issue for the protection of the coast. Return periods of sea level with wave set-up contribution are estimated here in one site : Cherbourg in France in the English Channel. The methodology follows two steps : the first one is computation of joint probability of simultaneous wave height and still sea level, the second one is interpretation of that joint probabilities to assess a sea level for a given return period. Two different approaches were evaluated to compute joint probability of simultaneous wave height and still sea level : the first one is multivariate extreme values distributions of logistic type in which all components of the variables become large simultaneously, the second one is conditional approach for multivariate extreme values in which only one component of the variables have to be large. Two different methods were applied to estimate sea level with wave set-up contribution for a given return period : Monte-Carlo simulation in which estimation is more accurate but needs higher calculation time and classical ocean engineering design contours of type inverse-FORM in which the method is simpler and allows more complex estimation of wave setup part (wave propagation to the coast for example). We compare results from the two different approaches with the two different methods. To be able to use both Monte-Carlo simulation and design contours methods, wave setup is estimated with an simple empirical formula. We show advantages of the conditional approach compared to the multivariate extreme values approach when extreme sea-level occurs when either surge or wave height is large. We discuss the validity of the ocean engineering design contours method which is an alternative when computation of sea levels is too complex to use Monte-Carlo simulation method.

  4. Selection of appropriate training and validation set chemicals for modelling dermal permeability by U-optimal design.

    PubMed

    Xu, G; Hughes-Oliver, J M; Brooks, J D; Yeatts, J L; Baynes, R E

    2013-01-01

    Quantitative structure-activity relationship (QSAR) models are being used increasingly in skin permeation studies. The main idea of QSAR modelling is to quantify the relationship between biological activities and chemical properties, and thus to predict the activity of chemical solutes. As a key step, the selection of a representative and structurally diverse training set is critical to the prediction power of a QSAR model. Early QSAR models selected training sets in a subjective way and solutes in the training set were relatively homogenous. More recently, statistical methods such as D-optimal design or space-filling design have been applied but such methods are not always ideal. This paper describes a comprehensive procedure to select training sets from a large candidate set of 4534 solutes. A newly proposed 'Baynes' rule', which is a modification of Lipinski's 'rule of five', was used to screen out solutes that were not qualified for the study. U-optimality was used as the selection criterion. A principal component analysis showed that the selected training set was representative of the chemical space. Gas chromatograph amenability was verified. A model built using the training set was shown to have greater predictive power than a model built using a previous dataset [1].

  5. Basic priority rating model 2.0: current applications for priority setting in health promotion practice.

    PubMed

    Neiger, Brad L; Thackeray, Rosemary; Fagen, Michael C

    2011-03-01

    Priority setting is an important component of systematic planning in health promotion and also factors into the development of a comprehensive evaluation plan. The basic priority rating (BPR) model was introduced more than 50 years ago and includes criteria that should be considered in any priority setting approach (i.e., use of predetermined criteria, standardized comparisons, and a rubric that controls bias). Although the BPR model has provided basic direction in priority setting, it does not represent the broad array of data currently available to decision makers. Elements in the model also give more weight to the impact of communicable diseases compared with chronic diseases. For these reasons, several modifications are recommended to improve the BPR model and to better assist health promotion practitioners in the priority setting process. The authors also suggest a new name, BPR 2.0, to represent this revised model.

  6. Validation of the regional climate model MAR over the CORDEX Africa domain and comparison with other regional models using unpublished data set

    NASA Astrophysics Data System (ADS)

    Prignon, Maxime; Agosta, Cécile; Kittel, Christoph; Fettweis, Xavier; Michel, Erpicum

    2016-04-01

    In the framework of the CORDEX project, we have applied the regional model MAR over the Africa domain at a resolution of 50 km. ERA-Interim and NCEP-NCAR reanalysis have been used as 6 hourly forcing at the MAR boundaries over 1950-2015. While MAR was already been validated over the West Africa, it is the first time that MAR simulations are carried out at the scale of the whole continent. Unpublished daily measurements, covering the Sahel and more areas up South, with a large set of variables, are used as validation of MAR, other CORDEX-Africa RCMs and both reanalyses. Comparisons with the CRU and the ECA&D databases are also performed. The unpublished daily data set covers the period 1884-2006 and comes from 1460 stations. The measured variables are wind, evapotranspiration, relative humidity, insolation, rain, surface pressure, temperature, vapour pressure and visibility. It covers 23 countries: Algeria, Benin, Burkina, Canary Islands, Cap Verde, Central Africa, Chad, Congo, Ivory Coast, Gabon, Gambia, Ghana, Guinea, Guinea-Bissau, Mali, Mauritania, Morocco, Niger, Nigeria, Senegal, Sudan and Togo.

  7. Joint Clustering and Component Analysis of Correspondenceless Point Sets: Application to Cardiac Statistical Modeling.

    PubMed

    Gooya, Ali; Lekadir, Karim; Alba, Xenia; Swift, Andrew J; Wild, Jim M; Frangi, Alejandro F

    2015-01-01

    Construction of Statistical Shape Models (SSMs) from arbitrary point sets is a challenging problem due to significant shape variation and lack of explicit point correspondence across the training data set. In medical imaging, point sets can generally represent different shape classes that span healthy and pathological exemplars. In such cases, the constructed SSM may not generalize well, largely because the probability density function (pdf) of the point sets deviates from the underlying assumption of Gaussian statistics. To this end, we propose a generative model for unsupervised learning of the pdf of point sets as a mixture of distinctive classes. A Variational Bayesian (VB) method is proposed for making joint inferences on the labels of point sets, and the principal modes of variations in each cluster. The method provides a flexible framework to handle point sets with no explicit point-to-point correspondences. We also show that by maximizing the marginalized likelihood of the model, the optimal number of clusters of point sets can be determined. We illustrate this work in the context of understanding the anatomical phenotype of the left and right ventricles in heart. To this end, we use a database containing hearts of healthy subjects, patients with Pulmonary Hypertension (PH), and patients with Hypertrophic Cardiomyopathy (HCM). We demonstrate that our method can outperform traditional PCA in both generalization and specificity measures.

  8. A Generalized 2D-Dynamical Mean-Field Ising Model with a Rich Set of Bifurcations (Inspired and Applied to Financial Crises)

    NASA Astrophysics Data System (ADS)

    Smug, Damian; Sornette, Didier; Ashwin, Peter

    We analyze an extended version of the dynamical mean-field Ising model. Instead of classical physical representation of spins and external magnetic field, the model describes traders' opinion dynamics. The external field is endogenized to represent a smoothed moving average of the past state variable. This model captures in a simple set-up the interplay between instantaneous social imitation and past trends in social coordinations. We show the existence of a rich set of bifurcations as a function of the two parameters quantifying the relative importance of instantaneous versus past social opinions on the formation of the next value of the state variable. Moreover, we present a thorough analysis of chaotic behavior, which is exhibited in certain parameter regimes. Finally, we examine several transitions through bifurcation curves and study how they could be understood as specific market scenarios. We find that the amplitude of the corrections needed to recover from a crisis and to push the system back to “normal” is often significantly larger than the strength of the causes that led to the crisis itself.

  9. Eutrophication status of the North Sea, Skagerrak, Kattegat and the Baltic Sea in present and future climates: A model study

    NASA Astrophysics Data System (ADS)

    Skogen, Morten D.; Eilola, Kari; Hansen, Jørgen L. S.; Meier, H. E. Markus; Molchanov, Mikhail S.; Ryabchenko, Vladimir A.

    2014-04-01

    A method to combine observations and an ensemble of ecological models has been used to assess eutrophication. Using downscaled forcing from two GCMs under the A1B emission scenario, an assessment of the eutrophication status was made for a control (1970-2000) and a future climate (2070-2100) period. By using validation results from a hindcast to compute individual weights between the models, an assessment of eutrophication is done using a set of threshold values. The final classification distinguishes between three categories: problem area, potential problem area, and non-problem area, in accordance with current management practice as suggested by the Oslo and Paris Commissions (OSPAR) and the Helsinki Commission (HELCOM). For the control run the assessment indicates that the Kattegat, the Danish Straits, the Gulf of Finland, the Gotland Basin as well as main parts of the Arkona Basin, the Bornholm Basin, and the Baltic proper may be classified as problem areas. The main part of the North Sea and also the Skagerrak are non-problem areas while the main parts of the Gulf of Bothnia, Gulf of Riga and the entire southeastern continental coast of the North Sea may be classified as potential problem areas. In the future climate scenarios most of the previous potential problem areas in the Baltic Sea have become problem areas, except for the Bothnian Bay where the situation remain fairly unchanged. In the North Sea there seems to be no obvious changes in eutrophication status in the projected future climate.

  10. A set-up for simultaneous measurement of second harmonic generation and streaming potential and some test applications.

    PubMed

    Lützenkirchen, Johannes; Scharnweber, Tim; Ho, Tuan; Striolo, Alberto; Sulpizi, Marialore; Abdelmonem, Ahmed

    2018-06-15

    We present a measurement cell that allows simultaneous measurement of second harmonic generation (SHG) and streaming potential (SP) at mineral-water interfaces with flat specimen that are suitable for non-linear optical (NLO) studies. The set-up directly yields SHG data for the interface of interest and can also be used to obtain information concerning the influence of flow on NLO signals from that interface. The streaming potential is at present measured against a reference substrate (PTFE). The properties of this inert reference can be independently determined for the same conditions. With the new cell, for the first time the SHG signal and the SP for flat surfaces have been simultaneously measured on the same surface. This can in turn be used to unambiguously relate the two observations for identical solution composition. The SHG test of the cell with a fluorite sample confirmed previously observed differences in NLO signal under flow vs. no flow conditions in sum frequency generation (SFG) investigations. As a second test surface, an inert ("hydrophobic") OTS covered sapphire-c electrolyte interface was studied to verify the zeta-potential measurements with the new cell. For this system we obtained combined zeta-potential/SHG data in the vicinity of the point of zero charge, which were found to be proportional to each other as expected. Furthermore, on the accessible time scales of the SHG measurements no effects of flow, flow velocity and stopped flow occurred on the interfacial water structure. This insensitivity to flow for the inert surface was corroborated by concomitant molecular dynamics simulations. Finally, the set-up was used for simultaneous measurements of the two properties as a function of pH in automated titrations with an oxidic surface. Different polarization combinations obtained in two separate titrations, yielded clearly different SHG data, while under identical conditions zeta-potentials were exactly reproduced. The polarization combination

  11. Towards Precise Metadata-set for Discovering 3D Geospatial Models in Geo-portals

    NASA Astrophysics Data System (ADS)

    Zamyadi, A.; Pouliot, J.; Bédard, Y.

    2013-09-01

    Accessing 3D geospatial models, eventually at no cost and for unrestricted use, is certainly an important issue as they become popular among participatory communities, consultants, and officials. Various geo-portals, mainly established for 2D resources, have tried to provide access to existing 3D resources such as digital elevation model, LIDAR or classic topographic data. Describing the content of data, metadata is a key component of data discovery in geo-portals. An inventory of seven online geo-portals and commercial catalogues shows that the metadata referring to 3D information is very different from one geo-portal to another as well as for similar 3D resources in the same geo-portal. The inventory considered 971 data resources affiliated with elevation. 51% of them were from three geo-portals running at Canadian federal and municipal levels whose metadata resources did not consider 3D model by any definition. Regarding the remaining 49% which refer to 3D models, different definition of terms and metadata were found, resulting in confusion and misinterpretation. The overall assessment of these geo-portals clearly shows that the provided metadata do not integrate specific and common information about 3D geospatial models. Accordingly, the main objective of this research is to improve 3D geospatial model discovery in geo-portals by adding a specific metadata-set. Based on the knowledge and current practices on 3D modeling, and 3D data acquisition and management, a set of metadata is proposed to increase its suitability for 3D geospatial models. This metadata-set enables the definition of genuine classes, fields, and code-lists for a 3D metadata profile. The main structure of the proposal contains 21 metadata classes. These classes are classified in three packages as General and Complementary on contextual and structural information, and Availability on the transition from storage to delivery format. The proposed metadata set is compared with Canadian Geospatial

  12. An algorithm for deriving core magnetic field models from the Swarm data set

    NASA Astrophysics Data System (ADS)

    Rother, Martin; Lesur, Vincent; Schachtschneider, Reyko

    2013-11-01

    In view of an optimal exploitation of the Swarm data set, we have prepared and tested software dedicated to the determination of accurate core magnetic field models and of the Euler angles between the magnetic sensors and the satellite reference frame. The dedicated core field model estimation is derived directly from the GFZ Reference Internal Magnetic Model (GRIMM) inversion and modeling family. The data selection techniques and the model parameterizations are similar to what were used for the derivation of the second (Lesur et al., 2010) and third versions of GRIMM, although the usage of observatory data is not planned in the framework of the application to Swarm. The regularization technique applied during the inversion process smoothes the magnetic field model in time. The algorithm to estimate the Euler angles is also derived from the CHAMP studies. The inversion scheme includes Euler angle determination with a quaternion representation for describing the rotations. It has been built to handle possible weak time variations of these angles. The modeling approach and software have been initially validated on a simple, noise-free, synthetic data set and on CHAMP vector magnetic field measurements. We present results of test runs applied to the synthetic Swarm test data set.

  13. A population-based model for priority setting across the care continuum and across modalities

    PubMed Central

    Segal, Leonie; Mortimer, Duncan

    2006-01-01

    Background The Health-sector Wide (HsW) priority setting model is designed to shift the focus of priority setting away from 'program budgets' – that are typically defined by modality or disease-stage – and towards well-defined target populations with a particular disease/health problem. Methods The key features of the HsW model are i) a disease/health problem framework, ii) a sequential approach to covering the entire health sector, iii) comprehensiveness of scope in identifying intervention options and iv) the use of objective evidence. The HsW model redefines the unit of analysis over which priorities are set to include all mutually exclusive and complementary interventions for the prevention and treatment of each disease/health problem under consideration. The HsW model is therefore incompatible with the fragmented approach to priority setting across multiple program budgets that currently characterises allocation in many health systems. The HsW model employs standard cost-utility analyses and decision-rules with the aim of maximising QALYs contingent upon the global budget constraint for the set of diseases/health problems under consideration. It is recognised that the objective function may include non-health arguments that would imply a departure from simple QALY maximisation and that political constraints frequently limit degrees of freedom. In addressing these broader considerations, the HsW model can be modified to maximise value-weighted QALYs contingent upon the global budget constraint and any political constraints bearing upon allocation decisions. Results The HsW model has been applied in several contexts, recently to osteoarthritis, that has demonstrated both its practical application and its capacity to derive clear evidenced-based policy recommendations. Conclusion Comparisons with other approaches to priority setting, such as Programme Budgeting and Marginal Analysis (PBMA) and modality-based cost-effectiveness comparisons, as typified by

  14. "helix Nebula - the Science Cloud", a European Science Driven Cross-Domain Initiative Implemented in via AN Active Ppp Set-Up

    NASA Astrophysics Data System (ADS)

    Lengert, W.; Mondon, E.; Bégin, M. E.; Ferrer, M.; Vallois, F.; DelaMar, J.

    2015-12-01

    Helix Nebula, a European science cross-domain initiative building on an active PPP, is aiming to implement the concept of an open science commons[1] while using a cloud hybrid model[2] as the proposed implementation solution. This approach allows leveraging and merging of complementary data intensive Earth Science disciplines (e.g. instrumentation[3] and modeling), without introducing significant changes in the contributors' operational set-up. Considering the seamless integration with life-science (e.g. EMBL), scientific exploitation of meteorological, climate, and Earth Observation data and models open an enormous potential for new big data science. The work of Helix Nebula has shown that is it feasible to interoperate publicly funded infrastructures, such as EGI [5] and GEANT [6], with commercial cloud services. Such hybrid systems are in the interest of the existing users of publicly funded infrastructures and funding agencies because they will provide "freedom and choice" over the type of computing resources to be consumed and the manner in which they can be obtained. But to offer such freedom and choice across a spectrum of suppliers, various issues such as intellectual property, legal responsibility, service quality agreements and related issues need to be addressed. Finding solutions to these issues is one of the goals of the Helix Nebula initiative. [1] http://www.egi.eu/news-and-media/publications/OpenScienceCommons_v3.pdf [2] http://www.helix-nebula.eu/events/towards-the-european-open-science-cloud [3] e.g. https://sentinel.esa.int/web/sentinel/sentinel-data-access [5] http://www.egi.eu/ [6] http://www.geant.net/

  15. Take the Reins on Model Quality with ModelCHECK and Gatekeeper

    NASA Technical Reports Server (NTRS)

    Jones, Corey

    2012-01-01

    Model quality and consistency has been an issue for us due to the diverse experience level and imaginative modeling techniques of our users. Fortunately, setting up ModelCHECK and Gatekeeper to enforce our best practices has helped greatly, but it wasn't easy. There were many challenges associated with setting up ModelCHECK and Gatekeeper including: limited documentation, restrictions within ModelCHECK, and resistance from end users. However, we consider ours a success story. In this presentation we will describe how we overcame these obstacles and present some of the details of how we configured them to work for us.

  16. Validation of the SimSET simulation package for modeling the Siemens Biograph mCT PET scanner

    NASA Astrophysics Data System (ADS)

    Poon, Jonathan K.; Dahlbom, Magnus L.; Casey, Michael E.; Qi, Jinyi; Cherry, Simon R.; Badawi, Ramsey D.

    2015-02-01

    Monte Carlo simulation provides a valuable tool in performance assessment and optimization of system design parameters for PET scanners. SimSET is a popular Monte Carlo simulation toolkit that features fast simulation time, as well as variance reduction tools to further enhance computational efficiency. However, SimSET has lacked the ability to simulate block detectors until its most recent release. Our goal is to validate new features of SimSET by developing a simulation model of the Siemens Biograph mCT PET scanner and comparing the results to a simulation model developed in the GATE simulation suite and to experimental results. We used the NEMA NU-2 2007 scatter fraction, count rates, and spatial resolution protocols to validate the SimSET simulation model and its new features. The SimSET model overestimated the experimental results of the count rate tests by 11-23% and the spatial resolution test by 13-28%, which is comparable to previous validation studies of other PET scanners in the literature. The difference between the SimSET and GATE simulation was approximately 4-8% for the count rate test and approximately 3-11% for the spatial resolution test. In terms of computational time, SimSET performed simulations approximately 11 times faster than GATE simulations. The new block detector model in SimSET offers a fast and reasonably accurate simulation toolkit for PET imaging applications.

  17. A model providing long-term data sets of energetic electron precipitation during geomagnetic storms

    NASA Astrophysics Data System (ADS)

    van de Kamp, M.; Seppälä, A.; Clilverd, M. A.; Rodger, C. J.; Verronen, P. T.; Whittaker, I. C.

    2016-10-01

    The influence of solar variability on the polar atmosphere and climate due to energetic electron precipitation (EEP) has remained an open question largely due to lack of a long-term EEP forcing data set that could be used in chemistry-climate models. Motivated by this, we have developed a model for 30-1000 keV radiation belt driven EEP. The model is based on precipitation data from low Earth orbiting POES satellites in the period 2002-2012 and empirically described plasmasphere structure, which are both scaled to a geomagnetic index. This geomagnetic index is the only input of the model and can be either Dst or Ap. Because of this, the model can be used to calculate the energy-flux spectrum of precipitating electrons from 1957 (Dst) or 1932 (Ap) onward, with a time resolution of 1 day. Results from the model compare well with EEP observations over the period of 2002-2012. Using the model avoids the challenges found in measured data sets concerning proton contamination. As demonstrated, the model results can be used to produce the first ever >80 year long atmospheric ionization rate data set for radiation belt EEP. The impact of precipitation in this energy range is mainly seen at altitudes 70-110 km. The ionization rate data set, which is available for the scientific community, will enable simulations of EEP impacts on the atmosphere and climate with realistic EEP variability. Due to limitations in this first version of the model, the results most likely represent an underestimation of the total EEP effect.

  18. Instruction manual model 600F, data transmission test set

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Information necessary for the operation and maintenance of the Model 600F Data Transmission Test Set is presented. A description is contained of the physical and functional characteristics; pertinent installation data; instructions for operating the equipment; general and detailed principles of operation; preventive and corrective maintenance procedures; and block, logic, and component layout diagrams of the equipment and its major component assemblies.

  19. Refining Sunrise/set Prediction Models by Accounting for the Effects of Refraction

    NASA Astrophysics Data System (ADS)

    Wilson, Teresa; Bartlett, Jennifer L.

    2016-01-01

    Current atmospheric models used to predict the times of sunrise and sunset have an error of one to four minutes at mid-latitudes (0° - 55° N/S). At higher latitudes, slight changes in refraction may cause significant discrepancies, including determining even whether the Sun appears to rise or set. While different components of refraction are known, how they affect predictions of sunrise/set has not yet been quantified. A better understanding of the contributions from temperature profile, pressure, humidity, and aerosols, could significantly improve the standard prediction. Because sunrise/set times and meteorological data from multiple locations will be necessary for a thorough investigation of the problem, we will collect this data using smartphones as part of a citizen science project. This analysis will lead to more complete models that will provide more accurate times for navigators and outdoorsman alike.

  20. Modeling radiative transfer with the doubling and adding approach in a climate GCM setting

    NASA Astrophysics Data System (ADS)

    Lacis, A. A.

    2017-12-01

    The nonlinear dependence of multiply scattered radiation on particle size, optical depth, and solar zenith angle, makes accurate treatment of multiple scattering in the climate GCM setting problematic, due primarily to computational cost issues. In regard to the accurate methods of calculating multiple scattering that are available, their computational cost is far too prohibitive for climate GCM applications. Utilization of two-stream-type radiative transfer approximations may be computationally fast enough, but at the cost of reduced accuracy. We describe here a parameterization of the doubling/adding method that is being used in the GISS climate GCM, which is an adaptation of the doubling/adding formalism configured to operate with a look-up table utilizing a single gauss quadrature point with an extra-angle formulation. It is designed to closely reproduce the accuracy of full-angle doubling and adding for the multiple scattering effects of clouds and aerosols in a realistic atmosphere as a function of particle size, optical depth, and solar zenith angle. With an additional inverse look-up table, this single-gauss-point doubling/adding approach can be adapted to model fractional cloud cover for any GCM grid-box in the independent pixel approximation as a function of the fractional cloud particle sizes, optical depths, and solar zenith angle dependence.

  1. Robust group-wise rigid registration of point sets using t-mixture model

    NASA Astrophysics Data System (ADS)

    Ravikumar, Nishant; Gooya, Ali; Frangi, Alejandro F.; Taylor, Zeike A.

    2016-03-01

    A probabilistic framework for robust, group-wise rigid alignment of point-sets using a mixture of Students t-distribution especially when the point sets are of varying lengths, are corrupted by an unknown degree of outliers or in the presence of missing data. Medical images (in particular magnetic resonance (MR) images), their segmentations and consequently point-sets generated from these are highly susceptible to corruption by outliers. This poses a problem for robust correspondence estimation and accurate alignment of shapes, necessary for training statistical shape models (SSMs). To address these issues, this study proposes to use a t-mixture model (TMM), to approximate the underlying joint probability density of a group of similar shapes and align them to a common reference frame. The heavy-tailed nature of t-distributions provides a more robust registration framework in comparison to state of the art algorithms. Significant reduction in alignment errors is achieved in the presence of outliers, using the proposed TMM-based group-wise rigid registration method, in comparison to its Gaussian mixture model (GMM) counterparts. The proposed TMM-framework is compared with a group-wise variant of the well-known Coherent Point Drift (CPD) algorithm and two other group-wise methods using GMMs, using both synthetic and real data sets. Rigid alignment errors for groups of shapes are quantified using the Hausdorff distance (HD) and quadratic surface distance (QSD) metrics.

  2. [Testicular cancer: a model to optimize the radiological follow-up].

    PubMed

    Stebler, V; Pauchard, B; Schmidt, S; Valerio, M; De Bari, B; Berthold, D

    2015-05-20

    Despite being rare cancers, testicular seminoma and non-seminoma play an important role in oncology: they represent a model on how to optimize radiological follow-up, aiming at a lowest possible radiation exposure and secondary cancer risk. Males diagnosed with testicular cancer undergo frequently prolonged follow-up with CT-scans with potential toxic side effects, in particular secondary cancers. To reduce the risks linked to ionizing radiation, precise follow-up protocols have been developed. The number of recommended CT-scanners has been significantly reduced over the last 10 years. The CT scanners have evolved technically and new acquisition protocols have the potential to reduce the radiation exposure further.

  3. Developing a Suitable Model for Water Uptake for Biodegradable Polymers Using Small Training Sets.

    PubMed

    Valenzuela, Loreto M; Knight, Doyle D; Kohn, Joachim

    2016-01-01

    Prediction of the dynamic properties of water uptake across polymer libraries can accelerate polymer selection for a specific application. We first built semiempirical models using Artificial Neural Networks and all water uptake data, as individual input. These models give very good correlations (R (2) > 0.78 for test set) but very low accuracy on cross-validation sets (less than 19% of experimental points within experimental error). Instead, using consolidated parameters like equilibrium water uptake a good model is obtained (R (2) = 0.78 for test set), with accurate predictions for 50% of tested polymers. The semiempirical model was applied to the 56-polymer library of L-tyrosine-derived polyarylates, identifying groups of polymers that are likely to satisfy design criteria for water uptake. This research demonstrates that a surrogate modeling effort can reduce the number of polymers that must be synthesized and characterized to identify an appropriate polymer that meets certain performance criteria.

  4. Trust in direct leaders and top leaders: A trickle-up model.

    PubMed

    Fulmer, C Ashley; Ostroff, Cheri

    2017-04-01

    Low levels of employee trust in top leaders pose challenges to organizations with respect to retention, performance, and profits. This research examines how trust in top leaders can be fostered through the relationships individuals have with their direct leaders. We propose a trickle-up model whereby trust in direct leaders exerts an upward influence on trust in top leaders. Drawing on the group value model, we predict that direct leaders' procedural justice serves as the key mechanism in facilitating the trickle-up process. Further, this process should be particularly strong for employees high on vertical collectivism, and the trickled-up trust in top leaders should exert a stronger impact on employees' overall performance in the organization than trust in direct leaders. Multiphase and multisource data from 336 individuals support these hypotheses. The findings advance our understanding of trust and leadership by highlighting that trust in leaders at different levels does not form independently and that trust in leaders trickles up across hierarchical levels. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  5. Maximizing Social Model Principles in Residential Recovery Settings

    PubMed Central

    Polcin, Douglas; Mericle, Amy; Howell, Jason; Sheridan, Dave; Christensen, Jeff

    2014-01-01

    Abstract Peer support is integral to a variety of approaches to alcohol and drug problems. However, there is limited information about the best ways to facilitate it. The “social model” approach developed in California offers useful suggestions for facilitating peer support in residential recovery settings. Key principles include using 12-step or other mutual-help group strategies to create and facilitate a recovery environment, involving program participants in decision making and facility governance, using personal recovery experience as a way to help others, and emphasizing recovery as an interaction between the individual and their environment. Although limited in number, studies have shown favorable outcomes for social model programs. Knowledge about social model recovery and how to use it to facilitate peer support in residential recovery homes varies among providers. This article presents specific, practical suggestions for enhancing social model principles in ways that facilitate peer support in a range of recovery residences. PMID:25364996

  6. Assessing effects of variation in global climate data sets on spatial predictions from climate envelope models

    USGS Publications Warehouse

    Romañach, Stephanie; Watling, James I.; Fletcher, Robert J.; Speroterra, Carolina; Bucklin, David N.; Brandt, Laura A.; Pearlstine, Leonard G.; Escribano, Yesenia; Mazzotti, Frank J.

    2014-01-01

    Climate change poses new challenges for natural resource managers. Predictive modeling of species–environment relationships using climate envelope models can enhance our understanding of climate change effects on biodiversity, assist in assessment of invasion risk by exotic organisms, and inform life-history understanding of individual species. While increasing interest has focused on the role of uncertainty in future conditions on model predictions, models also may be sensitive to the initial conditions on which they are trained. Although climate envelope models are usually trained using data on contemporary climate, we lack systematic comparisons of model performance and predictions across alternative climate data sets available for model training. Here, we seek to fill that gap by comparing variability in predictions between two contemporary climate data sets to variability in spatial predictions among three alternative projections of future climate. Overall, correlations between monthly temperature and precipitation variables were very high for both contemporary and future data. Model performance varied across algorithms, but not between two alternative contemporary climate data sets. Spatial predictions varied more among alternative general-circulation models describing future climate conditions than between contemporary climate data sets. However, we did find that climate envelope models with low Cohen's kappa scores made more discrepant spatial predictions between climate data sets for the contemporary period than did models with high Cohen's kappa scores. We suggest conservation planners evaluate multiple performance metrics and be aware of the importance of differences in initial conditions for spatial predictions from climate envelope models.

  7. Porting marine ecosystem model spin-up using transport matrices to GPUs

    NASA Astrophysics Data System (ADS)

    Siewertsen, E.; Piwonski, J.; Slawig, T.

    2013-01-01

    We have ported an implementation of the spin-up for marine ecosystem models based on transport matrices to graphics processing units (GPUs). The original implementation was designed for distributed-memory architectures and uses the Portable, Extensible Toolkit for Scientific Computation (PETSc) library that is based on the Message Passing Interface (MPI) standard. The spin-up computes a steady seasonal cycle of ecosystem tracers with climatological ocean circulation data as forcing. Since the transport is linear with respect to the tracers, the resulting operator is represented by matrices. Each iteration of the spin-up involves two matrix-vector multiplications and the evaluation of the used biogeochemical model. The original code was written in C and Fortran. On the GPU, we use the Compute Unified Device Architecture (CUDA) standard, a customized version of PETSc and a commercial CUDA Fortran compiler. We describe the extensions to PETSc and the modifications of the original C and Fortran codes that had to be done. Here we make use of freely available libraries for the GPU. We analyze the computational effort of the main parts of the spin-up for two exemplar ecosystem models and compare the overall computational time to those necessary on different CPUs. The results show that a consumer GPU can compete with a significant number of cluster CPUs without further code optimization.

  8. Developing a virtual reality application for training nuclear power plant operators: setting up a database containing dose rates in the refuelling plant.

    PubMed

    Ródenas, J; Zarza, I; Burgos, M C; Felipe, A; Sánchez-Mayoral, M L

    2004-01-01

    Operators in Nuclear Power Plants can receive high doses during refuelling operations. A training programme for simulating refuelling operations will be useful in reducing the doses received by workers as well as minimising operation time. With this goal in mind, a virtual reality application is developed within the framework of the CIPRES project. The application requires doses, both instantaneous and accumulated, to be displayed at all times during operator training. Therefore, it is necessary to set up a database containing dose rates at every point in the refuelling plant. This database is based on radiological protection surveillance data measured in the plant during refuelling operations. Some interpolation routines have been used to estimate doses through the refuelling plant. Different assumptions have been adopted in order to perform the interpolation and obtain consistent data. In this paper, the procedures developed to set up the dose database for the virtual reality application are presented and analysed.

  9. Estimating the CCSD basis-set limit energy from small basis sets: basis-set extrapolations vs additivity schemes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spackman, Peter R.; Karton, Amir, E-mail: amir.karton@uwa.edu.au

    Coupled cluster calculations with all single and double excitations (CCSD) converge exceedingly slowly with the size of the one-particle basis set. We assess the performance of a number of approaches for obtaining CCSD correlation energies close to the complete basis-set limit in conjunction with relatively small DZ and TZ basis sets. These include global and system-dependent extrapolations based on the A + B/L{sup α} two-point extrapolation formula, and the well-known additivity approach that uses an MP2-based basis-set-correction term. We show that the basis set convergence rate can change dramatically between different systems(e.g.it is slower for molecules with polar bonds and/ormore » second-row elements). The system-dependent basis-set extrapolation scheme, in which unique basis-set extrapolation exponents for each system are obtained from lower-cost MP2 calculations, significantly accelerates the basis-set convergence relative to the global extrapolations. Nevertheless, we find that the simple MP2-based basis-set additivity scheme outperforms the extrapolation approaches. For example, the following root-mean-squared deviations are obtained for the 140 basis-set limit CCSD atomization energies in the W4-11 database: 9.1 (global extrapolation), 3.7 (system-dependent extrapolation), and 2.4 (additivity scheme) kJ mol{sup –1}. The CCSD energy in these approximations is obtained from basis sets of up to TZ quality and the latter two approaches require additional MP2 calculations with basis sets of up to QZ quality. We also assess the performance of the basis-set extrapolations and additivity schemes for a set of 20 basis-set limit CCSD atomization energies of larger molecules including amino acids, DNA/RNA bases, aromatic compounds, and platonic hydrocarbon cages. We obtain the following RMSDs for the above methods: 10.2 (global extrapolation), 5.7 (system-dependent extrapolation), and 2.9 (additivity scheme) kJ mol{sup –1}.« less

  10. Scaling Up of Breastfeeding Promotion Programs in Low- and Middle-Income Countries: the “Breastfeeding Gear” Model12

    PubMed Central

    Pérez-Escamilla, Rafael; Curry, Leslie; Minhas, Dilpreet; Taylor, Lauren; Bradley, Elizabeth

    2012-01-01

    Breastfeeding (BF) promotion is one of the most cost-effective interventions to advance mother–child health. Evidence-based frameworks and models to promote the effective scale up and sustainability of BF programs are still lacking. A systematic review of peer-reviewed and gray literature reports was conducted to identify key barriers and facilitators for scale up of BF programs in low- and middle-income countries. The review identified BF programs located in 28 countries in Africa, Latin America and the Caribbean, and Asia. Study designs included case studies, qualitative studies, and observational quantitative studies. Only 1 randomized, controlled trial was identified. A total of 22 enabling factors and 15 barriers were mapped into a scale-up framework termed “AIDED” that was used to build the parsimonious breastfeeding gear model (BFGM). Analogous to a well-oiled engine, the BFGM indicates the need for several key “gears” to be working in synchrony and coordination. Evidence-based advocacy is needed to generate the necessary political will to enact legislation and policies to protect, promote, and support BF at the hospital and community levels. This political-policy axis in turn drives the resources needed to support workforce development, program delivery, and promotion. Research and evaluation are needed to sustain the decentralized program coordination “gear” required for goal setting and system feedback. The BFGM helps explain the different levels of performance in national BF outcomes in Mexico and Brazil. Empirical research is recommended to further test the usefulness of the AIDED framework and BFGM for global scaling up of BF programs. PMID:23153733

  11. Impact of patient-specific factors, irradiated left ventricular volume, and treatment set-up errors on the development of myocardial perfusion defects after radiation therapy for left-sided breast cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Elizabeth S.; Prosnitz, Robert G.; Yu Xiaoli

    2006-11-15

    Purpose: The aim of this study was to assess the impact of patient-specific factors, left ventricle (LV) volume, and treatment set-up errors on the rate of perfusion defects 6 to 60 months post-radiation therapy (RT) in patients receiving tangential RT for left-sided breast cancer. Methods and Materials: Between 1998 and 2005, a total of 153 patients were enrolled onto an institutional review board-approved prospective study and had pre- and serial post-RT (6-60 months) cardiac perfusion scans to assess for perfusion defects. Of the patients, 108 had normal pre-RT perfusion scans and available follow-up data. The impact of patient-specific factors onmore » the rate of perfusion defects was assessed at various time points using univariate and multivariate analysis. The impact of set-up errors on the rate of perfusion defects was also analyzed using a one-tailed Fisher's Exact test. Results: Consistent with our prior results, the volume of LV in the RT field was the most significant predictor of perfusion defects on both univariate (p = 0.0005 to 0.0058) and multivariate analysis (p = 0.0026 to 0.0029). Body mass index (BMI) was the only significant patient-specific factor on both univariate (p = 0.0005 to 0.022) and multivariate analysis (p = 0.0091 to 0.05). In patients with very small volumes of LV in the planned RT fields, the rate of perfusion defects was significantly higher when the fields set-up 'too deep' (83% vs. 30%, p = 0.059). The frequency of deep set-up errors was significantly higher among patients with BMI {>=}25 kg/m{sup 2} compared with patients of normal weight (47% vs. 28%, p = 0.068). Conclusions: BMI {>=}25 kg/m{sup 2} may be a significant risk factor for cardiac toxicity after RT for left-sided breast cancer, possibly because of more frequent deep set-up errors resulting in the inclusion of additional heart in the RT fields. Further study is necessary to better understand the impact of patient-specific factors and set-up errors on the

  12. Setting Dead at Zero: Applying Scale Properties to the QALY Model.

    PubMed

    Roudijk, Bram; Donders, A Rogier T; Stalmeier, Peep F M

    2018-04-01

    Scaling severe states can be a difficult task. First, the method of measurement affects whether a health state is considered better or worse than dead. Second, in discrete choice experiments, different models to anchor health states on 0 (dead) and 1 (perfect health) produce varying amounts of health states worse than dead. Within the context of the quality-adjusted life-year (QALY) model, this article provides insight into the value assigned to dead and its consequences for decision making. Our research questions are 1) what are the arguments set forth to assign dead the number 0 on the health-utility scale? And 2) what are the effects of the position of dead on the health-utility scale on decision making? A literature review was conducted to explore the arguments set forth to assign dead a value of 0 in the QALY model. In addition, scale properties and transformations were considered. The review uncovered several practical and theoretical considerations for setting dead at 0. In the QALY model, indifference between 2 health episodes is not preserved under changes of the origin of the duration scale. Ratio scale properties are needed for the duration scale to preserve indifferences. In combination with preferences and zero conditions for duration and health, it follows that dead should have a value of 0. The health-utility and duration scales have ratio scale properties, and dead should be assigned the number 0. Furthermore, the position of dead should be carefully established, because it determines how life-saving and life-improving values are weighed in cost-utility analysis.

  13. Enhancing Classroom Management Using the Classroom Check-up Consultation Model with In-Vivo Coaching and Goal Setting Components

    ERIC Educational Resources Information Center

    Kleinert, Whitney L.; Silva, Meghan R.; Codding, Robin S.; Feinberg, Adam B.; St. James, Paula S.

    2017-01-01

    Classroom management is essential to promote learning in schools, and as such it is imperative that teachers receive adequate support to maximize their competence implementing effective classroom management strategies. One way to improve teachers' classroom managerial competence is through consultation. The Classroom Check-Up (CCU) is a structured…

  14. Anomalous Lower Crustal and Surface Features as a Result of Plume-induced Continental Break-up: Inferences from Numerical Models

    NASA Astrophysics Data System (ADS)

    Beniest, A.; Koptev, A.; Leroy, S. D.

    2016-12-01

    Anomalous features along the South American and African rifted margins at depth and at the surface have been recognised with gravity and magnetic modelling. They include high velocity/high density bodies at lower crustal level and topography variations that are usually interpreted as aborted rifts. We present fully-coupled lithosphere-scale numerical models that permit us to explain both features in a relatively simple framework of an interaction between rheologically stratified continental lithosphere and an active mantle plume. We used 2D and 3D numerical models to investigate the impact of thermo-rheological structure of the continental lithosphere and initial plume position on continental rifting and breakup processes. Based on the results of our 2D experiments, three main types of continental break-up are revealed: A) mantle plume-induced break-up, directly located above the centre of the mantle anomaly, B) mantle plume-induced break-up, 50 to 250 km displaced from the initial plume location and C) self-induced break-up due to convection and/or slab-subduction/delamination, considerably shifted (300 to 800 km) from the initial plume position. With our 3D, laterally homogenous initial setup, we show that a complex system, with the axis of continental break-up 100's of km's shifted from the original plume location, can arise spontaneously from simple and perfectly symmetric preliminary settings. Our modelling demonstrates that fragments of a laterally migrating plume head become glued to the base of the lithosphere and remain at both sides of the newly-formed oceanic basin after continental break-up. Underplated plume material soldered into lower parts of lithosphere can be interpreted as the high-velocity/high density magmatic bodies at lower crustal levels. In the very early stages of rifting, first impingement of the vertically upwelled mantle plume to the lithospheric base leads to surface topographic variations. Given the shifted position of the final

  15. Development of a new model to engage patients and clinicians in setting research priorities.

    PubMed

    Pollock, Alex; St George, Bridget; Fenton, Mark; Crowe, Sally; Firkins, Lester

    2014-01-01

    Equitable involvement of patients and clinicians in setting research and funding priorities is ethically desirable and can improve the quality, relevance and implementation of research. Survey methods used in previous priority setting projects to gather treatment uncertainties may not be sufficient to facilitate responses from patients and their lay carers for some health care topics. We aimed to develop a new model to engage patients and clinicians in setting research priorities relating to life after stroke, and to explore the use of this model within a James Lind Alliance (JLA) priority setting project. We developed a model to facilitate involvement through targeted engagement and assisted involvement (FREE TEA model). We implemented both standard surveys and the FREE TEA model to gather research priorities (treatment uncertainties) from people affected by stroke living in Scotland. We explored and configured the number of treatment uncertainties elicited from different groups by the two approaches. We gathered 516 treatment uncertainties from stroke survivors, carers and health professionals. We achieved approximately equal numbers of contributions; 281 (54%) from stroke survivors/carers; 235 (46%) from health professionals. For stroke survivors and carers, 98 (35%) treatment uncertainties were elicited from the standard survey and 183 (65%) at FREE TEA face-to-face visits. This contrasted with the health professionals for whom 198 (84%) were elicited from the standard survey and only 37 (16%) from FREE TEA visits. The FREE TEA model has implications for future priority setting projects and user-involvement relating to populations of people with complex health needs. Our results imply that reliance on standard surveys may result in poor and unrepresentative involvement of patients, thereby favouring the views of health professionals.

  16. A Coupled THMC model of FEBEX mock-up test

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zheng, Liange; Samper, Javier

    2008-09-15

    FEBEX (Full-scale Engineered Barrier EXperiment) is a demonstration and research project for the engineered barrier system (EBS) of a radioactive waste repository in granite. It includes two full-scale heating and hydration tests: the in situ test performed at Grimsel (Switzerland) and a mock-up test operating at CIEMAT facilities in Madrid (Spain). The mock-up test provides valuable insight on thermal, hydrodynamic, mechanical and chemical (THMC) behavior of EBS because its hydration is controlled better than that of in situ test in which the buffer is saturated with water from the surrounding granitic rock. Here we present a coupled THMC model ofmore » the mock-up test which accounts for thermal and chemical osmosis and bentonite swelling with a state-surface approach. The THMC model reproduces measured temperature and cumulative water inflow data. It fits also relative humidity data at the outer part of the buffer, but underestimates relative humidities near the heater. Dilution due to hydration and evaporation near the heater are the main processes controlling the concentration of conservative species while surface complexation, mineral dissolution/precipitation and cation exchanges affect significantly reactive species as well. Results of sensitivity analyses to chemical processes show that pH is mostly controlled by surface complexation while dissolved cations concentrations are controlled by cation exchange reactions.« less

  17. Digital dental photography. Part 6: camera settings.

    PubMed

    Ahmad, I

    2009-07-25

    Once the appropriate camera and equipment have been purchased, the next considerations involve setting up and calibrating the equipment. This article provides details regarding depth of field, exposure, colour spaces and white balance calibration, concluding with a synopsis of camera settings for a standard dental set-up.

  18. Northern Russian chironomid-based modern summer temperature data set and inference models

    NASA Astrophysics Data System (ADS)

    Nazarova, Larisa; Self, Angela E.; Brooks, Stephen J.; van Hardenbroek, Maarten; Herzschuh, Ulrike; Diekmann, Bernhard

    2015-11-01

    West and East Siberian data sets and 55 new sites were merged based on the high taxonomic similarity, and the strong relationship between mean July air temperature and the distribution of chironomid taxa in both data sets compared with other environmental parameters. Multivariate statistical analysis of chironomid and environmental data from the combined data set consisting of 268 lakes, located in northern Russia, suggests that mean July air temperature explains the greatest amount of variance in chironomid distribution compared with other measured variables (latitude, longitude, altitude, water depth, lake surface area, pH, conductivity, mean January air temperature, mean July air temperature, and continentality). We established two robust inference models to reconstruct mean summer air temperatures from subfossil chironomids based on ecological and geographical approaches. The North Russian 2-component WA-PLS model (RMSEPJack = 1.35 °C, rJack2 = 0.87) can be recommended for application in palaeoclimatic studies in northern Russia. Based on distinctive chironomid fauna and climatic regimes of Kamchatka the Far East 2-component WAPLS model (RMSEPJack = 1.3 °C, rJack2 = 0.81) has potentially better applicability in Kamchatka.

  19. Ranked set sampling: cost and optimal set size.

    PubMed

    Nahhas, Ramzi W; Wolfe, Douglas A; Chen, Haiying

    2002-12-01

    McIntyre (1952, Australian Journal of Agricultural Research 3, 385-390) introduced ranked set sampling (RSS) as a method for improving estimation of a population mean in settings where sampling and ranking of units from the population are inexpensive when compared with actual measurement of the units. Two of the major factors in the usefulness of RSS are the set size and the relative costs of the various operations of sampling, ranking, and measurement. In this article, we consider ranking error models and cost models that enable us to assess the effect of different cost structures on the optimal set size for RSS. For reasonable cost structures, we find that the optimal RSS set sizes are generally larger than had been anticipated previously. These results will provide a useful tool for determining whether RSS is likely to lead to an improvement over simple random sampling in a given setting and, if so, what RSS set size is best to use in this case.

  20. Setting up a parathyroid multidisciplinary team meeting: one year's experience, outcomes and learning points.

    PubMed

    Hancox, S H; Sinnott, J D; Kirkland, P; Lipscomb, D; Owens, E; Howlett, D C

    2018-03-01

    A parathyroid multidisciplinary team meeting was set up at East Sussex Healthcare Trust, from November 2014 to November 2015, in order to improve and streamline services for patients with parathyroid pathology. Data were collected on all new referrals for hyperparathyroidism, and on the outcomes for each patient discussed at the meeting, including the number of operations and management outcomes. A survey was sent out to the members of the multidisciplinary team meeting to determine their perception of its effectiveness. Seventy-nine new referrals were discussed throughout the year; 43 per cent were recommended for surgery, 41 per cent had a trial of conservative or medical management before re-discussion, and 16 per cent required further imaging. Ninety-two per cent of patients underwent an ultrasound, single-photon emission computed tomography/computed tomography or nuclear medicine (sestamibi) scan prior to the meeting. All ultrasound scans were performed by a consultant radiologist. The multidisciplinary team meeting has been successful, with perceived benefits for patients, improved imaging evaluation and efficiency of referral pathways, leading to more appropriate patient management.

  1. Assuring high quality treatment delivery in clinical trials - Results from the Trans-Tasman Radiation Oncology Group (TROG) study 03.04 "RADAR" set-up accuracy study.

    PubMed

    Haworth, Annette; Kearvell, Rachel; Greer, Peter B; Hooton, Ben; Denham, James W; Lamb, David; Duchesne, Gillian; Murray, Judy; Joseph, David

    2009-03-01

    A multi-centre clinical trial for prostate cancer patients provided an opportunity to introduce conformal radiotherapy with dose escalation. To verify adequate treatment accuracy prior to patient recruitment, centres submitted details of a set-up accuracy study (SUAS). We report the results of the SUAS, the variation in clinical practice and the strategies used to help centres improve treatment accuracy. The SUAS required each of the 24 participating centres to collect data on at least 10 pelvic patients imaged on a minimum of 20 occasions. Software was provided for data collection and analysis. Support to centres was provided through educational lectures, the trial quality assurance team and an information booklet. Only two centres had recently carried out a SUAS prior to the trial opening. Systematic errors were generally smaller than those previously reported in the literature. The questionnaire identified many differences in patient set-up protocols. As a result of participating in this QA activity more than 65% of centres improved their treatment delivery accuracy. Conducting a pre-trial SUAS has led to improvement in treatment delivery accuracy in many centres. Treatment techniques and set-up accuracy varied greatly, demonstrating a need to ensure an on-going awareness for such studies in future trials and with the introduction of dose escalation or new technologies.

  2. Comparing the costs of three prostate cancer follow-up strategies: a cost minimisation analysis.

    PubMed

    Pearce, Alison M; Ryan, Fay; Drummond, Frances J; Thomas, Audrey Alforque; Timmons, Aileen; Sharp, Linda

    2016-02-01

    Prostate cancer follow-up is traditionally provided by clinicians in a hospital setting. Growing numbers of prostate cancer survivors mean that this model of care may not be economically sustainable, and a number of alternative approaches have been suggested. The aim of this study was to develop an economic model to compare the costs of three alternative strategies for prostate cancer follow-up in Ireland-the European Association of Urology (EAU) guidelines, the National Institute of Health Care Excellence (NICE) guidelines and current practice. A cost minimisation analysis was performed using a Markov model with three arms (EAU guidelines, NICE guidelines and current practice) comparing follow-up for men with prostate cancer treated with curative intent. The model took a health care payer's perspective over a 10-year time horizon. Current practice was the least cost efficient arm of the model, the NICE guidelines were most cost efficient (74 % of current practice costs) and the EAU guidelines intermediate (92 % of current practice costs). For the 2562 new cases of prostate cancer diagnosed in 2009, the Irish health care system could have saved €760,000 over a 10-year period if the NICE guidelines were adopted. This is the first study investigating costs of prostate cancer follow-up in the Irish setting. While economic models are designed as a simplification of complex real-world situations, these results suggest potential for significant savings within the Irish health care system associated with implementation of alternative models of prostate cancer follow-up care.

  3. Modelling oral up-take of hydrophobic and super-hydrophobic chemicals in fish.

    PubMed

    Larisch, Wolfgang; Goss, Kai-Uwe

    2018-01-24

    We have extended a recently published toxicokinetic model for fish (TK-fish) towards the oral up-take of contaminants. Validation with hydrophobic chemicals revealed that diffusive transport through aqueous boundary layers in the gastro-intestinal tract and in the blood is the limiting process. This process can only be modelled correctly if facilitated transport by albumin or bile micelles through these boundary layers is accounted for. In a case study we have investigated the up-take of a super hydrophobic chemical, Dechlorane Plus. Our results suggest that there is no indication of a hydrophobicity or size cut-off in the bioconcentration of this chemical. Based on an extremely high, but mechanistically sound facilitation factor we received model results in good agreement with experimental values from the literature. The results also indicate that established experimental procedures for BCF determination cannot cover the very slow up-take and clearance kinetics that are to be expected for such a chemical.

  4. Dynamic sensitivity analysis of long running landslide models through basis set expansion and meta-modelling

    NASA Astrophysics Data System (ADS)

    Rohmer, Jeremy

    2016-04-01

    Predicting the temporal evolution of landslides is typically supported by numerical modelling. Dynamic sensitivity analysis aims at assessing the influence of the landslide properties on the time-dependent predictions (e.g., time series of landslide displacements). Yet two major difficulties arise: 1. Global sensitivity analysis require running the landslide model a high number of times (> 1000), which may become impracticable when the landslide model has a high computation time cost (> several hours); 2. Landslide model outputs are not scalar, but function of time, i.e. they are n-dimensional vectors with n usually ranging from 100 to 1000. In this article, I explore the use of a basis set expansion, such as principal component analysis, to reduce the output dimensionality to a few components, each of them being interpreted as a dominant mode of variation in the overall structure of the temporal evolution. The computationally intensive calculation of the Sobol' indices for each of these components are then achieved through meta-modelling, i.e. by replacing the landslide model by a "costless-to-evaluate" approximation (e.g., a projection pursuit regression model). The methodology combining "basis set expansion - meta-model - Sobol' indices" is then applied to the La Frasse landslide to investigate the dynamic sensitivity analysis of the surface horizontal displacements to the slip surface properties during the pore pressure changes. I show how to extract information on the sensitivity of each main modes of temporal behaviour using a limited number (a few tens) of long running simulations. In particular, I identify the parameters, which trigger the occurrence of a turning point marking a shift between a regime of low values of landslide displacements and one of high values.

  5. Fire-setting performed in adolescence or early adulthood predicts schizophrenia: a register-based follow-up study of pre-trial offenders.

    PubMed

    Thomson, Annika; Tiihonen, Jari; Miettunen, Jouko; Virkkunen, Matti; Lindberg, Nina

    2017-02-01

    Aggressive and disruptive behaviours often precede the onset of serious mental illnesses. Fire-setting is a type of crime that is associated with psychotic disorders. The aim of this prospective follow-up study was to investigate if fire-setting performed in adolescence or early adulthood was associated with future diagnoses of schizophrenia or schizoaffective disorder. The consecutive sample consisted of 111 Finnish 15-25-year old males with fire-setting crimes, decreed to a pre-trial forensic psychiatric examination in 1973-1998, and showing no past nor current psychosis at the time of examination. For each firesetter, four age-, gender-, and place of birth-matched controls were randomly selected from the Central Population Register. The subjects were followed until the death of the individual, until they moved abroad, or until the end of 2012. Fourteen firesetters (12.6%) and five controls (1.1%) were diagnosed with either schizophrenia or schizoaffective disorder later in life, corresponding to a hazard ratio of 12.5. The delay between the fire-setting offense and the future diagnosis was on average nearly 10 years. Young male offenders undergoing a forensic psychiatric examination because of fire-setting crimes had a significant propensity for schizophrenia and schizoaffective disorder. Accurate assessments should be made both during imprisonment and later in life to detect possible psychotic signs in these individuals.

  6. In search of the economic sustainability of Hadron therapy: the real cost of setting up and operating a Hadron facility.

    PubMed

    Vanderstraeten, Barbara; Verstraete, Jan; De Croock, Roger; De Neve, Wilfried; Lievens, Yolande

    2014-05-01

    To determine the treatment cost and required reimbursement for a new hadron therapy facility, considering different technical solutions and financing methods. The 3 technical solutions analyzed are a carbon only (COC), proton only (POC), and combined (CC) center, each operating 2 treatment rooms and assumed to function at full capacity. A business model defines the required reimbursement and analyzes the financial implications of setting up a facility over time; activity-based costing (ABC) calculates the treatment costs per type of patient for a center in a steady state of operation. Both models compare a private, full-cost approach with public sponsoring, only taking into account operational costs. Yearly operational costs range between €10.0M (M = million) for a publicly sponsored POC to €24.8M for a CC with private financing. Disregarding inflation, the average treatment cost calculated with ABC (COC: €29,450; POC: €46,342; CC: €46,443 for private financing; respectively €16,059, €28,296, and €23,956 for public sponsoring) is slightly lower than the required reimbursement based on the business model (between €51,200 in a privately funded POC and €18,400 in COC with public sponsoring). Reimbursement for privately financed centers is very sensitive to a delay in commissioning and to the interest rate. Higher throughput and hypofractionation have a positive impact on the treatment costs. Both calculation methods are valid and complementary. The financially most attractive option of a publicly sponsored COC should be balanced to the clinical necessities and the sociopolitical context. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. Supervised exercise therapy: it does work, but how to set up a program?

    PubMed

    Hageman, David; van den Houten, Marijn M; Spruijt, Steffie; Gommans, Lindy N; Scheltinga, Marc R; Teijink, Joep A

    2017-04-01

    Intermittent claudication (IC) is a manifestation of peripheral arterial disease. IC has a high prevalence in the older population, is closely associated with other expressions of atherosclerotic disease and often co-exists in multimorbid patients. Treatment of IC should address reduction of cardiovascular risk and improvement of functional capacity and health-related quality of life (QoL). As recommended by contemporary international guidelines, the first-line treatment includes supervised exercise therapy (SET). In several randomized controlled trials and systematic reviews, SET is compared with usual care, placebo, walking advice and endovascular revascularization. The evidence supporting the efficacy of SET programs to alleviate claudication symptoms is robust. SET improves walking distance and health-related QoL and appears to be the most cost-effective treatment for IC. Nevertheless, only few of all newly diagnosed IC patients worldwide receive this safe, efficient and structured treatment. Worldwide implementation of structured SET programs is seriously impeded by outdated arguments favoring an invasive intervention, absence of a network of specialized physical therapists providing standardized SET and lack of awareness and/or knowledge of the importance of SET by referring physicians. Besides, misguiding financial incentives and lack of reimbursement hamper actual use of SET programs. In the Netherlands, a national integrated care network (ClaudicatioNet) was launched in 2011 to combat treatment shortcomings and stimulate cohesion and collaboration between stakeholders. This care intervention has resulted in optimized quality of care for all patients with IC.

  8. Microstructural modeling of thermal conductivity of high burn-up mixed oxide fuel

    NASA Astrophysics Data System (ADS)

    Teague, Melissa; Tonks, Michael; Novascone, Stephen; Hayes, Steven

    2014-01-01

    Predicting the thermal conductivity of oxide fuels as a function of burn-up and temperature is fundamental to the efficient and safe operation of nuclear reactors. However, modeling the thermal conductivity of fuel is greatly complicated by the radially inhomogeneous nature of irradiated fuel in both composition and microstructure. In this work, radially and temperature-dependent models for effective thermal conductivity were developed utilizing optical micrographs of high burn-up mixed oxide fuel. The micrographs were employed to create finite element meshes with the OOF2 software. The meshes were then used to calculate the effective thermal conductivity of the microstructures using the BISON [1] fuel performance code. The new thermal conductivity models were used to calculate thermal profiles at end of life for the fuel pellets. These results were compared to thermal conductivity models from the literature, and comparison between the new finite element-based thermal conductivity model and the Duriez-Lucuta model was favorable.

  9. Microstructural Modeling of Thermal Conductivity of High Burn-up Mixed Oxide Fuel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melissa Teague; Michael Tonks; Stephen Novascone

    2014-01-01

    Predicting the thermal conductivity of oxide fuels as a function of burn-up and temperature is fundamental to the efficient and safe operation of nuclear reactors. However, modeling the thermal conductivity of fuel is greatly complicated by the radially inhomogeneous nature of irradiated fuel in both composition and microstructure. In this work, radially and temperature-dependent models for effective thermal conductivity were developed utilizing optical micrographs of high burn-up mixed oxide fuel. The micrographs were employed to create finite element meshes with the OOF2 software. The meshes were then used to calculate the effective thermal conductivity of the microstructures using the BISONmore » fuel performance code. The new thermal conductivity models were used to calculate thermal profiles at end of life for the fuel pellets. These results were compared to thermal conductivity models from the literature, and comparison between the new finite element-based thermal conductivity model and the Duriez–Lucuta model was favorable.« less

  10. Regional primitive equation modeling and analysis of the polymode data set

    NASA Astrophysics Data System (ADS)

    Spall, Michael A.

    A regional, hybrid coordinate, primitive equation (PE) model is applied to a 60-day period of the POLYMODE data set. The initialization techniques and open boundary conditions introduced by Spall and Robinson are shown to produce stable, realistic, and reasonably accurate hindcasts for the 2-month data set. Comparisons with quasi-geostrophic (QG) modeling studies indicate that the PE model reproduced the jet formation that dominates the region more accurately than did the QG model. When the PE model used boundary conditions that were partially adjusted by the QG model, the resulting fields were very similar to the QG fields, indicating a rapid degradation of small-scale features near the boundaries in the QG calculation. A local term-by-term primitive equation energy and vorticity analysis package is also introduced. The full vorticity, horizontal divergence, kinetic energy, and available gravitational energy equations are solved diagnostically from the output of the regional PE model. Through the analysis of a time series of horizontal maps, the dominant processes in the flow are illustrated. The individual terms are also integrated over the region of jet formation to highlight the net balances as a function of time. The formation of the deep thermocline jet is shown to be due to horizontal advection through the boundary, baroclinic conversion in the deep thermocline and vertical pressure work, which exports the deep energy to the upper thermocline levels. It is concluded here that the PE model reproduces the observed jet formation better than the QG model because of the increased horizontal advection and stronger vertical pressure work. Although the PE model is shown to be superior to the QG model in this application, it is believed that both PE and QG models can play an important role in the regional study of mid-ocean mesoscale eddies.

  11. Capacitated set-covering model considering the distance objective and dependency of alternative facilities

    NASA Astrophysics Data System (ADS)

    Wayan Suletra, I.; Priyandari, Yusuf; Jauhari, Wakhid A.

    2018-03-01

    We propose a new model of facility location to solve a kind of problem that belong to a class of set-covering problem using an integer programming formulation. Our model contains a single objective function, but it represents two goals. The first is to minimize the number of facilities, and the other is to minimize the total distance of customers to facilities. The first goal is a mandatory goal, and the second is an improvement goal that is very useful when alternate optimum solutions for the first goal exist. We use a big number as a weight on the first goal to force the solution algorithm to give first priority to the first goal. Besides considering capacity constraints, our model accommodates a kind of either-or constraints representing facilities dependency. The either-or constraints will prevent the solution algorithm to select two or more facilities from the same set of facility with mutually exclusive properties. A real location selection problem to locate a set of wastewater treatment facility (IPAL) in Surakarta city, Indonesia, will describe the implementation of our model. A numerical example is given using the data of that real problem.

  12. Adjusting for Confounding in Early Postlaunch Settings: Going Beyond Logistic Regression Models.

    PubMed

    Schmidt, Amand F; Klungel, Olaf H; Groenwold, Rolf H H

    2016-01-01

    Postlaunch data on medical treatments can be analyzed to explore adverse events or relative effectiveness in real-life settings. These analyses are often complicated by the number of potential confounders and the possibility of model misspecification. We conducted a simulation study to compare the performance of logistic regression, propensity score, disease risk score, and stabilized inverse probability weighting methods to adjust for confounding. Model misspecification was induced in the independent derivation dataset. We evaluated performance using relative bias confidence interval coverage of the true effect, among other metrics. At low events per coefficient (1.0 and 0.5), the logistic regression estimates had a large relative bias (greater than -100%). Bias of the disease risk score estimates was at most 13.48% and 18.83%. For the propensity score model, this was 8.74% and >100%, respectively. At events per coefficient of 1.0 and 0.5, inverse probability weighting frequently failed or reduced to a crude regression, resulting in biases of -8.49% and 24.55%. Coverage of logistic regression estimates became less than the nominal level at events per coefficient ≤5. For the disease risk score, inverse probability weighting, and propensity score, coverage became less than nominal at events per coefficient ≤2.5, ≤1.0, and ≤1.0, respectively. Bias of misspecified disease risk score models was 16.55%. In settings with low events/exposed subjects per coefficient, disease risk score methods can be useful alternatives to logistic regression models, especially when propensity score models cannot be used. Despite better performance of disease risk score methods than logistic regression and propensity score models in small events per coefficient settings, bias, and coverage still deviated from nominal.

  13. Top-down and bottom-up modeling in system pharmacology to understand clinical efficacy: An example with NRTIs of HIV-1.

    PubMed

    Duwal, Sulav; von Kleist, Max

    2016-10-30

    A major aim of Systems Pharmacology is to understand clinically relevant mechanisms of action (MOA) of drugs and to use this knowledge in order to optimize therapy. To enable this mission it is necessary to obtain knowledge on how in vitro testable insights translate into clinical efficacy. Mathematical modeling and data integration are essential components to achieve this goal. Two modeling philosophies are prevalent, each of which in isolation is not sufficient to achieve the above described: In a 'top-down' approach, a minimal pharmacokinetic-pharmacodynamic (PK-PD) model is derived from- and fitted to available clinical data. This model may lack interpretability in terms of mechanisms and may only be predictive for scenarios already covered by the data used to derive it. A 'bottom-up' approach builds on mechanistic insights derived from in vitro/ex vivo experiments, which can be conducted under controlled conditions, but may not be fully representative for the in vivo/clinical situation. In this work, we employ both approaches side-by-side to predict the clinical potency (IC 50 values) of the nucleoside reverse transcriptase inhibitors (NRTIs) lamivudine, emtricitabine and tenofovir. In the 'top-down' approach, this requires to establish the dynamic link between the intracellularly active NRTI-triphosphates (which exert the effect) and plasma prodrug PK and to subsequently link this composite PK model to viral kinetics. The 'bottom-up' approach assesses inhibition of reverse transcriptase-mediated viral DNA polymerization by the intracellular, active NRTI-triphosphates, which has to be brought into the context of target cell infection. By using entirely disparate sets of data to derive and parameterize the respective models, our approach serves as a means to assess the clinical relevance of the 'bottom-up' approach. We obtain very good qualitative and quantitative agreement between 'top-down' vs. 'bottom-up' predicted IC 50 values, arguing for the validity of

  14. A simple recipe for setting up the flux equations of cyclic and linear reaction schemes of ion transport with a high number of states: The arrow scheme.

    PubMed

    Hansen, Ulf-Peter; Rauh, Oliver; Schroeder, Indra

    2016-01-01

    The calculation of flux equations or current-voltage relationships in reaction kinetic models with a high number of states can be very cumbersome. Here, a recipe based on an arrow scheme is presented, which yields a straightforward access to the minimum form of the flux equations and the occupation probability of the involved states in cyclic and linear reaction schemes. This is extremely simple for cyclic schemes without branches. If branches are involved, the effort of setting up the equations is a little bit higher. However, also here a straightforward recipe making use of so-called reserve factors is provided for implementing the branches into the cyclic scheme, thus enabling also a simple treatment of such cases.

  15. A Photometric Observing Program at the VATT: Setting Up a Calibration Field

    NASA Astrophysics Data System (ADS)

    Davis Philip, A. G.; Boyle, R. P.; Janusz, R.

    2009-05-01

    Philip and Boyle have been making Strömgren and then Strömvil photometric observations of open and globular clusters at the Vatican Advanced Technology Telescope located on Mt. Graham in Arizona. Our aim is to obtain CCD photometric indices good to 0.01 magnitude. Indices of this quality can later be analyzed to yield estimates of temperature, luminosity and metallicity. But we have found that the CCD chip does not yield photometry of this quality without further corrections. Our most observed cluster is the open cluster, M 67. This cluster is also very well observed in the literature. We took the best published values and created a set of "standard" stars for our field. Taking our CCD results we could calculate deltas, as a function of position on the chip, which we then applied to all the CCD frames that we obtained. With this procedure we were able to obtain the precision of 0.01 magnitudes in all the fields that we observed. When we started we were able to use the "A" two-inch square Strömgren four-color set from KPNO. Later the Vatican Observatory bought a set of 3.48 inch square Strömgren filters, The Vatican Observatory had a set of circular Vilnius filters There was also an X filter. These eight filters made our Strömvil set.

  16. Lessons learned in setting up and running the European copy of HST archive

    NASA Astrophysics Data System (ADS)

    Pirenne, Benoit; Benvenuti, P.; Albrecht, Rudolf; Rasmussen, B. F.

    1993-11-01

    The endeavour of Hubble Space Telescope (HST) proved once more that arguments such as high costs, extremely long preparation time, inherent total failure risks, limited life time and high over-subscription rates make each scientific space mission almost always a unique event. The above arguments immediately point to the need for storing all the data produced by spacecraft in a short time for the scientific community to re-use in the long term. This calls for the organization of science archives. Together with the Space Telescope Science Institute, the European Coordinating Facility developed an archive system for the HST data. This paper is about the experience gained in setting up and running the European HST Science Data Archive system. Organization, cost versus scientific return and acceptance by the scientists are among the aspects that will be covered. In particular, we will insist on the 'four-pillar' structure principle that all archive centers should have. Namely: a user interface, a catalogue accurately describing the content of the archive, the human scientific expertise and of course the data. Long term prospects and problems due to technology changes will be evaluated and solutions will be proposed. The adaptability of the system described to other scientific space missions our ground-based observatories will be discussed.

  17. Space-ecology set covering problem for modeling Daiyun Mountain Reserve, China

    NASA Astrophysics Data System (ADS)

    Lin, Chih-Wei; Liu, Jinfu; Huang, Jiahang; Zhang, Huiguang; Lan, Siren; Hong, Wei; Li, Wenzhou

    2018-02-01

    Site selection is an important issue in designing the nature reserve that has been studied over the years. However, a well-balanced relationship between preservation of biodiversity and site selection is still challenging. Unlike the existing methods, we consider three critical components, the spatial continuity, spatial compactness and ecological information to address the problem of designing the reserve. In this paper, we propose a new mathematical model of set covering problem called Space-ecology Set Covering Problem (SeSCP) for designing a reserve network. First, we generate the ecological information by forest resource investigation. Then, we split the landscape into elementary cells and calculate the ecological score of each cell. Next, we associate the ecological information with the spatial properties to select a set of cells to form a nature reserve for improving the ability of protecting the biodiversity. Two spatial constraints, continuity and compactability, are given in SeSCP. The continuity is to ensure that any selected site has to be connected with adjacent sites and the compactability is to minimize the perimeter of the selected sites. In computational experiments, we take Daiyun Mountain as a study area to demonstrate the feasibility and effectiveness of the proposed model.

  18. Evidence from catch-up growth and hoarding behavior of rats that exposure to hypobaric air lowers the body-mass set point.

    PubMed

    Bozzini, Carlos E; Lezón, Christian E; Norese, María F; Conti, María I; Martínez, María P; Olivera, María I; Alippi, Rosa M

    2005-01-01

    The depression of body growth rate and the reduction of body mass for chronological age and gender in growing experimental animals exposed to hypobaric air (simulated high altitude = SHA) have been associated with hypophagia because of reduced appetite. Catch-up growth during protein recovery after a short period of protein restriction only occurs if food intake becomes super-normal, which should not be possible under hypoxic conditions if the set-point for appetite is adjusted by the level of SHA. The present investigation was designed to test the hypothesis that growth retardation during exposure to SHA is due to an alteration of the neural mechanism for setting body mass size rather than a primary alteration of the central set-point for appetite. One group of female rats aged 35 d were exposed to SHA (5460m) in a SHA chamber for 27 d (HX rats). Other group was maintained under local barometric pressure conditions (NX rats). One half of both NX and HX rats were fed a protein-free diet for the initial 9 d of the experimental period. From this time on, they were fed a diet containing 20% protein, as were the remaining rats of both groups during the entire experimental period. The growth rates of both mass and length of the body were significantly depressed in well-nourished rats exposed to SHA during the entire observation period when compared to normoxic ones. At its end, body mass and body length were 24% and 21% less in HX than in NX rats. Growth rates were negatively affected by protein restriction in both NX and HX rats. During protein recovery, they reached supernormal values in response to supernormal levels of energy intake that allowed a complete catch-up of both body mass and length. The finding that energy intake during the period of protein rehabilitation in HX rats previously stunted by protein restriction was markedly higher than in HX control ones at equal levels of hypoxia demonstrates that the degree of hypoxia does not determine directly the

  19. Beam patterns in an optical parametric oscillator set-up employing walk-off compensating beta barium borate crystals

    NASA Astrophysics Data System (ADS)

    Kaucikas, M.; Warren, M.; Michailovas, A.; Antanavicius, R.; van Thor, J. J.

    2013-02-01

    This paper describes the investigation of an optical parametric oscillator (OPO) set-up based on two beta barium borate (BBO) crystals, where the interplay between the crystal orientations, cut angles and air dispersion substantially influenced the OPO performance, and especially the angular spectrum of the output beam. Theory suggests that if two BBO crystals are used in this type of design, they should be of different cuts. This paper aims to provide an experimental manifestation of this fact. Furthermore, it has been shown that air dispersion produces similar effects and should be taken into account. An x-ray crystallographic indexing of the crystals was performed as an independent test of the above conclusions.

  20. Do you think you have what it takes to set up a long-term video monitoring unit?

    PubMed

    Smith, Sheila L

    2006-03-01

    The single most important factor when setting up a long-term video monitoring unit is research. Research all vendors by traveling to other sites and calling other facilities. Considerations with equipment include the server, acquisition units, review units, cameras, software, and monitors as well as other factors including Health Insurance Portability and Accountability Act (HIPAA) compliance. Research customer support including both field and telephone support. Involve your Clinical Engineering Department in your investigations. Be sure to obtain warranty information. Researching placement of the equipment is essential. Communication with numerous groups is vital. Administration, engineers, clinical engineering, physicians, infection control, environmental services, house supervisors, security, and all involved parties should be involved in the planning.

  1. Recalibrating disease parameters for increasing realism in modeling epidemics in closed settings.

    PubMed

    Bioglio, Livio; Génois, Mathieu; Vestergaard, Christian L; Poletto, Chiara; Barrat, Alain; Colizza, Vittoria

    2016-11-14

    The homogeneous mixing assumption is widely adopted in epidemic modelling for its parsimony and represents the building block of more complex approaches, including very detailed agent-based models. The latter assume homogeneous mixing within schools, workplaces and households, mostly for the lack of detailed information on human contact behaviour within these settings. The recent data availability on high-resolution face-to-face interactions makes it now possible to assess the goodness of this simplified scheme in reproducing relevant aspects of the infection dynamics. We consider empirical contact networks gathered in different contexts, as well as synthetic data obtained through realistic models of contacts in structured populations. We perform stochastic spreading simulations on these contact networks and in populations of the same size under a homogeneous mixing hypothesis. We adjust the epidemiological parameters of the latter in order to fit the prevalence curve of the contact epidemic model. We quantify the agreement by comparing epidemic peak times, peak values, and epidemic sizes. Good approximations of the peak times and peak values are obtained with the homogeneous mixing approach, with a median relative difference smaller than 20 % in all cases investigated. Accuracy in reproducing the peak time depends on the setting under study, while for the peak value it is independent of the setting. Recalibration is found to be linear in the epidemic parameters used in the contact data simulations, showing changes across empirical settings but robustness across groups and population sizes. An adequate rescaling of the epidemiological parameters can yield a good agreement between the epidemic curves obtained with a real contact network and a homogeneous mixing approach in a population of the same size. The use of such recalibrated homogeneous mixing approximations would enhance the accuracy and realism of agent-based simulations and limit the intrinsic biases of

  2. Sets, Planets, and Comets

    ERIC Educational Resources Information Center

    Baker, Mark; Beltran, Jane; Buell, Jason; Conrey, Brian; Davis, Tom; Donaldson, Brianna; Detorre-Ozeki, Jeanne; Dibble, Leila; Freeman, Tom; Hammie, Robert; Montgomery, Julie; Pickford, Avery; Wong, Justine

    2013-01-01

    Sets in the game "Set" are lines in a certain four-dimensional space. Here we introduce planes into the game, leading to interesting mathematical questions, some of which we solve, and to a wonderful variation on the game "Set," in which every tableau of nine cards must contain at least one configuration for a player to pick up.

  3. "Ready, Set, FLOW!"

    ERIC Educational Resources Information Center

    Stroud, Wesley

    2018-01-01

    All educators want their classrooms to be inviting areas that support investigations. However, a common mistake is to fill learning spaces with items or objects that are set up by the teacher or are simply "for show." This type of setting, although it may create a comfortable space for students, fails to stimulate investigations and…

  4. TB preventive therapy for people living with HIV: key considerations for scale-up in resource-limited settings.

    PubMed

    Pathmanathan, I; Ahmedov, S; Pevzner, E; Anyalechi, G; Modi, S; Kirking, H; Cavanaugh, J S

    2018-06-01

    Tuberculosis (TB) is the leading cause of death for persons living with the human immunodeficiency virus (PLHIV). TB preventive therapy (TPT) works synergistically with, and independently of, antiretroviral therapy to reduce TB morbidity, mortality and incidence among PLHIV. However, although TPT is a crucial and cost-effective component of HIV care for adults and children and has been recommended as an international standard of care for over a decade, it remains highly underutilized. If we are to end the global TB epidemic, we must address the significant reservoir of tuberculous infection, especially in those, such as PLHIV, who are most likely to progress to TB disease. To do so, we must confront the pervasive perception that barriers to TPT scale-up are insurmountable in resource-limited settings. Here we review available evidence to address several commonly stated obstacles to TPT scale-up, including the need for the tuberculin skin test, limited diagnostic capacity to reliably exclude TB disease, concerns about creating drug resistance, suboptimal patient adherence to therapy, inability to monitor for and prevent adverse events, a 'one size fits all' option for TPT regimen and duration, and uncertainty about TPT use in children, adolescents, and pregnant women. We also discuss TPT delivery in the era of differentiated care for PLHIV, how best to tackle advanced planning for drug procurement and supply chain management, and how to create an enabling environment for TPT scale-up success.

  5. Facilitation of Goal-Setting and Follow-Up in an Internet Intervention for Health and Wellness

    NASA Astrophysics Data System (ADS)

    Kaipainen, Kirsikka; Mattila, Elina; Kinnunen, Marja-Liisa; Korhonen, Ilkka

    Chronic work-related stress and insufficient recovery from workload can gradually lead to problems with mental and physical health. Resources in healthcare are limited especially for preventive treatment, but low-cost support can be provided by Internet-based behavior change interventions. This paper describes the design of an Internet intervention which supports working-age people in managing and preventing stress-related health and wellness problems. The intervention is designed for early prevention and aims to motivate individuals to take responsibility for their own well-being. It allows them to choose the approach to take to address personally significant issues, while guiding them through the process. The first iteration of the intervention was evaluated with three user groups and subsequently improved based on the user experiences to be more persuasive, motivating and better suited for independent use. Goal setting and follow-up were especially enhanced, tunneled structure improved, and the threshold of use lowered.

  6. Methods of mathematical modeling using polynomials of algebra of sets

    NASA Astrophysics Data System (ADS)

    Kazanskiy, Alexandr; Kochetkov, Ivan

    2018-03-01

    The article deals with the construction of discrete mathematical models for solving applied problems arising from the operation of building structures. Security issues in modern high-rise buildings are extremely serious and relevant, and there is no doubt that interest in them will only increase. The territory of the building is divided into zones for which it is necessary to observe. Zones can overlap and have different priorities. Such situations can be described using formulas algebra of sets. Formulas can be programmed, which makes it possible to work with them using computer models.

  7. Provably secure Rabin-p cryptosystem in hybrid setting

    NASA Astrophysics Data System (ADS)

    Asbullah, Muhammad Asyraf; Ariffin, Muhammad Rezal Kamel

    2016-06-01

    In this work, we design an efficient and provably secure hybrid cryptosystem depicted by a combination of the Rabin-p cryptosystem with an appropriate symmetric encryption scheme. We set up a hybrid structure which is proven secure in the sense of indistinguishable against the chosen-ciphertext attack. We presume that the integer factorization problem is hard and the hash function that modeled as a random function.

  8. Validating the energy transport modeling of the DIII-D and EAST ramp up experiments using TSC

    NASA Astrophysics Data System (ADS)

    Liu, Li; Guo, Yong; Chan, Vincent; Mao, Shifeng; Wang, Yifeng; Pan, Chengkang; Luo, Zhengping; Zhao, Hailin; Ye, Minyou

    2017-06-01

    The confidence in ramp up scenario design of the China fusion engineering test reactor (CFETR) can be significantly enhanced using validated transport models to predict the current profile and temperature profile. In the tokamak simulation code (TSC), two semi-empirical energy transport models (the Coppi-Tang (CT) and BGB model) and three theory-based models (the GLF23, MMM95 and CDBM model) are investigated on the CFETR relevant ramp up discharges, including three DIII-D ITER-like ramp up discharges and one EAST ohmic discharge. For the DIII-D discharges, all the transport models yield dynamic {{\\ell}\\text{i}} within +/- 0.15 deviations except for some time points where the experimental fluctuation is very strong. All the models agree with the experimental {β\\text{p}} except that the CT model strongly overestimates {β\\text{p}} in the first half of ramp up phase. When applying the CT, CDBM and GLF23 model to estimate the internal flux, they show maximum deviations of more than 10% because of inaccuracies in the temperature profile predictions, while the BGB model performs best on the internal flux. Although all the models fall short in reproducing the dynamic {{\\ell}\\text{i}} evolution for the EAST tokamak, the result of the BGB model is the closest to the experimental {{\\ell}\\text{i}} . Based on these comparisons, we conclude that the BGB model is the most consistent among these models for simulating CFETR ohmic ramp-up. The CT model with improvement for better simulation of the temperature profiles in the first half of ramp up phase will also be attractive. For the MMM95, GLF23 and CDBM model, better prediction of the edge temperature will improve the confidence for CFETR L-mode simulation. Conclusive validation of any transport model will require extensive future investigation covering a larger variety discharges.

  9. The complete set of Cassini's UVIS occultation observations of Enceladus plume: model fits

    NASA Astrophysics Data System (ADS)

    Portyankina, G.; Esposito, L. W.; Hansen, C. J.

    2017-12-01

    Since the discovery in 2005, plume of Enceladus was observed by most of the instruments onboard Cassini spacecraft. Ultraviolet Imaging Spectrograph (UVIS) have observed Enceladus plume and collimated jets embedded in it in occultational geometry on 6 different occasions. We have constructed a 3D direct simulation Monte Carlo (DSMC) model for Enceladus jets and apply it to the analysis of the full set of UVIS occultation observations conducted during Cassini's mission from 2005 to 2017. The Monte Carlo model tracks test particles from their source at the surface into space. The initial positions of all test particles for a single jet are fixed to one of 100 jets sources identified by Porco et al. (2014). The initial three-dimensional velocity of each particle contains two components: a velocity Vz which is perpendicular to the surface, and a thermal velocity which is isotropic in the upward hemisphere. The direction and speed of the thermal velocity of each particle is chosen randomly but the ensemble moves isotropically at a speed which satisfies a Boltzmann distribution for a given temperature Tth. A range for reasonable Vz is then determined by requiring that modeled jet widths match the observed ones. Each model run results in a set of coordinates and velocities of a given set of test particles. These are converted to the test particle number densities and then integrated along LoS for each time step of the occultation observation. The geometry of the observation is calculated using SPICE. The overarching result of the simulation run is a test particle number density along LoS for each time point during the occultation observation for each of the jets separately. To fit the model to the data, we integrate all jets that are crossed by the LoS at each point during an observation. The relative strength of the jets must be determined to fit the observed UVIS curves. The results of the fits are sets of active jets for each occultation. Each UVIS occultation

  10. Mathematical Modeling of the Transmission Dynamics of Clostridium difficile Infection and Colonization in Healthcare Settings: A Systematic Review

    PubMed Central

    Gingras, Guillaume; Guertin, Marie-Hélène; Laprise, Jean-François; Drolet, Mélanie; Brisson, Marc

    2016-01-01

    Background We conducted a systematic review of mathematical models of transmission dynamic of Clostridium difficile infection (CDI) in healthcare settings, to provide an overview of existing models and their assessment of different CDI control strategies. Methods We searched MEDLINE, EMBASE and Web of Science up to February 3, 2016 for transmission-dynamic models of Clostridium difficile in healthcare settings. The models were compared based on their natural history representation of Clostridium difficile, which could include health states (S-E-A-I-R-D: Susceptible-Exposed-Asymptomatic-Infectious-Resistant-Deceased) and the possibility to include healthcare workers and visitors (vectors of transmission). Effectiveness of interventions was compared using the relative reduction (compared to no intervention or current practice) in outcomes such as incidence of colonization, CDI, CDI recurrence, CDI mortality, and length of stay. Results Nine studies describing six different models met the inclusion criteria. Over time, the models have generally increased in complexity in terms of natural history and transmission dynamics and number/complexity of interventions/bundles of interventions examined. The models were categorized into four groups with respect to their natural history representation: S-A-I-R, S-E-A-I, S-A-I, and S-E-A-I-R-D. Seven studies examined the impact of CDI control strategies. Interventions aimed at controlling the transmission, lowering CDI vulnerability and reducing the risk of recurrence/mortality were predicted to reduce CDI incidence by 3–49%, 5–43% and 5–29%, respectively. Bundles of interventions were predicted to reduce CDI incidence by 14–84%. Conclusions Although CDI is a major public health problem, there are very few published transmission-dynamic models of Clostridium difficile. Published models vary substantially in the interventions examined, the outcome measures used and the representation of the natural history of Clostridium

  11. Addressing HIV in the School Setting: Application of a School Change Model

    ERIC Educational Resources Information Center

    Walsh, Audra St. John; Chenneville, Tiffany

    2013-01-01

    This paper describes best practices for responding to youth with human immunodeficiency virus (HIV) in the school setting through the application of a school change model designed by the World Health Organization. This model applies a whole school approach and includes four levels that span the continuum from universal prevention to direct…

  12. Assessment of cataract surgical outcomes in settings where follow-up is poor: PRECOG, a multicentre observational study.

    PubMed

    Congdon, Nathan; Yan, Xixi; Lansingh, Van; Sisay, Alemayehu; Müller, Andreas; Chan, Ving; Jin, Ling; Meltzer, Mirjam E; Karumanchi, Sasipriya M; Guan, Chunhong; Vuong, Quy; Rivera, Nelson; McCleod-Omawale, Joan; He, Mingguang

    2013-07-01

    of less than 50% (rs=0·71, p=0·002). When we divided hospitals into top 25%, middle 50%, and bottom 25% by visual outcome, classification based on final follow-up assessment for all patients was the same as that based on early postoperative assessment for 27 (68%) of 40 centres, and the same as that based on data from patients who returned without additional prompting in 31 (84%) of 37 centres. Use of glasses to optimise vision at the time of the early and late examinations did not further improve the correlations. Early vision assessment for all patients and follow-up assessment only for patients who return to the clinic without prompting are valid measures of operative quality in settings where follow-up is poor. ORBIS International, Fred Hollows Foundation, Helen Keller International, International Association for the Prevention of Blindness Latin American Office, Aravind Eye Care System. Copyright © 2013 Congdon et al. Open Access article distributed under the terms of CC BY. Published by .. All rights reserved.

  13. Instalacion necesaria para montar una pequena central electrica (plant requirements to set up and operate a small-community electric system)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1965-04-30

    The manual serves as a guide to the important factors to consider in establishing a small-scale community electric system. Financial requirements include labor costs, machinery, equipment, utilities and administrative costs, raw materials (for diesel fuel to run the generators). Tables on cost estimates are given, with a blank column for actual cost statements; the summary provides questions that will help the planner decide what is necessary for setting up the plant and whether the requirements can be met.

  14. Determination of the Parameter Sets for the Best Performance of IPS-driven ENLIL Model

    NASA Astrophysics Data System (ADS)

    Yun, Jongyeon; Choi, Kyu-Cheol; Yi, Jonghyuk; Kim, Jaehun; Odstrcil, Dusan

    2016-12-01

    Interplanetary scintillation-driven (IPS-driven) ENLIL model was jointly developed by University of California, San Diego (UCSD) and National Aeronaucics and Space Administration/Goddard Space Flight Center (NASA/GSFC). The model has been in operation by Korean Space Weather Cetner (KSWC) since 2014. IPS-driven ENLIL model has a variety of ambient solar wind parameters and the results of the model depend on the combination of these parameters. We have conducted researches to determine the best combination of parameters to improve the performance of the IPS-driven ENLIL model. The model results with input of 1,440 combinations of parameters are compared with the Advanced Composition Explorer (ACE) observation data. In this way, the top 10 parameter sets showing best performance were determined. Finally, the characteristics of the parameter sets were analyzed and application of the results to IPS-driven ENLIL model was discussed.

  15. Impact of malaria interventions on child mortality in endemic African settings: comparison and alignment between LiST and Spectrum-Malaria model.

    PubMed

    Korenromp, Eline; Hamilton, Matthew; Sanders, Rachel; Mahiané, Guy; Briët, Olivier J T; Smith, Thomas; Winfrey, William; Walker, Neff; Stover, John

    2017-11-07

    In malaria-endemic countries, malaria prevention and treatment are critical for child health. In the context of intervention scale-up and rapid changes in endemicity, projections of intervention impact and optimized program scale-up strategies need to take into account the consequent dynamics of transmission and immunity. The new Spectrum-Malaria program planning tool was used to project health impacts of Insecticide-Treated mosquito Nets (ITNs) and effective management of uncomplicated malaria cases (CMU), among other interventions, on malaria infection prevalence, case incidence and mortality in children 0-4 years, 5-14 years of age and adults. Spectrum-Malaria uses statistical models fitted to simulations of the dynamic effects of increasing intervention coverage on these burdens as a function of baseline malaria endemicity, seasonality in transmission and malaria intervention coverage levels (estimated for years 2000 to 2015 by the World Health Organization and Malaria Atlas Project). Spectrum-Malaria projections of proportional reductions in under-five malaria mortality were compared with those of the Lives Saved Tool (LiST) for the Democratic Republic of the Congo and Zambia, for given (standardized) scenarios of ITN and/or CMU scale-up over 2016-2030. Proportional mortality reductions over the first two years following scale-up of ITNs from near-zero baselines to moderately higher coverages align well between LiST and Spectrum-Malaria -as expected since both models were fitted to cluster-randomized ITN trials in moderate-to-high-endemic settings with 2-year durations. For further scale-up from moderately high ITN coverage to near-universal coverage (as currently relevant for strategic planning for many countries), Spectrum-Malaria predicts smaller additional ITN impacts than LiST, reflecting progressive saturation. For CMU, especially in the longer term (over 2022-2030) and for lower-endemic settings (like Zambia), Spectrum-Malaria projects larger

  16. Parallel Execution of Functional Mock-up Units in Buildings Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ozmen, Ozgur; Nutaro, James J.; New, Joshua Ryan

    2016-06-30

    A Functional Mock-up Interface (FMI) defines a standardized interface to be used in computer simulations to develop complex cyber-physical systems. FMI implementation by a software modeling tool enables the creation of a simulation model that can be interconnected, or the creation of a software library called a Functional Mock-up Unit (FMU). This report describes an FMU wrapper implementation that imports FMUs into a C++ environment and uses an Euler solver that executes FMUs in parallel using Open Multi-Processing (OpenMP). The purpose of this report is to elucidate the runtime performance of the solver when a multi-component system is imported asmore » a single FMU (for the whole system) or as multiple FMUs (for different groups of components as sub-systems). This performance comparison is conducted using two test cases: (1) a simple, multi-tank problem; and (2) a more realistic use case based on the Modelica Buildings Library. In both test cases, the performance gains are promising when each FMU consists of a large number of states and state events that are wrapped in a single FMU. Load balancing is demonstrated to be a critical factor in speeding up parallel execution of multiple FMUs.« less

  17. An up-to-date quality-controlled surface mass balance data set for the 90°-180°E Antarctica sector and 1950-2005 period

    NASA Astrophysics Data System (ADS)

    Magand, O.; Genthon, C.; Fily, M.; Krinner, G.; Picard, G.; Frezzotti, M.; Ekaykin, A. A.

    2007-06-01

    On the basis of thousands of surface mass balance (SMB) field measurements over the entire Antarctic ice sheet it is currently estimated that more than 2 Gt of ice accumulate each year at the surface of Antarctica. However, these estimates suffer from large uncertainties. Various problems affect Antarctic SMB measurements, in particular, limited or unwarranted spatial and temporal representativeness, measurement inaccuracy, and lack of quality control. We define quality criteria on the basis of (1) an up-to-date review and quality rating of the various SMB measurement methods and (2) essential information (location, dates of measurements, time period covered by the SMB values, and primary data sources) related to each SMB data. We apply these criteria to available SMB values from Queen Mary to Victoria lands (90°-180°E Antarctic sector) from the early 1950s to present. This results in a new set of observed SMB values for the 1950-2005 time period with strong reduction in density and coverage but also expectedly reduced inaccuracies and uncertainties compared to other compilations. The quality-controlled SMB data set also contains new results from recent field campaigns (International Trans-Antarctic Scientific Expedition (ITASE), Russian Antarctic Expedition (RAE), and Australian National Antarctic Research Expeditions (ANARE) projects) which comply with the defined quality criteria. A comparative evaluation of climate model results against the quality-controlled updated SMB data set and other widely used ones illustrates that such Antarctic SMB studies are significantly affected by the quality of field SMB values used as reference.

  18. An interactive environment for the analysis of large Earth observation and model data sets

    NASA Technical Reports Server (NTRS)

    Bowman, Kenneth P.; Walsh, John E.; Wilhelmson, Robert B.

    1993-01-01

    We propose to develop an interactive environment for the analysis of large Earth science observation and model data sets. We will use a standard scientific data storage format and a large capacity (greater than 20 GB) optical disk system for data management; develop libraries for coordinate transformation and regridding of data sets; modify the NCSA X Image and X DataSlice software for typical Earth observation data sets by including map transformations and missing data handling; develop analysis tools for common mathematical and statistical operations; integrate the components described above into a system for the analysis and comparison of observations and model results; and distribute software and documentation to the scientific community.

  19. An interactive environment for the analysis of large Earth observation and model data sets

    NASA Technical Reports Server (NTRS)

    Bowman, Kenneth P.; Walsh, John E.; Wilhelmson, Robert B.

    1992-01-01

    We propose to develop an interactive environment for the analysis of large Earth science observation and model data sets. We will use a standard scientific data storage format and a large capacity (greater than 20 GB) optical disk system for data management; develop libraries for coordinate transformation and regridding of data sets; modify the NCSA X Image and X Data Slice software for typical Earth observation data sets by including map transformations and missing data handling; develop analysis tools for common mathematical and statistical operations; integrate the components described above into a system for the analysis and comparison of observations and model results; and distribute software and documentation to the scientific community.

  20. Diagnostic Profiles: A Standard Setting Method for Use with a Cognitive Diagnostic Model

    ERIC Educational Resources Information Center

    Skaggs, Gary; Hein, Serge F.; Wilkins, Jesse L. M.

    2016-01-01

    This article introduces the Diagnostic Profiles (DP) standard setting method for setting a performance standard on a test developed from a cognitive diagnostic model (CDM), the outcome of which is a profile of mastered and not-mastered skills or attributes rather than a single test score. In the DP method, the key judgment task for panelists is a…

  1. Does mode of follow-up influence contraceptive use after medical abortion in a low-resource setting? Secondary outcome analysis of a non-inferiority randomized controlled trial.

    PubMed

    Paul, Mandira; Iyengar, Sharad D; Essén, Birgitta; Gemzell-Danielsson, Kristina; Iyengar, Kirti; Bring, Johan; Klingberg-Allvin, Marie

    2016-10-17

    Post-abortion contraceptive use in India is low and the use of modern methods of contraception is rare, especially in rural areas. This study primarily compares contraceptive use among women whose abortion outcome was assessed in-clinic with women who assessed their abortion outcome at home, in a low-resource, primary health care setting. Moreover, it investigates how background characteristics and abortion service provision influences contraceptive use post-abortion. A randomized controlled, non-inferiority, trial (RCT) compared clinic follow-up with home-assessment of abortion outcome at 2 weeks post-abortion. Additionally, contraceptive-use at 3 months post-abortion was investigated through a cross-sectional follow-up interview with a largely urban sub-sample of women from the RCT. Women seeking abortion with a gestational age of up to 9 weeks and who agreed to a 2-week follow-up were included (n = 731). Women with known contraindications to medical abortions, Hb < 85 mg/l and aged below 18 were excluded. Data were collected between April 2013 and August 2014 in six primary health-care clinics in Rajasthan. A computerised random number generator created the randomisation sequence (1:1) in blocks of six. Contraceptive use was measured at 2 weeks among women successfully followed-up (n = 623) and 3 months in the sub-set of women who were included if they were recruited at one of the urban study sites, owned a phone and agreed to a 3-month follow-up (n = 114). There were no differences between contraceptive use and continuation between study groups at 3 months (76 % clinic follow-up, 77 % home-assessment), however women in the clinic follow-up group were most likely to adopt a contraceptive method at 2 weeks (62 ± 12 %), while women in the home-assessment group were most likely to adopt a method after next menstruation (60 ± 13 %). Fifty-two per cent of women who initiated a method at 2 weeks chose the 3-month injection or the

  2. A Step towards a Sharable Community Knowledge Base for WRF Settings -Developing a WRF Setting Methodology based on a case study in a Torrential Rainfall Event

    NASA Astrophysics Data System (ADS)

    CHU, Q.; Xu, Z.; Zhuo, L.; Han, D.

    2016-12-01

    Increased requirements for interactions between different disciplines and readily access to the numerical weather forecasting system featured with portability and extensibility have made useful contribution to the increases of downstream model users in WRF over recent years. For these users, a knowledge base classified by the representative events would be much helpful. This is because the determination of model settings is regarded as the most important steps in WRF. However, such a process is generally time-consuming, even if with a high computational platform. As such, we propose a sharable proper lookup table on WRF domain settings and corresponding procedures based on a representative torrential rainfall event in Beijing, China. It has been found that WRF's simulations' drift away from the input lateral boundary conditions can be significantly reduced with the adjustment of the domain settings. Among all the impact factors, the placement of nested domain can not only affect the moving speed and angle of the storm-center, but also the location and amount of heavy-rain-belt which can only be detected with adjusted spatial resolutions. Spin-up time is also considered in the model settings, which is demonstrated to have the most obvious influence on the accuracy of the simulations. This conclusion is made based on the large diversity of spatial distributions of precipitation, in terms of the amount of heavy rain varied from -30% to 58% among each experiment. After following all the procedures, the variations of domain settings have minimal effect on the modeling and show the best correlation (larger than 0.65) with fusion observations. So the model settings, including domain size covering the greater Beijing area, 1:5:5 downscaling ratio, 57 vertical levels with top of 50hpa and 60h spin-up time, are found suitable for predicting the similar convective torrential rainfall event in Beijing area. We hope that the procedure for building the community WRF knowledge

  3. Combining in situ characterization methods in one set-up: looking with more eyes into the intricate chemistry of the synthesis and working of heterogeneous catalysts.

    PubMed

    Bentrup, Ursula

    2010-12-01

    Several in situ techniques are known which allow investigations of catalysts and catalytic reactions under real reaction conditions using different spectroscopic and X-ray methods. In recent years, specific set-ups have been established which combine two or more in situ methods in order to get a more detailed understanding of catalytic systems. This tutorial review will give a summary of currently available set-ups equipped with multiple techniques for in situ catalyst characterization, catalyst preparation, and reaction monitoring. Besides experimental and technical aspects of method coupling including X-ray techniques, spectroscopic methods (Raman, UV-vis, FTIR), and magnetic resonance spectroscopies (NMR, EPR), essential results will be presented to demonstrate the added value of multitechnique in situ approaches. A special section is focussed on selected examples of use which show new developments and application fields.

  4. A Dual Hesitant Fuzzy Multigranulation Rough Set over Two-Universe Model for Medical Diagnoses

    PubMed Central

    Zhang, Chao; Li, Deyu; Yan, Yan

    2015-01-01

    In medical science, disease diagnosis is one of the difficult tasks for medical experts who are confronted with challenges in dealing with a lot of uncertain medical information. And different medical experts might express their own thought about the medical knowledge base which slightly differs from other medical experts. Thus, to solve the problems of uncertain data analysis and group decision making in disease diagnoses, we propose a new rough set model called dual hesitant fuzzy multigranulation rough set over two universes by combining the dual hesitant fuzzy set and multigranulation rough set theories. In the framework of our study, both the definition and some basic properties of the proposed model are presented. Finally, we give a general approach which is applied to a decision making problem in disease diagnoses, and the effectiveness of the approach is demonstrated by a numerical example. PMID:26858772

  5. Controls on Water Storage, Mixing and Release in a Nested Catchment Set-up with Clean and Mixed Physiographic Characteristics

    NASA Astrophysics Data System (ADS)

    Pfister, L.; McDonnell, J.; Hissler, C.; Martínez-Carreras, N.; Klaus, J.

    2015-12-01

    With catchment water storage being only rarely determined, storage dynamics remain largely unknown to date. However, storage bears considerable potential for catchment inter-comparison exercises, as well as it is likely to have an important role in regulating catchment functions. Catchment comparisons across a wide range of environments and scales will help to increase our understanding of relationships between storage dynamics and catchment processes. With respect to the potential of catchment storage for bringing new momentum to catchment classification and catchment processes understanding we currently investigate spatial and temporal variability of dynamic storage in a nested catchment set-up (16 catchments) of the Alzette River basin (Luxembourg, Europe), covering a wide range of geological settings, catchment areas, contrasted landuse, and hydro-meteorological and tracer series. We define catchment storage as the total amount of water stored in a control volume, delimited by the catchment's topographical boundaries and depth of saturated and unsaturated zones. Complementary storage assessments (via input-output dynamics of natural tracers, geographical sounding, groundwater level measurements, soil moisture measurements, hydrometry) are carried out for comparison purposes. In our nested catchment set-up we have (1) assessed dependencies between geology, catchment permeability and winter runoff coefficients, (2) calculated water balance derived catchment storage and mixing potential and quantified how dynamic storage differs between catchments and scales, and (3) examined how stream baseflow dD (as a proxy for baseflow transit time) and integrated flow measures (like the flow duration curve) relate to bedrock geology. Catchments with higher bedrock permeability exhibited larger storage capacities and eventually lower average winter runoff coefficients. Over a time-span of 11 years, all catchments re-produced the same winter runoff coefficients year after year

  6. A simple recipe for setting up the flux equations of cyclic and linear reaction schemes of ion transport with a high number of states: The arrow scheme

    PubMed Central

    Hansen, Ulf-Peter; Rauh, Oliver; Schroeder, Indra

    2016-01-01

    abstract The calculation of flux equations or current-voltage relationships in reaction kinetic models with a high number of states can be very cumbersome. Here, a recipe based on an arrow scheme is presented, which yields a straightforward access to the minimum form of the flux equations and the occupation probability of the involved states in cyclic and linear reaction schemes. This is extremely simple for cyclic schemes without branches. If branches are involved, the effort of setting up the equations is a little bit higher. However, also here a straightforward recipe making use of so-called reserve factors is provided for implementing the branches into the cyclic scheme, thus enabling also a simple treatment of such cases. PMID:26646356

  7. Applicability domains for classification problems: benchmarking of distance to models for AMES mutagenicity set

    EPA Science Inventory

    For QSAR and QSPR modeling of biological and physicochemical properties, estimating the accuracy of predictions is a critical problem. The “distance to model” (DM) can be defined as a metric that defines the similarity between the training set molecules and the test set compound ...

  8. Set-theoretic estimation of hybrid system configurations.

    PubMed

    Benazera, Emmanuel; Travé-Massuyès, Louise

    2009-10-01

    Hybrid systems serve as a powerful modeling paradigm for representing complex continuous controlled systems that exhibit discrete switches in their dynamics. The system and the models of the system are nondeterministic due to operation in uncertain environment. Bayesian belief update approaches to stochastic hybrid system state estimation face a blow up in the number of state estimates. Therefore, most popular techniques try to maintain an approximation of the true belief state by either sampling or maintaining a limited number of trajectories. These limitations can be avoided by using bounded intervals to represent the state uncertainty. This alternative leads to splitting the continuous state space into a finite set of possibly overlapping geometrical regions that together with the system modes form configurations of the hybrid system. As a consequence, the true system state can be captured by a finite number of hybrid configurations. A set of dedicated algorithms that can efficiently compute these configurations is detailed. Results are presented on two systems of the hybrid system literature.

  9. Polysomnography versus limited respiratory monitoring and nurse-led titration to optimise non-invasive ventilation set-up: a pilot randomised clinical trial.

    PubMed

    Patout, Maxime; Arbane, Gill; Cuvelier, Antoine; Muir, Jean Francois; Hart, Nicholas; Murphy, Patrick Brian

    2018-03-30

    Polysomnography (PSG) is recommended for non-invasive ventilation (NIV) set-up in patients with chronic respiratory failure. In this pilot randomised clinical trial, we compared the physiological effectiveness of NIV set-up guided by PSG to limited respiratory monitoring (LRM) and nurse-led titration in patients with COPD-obstructive sleep apnoea (OSA) overlap. The principal outcome of interest was change in daytime arterial partial pressure of carbon dioxide (PaCO 2 ) at 3 months. Fourteen patients with daytime PaCO 2 >6 kPa and body mass index >30 kg/m 2 were recruited. At 3 months, PaCO 2 was reduced by -0.88 kPa (95% CI -1.52 to -0.24 kPa) in the LRM group and by -0.36 kPa (95% CI -0.96 to 0.24 kPa) in the PSG group. These pilot data provide support to undertake a clinical trial investigating the clinical effectiveness of attended limited respiratory monitoring and PSG to establish NIV in patients with COPD-OSA overlap. Results, NCT02444806. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  10. Facilitating responsible gambling: the relative effectiveness of education-based animation and monetary limit setting pop-up messages among electronic gaming machine players.

    PubMed

    Wohl, Michael J A; Gainsbury, Sally; Stewart, Melissa J; Sztainert, Travis

    2013-12-01

    Although most gamblers set a monetary limit on their play, many exceed this limit--an antecedent of problematic gambling. Responsible gambling tools may assist players to gamble within their means. Historically, however, the impact of such tools has been assessed in isolation. In the current research, two responsible gambling tools that target adherence to a monetary limit were assessed among 72 electronic gaming machine (EGM) players. Participants watched an educational animation explaining how EGMs work (or a neutral video) and then played an EGM in a virtual reality environment. All participants were asked to set a monetary limit on their play, but only half were reminded when that limit was reached. Results showed that both the animation and pop-up limit reminder helped gamblers stay within their preset monetary limit; however, an interaction qualified these main effects. Among participants who did not experience the pop-up reminder, those who watched the animation stayed within their preset monetary limits more than those who did not watch the animation. For those who were reminded of their limit, however, there was no difference in limit adherence between those who watched the animation and those who did not watch the animation. From a responsible gambling perspective, the current study suggests that there is no additive effect of exposure to both responsible gambling tools. Therefore, for minimal disruption in play, a pop-up message reminding gamblers of their preset monetary limit might be preferred over the lengthier educational animation.

  11. Scaling Up Graph-Based Semisupervised Learning via Prototype Vector Machines

    PubMed Central

    Zhang, Kai; Lan, Liang; Kwok, James T.; Vucetic, Slobodan; Parvin, Bahram

    2014-01-01

    When the amount of labeled data are limited, semi-supervised learning can improve the learner's performance by also using the often easily available unlabeled data. In particular, a popular approach requires the learned function to be smooth on the underlying data manifold. By approximating this manifold as a weighted graph, such graph-based techniques can often achieve state-of-the-art performance. However, their high time and space complexities make them less attractive on large data sets. In this paper, we propose to scale up graph-based semisupervised learning using a set of sparse prototypes derived from the data. These prototypes serve as a small set of data representatives, which can be used to approximate the graph-based regularizer and to control model complexity. Consequently, both training and testing become much more efficient. Moreover, when the Gaussian kernel is used to define the graph affinity, a simple and principled method to select the prototypes can be obtained. Experiments on a number of real-world data sets demonstrate encouraging performance and scaling properties of the proposed approach. It also compares favorably with models learned via ℓ1-regularization at the same level of model sparsity. These results demonstrate the efficacy of the proposed approach in producing highly parsimonious and accurate models for semisupervised learning. PMID:25720002

  12. Testing an Instructional Model in a University Educational Setting from the Student's Perspective

    ERIC Educational Resources Information Center

    Betoret, Fernando Domenech

    2006-01-01

    We tested a theoretical model that hypothesized relationships between several variables from input, process and product in an educational setting, from the university student's perspective, using structural equation modeling. In order to carry out the analysis, we measured in sequential order the input (referring to students' personal…

  13. Role of vault cytology in follow-up of hysterectomized women: results and inferences from a low resource setting.

    PubMed

    Gupta, Sanjay; Sodhani, Pushpa; Singh, Veena; Sehgal, Ashok

    2013-09-01

    The study was undertaken to assess the utility of cervico-vaginal/vault cytology in the follow-up of women treated for cervical cancer and benign gynecological conditions. Records of 3,523 cervico-vaginal smears from 2,658 women who underwent hysterectomy and/or radiotherapy or chemotherapy, over a 10-year period were retrieved. Data was collected on type of treatment received, indication for hysterectomy, age of patient, presenting symptoms, stage of tumor, interval since treatment, cytology and biopsy results. The results of cytology versus other parameters were analyzed separately for women treated for cervical cancer and those hysterectomized for benign indications. Malignant cells were detected in 141/1949 (7.2%) follow-up smears from treated cervical cancer cases (140 recurrences and 1 VAIN). Around 92% of recurrences of cervical cancer were detected with in 2 years of follow-up and 75% of these women were symptomatic. Cytology first alerted the clinicians to a recurrence in a quarter of cases. On the other hand, VAIN was detected in 5/1079 (0.46%) vault smears from 997 women hysterectomized for benign gynecologic disease. All these women were asymptomatic and majority (80%) were detected in follow-up smears performed between 3 and 10 years. Vault cytology is an accurate tool to detect local recurrences/VAIN in women treated for cervical cancer or benign gynecological conditions. It may even first alert the clinicians to a possibility of recurrence. However, due to extremely low prevalence of VAIN/vaginal cancer, it seems unwarranted in women hysterectomized for benign indications, especially in resource constrained settings. Copyright © 2012 Wiley Periodicals, Inc., a Wiley company.

  14. Process-based interpretation of conceptual hydrological model performance using a multinational catchment set

    NASA Astrophysics Data System (ADS)

    Poncelet, Carine; Merz, Ralf; Merz, Bruno; Parajka, Juraj; Oudin, Ludovic; Andréassian, Vazken; Perrin, Charles

    2017-08-01

    Most of previous assessments of hydrologic model performance are fragmented, based on small number of catchments, different methods or time periods and do not link the results to landscape or climate characteristics. This study uses large-sample hydrology to identify major catchment controls on daily runoff simulations. It is based on a conceptual lumped hydrological model (GR6J), a collection of 29 catchment characteristics, a multinational set of 1103 catchments located in Austria, France, and Germany and four runoff model efficiency criteria. Two analyses are conducted to assess how features and criteria are linked: (i) a one-dimensional analysis based on the Kruskal-Wallis test and (ii) a multidimensional analysis based on regression trees and investigating the interplay between features. The catchment features most affecting model performance are the flashiness of precipitation and streamflow (computed as the ratio of absolute day-to-day fluctuations by the total amount in a year), the seasonality of evaporation, the catchment area, and the catchment aridity. Nonflashy, nonseasonal, large, and nonarid catchments show the best performance for all the tested criteria. We argue that this higher performance is due to fewer nonlinear responses (higher correlation between precipitation and streamflow) and lower input and output variability for such catchments. Finally, we show that, compared to national sets, multinational sets increase results transferability because they explore a wider range of hydroclimatic conditions.

  15. Towards deep inclusion for equity-oriented health research priority-setting: A working model.

    PubMed

    Pratt, Bridget; Merritt, Maria; Hyder, Adnan A

    2016-02-01

    Growing consensus that health research funders should align their investments with national research priorities presupposes that such national priorities exist and are just. Arguably, justice requires national health research priority-setting to promote health equity. Such a position is consistent with recommendations made by the World Health Organization and at global ministerial summits that health research should serve to reduce health inequalities between and within countries. Thus far, no specific requirements for equity-oriented research priority-setting have been described to guide policymakers. As a step towards the explication and defence of such requirements, we propose that deep inclusion is a key procedural component of equity-oriented research priority-setting. We offer a model of deep inclusion that was developed by applying concepts from work on deliberative democracy and development ethics. This model consists of three dimensions--breadth, qualitative equality, and high-quality non-elite participation. Deep inclusion is captured not only by who is invited to join a decision-making process but also by how they are involved and by when non-elite stakeholders are involved. To clarify and illustrate the proposed dimensions, we use the sustained example of health systems research. We conclude by reviewing practical challenges to achieving deep inclusion. Despite the existence of barriers to implementation, our model can help policymakers and other stakeholders design more inclusive national health research priority-setting processes and assess these processes' depth of inclusion. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Carbon Ion Radiotherapy at the Gunma University Heavy Ion Medical Center: New Facility Set-up.

    PubMed

    Ohno, Tatsuya; Kanai, Tatsuaki; Yamada, Satoru; Yusa, Ken; Tashiro, Mutsumi; Shimada, Hirofumi; Torikai, Kota; Yoshida, Yukari; Kitada, Yoko; Katoh, Hiroyuki; Ishii, Takayoshi; Nakano, Takashi

    2011-10-26

    Carbon ion radiotherapy (C-ion RT) offers superior dose conformity in the treatment of deep-seated tumors compared with conventional X-ray therapy. In addition, carbon ion beams have a higher relative biological effectiveness compared with protons or X-ray beams. C-ion RT for the first patient at Gunma University Heavy Ion Medical Center (GHMC) was initiated in March of 2010. The major specifications of the facility were determined based on the experience of clinical treatments at the National Institute of Radiological Sciences (NIRS), with the size and cost being reduced to one-third of those at NIRS. The currently indicated sites of cancer treatment at GHMC are lung, prostate, head and neck, liver, rectum, bone and soft tissue. Between March 2010 and July 2011, a total of 177 patients were treated at GHMC although a total of 100 patients was the design specification during the period in considering the optimal machine performance. In the present article, we introduce the facility set-up of GHMC, including the facility design, treatment planning systems, and clinical preparations.

  17. Carbon Ion Radiotherapy at the Gunma University Heavy Ion Medical Center: New Facility Set-up

    PubMed Central

    Ohno, Tatsuya; Kanai, Tatsuaki; Yamada, Satoru; Yusa, Ken; Tashiro, Mutsumi; Shimada, Hirofumi; Torikai, Kota; Yoshida, Yukari; Kitada, Yoko; Katoh, Hiroyuki; Ishii, Takayoshi; Nakano, Takashi

    2011-01-01

    Carbon ion radiotherapy (C-ion RT) offers superior dose conformity in the treatment of deep-seated tumors compared with conventional X-ray therapy. In addition, carbon ion beams have a higher relative biological effectiveness compared with protons or X-ray beams. C-ion RT for the first patient at Gunma University Heavy Ion Medical Center (GHMC) was initiated in March of 2010. The major specifications of the facility were determined based on the experience of clinical treatments at the National Institute of Radiological Sciences (NIRS), with the size and cost being reduced to one-third of those at NIRS. The currently indicated sites of cancer treatment at GHMC are lung, prostate, head and neck, liver, rectum, bone and soft tissue. Between March 2010 and July 2011, a total of 177 patients were treated at GHMC although a total of 100 patients was the design specification during the period in considering the optimal machine performance. In the present article, we introduce the facility set-up of GHMC, including the facility design, treatment planning systems, and clinical preparations. PMID:24213124

  18. Scientific Playworlds: a Model of Teaching Science in Play-Based Settings

    NASA Astrophysics Data System (ADS)

    Fleer, Marilyn

    2017-09-01

    Eminent scientists, like Einstein, worked with theoretical contradiction, thought experiments, mental models and visualisation—all characteristics of children's play. Supporting children's play is a strength of early childhood teachers. Promising research shows a link between imagination in science and imagination in play. A case study of 3 preschool teachers and 26 children (3.6-5.9 years; mean age of 4.6 years) over 6 weeks was undertaken, generating 59.6 h of digital observations and 788 photographs of play practices. The research sought to understand (1) how imaginative play promotes scientific learning and (2) examined how teachers engaged children in scientific play. Although play pedagogy is a strength of early childhood teachers, it was found that transforming imaginary situations into scientific narratives requires different pedagogical characteristics. The study found that the building of collective scientific narratives alongside of discourses of wondering were key determinants of science learning in play-based settings. Specifically, the pedagogical principles of using a cultural device that mirrors the science experiences, creating imaginary scientific situations, collectively building scientific problem situations, and imagining the relations between observable contexts and non-observable concepts, changed everyday practices into a scientific narrative and engagement. It is argued that these unique pedagogical characteristics promote scientific narratives in play-based settings. An approach, named as Scientific Playworlds, is presented as a possible model for teaching science in play-based settings.

  19. A Model Job Rotation Plan: A 10-Year Follow-up.

    ERIC Educational Resources Information Center

    Robinson, Daniel C.; Delbridge-Parker, Linda

    1991-01-01

    Describes model job rotation plan in a college student affairs division in which a staff member (intern) rotates among departments as a staff development opportunity. A 10-year follow-up evaluation underscored the success of the program. Concludes job rotation is not just learning experience, but it is also sharing experience. (Author/ABL)

  20. Model-based gene set analysis for Bioconductor.

    PubMed

    Bauer, Sebastian; Robinson, Peter N; Gagneur, Julien

    2011-07-01

    Gene Ontology and other forms of gene-category analysis play a major role in the evaluation of high-throughput experiments in molecular biology. Single-category enrichment analysis procedures such as Fisher's exact test tend to flag large numbers of redundant categories as significant, which can complicate interpretation. We have recently developed an approach called model-based gene set analysis (MGSA), that substantially reduces the number of redundant categories returned by the gene-category analysis. In this work, we present the Bioconductor package mgsa, which makes the MGSA algorithm available to users of the R language. Our package provides a simple and flexible application programming interface for applying the approach. The mgsa package has been made available as part of Bioconductor 2.8. It is released under the conditions of the Artistic license 2.0. peter.robinson@charite.de; julien.gagneur@embl.de.

  1. Setting up a cohort study in speech and language therapy: lessons from The UK Cleft Collective Speech and Language (CC-SL) study.

    PubMed

    Wren, Yvonne; Humphries, Kerry; Stock, Nicola Marie; Rumsey, Nichola; Lewis, Sarah; Davies, Amy; Bennett, Rhiannon; Sandy, Jonathan

    2018-05-01

    Efforts to increase the evidence base in speech and language therapy are often limited by methodological factors that have restricted the strength of the evidence to the lower levels of the evidence hierarchy. Where higher graded studies, such as randomized controlled trials, have been carried out, it has sometimes been difficult to obtain sufficient power to detect a potential effect of intervention owing to small sample sizes or heterogeneity in the participants. With certain clinical groups such as cleft lip and palate, systematic reviews of intervention studies have shown that there is no robust evidence to support the efficacy of any one intervention protocol over another. To describe the setting up of an observational clinical cohort study and to present this as an alternative design for answering research questions relating to prevalence, risk factors and outcomes from intervention. The Cleft Collective Speech and Language (CC-SL) study is a national cohort study of children born with cleft palate. Working in partnership with regional clinical cleft centres, a sample size of over 600 children and 600 parents is being recruited and followed up from birth to age 5 years. Variables being collected include demographic, psychological, surgical, hearing, and speech and language data. The process of setting up the study has led to the creation of a unique, large-scale data set which is available for researchers to access now and in future. As well as exploring predictive factors, the data can be used to explore the impact of interventions in relation to individual differences. Findings from these investigations can be used to provide information on sample criteria and definitions of intervention and dosage which can be used in future trials. The observational cohort study is a useful alternative design to explore questions around prevalence, risk factors and intervention for clinical groups where robust research data are not yet available. Findings from such a

  2. Using the cloud to speed-up calibration of watershed-scale hydrologic models (Invited)

    NASA Astrophysics Data System (ADS)

    Goodall, J. L.; Ercan, M. B.; Castronova, A. M.; Humphrey, M.; Beekwilder, N.; Steele, J.; Kim, I.

    2013-12-01

    This research focuses on using the cloud to address computational challenges associated with hydrologic modeling. One example is calibration of a watershed-scale hydrologic model, which can take days of execution time on typical computers. While parallel algorithms for model calibration exist and some researchers have used multi-core computers or clusters to run these algorithms, these solutions do not fully address the challenge because (i) calibration can still be too time consuming even on multicore personal computers and (ii) few in the community have the time and expertise needed to manage a compute cluster. Given this, another option for addressing this challenge that we are exploring through this work is the use of the cloud for speeding-up calibration of watershed-scale hydrologic models. The cloud used in this capacity provides a means for renting a specific number and type of machines for only the time needed to perform a calibration model run. The cloud allows one to precisely balance the duration of the calibration with the financial costs so that, if the budget allows, the calibration can be performed more quickly by renting more machines. Focusing specifically on the SWAT hydrologic model and a parallel version of the DDS calibration algorithm, we show significant speed-up time across a range of watershed sizes using up to 256 cores to perform a model calibration. The tool provides a simple web-based user interface and the ability to monitor the calibration job submission process during the calibration process. Finally this talk concludes with initial work to leverage the cloud for other tasks associated with hydrologic modeling including tasks related to preparing inputs for constructing place-based hydrologic models.

  3. Detection of actinides and rare earths in natural matrices with the AGLAE new, high sensitivity detection set-up

    NASA Astrophysics Data System (ADS)

    Zucchiatti, Alessandro; Alonso, Ursula; Lemasson, Quentin; Missana, Tiziana; Moignard, Brice; Pacheco, Claire; Pichon, Laurent; Camarena de la Mora, Sandra

    2014-08-01

    A series of granite samples (Grimsel and Äspö) enriched by sorption with natU (10-3 M, 10-4 M, 10-5 M in solution) and La (10-3 M, 10-4 M in solution) has been scanned by PIXE over a surface of 1920 × 1920 mm2 together with non-enriched Grimsel and Äspö granites and a glass standard. An assessment of minimum detection limits, MDL's, for several elements has been performed with the use of standard materials. Due to mapping and the high sensitivity of the new AGLAE detection system, U levels around 30 ppm can be detected from the whole PIXE spectrum (one low energy detector and four summed filtered detectors) while U reach grains, inhomogeneously distributed over the surface can be clearly identified through the multi elemental maps and analyzed separately. Even the nominally enriched samples have La levels below the MDL, probably because precipitation of the element (and not adsorption) mostly took place, and precipitates were eliminated after surface cleaning carried out before PIXE analyses. A multi detector system that implies a PIXE detection solid angle much wider than in any other similar set-up (a factor of 2-5); a higher events selectivity, given by the possibility of filtering individually up to 4 PIXE detectors; a double RBS detector, the new Ion Beam Induced Luminescence (IBIL) spectrometry and gamma spectrometry. Full mapping capability in air, assisted by a powerful event by event reconstruction software. These features allow lower Minimum Detection Limits (MDL) which are highly beneficial to the analysis of cultural heritage objects, meaning generally a reduction of irradiation time. Paintings will then be studied without any damage to the pigments that have color change tendencies which is a major drawback of the previous system. Alternatively they could allow an increase in information collected at equal time, particularly considering the detector's fast response and therefore the potential for high beam currents when sample damage can be

  4. The Impacts of Different Meteorology Data Sets on Nitrogen Fate and Transport in the SWAT Watershed Model

    EPA Science Inventory

    In this study, we investigated how different meteorology data sets impacts nitrogen fate and transport responses in the Soil and Water Assessment Tool (SWAT) model. We used two meteorology data sets: National Climatic Data Center (observed) and Mesoscale Model 5/Weather Research ...

  5. Blow-up for a three dimensional Keller-Segel model with consumption of chemoattractant

    NASA Astrophysics Data System (ADS)

    Jiang, Jie; Wu, Hao; Zheng, Songmu

    2018-04-01

    We investigate blow-up properties for the initial-boundary value problem of a Keller-Segel model with consumption of chemoattractant when the spatial dimension is three. Through a kinetic reformulation of the Keller-Segel system, we first derive some higher-order estimates and obtain certain blow-up criteria for the local classical solutions. These blow-up criteria generalize the results in [4,5] from the whole space R3 to the case of bounded smooth domain Ω ⊂R3. Lower global blow-up estimate on ‖ n ‖ L∞ (Ω) is also obtained based on our higher-order estimates. Moreover, we prove local non-degeneracy for blow-up points.

  6. Spin-up simulation behaviors in a climate model to build a basement of long-time simulation

    NASA Astrophysics Data System (ADS)

    Lee, J.; Xue, Y.; De Sales, F.

    2015-12-01

    It is essential to develop start-up information when conducting long-time climate simulation. In case that the initial condition is already available from the previous simulation of same type model this does not necessary; however, if not, model needs spin-up simulation to have adjusted and balanced initial condition with the model climatology. Otherwise, a severe spin may take several years. Some of model variables such as deep soil temperature fields and temperature in ocean deep layers in initial fields would affect model's further long-time simulation due to their long residual memories. To investigate the important factor for spin-up simulation in producing an atmospheric initial condition, we had conducted two different spin-up simulations when no atmospheric condition is available from exist datasets. One simulation employed atmospheric global circulation model (AGCM), namely Global Forecast System (GFS) of National Center for Environmental Prediction (NCEP), while the other employed atmosphere-ocean coupled global circulation model (CGCM), namely Climate Forecast System (CFS) of NCEP. Both models share the atmospheric modeling part and only difference is in applying of ocean model coupling, which is conducted by Modular Ocean Model version 4 (MOM4) of Geophysical Fluid Dynamics Laboratory (GFDL) in CFS. During a decade of spin-up simulation, prescribed sea-surface temperature (SST) fields of target year is forced to the GFS daily basis, while CFS digested only first time step ocean condition and freely iterated for the rest of the period. Both models were forced by CO2 condition and solar constant given from the target year. Our analyses of spin-up simulation results indicate that freely conducted interaction between the ocean and the atmosphere is more helpful to produce the initial condition for the target year rather than produced by fixed SST forcing. Since the GFS used prescribed forcing exactly given from the target year, this result is unexpected

  7. Complex fuzzy soft expert sets

    NASA Astrophysics Data System (ADS)

    Selvachandran, Ganeshsree; Hafeed, Nisren A.; Salleh, Abdul Razak

    2017-04-01

    Complex fuzzy sets and its accompanying theory although at its infancy, has proven to be superior to classical type-1 fuzzy sets, due its ability in representing time-periodic problem parameters and capturing the seasonality of the fuzziness that exists in the elements of a set. These are important characteristics that are pervasive in most real world problems. However, there are two major problems that are inherent in complex fuzzy sets: it lacks a sufficient parameterization tool and it does not have a mechanism to validate the values assigned to the membership functions of the elements in a set. To overcome these problems, we propose the notion of complex fuzzy soft expert sets which is a hybrid model of complex fuzzy sets and soft expert sets. This model incorporates the advantages of complex fuzzy sets and soft sets, besides having the added advantage of allowing the users to know the opinion of all the experts in a single model without the need for any additional cumbersome operations. As such, this model effectively improves the accuracy of representation of problem parameters that are periodic in nature, besides having a higher level of computational efficiency compared to similar models in literature.

  8. Integrating Soft Set Theory and Fuzzy Linguistic Model to Evaluate the Performance of Training Simulation Systems

    PubMed Central

    Chang, Kuei-Hu; Chang, Yung-Chia; Chain, Kai; Chung, Hsiang-Yu

    2016-01-01

    The advancement of high technologies and the arrival of the information age have caused changes to the modern warfare. The military forces of many countries have replaced partially real training drills with training simulation systems to achieve combat readiness. However, considerable types of training simulation systems are used in military settings. In addition, differences in system set up time, functions, the environment, and the competency of system operators, as well as incomplete information have made it difficult to evaluate the performance of training simulation systems. To address the aforementioned problems, this study integrated analytic hierarchy process, soft set theory, and the fuzzy linguistic representation model to evaluate the performance of various training simulation systems. Furthermore, importance–performance analysis was adopted to examine the influence of saving costs and training safety of training simulation systems. The findings of this study are expected to facilitate applying military training simulation systems, avoiding wasting of resources (e.g., low utility and idle time), and providing data for subsequent applications and analysis. To verify the method proposed in this study, the numerical examples of the performance evaluation of training simulation systems were adopted and compared with the numerical results of an AHP and a novel AHP-based ranking technique. The results verified that not only could expert-provided questionnaire information be fully considered to lower the repetition rate of performance ranking, but a two-dimensional graph could also be used to help administrators allocate limited resources, thereby enhancing the investment benefits and training effectiveness of a training simulation system. PMID:27598390

  9. Integrating Soft Set Theory and Fuzzy Linguistic Model to Evaluate the Performance of Training Simulation Systems.

    PubMed

    Chang, Kuei-Hu; Chang, Yung-Chia; Chain, Kai; Chung, Hsiang-Yu

    2016-01-01

    The advancement of high technologies and the arrival of the information age have caused changes to the modern warfare. The military forces of many countries have replaced partially real training drills with training simulation systems to achieve combat readiness. However, considerable types of training simulation systems are used in military settings. In addition, differences in system set up time, functions, the environment, and the competency of system operators, as well as incomplete information have made it difficult to evaluate the performance of training simulation systems. To address the aforementioned problems, this study integrated analytic hierarchy process, soft set theory, and the fuzzy linguistic representation model to evaluate the performance of various training simulation systems. Furthermore, importance-performance analysis was adopted to examine the influence of saving costs and training safety of training simulation systems. The findings of this study are expected to facilitate applying military training simulation systems, avoiding wasting of resources (e.g., low utility and idle time), and providing data for subsequent applications and analysis. To verify the method proposed in this study, the numerical examples of the performance evaluation of training simulation systems were adopted and compared with the numerical results of an AHP and a novel AHP-based ranking technique. The results verified that not only could expert-provided questionnaire information be fully considered to lower the repetition rate of performance ranking, but a two-dimensional graph could also be used to help administrators allocate limited resources, thereby enhancing the investment benefits and training effectiveness of a training simulation system.

  10. Setting up a clinical trial for a novel disease: a case study of the Doxycycline for the Treatment of Nodding Syndrome Trial - challenges, enablers and lessons learned.

    PubMed

    Anguzu, Ronald; Akun, Pamela R; Ogwang, Rodney; Shour, Abdul Rahman; Sekibira, Rogers; Ningwa, Albert; Nakamya, Phellister; Abbo, Catherine; Mwaka, Amos D; Opar, Bernard; Idro, Richard

    2018-01-01

    A large amount of preparation goes into setting up trials. Different challenges and lessons are experienced. Our trial, testing a treatment for nodding syndrome, an acquired neurological disorder of unknown cause affecting thousands of children in Eastern Africa, provides a unique case study. As part of a study to determine the aetiology, understand pathogenesis and develop specific treatment, we set up a clinical trial in a remote district hospital in Uganda. This paper describes our experiences and documents supportive structures (enablers), challenges faced and lessons learned during set-up of the trial. Protocol development started in September 2015 with phased recruitment of a critical study team. The team spent 12 months preparing trial documents, procurement and training on procedures. Potential recruitment sites were pre-visited, and district and local leaders met as key stakeholders. Key enablers were supportive local leadership and investment by the district and Ministry of Health. The main challenges were community fears about nodding syndrome, adverse experiences of the community during previous research and political involvement. Other challenges included the number and delays in protocol approvals and lengthy procurement processes. This hard-to-reach area has frequent power and Internet fluctuations, which may affect cold chains for study samples, communication and data management. These concerns decreased with a pilot community engagement programme. Experiences and lessons learnt can reduce the duration of processes involved in trial-site set-up. A programme of community engagement and local leader involvement may be key to the success of a trial and in reducing community opposition towards participation in research.

  11. Does the Danube exist? Versions of reality given by various regional climate models and climatological data sets

    NASA Astrophysics Data System (ADS)

    Lucarini, Valerio; Danihlik, Robert; Kriegerova, Ida; Speranza, Antonio

    2007-07-01

    We present an auditing (intercomparison and verification) of several regional climate models (RCMs) nested into the same run of the same atmospheric global circulation model (AGCM) regarding their representation of the statistical properties of the hydrological balance of the Danube river basin for 1961-1990. We also consider the data sets produced by the driving AGCM, by the European Centre for Medium-Range Weather Forecasts (ECMWF) and National Centers for Environmental Prediction (NCEP)-National Center for Atmospheric Research (NCAR) reanalyses. The hydrological balance is computed by integrating the precipitation and evaporation fields over the area of interest. Large discrepancies exist among RCMs for the monthly climatology as well as for the mean and variability of the annual balances, and only few data sets are consistent with the observed discharge values of the Danube at its Delta, even if the driving AGCM provides itself an excellent estimate. We find consistently that, for a given model, increases in the resolution do not alter the net water balance, while speeding up the hydrological cycle through the enhancement of both precipitation and evaporation by the same amount. Since the considered approach relies on the mass conservation principle and bypasses the details of the air-land interface modeling, we propose that the atmospheric components of RCMs still face difficulties in representing the water balance even on a relatively large scale. Their reliability on smaller river basins may be even more problematic. Moreover, since for some models the hydrological balance estimates obtained with the runoff fields do not agree with those obtained via precipitation and evaporation, some deficiencies of the land models are also apparent. The driving AGCM greatly overperforms the NCEP-NCAR and ECMWF 40-year (ERA-40) reanalyses, which result to be largely inadequate for representing the hydrology of the Danube river basin, both for the reconstruction of the long

  12. Constraints from the CMB temperature and other common observational data sets on variable dark energy density models

    NASA Astrophysics Data System (ADS)

    Jetzer, Philippe; Tortora, Crescenzo

    2011-08-01

    The thermodynamic and dynamical properties of a variable dark energy model with density scaling as ρx∝(1+z)m, z being the redshift, are discussed following the outline of Jetzer et al. [P. Jetzer, D. Puy, M. Signore, and C. Tortora, Gen. Relativ. Gravit. 43, 1083 (2011).GRGVA80001-770110.1007/s10714-010-1091-4]. These kinds of models are proven to lead to the creation/disruption of matter and radiation, which affect the cosmic evolution of both matter and radiation components in the Universe. In particular, we have concentrated on the temperature-redshift relation of radiation, which has been constrained using a very recent collection of cosmic microwave background (CMB) temperature measurements up to z˜3. For the first time, we have combined this observational probe with a set of independent measurements (Supernovae Ia distance moduli, CMB anisotropy, large-scale structure and observational data for the Hubble parameter), which are commonly adopted to constrain dark energy models. We find that, within the uncertainties, the model is indistinguishable from a cosmological constant which does not exchange any particles with other components. Anyway, while temperature measurements and Supernovae Ia tend to predict slightly decaying models, the contrary happens if CMB data are included. Future observations, in particular, measurements of CMB temperature at large redshift, will allow to give firmer bounds on the effective equation of state parameter weff of this kind of dark energy model.

  13. Effects of topography on the spin-up of a Venus atmospheric model

    NASA Astrophysics Data System (ADS)

    Herrnstein, A.; Dowling, T. E.

    2007-04-01

    We study how topography affects the spin-up from rest of a model of the atmosphere of Venus. The simulations are performed with the EPIC model using its isentropic, terrain-following hybrid vertical coordinate, and are forced with the Newtonian-cooling profile used to achieve superrotation in a Venus model with no topography by Lee et al. (2005). We are able to reproduce their results with our model, which was developed independently and uses a different vertical coordinate. Both groups use a horizontal resolution of 5°, which is dictated by the need for reasonable computer runtime and is not a claim of numerical convergence. We find that the addition of topography substantially changes both the evolution and end state of the model's spin-up: the magnitude of the superrotation is diminished from 55 ms-1 to 35 ms-1, and it reaches steady state faster, in a few years instead of a few decades. A large, stationary eddy associated with Ishtar Terra forms that has a local horizontal temperature anomaly of order 2 K at the 0.7 bar level; such a feature may be observable in high-resolution infrared images.

  14. Shear rheology and 1H TD-NMR combined to low-field RheoNMR: Set-up and application to quiescent and flow-induced crystallization of polymers

    NASA Astrophysics Data System (ADS)

    Räntzsch, Volker; Özen, Mürüvvet Begüm; Ratzsch, Karl-Friedrich; Guthausen, Gisela; Wilhelm, Manfred

    2017-05-01

    Rheology provides access to the flow properties of soft matter, while 1H TD-NMR is a useful technique for the characterization of molecular dynamics. To achieve greater insight into the interplay of these domains, especially under flow, it is desirable to combine these two methods in one set-up. We present a low-field RheoNMR set-up based on a portable 30 MHz 1H NMR unit that was integrated into a commercial strain-controlled shear rheometer. This unique combination can simultaneously conduct a full rheological characterization (G', G", |η*|, FT-Rheology: I3/1, Q0) while monitoring molecular dynamics in-situ via 1H TD-NMR for temperatures from -15 to +210 °C. Possible applications include the quantitative measurement of the composition in multiphase systems (fats, polymers, etc.) and soft matter during the application of flow, e.g. measurements on the flow-induced crystallization of polymers.

  15. Benchmark Data Set for Wheat Growth Models: Field Experiments and AgMIP Multi-Model Simulations.

    NASA Technical Reports Server (NTRS)

    Asseng, S.; Ewert, F.; Martre, P.; Rosenzweig, C.; Jones, J. W.; Hatfield, J. L.; Ruane, A. C.; Boote, K. J.; Thorburn, P.J.; Rotter, R. P.

    2015-01-01

    The data set includes a current representative management treatment from detailed, quality-tested sentinel field experiments with wheat from four contrasting environments including Australia, The Netherlands, India and Argentina. Measurements include local daily climate data (solar radiation, maximum and minimum temperature, precipitation, surface wind, dew point temperature, relative humidity, and vapor pressure), soil characteristics, frequent growth, nitrogen in crop and soil, crop and soil water and yield components. Simulations include results from 27 wheat models and a sensitivity analysis with 26 models and 30 years (1981-2010) for each location, for elevated atmospheric CO2 and temperature changes, a heat stress sensitivity analysis at anthesis, and a sensitivity analysis with soil and crop management variations and a Global Climate Model end-century scenario.

  16. Mathematical models of the simplest fuzzy PI/PD controllers with skewed input and output fuzzy sets.

    PubMed

    Mohan, B M; Sinha, Arpita

    2008-07-01

    This paper unveils mathematical models for fuzzy PI/PD controllers which employ two skewed fuzzy sets for each of the two-input variables and three skewed fuzzy sets for the output variable. The basic constituents of these models are Gamma-type and L-type membership functions for each input, trapezoidal/triangular membership functions for output, intersection/algebraic product triangular norm, maximum/drastic sum triangular conorm, Mamdani minimum/Larsen product/drastic product inference method, and center of sums defuzzification method. The existing simplest fuzzy PI/PD controller structures derived via symmetrical fuzzy sets become special cases of the mathematical models revealed in this paper. Finally, a numerical example along with its simulation results are included to demonstrate the effectiveness of the simplest fuzzy PI controllers.

  17. Aerostructural Level Set Topology Optimization for a Common Research Model Wing

    NASA Technical Reports Server (NTRS)

    Dunning, Peter D.; Stanford, Bret K.; Kim, H. Alicia

    2014-01-01

    The purpose of this work is to use level set topology optimization to improve the design of a representative wing box structure for the NASA common research model. The objective is to minimize the total compliance of the structure under aerodynamic and body force loading, where the aerodynamic loading is coupled to the structural deformation. A taxi bump case was also considered, where only body force loads were applied. The trim condition that aerodynamic lift must balance the total weight of the aircraft is enforced by allowing the root angle of attack to change. The level set optimization method is implemented on an unstructured three-dimensional grid, so that the method can optimize a wing box with arbitrary geometry. Fast matching and upwind schemes are developed for an unstructured grid, which make the level set method robust and efficient. The adjoint method is used to obtain the coupled shape sensitivities required to perform aerostructural optimization of the wing box structure.

  18. A moist Boussinesq shallow water equations set for testing atmospheric models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zerroukat, M., E-mail: mohamed.zerroukat@metoffice.gov.uk; Allen, T.

    The shallow water equations have long been used as an initial test for numerical methods applied to atmospheric models with the test suite of Williamson et al. being used extensively for validating new schemes and assessing their accuracy. However the lack of physics forcing within this simplified framework often requires numerical techniques to be reworked when applied to fully three dimensional models. In this paper a novel two-dimensional shallow water equations system that retains moist processes is derived. This system is derived from three-dimensional Boussinesq approximation of the hydrostatic Euler equations where, unlike the classical shallow water set, we allowmore » the density to vary slightly with temperature. This results in extra (or buoyancy) terms for the momentum equations, through which a two-way moist-physics dynamics feedback is achieved. The temperature and moisture variables are advected as separate tracers with sources that interact with the mean-flow through a simplified yet realistic bulk moist-thermodynamic phase-change model. This moist shallow water system provides a unique tool to assess the usually complex and highly non-linear dynamics–physics interactions in atmospheric models in a simple yet realistic way. The full non-linear shallow water equations are solved numerically on several case studies and the results suggest quite realistic interaction between the dynamics and physics and in particular the generation of cloud and rain. - Highlights: • Novel shallow water equations which retains moist processes are derived from the three-dimensional hydrostatic Boussinesq equations. • The new shallow water set can be seen as a more general one, where the classical equations are a special case of these equations. • This moist shallow water system naturally allows a feedback mechanism from the moist physics increments to the momentum via buoyancy. • Like full models, temperature and moistures are advected as tracers that

  19. Evaluating a scalable model for implementing electronic health records in resource-limited settings.

    PubMed

    Were, Martin C; Emenyonu, Nneka; Achieng, Marion; Shen, Changyu; Ssali, John; Masaba, John P M; Tierney, William M

    2010-01-01

    Current models for implementing electronic health records (EHRs) in resource-limited settings may not be scalable because they fail to address human-resource and cost constraints. This paper describes an implementation model which relies on shared responsibility between local sites and an external three-pronged support infrastructure consisting of: (1) a national technical expertise center, (2) an implementer's community, and (3) a developer's community. This model was used to implement an open-source EHR in three Ugandan HIV-clinics. Pre-post time-motion study at one site revealed that Primary Care Providers spent a third less time in direct and indirect care of patients (p<0.001) and 40% more time on personal activities (p=0.09) after EHRs implementation. Time spent by previously enrolled patients with non-clinician staff fell by half (p=0.004) and with pharmacy by 63% (p<0.001). Surveyed providers were highly satisfied with the EHRs and its support infrastructure. This model offers a viable approach for broadly implementing EHRs in resource-limited settings.

  20. A Model Process for Institutional Goals-Setting. A Module of the Needs Assessment Project.

    ERIC Educational Resources Information Center

    King, Maxwell C.; And Others

    A goals-setting model for the community/junior college that would interface with the community needs assessment model was developed, using as the survey instrument the Institutional Goals Inventory (I.G.I.) developed by the Educational Testing Service. The nine steps in the model are: Establish Committee on College Goals and Identify Goals Project…

  1. Linking mathematics with engineering applications at an early stage - implementation, experimental set-up and evaluation of a pilot project

    NASA Astrophysics Data System (ADS)

    Rooch, Aeneas; Junker, Philipp; Härterich, Jörg; Hackl, Klaus

    2016-03-01

    Too difficult, too abstract, too theoretical - many first-year engineering students complain about their mathematics courses. The project MathePraxis aims to resolve this disaffection. It links mathematical methods as they are taught in the first semesters with practical problems from engineering applications - and thereby shall give first-year engineering students a vivid and convincing impression of where they will need mathematics in their later working life. But since real applications usually require more than basic mathematics and first-year engineering students typically are not experienced with construction, mensuration and the use of engineering software, such an approach is hard to realise. In this article, we show that it is possible. We report on the implementation of MathePraxis at Ruhr-Universität Bochum. We describe the set-up and the implementation of a course on designing a mass damper which combines basic mathematical techniques with an impressive experiment. In an accompanying evaluation, we have examined the students' motivation relating to mathematics. This opens up new perspectives how to address the need for a more practically oriented mathematical education in engineering sciences.

  2. Chemical Topic Modeling: Exploring Molecular Data Sets Using a Common Text-Mining Approach.

    PubMed

    Schneider, Nadine; Fechner, Nikolas; Landrum, Gregory A; Stiefl, Nikolaus

    2017-08-28

    Big data is one of the key transformative factors which increasingly influences all aspects of modern life. Although this transformation brings vast opportunities it also generates novel challenges, not the least of which is organizing and searching this data deluge. The field of medicinal chemistry is not different: more and more data are being generated, for instance, by technologies such as DNA encoded libraries, peptide libraries, text mining of large literature corpora, and new in silico enumeration methods. Handling those huge sets of molecules effectively is quite challenging and requires compromises that often come at the expense of the interpretability of the results. In order to find an intuitive and meaningful approach to organizing large molecular data sets, we adopted a probabilistic framework called "topic modeling" from the text-mining field. Here we present the first chemistry-related implementation of this method, which allows large molecule sets to be assigned to "chemical topics" and investigating the relationships between those. In this first study, we thoroughly evaluate this novel method in different experiments and discuss both its disadvantages and advantages. We show very promising results in reproducing human-assigned concepts using the approach to identify and retrieve chemical series from sets of molecules. We have also created an intuitive visualization of the chemical topics output by the algorithm. This is a huge benefit compared to other unsupervised machine-learning methods, like clustering, which are commonly used to group sets of molecules. Finally, we applied the new method to the 1.6 million molecules of the ChEMBL22 data set to test its robustness and efficiency. In about 1 h we built a 100-topic model of this large data set in which we could identify interesting topics like "proteins", "DNA", or "steroids". Along with this publication we provide our data sets and an open-source implementation of the new method (CheTo) which

  3. LexValueSets: An Approach for Context-Driven Value Sets Extraction

    PubMed Central

    Pathak, Jyotishman; Jiang, Guoqian; Dwarkanath, Sridhar O.; Buntrock, James D.; Chute, Christopher G.

    2008-01-01

    The ability to model, share and re-use value sets across multiple medical information systems is an important requirement. However, generating value sets semi-automatically from a terminology service is still an unresolved issue, in part due to the lack of linkage to clinical context patterns that provide the constraints in defining a concept domain and invocation of value sets extraction. Towards this goal, we develop and evaluate an approach for context-driven automatic value sets extraction based on a formal terminology model. The crux of the technique is to identify and define the context patterns from various domains of discourse and leverage them for value set extraction using two complementary ideas based on (i) local terms provided by the Subject Matter Experts (extensional) and (ii) semantic definition of the concepts in coding schemes (intensional). A prototype was implemented based on SNOMED CT rendered in the LexGrid terminology model and a preliminary evaluation is presented. PMID:18998955

  4. Detection performance assessment of hand-held mine detection systems in a procurement process: test set-up for MDs and MD/GPRs

    NASA Astrophysics Data System (ADS)

    Schoolderman, Arnold J.; Roosenboom, Jacques H. J.

    2005-06-01

    The Engineers Centre of Expertise of the Royal Netherlands Army (RNLA) has conducted a study on countermine in peace operations. This study, finished in 2002, concluded that the final solution to countermine will depend in the first place on better detection of buried low-metal mines, e.g. by direct detection of the explosive components in mines. Until such detection systems are available, intermediate solutions are necessary in order to assure freedom of movement in peace operations. Because countermine operations consist of a number of different activities (area preparation, detection, clearance, etc) and the suitability of the different types of available equipment depends on the scenario, the toolbox concept for countermine equipment was adopted. In 2003 a procurement process was started in order to fill this toolbox with commercial-off-the-shelf and military-off-the-shelf equipment. The paper gives a concise description of the study on countermine operations and the procurement process, and subsequently focuses on the set-up of the tests that were conducted in the framework of the procurement of hand-held mine detection systems, like metal detectors and dual-sensor mine detectors. Programs of requirements for these systems were drawn up, aiming at systems for general use and special purpose systems. Blind tests to check the compliancy to the detection performance requirements were designed and conducted in the short timeframe that was available in the procurement process. These tests are discussed in this paper, including the set-up of the test lanes, the targets used and their depths, and the role of the operator. The tests of the capability of the detectors to discriminate small targets adjacent to large targets were conducted according the guidelines of the CEN Workshop Agreement on metal detector tests. Although the results of the tests are commercially confidential, conclusions and lessons learned from the execution of these tests are presented.

  5. Discretisation Schemes for Level Sets of Planar Gaussian Fields

    NASA Astrophysics Data System (ADS)

    Beliaev, D.; Muirhead, S.

    2018-01-01

    Smooth random Gaussian functions play an important role in mathematical physics, a main example being the random plane wave model conjectured by Berry to give a universal description of high-energy eigenfunctions of the Laplacian on generic compact manifolds. Our work is motivated by questions about the geometry of such random functions, in particular relating to the structure of their nodal and level sets. We study four discretisation schemes that extract information about level sets of planar Gaussian fields. Each scheme recovers information up to a different level of precision, and each requires a maximum mesh-size in order to be valid with high probability. The first two schemes are generalisations and enhancements of similar schemes that have appeared in the literature (Beffara and Gayet in Publ Math IHES, 2017. https://doi.org/10.1007/s10240-017-0093-0; Mischaikow and Wanner in Ann Appl Probab 17:980-1018, 2007); these give complete topological information about the level sets on either a local or global scale. As an application, we improve the results in Beffara and Gayet (2017) on Russo-Seymour-Welsh estimates for the nodal set of positively-correlated planar Gaussian fields. The third and fourth schemes are, to the best of our knowledge, completely new. The third scheme is specific to the nodal set of the random plane wave, and provides global topological information about the nodal set up to `visible ambiguities'. The fourth scheme gives a way to approximate the mean number of excursion domains of planar Gaussian fields.

  6. Blow-up in nonlinear models of extended particles with confined constituents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alvarez, A.; Ranada, A.F.

    1988-11-15

    It is shown that the indefinite character of the charge in classical models of extended particles with confined constituents is a serious handicap since infinite amounts of positive and negative charge can be emitted in some solutions, causing a blow-up in finite time.

  7. Dissecting the genetic make-up of North-East Sardinia using a large set of haploid and autosomal markers.

    PubMed

    Pardo, Luba M; Piras, Giovanna; Asproni, Rosanna; van der Gaag, Kristiaan J; Gabbas, Attilio; Ruiz-Linares, Andres; de Knijff, Peter; Monne, Maria; Rizzu, Patrizia; Heutink, Peter

    2012-09-01

    Sardinia has been used for genetic studies because of its historical isolation, genetic homogeneity and increased prevalence of certain rare diseases. Controversy remains concerning the genetic substructure and the extent of genetic homogeneity, which has implications for the design of genome-wide association studies (GWAS). We revisited this issue by examining the genetic make-up of a sample from North-East Sardinia using a dense set of autosomal, Y chromosome and mitochondrial markers to assess the potential of the sample for GWAS and fine mapping studies. We genotyped individuals for 500K single-nucleotide polymorphisms, Y chromosome markers and sequenced the mitochondrial hypervariable (HVI-HVII) regions. We identified major haplogroups and compared these with other populations. We estimated linkage disequilibrium (LD) and haplotype diversity across autosomal markers, and compared these with other populations. Our results show that within Sardinia there is no major population substructure and thus it can be considered a genetically homogenous population. We did not find substantial differences in the extent of LD in Sardinians compared with other populations. However, we showed that at least 9% of genomic regions in Sardinians differed in LD structure, which is helpful for identifying functional variants using fine mapping. We concluded that Sardinia is a powerful setting for genetic studies including GWAS and other mapping approaches.

  8. Psychometric evaluation of 3-set 4P questionnaire.

    PubMed

    Akerman, Eva; Fridlund, Bengt; Samuelson, Karin; Baigi, Amir; Ersson, Anders

    2013-02-01

    This is a further development of a specific questionnaire, the 3-set 4P, to be used for measuring former ICU patients' physical and psychosocial problems after intensive care and the need for follow-up. The aim was to psychometrically test and evaluate the 3-set 4P questionnaire in a larger population. The questionnaire consists of three sets: "physical", "psychosocial" and "follow-up". The questionnaires were sent by mail to all patients with more than 24-hour length of stay on four ICUs in Sweden. Construct validity was measured with exploratory factor analysis with Varimax rotation. This resulted in three factors for the "physical set", five factors for the "psychosocial set" and four factors for the "follow-up set" with strong factor loadings and a total explained variance of 62-77.5%. Thirteen questions in the SF-36 were used for concurrent validity showing Spearman's r(s) 0.3-0.6 in eight questions and less than 0.2 in five. Test-retest was used for stability reliability. In set follow-up the correlation was strong to moderate and in physical and psychosocial sets the correlations were moderate to fair. This may have been because the physical and psychosocial status changed rapidly during the test period. All three sets had good homogeneity. In conclusion, the 3-set 4P showed overall acceptable results, but it has to be further modified in different cultures before being considered a fully operational instrument for use in clinical practice. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. Setting Up a Patient Care Call Center After Potential HCV Exposure.

    PubMed

    Friedman, Candace; Bucholz, Brigette; Anderson, Susan G; Dwyer, Shon A; Aguirre, Josephine

    2016-09-01

    Notify patients of a potential exposure to hepatitis C virus, coordinate testing, and provide follow-up counseling. A team was convened to identify various needs in developing a patient care call center. The areas addressed included the following: location, hours, and duration; telephone accessibility; tracking calls and test results; billing; staffing; notification; and potential issues requiring additional evaluation. Disclosure letters were sent to 1275 patients; 57 letters were not deliverable. There were 245 calls to the helpline from October 25 through November 15. Lessons learned centered on hours of availability, staffing, use of an automated phone system and email communication, tracking results, and billing issues. A successful patient notification and follow-up effort requires a multidisciplinary team, internal and external communication, collection of data over an extended period, and coordination of patient information.

  10. 'Scaling-up is a craft not a science': Catalysing scale-up of health innovations in Ethiopia, India and Nigeria.

    PubMed

    Spicer, Neil; Bhattacharya, Dipankar; Dimka, Ritgak; Fanta, Feleke; Mangham-Jefferies, Lindsay; Schellenberg, Joanna; Tamire-Woldemariam, Addis; Walt, Gill; Wickremasinghe, Deepthi

    2014-11-01

    Donors and other development partners commonly introduce innovative practices and technologies to improve health in low and middle income countries. Yet many innovations that are effective in improving health and survival are slow to be translated into policy and implemented at scale. Understanding the factors influencing scale-up is important. We conducted a qualitative study involving 150 semi-structured interviews with government, development partners, civil society organisations and externally funded implementers, professional associations and academic institutions in 2012/13 to explore scale-up of innovative interventions targeting mothers and newborns in Ethiopia, the Indian state of Uttar Pradesh and the six states of northeast Nigeria, which are settings with high burdens of maternal and neonatal mortality. Interviews were analysed using a common analytic framework developed for cross-country comparison and themes were coded using Nvivo. We found that programme implementers across the three settings require multiple steps to catalyse scale-up. Advocating for government to adopt and finance health innovations requires: designing scalable innovations; embedding scale-up in programme design and allocating time and resources; building implementer capacity to catalyse scale-up; adopting effective approaches to advocacy; presenting strong evidence to support government decision making; involving government in programme design; invoking policy champions and networks; strengthening harmonisation among external programmes; aligning innovations with health systems and priorities. Other steps include: supporting government to develop policies and programmes and strengthening health systems and staff; promoting community uptake by involving media, community leaders, mobilisation teams and role models. We conclude that scale-up has no magic bullet solution - implementers must embrace multiple activities, and require substantial support from donors and governments in

  11. Costs and outcomes of VCT delivery models in the context of scaling up services in Indonesia.

    PubMed

    Siregar, Adiatma Y M; Komarudin, Dindin; Wisaksana, Rudi; van Crevel, Reinout; Baltussen, Rob

    2011-02-01

    To evaluate costs and outcomes of voluntary counselling and testing (VCT) service delivery models in urban Indonesia. We collected primary data on utilization, costs and outcomes of VCT services in a hospital clinic (568 clients), HIV community clinic (28 clients), sexually transmitted infection (STI) community clinic (784 clients) and prison clinic (574 clients) in Bandung, Indonesia, in the period January 2008-April 2009. The hospital clinic diagnosed the highest proportion and absolute number of HIV infections, but with the lowest average CD4 cell count and with the highest associated travelling and waiting time. The prison clinic detected fewer cases, but at an earlier stage, and all enrolled in HIV care. The community clinics detected the smallest number of cases, and only 0-8% enrolled in HIV care. The unit cost per VCT was highest in the hospital clinic (US$74), followed by the STI community clinic (US$65), the HIV community clinic (US$39) and the prison (US$23). We propose a reorientation of the delivery models for VCT and related HIV/AIDS treatment in this setting. We call for the scaling up of community clinics for VCT to improve access, promote earlier detection and to perform (early) treatment activities. This would reduce the burden of the hospital clinic to orient itself towards the treatment of AIDS patients. This is one of very few studies addressing this issue in Asia and the first of its kind in Indonesia, which has a rapidly growing HIV epidemic. The conceptual framework and overall conclusions may be relevant to other low-income settings. © 2010 Blackwell Publishing Ltd.

  12. Multiple data sets and modelling choices in a comparative LCA of disposable beverage cups.

    PubMed

    van der Harst, Eugenie; Potting, José; Kroeze, Carolien

    2014-10-01

    This study used multiple data sets and modelling choices in an environmental life cycle assessment (LCA) to compare typical disposable beverage cups made from polystyrene (PS), polylactic acid (PLA; bioplastic) and paper lined with bioplastic (biopaper). Incineration and recycling were considered as waste processing options, and for the PLA and biopaper cup also composting and anaerobic digestion. Multiple data sets and modelling choices were systematically used to calculate average results and the spread in results for each disposable cup in eleven impact categories. The LCA results of all combinations of data sets and modelling choices consistently identify three processes that dominate the environmental impact: (1) production of the cup's basic material (PS, PLA, biopaper), (2) cup manufacturing, and (3) waste processing. The large spread in results for impact categories strongly overlaps among the cups, however, and therefore does not allow a preference for one type of cup material. Comparison of the individual waste treatment options suggests some cautious preferences. The average waste treatment results indicate that recycling is the preferred option for PLA cups, followed by anaerobic digestion and incineration. Recycling is slightly preferred over incineration for the biopaper cups. There is no preferred waste treatment option for the PS cups. Taking into account the spread in waste treatment results for all cups, however, none of these preferences for waste processing options can be justified. The only exception is composting, which is least preferred for both PLA and biopaper cups. Our study illustrates that using multiple data sets and modelling choices can lead to considerable spread in LCA results. This makes comparing products more complex, but the outcomes more robust. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Single well thermal tracer test, a new experimental set up for characterizing thermal transport in fractured media

    NASA Astrophysics Data System (ADS)

    de La Bernardie, Jérôme; Bour, Olivier; Guihéneuf, Nicolas; Chatton, Eliot; Labasque, Thierry; Longuevergne, Laurent; Le Lay, Hugo; Koch, Floriant; Gerard, Marie-Françoise; Le Borgne, Tanguy

    2017-04-01

    Thermal transport in fractured media depends on the hydrological properties of fractures and thermal characteristics of rock. Tracer tests using heat as tracer can thus be a good alternative to characterize fractured media for shallow geothermal needs. This study investigates the possibility of implementing a new thermal tracer test set up, the single well thermal tracer test, to characterize hydraulic and thermal transport properties of fractured crystalline rock. The experimental setup is based on injecting hot water in a fracture isolated by a double straddle packer in the borehole while pumping and monitoring the temperature in a fracture crossing the same borehole at greater elevation. One difficulty comes from the fact that injection and withdrawal are achieved in the same borehole involving thermal losses along the injection tube that may disturb the heat recovery signal. To be able to well localize the heat influx, we implemented a Fiber-Optic Distributed Temperature Sensing (FO-DTS) which allows the temperature monitoring with high spatial and temporal resolution (29 centimeters and 30 seconds respectively). Several tests, at different pumping and injection rates, were performed in a crystalline rock aquifer at the experimental site of Ploemeur (H+ observatory network). We show through signal processing how the thermal breakthrough may be extracted thanks to Fiber-Optic distributed temperature measurements. In particular, we demonstrate how detailed distributed temperature measurements were useful to identify different inflows and to estimate how much heat was transported and stored within the fractures network. Thermal breakthrough curves of single well thermal tracer tests were then interpreted with a simple analytical model to characterize hydraulic and thermal characteristics of the fractured media. We finally discuss the advantages of these tests compared to cross-borehole thermal tracer tests.

  14. Index-based groundwater vulnerability mapping models using hydrogeological settings: A critical evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, Prashant, E-mail: prashantkumar@csio.res.in; Academy of Scientific and Innovative Research—CSIO, Chandigarh 160030; Bansod, Baban K.S.

    2015-02-15

    Groundwater vulnerability maps are useful for decision making in land use planning and water resource management. This paper reviews the various groundwater vulnerability assessment models developed across the world. Each model has been evaluated in terms of its pros and cons and the environmental conditions of its application. The paper further discusses the validation techniques used for the generated vulnerability maps by various models. Implicit challenges associated with the development of the groundwater vulnerability assessment models have also been identified with scientific considerations to the parameter relations and their selections. - Highlights: • Various index-based groundwater vulnerability assessment models havemore » been discussed. • A comparative analysis of the models and its applicability in different hydrogeological settings has been discussed. • Research problems of underlying vulnerability assessment models are also reported in this review paper.« less

  15. Quantized Step-up Model for Evaluation of Internship in Teaching of Prospective Science Teachers.

    ERIC Educational Resources Information Center

    Sindhu, R. S.

    2002-01-01

    Describes the quantized step-up model developed for the evaluation purposes of internship in teaching which is an analogous model of the atomic structure. Assesses prospective teachers' abilities in lesson delivery. (YDS)

  16. Approximation Set of the Interval Set in Pawlak's Space

    PubMed Central

    Wang, Jin; Wang, Guoyin

    2014-01-01

    The interval set is a special set, which describes uncertainty of an uncertain concept or set Z with its two crisp boundaries named upper-bound set and lower-bound set. In this paper, the concept of similarity degree between two interval sets is defined at first, and then the similarity degrees between an interval set and its two approximations (i.e., upper approximation set R¯(Z) and lower approximation set R_(Z)) are presented, respectively. The disadvantages of using upper-approximation set R¯(Z) or lower-approximation set R_(Z) as approximation sets of the uncertain set (uncertain concept) Z are analyzed, and a new method for looking for a better approximation set of the interval set Z is proposed. The conclusion that the approximation set R 0.5(Z) is an optimal approximation set of interval set Z is drawn and proved successfully. The change rules of R 0.5(Z) with different binary relations are analyzed in detail. Finally, a kind of crisp approximation set of the interval set Z is constructed. We hope this research work will promote the development of both the interval set model and granular computing theory. PMID:25177721

  17. MUTILS - a set of efficient modeling tools for multi-core CPUs implemented in MEX

    NASA Astrophysics Data System (ADS)

    Krotkiewski, Marcin; Dabrowski, Marcin

    2013-04-01

    The need for computational performance is common in scientific applications, and in particular in numerical simulations, where high resolution models require efficient processing of large amounts of data. Especially in the context of geological problems the need to increase the model resolution to resolve physical and geometrical complexities seems to have no limits. Alas, the performance of new generations of CPUs does not improve any longer by simply increasing clock speeds. Current industrial trends are to increase the number of computational cores. As a result, parallel implementations are required in order to fully utilize the potential of new processors, and to study more complex models. We target simulations on small to medium scale shared memory computers: laptops and desktop PCs with ~8 CPU cores and up to tens of GB of memory to high-end servers with ~50 CPU cores and hundereds of GB of memory. In this setting MATLAB is often the environment of choice for scientists that want to implement their own models with little effort. It is a useful general purpose mathematical software package, but due to its versatility some of its functionality is not as efficient as it could be. In particular, the challanges of modern multi-core architectures are not fully addressed. We have developed MILAMIN 2 - an efficient FEM modeling environment written in native MATLAB. Amongst others, MILAMIN provides functions to define model geometry, generate and convert structured and unstructured meshes (also through interfaces to external mesh generators), compute element and system matrices, apply boundary conditions, solve the system of linear equations, address non-linear and transient problems, and perform post-processing. MILAMIN strives to combine the ease of code development and the computational efficiency. Where possible, the code is optimized and/or parallelized within the MATLAB framework. Native MATLAB is augmented with the MUTILS library - a set of MEX functions that

  18. Setting up a clinical trial for a novel disease: a case study of the Doxycycline for the Treatment of Nodding Syndrome Trial – challenges, enablers and lessons learned

    PubMed Central

    Anguzu, Ronald; Akun, Pamela R; Ogwang, Rodney; Shour, Abdul Rahman; Sekibira, Rogers; Ningwa, Albert; Nakamya, Phellister; Abbo, Catherine; Mwaka, Amos D; Opar, Bernard; Idro, Richard

    2018-01-01

    ABSTRACT A large amount of preparation goes into setting up trials. Different challenges and lessons are experienced. Our trial, testing a treatment for nodding syndrome, an acquired neurological disorder of unknown cause affecting thousands of children in Eastern Africa, provides a unique case study. As part of a study to determine the aetiology, understand pathogenesis and develop specific treatment, we set up a clinical trial in a remote district hospital in Uganda. This paper describes our experiences and documents supportive structures (enablers), challenges faced and lessons learned during set-up of the trial. Protocol development started in September 2015 with phased recruitment of a critical study team. The team spent 12 months preparing trial documents, procurement and training on procedures. Potential recruitment sites were pre-visited, and district and local leaders met as key stakeholders. Key enablers were supportive local leadership and investment by the district and Ministry of Health. The main challenges were community fears about nodding syndrome, adverse experiences of the community during previous research and political involvement. Other challenges included the number and delays in protocol approvals and lengthy procurement processes. This hard-to-reach area has frequent power and Internet fluctuations, which may affect cold chains for study samples, communication and data management. These concerns decreased with a pilot community engagement programme. Experiences and lessons learnt can reduce the duration of processes involved in trial-site set-up. A programme of community engagement and local leader involvement may be key to the success of a trial and in reducing community opposition towards participation in research. PMID:29382251

  19. Adaptation of clinical prediction models for application in local settings.

    PubMed

    Kappen, Teus H; Vergouwe, Yvonne; van Klei, Wilton A; van Wolfswinkel, Leo; Kalkman, Cor J; Moons, Karel G M

    2012-01-01

    When planning to use a validated prediction model in new patients, adequate performance is not guaranteed. For example, changes in clinical practice over time or a different case mix than the original validation population may result in inaccurate risk predictions. To demonstrate how clinical information can direct updating a prediction model and development of a strategy for handling missing predictor values in clinical practice. A previously derived and validated prediction model for postoperative nausea and vomiting was updated using a data set of 1847 patients. The update consisted of 1) changing the definition of an existing predictor, 2) reestimating the regression coefficient of a predictor, and 3) adding a new predictor to the model. The updated model was then validated in a new series of 3822 patients. Furthermore, several imputation models were considered to handle real-time missing values, so that possible missing predictor values could be anticipated during actual model use. Differences in clinical practice between our local population and the original derivation population guided the update strategy of the prediction model. The predictive accuracy of the updated model was better (c statistic, 0.68; calibration slope, 1.0) than the original model (c statistic, 0.62; calibration slope, 0.57). Inclusion of logistical variables in the imputation models, besides observed patient characteristics, contributed to a strategy to deal with missing predictor values at the time of risk calculation. Extensive knowledge of local, clinical processes provides crucial information to guide the process of adapting a prediction model to new clinical practices.

  20. An Efficient Soft Set-Based Approach for Conflict Analysis

    PubMed Central

    Sutoyo, Edi; Mungad, Mungad; Hamid, Suraya; Herawan, Tutut

    2016-01-01

    Conflict analysis has been used as an important tool in economic, business, governmental and political dispute, games, management negotiations, military operations and etc. There are many mathematical formal models have been proposed to handle conflict situations and one of the most popular is rough set theory. With the ability to handle vagueness from the conflict data set, rough set theory has been successfully used. However, computational time is still an issue when determining the certainty, coverage, and strength of conflict situations. In this paper, we present an alternative approach to handle conflict situations, based on some ideas using soft set theory. The novelty of the proposed approach is that, unlike in rough set theory that uses decision rules, it is based on the concept of co-occurrence of parameters in soft set theory. We illustrate the proposed approach by means of a tutorial example of voting analysis in conflict situations. Furthermore, we elaborate the proposed approach on real world dataset of political conflict in Indonesian Parliament. We show that, the proposed approach achieves lower computational time as compared to rough set theory of up to 3.9%. PMID:26928627

  1. An Efficient Soft Set-Based Approach for Conflict Analysis.

    PubMed

    Sutoyo, Edi; Mungad, Mungad; Hamid, Suraya; Herawan, Tutut

    2016-01-01

    Conflict analysis has been used as an important tool in economic, business, governmental and political dispute, games, management negotiations, military operations and etc. There are many mathematical formal models have been proposed to handle conflict situations and one of the most popular is rough set theory. With the ability to handle vagueness from the conflict data set, rough set theory has been successfully used. However, computational time is still an issue when determining the certainty, coverage, and strength of conflict situations. In this paper, we present an alternative approach to handle conflict situations, based on some ideas using soft set theory. The novelty of the proposed approach is that, unlike in rough set theory that uses decision rules, it is based on the concept of co-occurrence of parameters in soft set theory. We illustrate the proposed approach by means of a tutorial example of voting analysis in conflict situations. Furthermore, we elaborate the proposed approach on real world dataset of political conflict in Indonesian Parliament. We show that, the proposed approach achieves lower computational time as compared to rough set theory of up to 3.9%.

  2. USE OF ROUGH SETS AND SPECTRAL DATA FOR BUILDING PREDICTIVE MODELS OF REACTION RATE CONSTANTS

    EPA Science Inventory

    A model for predicting the log of the rate constants for alkaline hydrolysis of organic esters has been developed with the use of gas-phase min-infrared library spectra and a rule-building software system based on the mathematical theory of rough sets. A diverse set of 41 esters ...

  3. Be a Healthy Role Model for Children: 10 Tips for Setting Good Examples

    MedlinePlus

    ... model for children 10 tips for setting good examples You are the most important influence on your ... make mealtime a family time! 1 show by example Eat vegetables, fruits, and whole grains with meals ...

  4. How to Set up Oral Homework: A Case of Limited Technology

    ERIC Educational Resources Information Center

    Mendez, Elba

    2010-01-01

    Homework usually consists of the learners' written account of how they interpreted a task set by the teacher, and is generally defined as out-of-class assignments that are handed in for the instructor to grade. Learners may work individually or with partners to answer simple or challenging linguistic exercises, sketch out a mind map, or develop a…

  5. Sensitivity of the Properties of Ruthenium “Blue Dimer” to Method, Basis Set, and Continuum Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ozkanlar, Abdullah; Clark, Aurora E.

    2012-05-23

    The ruthenium “blue dimer” [(bpy)2RuIIIOH2]2O4+ is best known as the first well-defined molecular catalyst for water oxidation. It has been subject to numerous computational studies primarily employing density functional theory. However, those studies have been limited in the functionals, basis sets, and continuum models employed. The controversy in the calculated electronic structure and the reaction energetics of this catalyst highlights the necessity of benchmark calculations that explore the role of density functionals, basis sets, and continuum models upon the essential features of blue-dimer reactivity. In this paper, we report Kohn-Sham complete basis set (KS-CBS) limit extrapolations of the electronic structuremore » of “blue dimer” using GGA (BPW91 and BP86), hybrid-GGA (B3LYP), and meta-GGA (M06-L) density functionals. The dependence of solvation free energy corrections on the different cavity types (UFF, UA0, UAHF, UAKS, Bondi, and Pauling) within polarizable and conductor-like polarizable continuum model has also been investigated. The most common basis sets of double-zeta quality are shown to yield results close to the KS-CBS limit; however, large variations are observed in the reaction energetics as a function of density functional and continuum cavity model employed.« less

  6. [Multiple sclerosis epidemiological situation update: pertinence and set-up of a population based registry of new cases in Catalonia].

    PubMed

    Otero, S; Batlle, J; Bonaventura, I; Brieva, Ll; Bufill, E; Cano, A; Carmona, O; Escartín, A; Marco, M; Moral, E; Munteis, E; Nos, C; Pericot, I; Perkal, H; Ramió-Torrentà, Ll; Ramo-Tello, C; Saiz, A; Sastre-Garriga, J; Tintoré, M; Vaqué, J; Montalban, X

    2010-05-16

    The first epidemiological studies on multiple sclerosis (MS) around the world pictured a north to south latitudinal gradient that led to the first genetic and environmental pathogenic hypothesis. MS incidence seems to be increasing during the past 20 years based on recent data from prospective studies performed in Europe, America and Asia. This phenomenon could be explained by a better case ascertainment as well as a change in causal factors. The few prospective studies in our area together with the increase in the disease in other regions, justifies an epidemiological MS project in order to describe the incidence and temporal trends of MS. A prospective multicenter MS registry has been established according to the actual requirements of an epidemiological surveillance system. Case definition is based on the fulfillment of the McDonald diagnostic criteria. The registry setting is the geographical area of Cataluna (northeastern Spain), using a wide network of hospitals specialized in MS management. Recent epidemiological studies have described an increase in MS incidence. In order to contrast this finding in our area, we consider appropriate to set up a population based registry.

  7. Exhaustively characterizing feasible logic models of a signaling network using Answer Set Programming.

    PubMed

    Guziolowski, Carito; Videla, Santiago; Eduati, Federica; Thiele, Sven; Cokelaer, Thomas; Siegel, Anne; Saez-Rodriguez, Julio

    2013-09-15

    Logic modeling is a useful tool to study signal transduction across multiple pathways. Logic models can be generated by training a network containing the prior knowledge to phospho-proteomics data. The training can be performed using stochastic optimization procedures, but these are unable to guarantee a global optima or to report the complete family of feasible models. This, however, is essential to provide precise insight in the mechanisms underlaying signal transduction and generate reliable predictions. We propose the use of Answer Set Programming to explore exhaustively the space of feasible logic models. Toward this end, we have developed caspo, an open-source Python package that provides a powerful platform to learn and characterize logic models by leveraging the rich modeling language and solving technologies of Answer Set Programming. We illustrate the usefulness of caspo by revisiting a model of pro-growth and inflammatory pathways in liver cells. We show that, if experimental error is taken into account, there are thousands (11 700) of models compatible with the data. Despite the large number, we can extract structural features from the models, such as links that are always (or never) present or modules that appear in a mutual exclusive fashion. To further characterize this family of models, we investigate the input-output behavior of the models. We find 91 behaviors across the 11 700 models and we suggest new experiments to discriminate among them. Our results underscore the importance of characterizing in a global and exhaustive manner the family of feasible models, with important implications for experimental design. caspo is freely available for download (license GPLv3) and as a web service at http://caspo.genouest.org/. Supplementary materials are available at Bioinformatics online. santiago.videla@irisa.fr.

  8. Exhaustively characterizing feasible logic models of a signaling network using Answer Set Programming

    PubMed Central

    Guziolowski, Carito; Videla, Santiago; Eduati, Federica; Thiele, Sven; Cokelaer, Thomas; Siegel, Anne; Saez-Rodriguez, Julio

    2013-01-01

    Motivation: Logic modeling is a useful tool to study signal transduction across multiple pathways. Logic models can be generated by training a network containing the prior knowledge to phospho-proteomics data. The training can be performed using stochastic optimization procedures, but these are unable to guarantee a global optima or to report the complete family of feasible models. This, however, is essential to provide precise insight in the mechanisms underlaying signal transduction and generate reliable predictions. Results: We propose the use of Answer Set Programming to explore exhaustively the space of feasible logic models. Toward this end, we have developed caspo, an open-source Python package that provides a powerful platform to learn and characterize logic models by leveraging the rich modeling language and solving technologies of Answer Set Programming. We illustrate the usefulness of caspo by revisiting a model of pro-growth and inflammatory pathways in liver cells. We show that, if experimental error is taken into account, there are thousands (11 700) of models compatible with the data. Despite the large number, we can extract structural features from the models, such as links that are always (or never) present or modules that appear in a mutual exclusive fashion. To further characterize this family of models, we investigate the input–output behavior of the models. We find 91 behaviors across the 11 700 models and we suggest new experiments to discriminate among them. Our results underscore the importance of characterizing in a global and exhaustive manner the family of feasible models, with important implications for experimental design. Availability: caspo is freely available for download (license GPLv3) and as a web service at http://caspo.genouest.org/. Supplementary information: Supplementary materials are available at Bioinformatics online. Contact: santiago.videla@irisa.fr PMID:23853063

  9. Musculoskeletal Simulation Model Generation from MRI Data Sets and Motion Capture Data

    NASA Astrophysics Data System (ADS)

    Schmid, Jérôme; Sandholm, Anders; Chung, François; Thalmann, Daniel; Delingette, Hervé; Magnenat-Thalmann, Nadia

    Today computer models and computer simulations of the musculoskeletal system are widely used to study the mechanisms behind human gait and its disorders. The common way of creating musculoskeletal models is to use a generic musculoskeletal model based on data derived from anatomical and biomechanical studies of cadaverous specimens. To adapt this generic model to a specific subject, the usual approach is to scale it. This scaling has been reported to introduce several errors because it does not always account for subject-specific anatomical differences. As a result, a novel semi-automatic workflow is proposed that creates subject-specific musculoskeletal models from magnetic resonance imaging (MRI) data sets and motion capture data. Based on subject-specific medical data and a model-based automatic segmentation approach, an accurate modeling of the anatomy can be produced while avoiding the scaling operation. This anatomical model coupled with motion capture data, joint kinematics information, and muscle-tendon actuators is finally used to create a subject-specific musculoskeletal model.

  10. Responsive Urban Models by Processing Sets of Heterogeneous Data

    NASA Astrophysics Data System (ADS)

    Calvano, M.; Casale, A.; Ippoliti, E.; Guadagnoli, F.

    2018-05-01

    This paper presents some steps in experimentation aimed at describing urban spaces made following the series of earthquakes that affected a vast area of central Italy starting on 24 August 2016. More specifically, these spaces pertain to historical centres of limited size and case studies that can be called "problematic" (due to complex morphological and settlement conditions, because they are difficult to access, or because they have been affected by calamitous events, etc.). The main objectives were to verify the use of sets of heterogeneous data that are already largely available to define a workflow and develop procedures that would allow some of the steps to be automated as much as possible. The most general goal was to use the experimentation to define a methodology to approach the problem aimed at developing descriptive responsive models of the urban space, that is, morphological and computer-based models capable of being modified in relation to the constantly updated flow of input data.

  11. Partition-based acquisition model for speed up navigated beta-probe surface imaging

    NASA Astrophysics Data System (ADS)

    Monge, Frédéric; Shakir, Dzhoshkun I.; Navab, Nassir; Jannin, Pierre

    2016-03-01

    Although gross total resection in low-grade glioma surgery leads to a better patient outcome, the in-vivo control of resection borders remains challenging. For this purpose, navigated beta-probe systems combined with 18F-based radiotracer, relying on activity distribution surface estimation, have been proposed to generate reconstructed images. The clinical relevancy has been outlined by early studies where intraoperative functional information is leveraged although inducing low spatial resolution in reconstruction. To improve reconstruction quality, multiple acquisition models have been proposed. They involve the definition of attenuation matrix for designing radiation detection physics. Yet, they require high computational power for efficient intraoperative use. To address the problem, we propose a new acquisition model called Partition Model (PM) considering an existing model where coefficients of the matrix are taken from a look-up table (LUT). Our model is based upon the division of the LUT into averaged homogeneous values for assigning attenuation coefficients. We validated our model using in vitro datasets, where tumors and peri-tumoral tissues have been simulated. We compared our acquisition model with the o_-the-shelf LUT and the raw method. Acquisition models outperformed the raw method in term of tumor contrast (7.97:1 mean T:B) but with a difficulty of real-time use. Both acquisition models reached the same detection performance with references (0.8 mean AUC and 0.77 mean NCC), where PM slightly improves the mean tumor contrast up to 10.1:1 vs 9.9:1 with the LUT model and more importantly, it reduces the mean computation time by 7.5%. Our model gives a faster solution for an intraoperative use of navigated beta-probe surface imaging system, with improved image quality.

  12. Teacher-Led Professional Development: A Proposal for a Bottom-up Structure Approach

    ERIC Educational Resources Information Center

    Macias, Angela

    2017-01-01

    This article uses current research recommendations for teacher-led professional development as well as qualitative data from a set of grassroots conferences to propose a new model for bottom-up teacher-led professional development. This article argues that by providing a neutral space and recruiting expertise of local experts, a public sphere can…

  13. Impact of CAMEX-4 Data Sets for Hurricane Forecasts using a Global Model

    NASA Technical Reports Server (NTRS)

    Kamineni, Rupa; Krishnamurti, T. N.; Pattnaik, S.; Browell, Edward V.; Ismail, Syed; Ferrare, Richard A.

    2005-01-01

    This study explores the impact on hurricane data assimilation and forecasts from the use of dropsondes and remote-sensed moisture profiles from the airborne Lidar Atmospheric Sensing Experiment (LASE) system. We show that the use of these additional data sets, above those from the conventional world weather watch, has a positive impact on hurricane predictions. The forecast tracks and intensity from the experiments show a marked improvement compared to the control experiment where such data sets were excluded. A study of the moisture budget in these hurricanes showed enhanced evaporation and precipitation over the storm area. This resulted in these data sets making a large impact on the estimate of mass convergence and moisture fluxes, which were much smaller in the control runs. Overall this study points to the importance of high vertical resolution humidity data sets for improved model results. We note that the forecast impact from the moisture profiling data sets for some of the storms is even larger than the impact from the use of dropwindsonde based winds.

  14. Mathematical modeling of vibration processes in reinforced concrete structures for setting up crack initiation monitoring

    NASA Astrophysics Data System (ADS)

    Bykov, A. A.; Matveenko, B. P.; Serovaev, G. S.; Shardakov, I. N.; Shestakov, A. P.

    2015-03-01

    The contemporary construction industry is based on the use of reinforced concrete structures, but emergency situations resulting in fracture can arise in their exploitation. In a majority of cases, reinforced concrete fracture is realized as the process of crack formation and development. As a rule, the appearance of the first cracks does not lead to the complete loss of the carrying capacity but is a fracture precursor. One method for ensuring the safe operation of building structures is based on crack initiation monitoring. A vibration method for the monitoring of reinforced concrete structures is justified in this paper. An example of a reinforced concrete beam is used to consider all stages related to the analysis of the behavior of natural frequencies in the development of a crack-shaped defect and the use of the obtained numerical results for the vibration test method. The efficiency of the method is illustrated by the results of modeling of the physical part of the method related to the analysis of the natural frequency evolution as a response to the impact action in the crack development process.

  15. Teaching and Learning Structural Geology Using SketchUp

    NASA Astrophysics Data System (ADS)

    Rey, Patrice

    2017-04-01

    The books and maps we read, the posters we pin on our walls, the TV sets and computer monitors we spend hours watching, the white (or black) boards we use to teach, all reduce our world into planar images. As a result, and through years of oblivious practice, our brain is conditioned to understand the world in two dimensions (2D) only. As structural geologists, we know that the most challenging aspect of teaching and learning structural geology is that we need to be able to mentally manipulate 2D and three-dimensional (3D) objects. Although anyone can learn through practice the art of spatial visualisation, the fact remains that the initial stages of learning structural geology are for many students very challenging, as we naively use 2D images to teach 3D concepts. While interactive 3D holography is not far away, some inexpensive tools already exist allowing us to generate interactive computer images, the free rotation, scaling and manipulation of which can help students to quickly grasp the geometry and internal architecture of 3D objects. Recently, I have experimented with SketchUp (works on Mac and Windows). SketchUp was initially released in 2000 by @Last Software, as a 3D modelling tool for architects, designers and filmmakers. It was acquired by Google in 2006 to further the development of GoogleEarth. Google released SketchUp for free, and provided a portal named 3D Warehouse for users to share their models. Google sold SketchUp to Trimble Navigation in 2012, which added Extension Warehouse for users to distribute add-ons. SketchUp models can be exported in a number of formats including .dae (digital asset exchange) useful to embed interactive 3D models into iBooks and html5 documents, and .kmz (keyhole markup language zipped) to embed interactive 3D models and cross-sections into GoogleEarth. SketchUp models can be exported into 3D pdf through the add-on SimLab, and .stl for 3D printing through the add-on SketchUp STL. A free licence is available for

  16. A Model for Teaching Rational Behavior Therapy in a Public School Setting.

    ERIC Educational Resources Information Center

    Patton, Patricia L.

    A training model for the use of rational behavior therapy (RBT) with emotionally disturbed adolescents in a school setting is presented, including a structured, didactic format consisting of five basic RBT training techniques. The training sessions, lasting 10 weeks each, are described. Also presented is the organization for the actual classroom…

  17. The role of empathy and emotional intelligence in nurses' communication attitudes using regression models and fuzzy-set qualitative comparative analysis models.

    PubMed

    Giménez-Espert, María Del Carmen; Prado-Gascó, Vicente Javier

    2018-03-01

    To analyse link between empathy and emotional intelligence as a predictor of nurses' attitudes towards communication while comparing the contribution of emotional aspects and attitudinal elements on potential behaviour. Nurses' attitudes towards communication, empathy and emotional intelligence are key skills for nurses involved in patient care. There are currently no studies analysing this link, and its investigation is needed because attitudes may influence communication behaviours. Correlational study. To attain this goal, self-reported instruments (attitudes towards communication of nurses, trait emotional intelligence (Trait Emotional Meta-Mood Scale) and Jefferson Scale of Nursing Empathy (Jefferson Scale Nursing Empathy) were collected from 460 nurses between September 2015-February 2016. Two different analytical methodologies were used: traditional regression models and fuzzy-set qualitative comparative analysis models. The results of the regression model suggest that cognitive dimensions of attitude are a significant and positive predictor of the behavioural dimension. The perspective-taking dimension of empathy and the emotional-clarity dimension of emotional intelligence were significant positive predictors of the dimensions of attitudes towards communication, except for the affective dimension (for which the association was negative). The results of the fuzzy-set qualitative comparative analysis models confirm that the combination of high levels of cognitive dimension of attitudes, perspective-taking and emotional clarity explained high levels of the behavioural dimension of attitude. Empathy and emotional intelligence are predictors of nurses' attitudes towards communication, and the cognitive dimension of attitude is a good predictor of the behavioural dimension of attitudes towards communication of nurses in both regression models and fuzzy-set qualitative comparative analysis. In general, the fuzzy-set qualitative comparative analysis models appear

  18. An intelligent knowledge mining model for kidney cancer using rough set theory.

    PubMed

    Durai, M A Saleem; Acharjya, D P; Kannan, A; Iyengar, N Ch Sriman Narayana

    2012-01-01

    Medical diagnosis processes vary in the degree to which they attempt to deal with different complicating aspects of diagnosis such as relative importance of symptoms, varied symptom pattern and the relation between diseases themselves. Rough set approach has two major advantages over the other methods. First, it can handle different types of data such as categorical, numerical etc. Secondly, it does not make any assumption like probability distribution function in stochastic modeling or membership grade function in fuzzy set theory. It involves pattern recognition through logical computational rules rather than approximating them through smooth mathematical functional forms. In this paper we use rough set theory as a data mining tool to derive useful patterns and rules for kidney cancer faulty diagnosis. In particular, the historical data of twenty five research hospitals and medical college is used for validation and the results show the practical viability of the proposed approach.

  19. The effects of climate downscaling technique and observational data set on modeled ecological responses

    Treesearch

    Afshin Pourmokhtarian; Charles T. Driscoll; John L. Campbell; Katharine Hayhoe; Anne M. K. Stoner

    2016-01-01

    Assessments of future climate change impacts on ecosystems typically rely on multiple climate model projections, but often utilize only one downscaling approach trained on one set of observations. Here, we explore the extent to which modeled biogeochemical responses to changing climate are affected by the selection of the climate downscaling method and training...

  20. Sample size and classification error for Bayesian change-point models with unlabelled sub-groups and incomplete follow-up.

    PubMed

    White, Simon R; Muniz-Terrera, Graciela; Matthews, Fiona E

    2018-05-01

    Many medical (and ecological) processes involve the change of shape, whereby one trajectory changes into another trajectory at a specific time point. There has been little investigation into the study design needed to investigate these models. We consider the class of fixed effect change-point models with an underlying shape comprised two joined linear segments, also known as broken-stick models. We extend this model to include two sub-groups with different trajectories at the change-point, a change and no change class, and also include a missingness model to account for individuals with incomplete follow-up. Through a simulation study, we consider the relationship of sample size to the estimates of the underlying shape, the existence of a change-point, and the classification-error of sub-group labels. We use a Bayesian framework to account for the missing labels, and the analysis of each simulation is performed using standard Markov chain Monte Carlo techniques. Our simulation study is inspired by cognitive decline as measured by the Mini-Mental State Examination, where our extended model is appropriate due to the commonly observed mixture of individuals within studies who do or do not exhibit accelerated decline. We find that even for studies of modest size ( n = 500, with 50 individuals observed past the change-point) in the fixed effect setting, a change-point can be detected and reliably estimated across a range of observation-errors.

  1. Modeling the effects of diagenesis on carbonate clumped-isotope values in deep- and shallow-water settings

    NASA Astrophysics Data System (ADS)

    Stolper, Daniel A.; Eiler, John M.; Higgins, John A.

    2018-04-01

    The measurement of multiply isotopically substituted ('clumped isotope') carbonate groups provides a way to reconstruct past mineral formation temperatures. However, dissolution-reprecipitation (i.e., recrystallization) reactions, which commonly occur during sedimentary burial, can alter a sample's clumped-isotope composition such that it partially or wholly reflects deeper burial temperatures. Here we derive a quantitative model of diagenesis to explore how diagenesis alters carbonate clumped-isotope values. We apply the model to a new dataset from deep-sea sediments taken from Ocean Drilling Project site 807 in the equatorial Pacific. This dataset is used to ground truth the model. We demonstrate that the use of the model with accompanying carbonate clumped-isotope and carbonate δ18O values provides new constraints on both the diagenetic history of deep-sea settings as well as past equatorial sea-surface temperatures. Specifically, the combination of the diagenetic model and data support previous work that indicates equatorial sea-surface temperatures were warmer in the Paleogene as compared to today. We then explore whether the model is applicable to shallow-water settings commonly preserved in the rock record. Using a previously published dataset from the Bahamas, we demonstrate that the model captures the main trends of the data as a function of burial depth and thus appears applicable to a range of depositional settings.

  2. Modelling of pore coarsening in the high burn-up structure of UO2 fuel

    NASA Astrophysics Data System (ADS)

    Veshchunov, M. S.; Tarasov, V. I.

    2017-05-01

    The model for coalescence of randomly distributed immobile pores owing to their growth and impingement, applied by the authors earlier to consideration of the porosity evolution in the high burn-up structure (HBS) at the UO2 fuel pellet periphery (rim zone), was further developed and validated. Predictions of the original model, taking into consideration only binary impingements of growing immobile pores, qualitatively correctly describe the decrease of the pore number density with the increase of the fractional porosity, however notably underestimate the coalescence rate at high burn-ups attained in the outmost region of the rim zone. In order to overcome this discrepancy, the next approximation of the model taking into consideration triple impingements of growing pores was developed. The advanced model provides a reasonable consent with experimental data, thus demonstrating the validity of the proposed pore coarsening mechanism in the HBS.

  3. Image reconstructions from super-sampled data sets with resolution modeling in PET imaging.

    PubMed

    Li, Yusheng; Matej, Samuel; Metzler, Scott D

    2014-12-01

    Spatial resolution in positron emission tomography (PET) is still a limiting factor in many imaging applications. To improve the spatial resolution for an existing scanner with fixed crystal sizes, mechanical movements such as scanner wobbling and object shifting have been considered for PET systems. Multiple acquisitions from different positions can provide complementary information and increased spatial sampling. The objective of this paper is to explore an efficient and useful reconstruction framework to reconstruct super-resolution images from super-sampled low-resolution data sets. The authors introduce a super-sampling data acquisition model based on the physical processes with tomographic, downsampling, and shifting matrices as its building blocks. Based on the model, we extend the MLEM and Landweber algorithms to reconstruct images from super-sampled data sets. The authors also derive a backprojection-filtration-like (BPF-like) method for the super-sampling reconstruction. Furthermore, they explore variant methods for super-sampling reconstructions: the separate super-sampling resolution-modeling reconstruction and the reconstruction without downsampling to further improve image quality at the cost of more computation. The authors use simulated reconstruction of a resolution phantom to evaluate the three types of algorithms with different super-samplings at different count levels. Contrast recovery coefficient (CRC) versus background variability, as an image-quality metric, is calculated at each iteration for all reconstructions. The authors observe that all three algorithms can significantly and consistently achieve increased CRCs at fixed background variability and reduce background artifacts with super-sampled data sets at the same count levels. For the same super-sampled data sets, the MLEM method achieves better image quality than the Landweber method, which in turn achieves better image quality than the BPF-like method. The authors also demonstrate

  4. A multivariate fall risk assessment model for VHA nursing homes using the minimum data set.

    PubMed

    French, Dustin D; Werner, Dennis C; Campbell, Robert R; Powell-Cope, Gail M; Nelson, Audrey L; Rubenstein, Laurence Z; Bulat, Tatjana; Spehar, Andrea M

    2007-02-01

    The purpose of this study was to develop a multivariate fall risk assessment model beyond the current fall Resident Assessment Protocol (RAP) triggers for nursing home residents using the Minimum Data Set (MDS). Retrospective, clustered secondary data analysis. National Veterans Health Administration (VHA) long-term care nursing homes (N = 136). The study population consisted of 6577 national VHA nursing home residents who had an annual assessment during FY 2005, identified from the MDS, as well as an earlier annual or admission assessment within a 1-year look-back period. A dichotomous multivariate model of nursing home residents coded with a fall on selected fall risk characteristics from the MDS, estimated with general estimation equations (GEE). There were 17 170 assessments corresponding to 6577 long-term care nursing home residents. The increased odds ratio (OR) of being classified as a faller relative to the omitted "dependent" category of activities of daily living (ADL) ranged from OR = 1.35 for "limited" ADL category up to OR = 1.57 for "extensive-2" ADL (P < .0001). Unsteady gait more than doubles the odds of being a faller (OR = 2.63, P < .0001). The use of assistive devices such as canes, walkers, or crutches, or the use of wheelchairs increases the odds of being a faller (OR = 1.17, P < .0005) or (OR = 1.19, P < .0002), respectively. Foot problems may also increase the odds of being a faller (OR = 1.26, P < .0016). Alzheimer's or other dementias also increase the odds of being classified as a faller (OR = 1.18, P < .0219) or (OR=1.22, P < .0001), respectively. In addition, anger (OR = 1.19, P < .0065); wandering (OR = 1.53, P < .0001); or use of antipsychotic medications (OR = 1.15, P < .0039), antianxiety medications (OR = 1.13, P < .0323), or antidepressant medications (OR = 1.39, P < .0001) was also associated with the odds of being a faller. This national study in one of the largest managed healthcare systems in the United States has empirically

  5. School Teams up for SSP Functional Models

    NASA Astrophysics Data System (ADS)

    Pignolet, G.; Lallemand, R.; Celeste, A.; von Muldau, H.

    2002-01-01

    Space Solar Power systems appear increasingly as one of the major solutions to the upcoming global energy crisis, by collecting solar energy in space where this is most easy, and sending it by microwave beam to the surface of the planet, where the need for controlled energy is located. While fully operational systems are still decades away, the need for major development efforts is with us now. Yet, for many decision-makers and for most of the public, SSP often still sounds like science fiction. Six functional demonstration systems, based on the Japanese SPS-2000 concept, have been built as a result of a cooperation between France and Japan, and they are currently used extensively, in Japan, in Europe and in North America, for executive presentations as well as for public exhibitions. There is demand for more models, both for science museums and for use by energy dedicated groups, and a senior high school in La Reunion, France, has picked up the challenge to make the production of such models an integrated practical school project for pre-college students. In December 2001, the administration and the teachers of the school have evaluated the feasibility of the project and eventually taken the go decision for the school year 2002- 2003, when for education purposes a temporary "school business company" will be incorporated with the goal to study and manufacture a limited series of professional quality SSP demonstration models, and to sell them world- wide to institutions and advocacy groups concerned with energy problems and with the environment. The different sections of the school will act as the different services of an integrated business : based on the current existing models, the electronic section will redesign the energy management system and the microwave projector module, while the mechanical section of the school will adapt and re-conceive the whole packaging of the demonstrator. The French and foreign language sections will write up a technical manual for

  6. Invariance, Artifact, and the Psychological Setting of Rasch's Model: Comments on Engelhard

    ERIC Educational Resources Information Center

    Michell, Joel

    2008-01-01

    In the following, I confine my comments mainly to the issue of invariance in relation to Rasch's model for dichotomous, ability test items. "It is senseless to seek in the logical process of mathematical elaboration a psychologically significant precision that was not present in the psychological setting of the problem." (Boring, 1920)

  7. PR-Set7 is degraded in a conditional Cul4A transgenic mouse model of lung cancer

    DOE PAGES

    Wang, Yang; Xu, Zhidong; Mao, Jian -Hua; ...

    2015-06-01

    Background and objective. Maintenance of genomic integrity is essential to ensure normal organismal development and to prevent diseases such as cancer. PR-Set7 (also known as Set8) is a cell cycle regulated enzyme that catalyses monomethylation of histone 4 at Lys20 (H4K20me1) to promote chromosome condensation and prevent DNA damage. Recent studies show that CRL4CDT2-mediated ubiquitylation of PR-Set7 leads to its degradation during S phase and after DNA damage. This might occur to ensure appropriate changes in chromosome structure during the cell cycle or to preserve genome integrity after DNA damage. Methods. We developed a new model of lung tumor developmentmore » in mice harboring a conditionally expressed allele of Cul4A. We have therefore used a mouse model to demonstrate for the first time that Cul4A is oncogenic in vivo. With this model, staining of PR-Set7 in the preneoplastic and tumor lesions in AdenoCre-induced mouse lungs was performed. Meanwhile we identified higher protein level changes of γ-tubulin and pericentrin by IHC. Results. The level of PR-Set7 down-regulated in the preneoplastic and adenocarcinomous lesions following over-expression of Cul4A. We also identified higher levels of the proteins pericentrin and γ-tubulin in Cul4A mouse lungs induced by AdenoCre. Conclusion. PR-Set7 is a direct target of Cul4A for degradation and involved in the formation of lung tumors in the conditional Cul4A transgenic mouse model.« less

  8. PR-Set7 is degraded in a conditional Cul4A transgenic mouse model of lung cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yang; Xu, Zhidong; Mao, Jian -Hua

    Background and objective. Maintenance of genomic integrity is essential to ensure normal organismal development and to prevent diseases such as cancer. PR-Set7 (also known as Set8) is a cell cycle regulated enzyme that catalyses monomethylation of histone 4 at Lys20 (H4K20me1) to promote chromosome condensation and prevent DNA damage. Recent studies show that CRL4CDT2-mediated ubiquitylation of PR-Set7 leads to its degradation during S phase and after DNA damage. This might occur to ensure appropriate changes in chromosome structure during the cell cycle or to preserve genome integrity after DNA damage. Methods. We developed a new model of lung tumor developmentmore » in mice harboring a conditionally expressed allele of Cul4A. We have therefore used a mouse model to demonstrate for the first time that Cul4A is oncogenic in vivo. With this model, staining of PR-Set7 in the preneoplastic and tumor lesions in AdenoCre-induced mouse lungs was performed. Meanwhile we identified higher protein level changes of γ-tubulin and pericentrin by IHC. Results. The level of PR-Set7 down-regulated in the preneoplastic and adenocarcinomous lesions following over-expression of Cul4A. We also identified higher levels of the proteins pericentrin and γ-tubulin in Cul4A mouse lungs induced by AdenoCre. Conclusion. PR-Set7 is a direct target of Cul4A for degradation and involved in the formation of lung tumors in the conditional Cul4A transgenic mouse model.« less

  9. An integrated health sector response to violence against women in Malaysia: lessons for supporting scale up.

    PubMed

    Colombini, Manuela; Mayhew, Susannah H; Ali, Siti Hawa; Shuib, Rashidah; Watts, Charlotte

    2012-07-24

    Malaysia has been at the forefront of the development and scale up of One-Stop Crisis Centres (OSCC) - an integrated health sector model that provides comprehensive care to women and children experiencing physical, emotional and sexual abuse. This study explored the strengths and challenges faced during the scaling up of the OSCC model to two States in Malaysia in order to identify lessons for supporting successful scale-up. In-depth interviews were conducted with health care providers, policy makers and key informants in 7 hospital facilities. This was complemented by a document analysis of hospital records and protocols. Data were coded and analysed using NVivo 7. The implementation of the OSCC model differed between hospital settings, with practise being influenced by organisational systems and constraints. Health providers generally tried to offer care to abused women, but they are not fully supported within their facility due to lack of training, time constraints, limited allocated budget, or lack of referral system to external support services. Non-specialised hospitals in both States struggled with a scarcity of specialised staff and limited referral options for abused women. Despite these challenges, even in more resource-constrained settings staff who took the initiative found it was possible to adapt to provide some level of OSCC services, such as referring women to local NGOs or community support groups, or training nurses to offer basic counselling. The national implementation of OSCC provides a potentially important source of support for women experiencing violence. Our findings confirm that pilot interventions for health sector responses to gender based violence can be scaled up only when there is a sound health infrastructure in place - in other words a supportive health system. Furthermore, the successful replication of the OSCC model in other similar settings requires that the model - and the system supporting it - needs to be flexible enough to

  10. PIV study of the wake of a model wind turbine transitioning between operating set points

    NASA Astrophysics Data System (ADS)

    Houck, Dan; Cowen, Edwin (Todd)

    2016-11-01

    Wind turbines are ideally operated at their most efficient tip speed ratio for a given wind speed. There is increasing interest, however, in operating turbines at other set points to increase the overall power production of a wind farm. Specifically, Goit and Meyers (2015) used LES to examine a wind farm optimized by unsteady operation of its turbines. In this study, the wake of a model wind turbine is measured in a water channel using PIV. We measure the wake response to a change in operational set point of the model turbine, e.g., from low to high tip speed ratio or vice versa, to examine how it might influence a downwind turbine. A modified torque transducer after Kang et al. (2010) is used to calibrate in situ voltage measurements of the model turbine's generator operating across a resistance to the torque on the generator. Changes in operational set point are made by changing the resistance or the flow speed, which change the rotation rate measured by an encoder. Single camera PIV on vertical planes reveals statistics of the wake at various distances downstream as the turbine transitions from one set point to another. From these measurements, we infer how the unsteady operation of a turbine may affect the performance of a downwind turbine as its incoming flow. National Science Foundation and the Atkinson Center for a Sustainable Future.

  11. Sensitivity study of the UHI in the city of Szeged (Hungary) to different offline simulation set-ups using SURFEX/TEB

    NASA Astrophysics Data System (ADS)

    Zsebeházi, Gabriella; Hamdi, Rafiq; Szépszó, Gabriella

    2015-04-01

    Urbanised areas modify the local climate due to the physical properties of surface subjects and their morphology. The urban effect on local climate and regional climate change interact, resulting in more serious climate change impacts (e.g., more heatwave events) over cities. Majority of people are now living in cities and thus, affected by these enhanced changes. Therefore, targeted adaptation and mitigation strategies in cities are of high importance. Regional climate models (RCMs) are sufficient tools for estimating future climate change of an area in detail, although most of them cannot represent the urban climate characteristics, because their spatial resolution is too coarse (in general 10-50 km) and they do not use a specific urban parametrization over urbanized areas. To describe the interactions between the urban surface and atmosphere on few km spatial scale, we use the externalised SURFEX land surface scheme including the TEB urban canopy model in offline mode (i.e. the interaction is only one-way). The driving atmospheric conditions highly influence the impact results, thus the good quality of these data is particularly essential. The overall aim of our research is to understand the behaviour of the impact model and its interaction with the forcing coming from the atmospheric model in order to reduce the biases, which can lead to qualified impact studies of climate change over urban areas. As a preliminary test, several short (few-day) 1 km resolution simulations are carried out over a domain covering a Hungarian town, Szeged, which is located at the flat southern part of Hungary. The atmospheric forcing is provided by ALARO (a new version of the limited-area model of the ARPEGE-IFS system running at the Royal Meteorological Institute of Belgium) applied over Hungary. The focal point of our investigations is the ability of SURFEX to simulate the diurnal evolution and spatial pattern of urban heat island (UHI). Different offline simulation set-ups have

  12. Using an Agenda Setting Model to Help Students Develop & Exercise Participatory Skills and Values

    ERIC Educational Resources Information Center

    Perry, Anthony D.; Wilkenfeld, Britt S.

    2006-01-01

    The Agenda Setting Model is a program component that can be used in courses to contribute to students' development as responsible, effective, and informed citizens. This model involves students in finding a unified voice to assert an agenda of issues that they find especially pressing. This is often the only time students experience such a…

  13. Pan-European stochastic flood event set

    NASA Astrophysics Data System (ADS)

    Kadlec, Martin; Pinto, Joaquim G.; He, Yi; Punčochář, Petr; Kelemen, Fanni D.; Manful, Desmond; Palán, Ladislav

    2017-04-01

    Impact Forecasting (IF), the model development center of Aon Benfield, has been developing a large suite of catastrophe flood models on probabilistic bases for individual countries in Europe. Such natural catastrophes do not follow national boundaries: for example, the major flood in 2016 was responsible for the Europe's largest insured loss of USD3.4bn and affected Germany, France, Belgium, Austria and parts of several other countries. Reflecting such needs, IF initiated a pan-European flood event set development which combines cross-country exposures with country based loss distributions to provide more insightful data to re/insurers. Because the observed discharge data are not available across the whole Europe in sufficient quantity and quality to permit a detailed loss evaluation purposes, a top-down approach was chosen. This approach is based on simulating precipitation from a GCM/RCM model chain followed by a calculation of discharges using rainfall-runoff modelling. IF set up this project in a close collaboration with Karlsruhe Institute of Technology (KIT) regarding the precipitation estimates and with University of East Anglia (UEA) in terms of the rainfall-runoff modelling. KIT's main objective is to provide high resolution daily historical and stochastic time series of key meteorological variables. A purely dynamical downscaling approach with the regional climate model COSMO-CLM (CCLM) is used to generate the historical time series, using re-analysis data as boundary conditions. The resulting time series are validated against the gridded observational dataset E-OBS, and different bias-correction methods are employed. The generation of the stochastic time series requires transfer functions between large-scale atmospheric variables and regional temperature and precipitation fields. These transfer functions are developed for the historical time series using reanalysis data as predictors and bias-corrected CCLM simulated precipitation and temperature as

  14. Standardized network order sets in rural Ontario: a follow-up report on successes and sustainability.

    PubMed

    Rawn, Andrea; Wilson, Katrina

    2011-01-01

    Unifying, implementing and sustaining a large order set project requires strategic placement of key organizational professionals to provide ongoing user education, communication and support. This article will outline the successful strategies implemented by the Grey Bruce Health Network, Evidence-Based Care Program to reduce length of stay, increase patient satisfaction and increase the use of best practices resulting in quality outcomes, safer practice and better allocation of resources by using standardized Order Sets within a network of 11 hospital sites. Audits conducted in 2007 and again in 2008 revealed a reduced length of stay of 0.96 in-patient days when order sets were used on admission and readmission for the same or a related diagnosis within one month decreased from 5.5% without order sets to 3.5% with order sets.

  15. 2018 update to the HIV-TRePS system: the development of new computational models to predict HIV treatment outcomes, with or without a genotype, with enhanced usability for low-income settings.

    PubMed

    Revell, Andrew D; Wang, Dechao; Perez-Elias, Maria-Jesus; Wood, Robin; Cogill, Dolphina; Tempelman, Hugo; Hamers, Raph L; Reiss, Peter; van Sighem, Ard I; Rehm, Catherine A; Pozniak, Anton; Montaner, Julio S G; Lane, H Clifford; Larder, Brendan A

    2018-06-08

    Optimizing antiretroviral drug combination on an individual basis can be challenging, particularly in settings with limited access to drugs and genotypic resistance testing. Here we describe our latest computational models to predict treatment responses, with or without a genotype, and compare their predictive accuracy with that of genotyping. Random forest models were trained to predict the probability of virological response to a new therapy introduced following virological failure using up to 50 000 treatment change episodes (TCEs) without a genotype and 18 000 TCEs including genotypes. Independent data sets were used to evaluate the models. This study tested the effects on model accuracy of relaxing the baseline data timing windows, the use of a new filter to exclude probable non-adherent cases and the addition of maraviroc, tipranavir and elvitegravir to the system. The no-genotype models achieved area under the receiver operator characteristic curve (AUC) values of 0.82 and 0.81 using the standard and relaxed baseline data windows, respectively. The genotype models achieved AUC values of 0.86 with the new non-adherence filter and 0.84 without. Both sets of models were significantly more accurate than genotyping with rules-based interpretation, which achieved AUC values of only 0.55-0.63, and were marginally more accurate than previous models. The models were able to identify alternative regimens that were predicted to be effective for the vast majority of cases in which the new regimen prescribed in the clinic failed. These latest global models predict treatment responses accurately even without a genotype and have the potential to help optimize therapy, particularly in resource-limited settings.

  16. Modeling Mode Choice Behavior Incorporating Household and Individual Sociodemographics and Travel Attributes Based on Rough Sets Theory

    PubMed Central

    Chen, Xuewu; Wei, Ming; Wu, Jingxian; Hou, Xianyao

    2014-01-01

    Most traditional mode choice models are based on the principle of random utility maximization derived from econometric theory. Alternatively, mode choice modeling can be regarded as a pattern recognition problem reflected from the explanatory variables of determining the choices between alternatives. The paper applies the knowledge discovery technique of rough sets theory to model travel mode choices incorporating household and individual sociodemographics and travel information, and to identify the significance of each attribute. The study uses the detailed travel diary survey data of Changxing county which contains information on both household and individual travel behaviors for model estimation and evaluation. The knowledge is presented in the form of easily understood IF-THEN statements or rules which reveal how each attribute influences mode choice behavior. These rules are then used to predict travel mode choices from information held about previously unseen individuals and the classification performance is assessed. The rough sets model shows high robustness and good predictive ability. The most significant condition attributes identified to determine travel mode choices are gender, distance, household annual income, and occupation. Comparative evaluation with the MNL model also proves that the rough sets model gives superior prediction accuracy and coverage on travel mode choice modeling. PMID:25431585

  17. A rough set-based measurement model study on high-speed railway safety operation.

    PubMed

    Hu, Qizhou; Tan, Minjia; Lu, Huapu; Zhu, Yun

    2018-01-01

    Aiming to solve the safety problems of high-speed railway operation and management, one new method is urgently needed to construct on the basis of the rough set theory and the uncertainty measurement theory. The method should carefully consider every factor of high-speed railway operation that realizes the measurement indexes of its safety operation. After analyzing the factors that influence high-speed railway safety operation in detail, a rough measurement model is finally constructed to describe the operation process. Based on the above considerations, this paper redistricts the safety influence factors of high-speed railway operation as 16 measurement indexes which include staff index, vehicle index, equipment index and environment. And the paper also provides another reasonable and effective theoretical method to solve the safety problems of multiple attribute measurement in high-speed railway operation. As while as analyzing the operation data of 10 pivotal railway lines in China, this paper respectively uses the rough set-based measurement model and value function model (one model for calculating the safety value) for calculating the operation safety value. The calculation result shows that the curve of safety value with the proposed method has smaller error and greater stability than the value function method's, which verifies the feasibility and effectiveness.

  18. I{ Relationship between source clean up and mass flux of chlorinated solvents in low permeability settings with fractures}

    NASA Astrophysics Data System (ADS)

    Bjerg, P. L.; Chambon, J. C.; Christiansen, C. M.; Broholm, M. M.; Binning, P. J.

    2009-04-01

    Groundwater contamination by chlorinated solvents, such as perchloroethylene (PCE), often occurs via leaching from complex sources located in low permeability sediments such as clayey tills overlying aquifers. Clayey tills are mostly fractured, and contamination migrating through the fractures spreads to the low permeability matrix by diffusion. This results in a long term source of contamination due to back-diffusion. Leaching from such sources is further complicated by microbial degradation under anaerobic conditions to sequentially form the daughter products trichloroethylene, cis-dichloroethylene (cis-DCE), vinyl chloride (VC) and ethene. This process can be enhanced by addition of electron donors and/or bioaugmentation and is termed Enhanced Reductive Dechlorination (ERD). This work aims to improve our understanding of the physical, chemical and microbial processes governing source behaviour under natural and enhanced conditions. That understanding is applied to risk assessment, and to determine the relationship and time frames of source clean up and plume response. To meet that aim, field and laboratory observations are coupled to state of the art models incorporating new insights of contaminant behaviour. The long term leaching of chlorinated ethenes from clay aquitards is currently being monitored at a number of Danish sites. The observed data is simulated using a coupled fracture flow and clay matrix diffusion model. Sequential degradation is represented by modified Monod kinetics accounting for competitive inhibition between the chlorinated ethenes. The model is constructed using Comsol Multiphysics, a generic finite- element partial differential equation solver. The model is applied at well characterised field sites with respect to hydrogeology, fracture network, contaminant distribution and microbial processes (lab and field experiments). At one of the study sites (Sortebrovej), the source areas are situated in a clayey till with fractures and

  19. Implementing the Career Domain of the American School Counselor Association's National Model into the Virtual Setting

    ERIC Educational Resources Information Center

    Terry, Laura Robin

    2012-01-01

    The implementation of the American School Counselor Association (ASCA) national model has not been studied in nontraditional settings such as in virtual schools. The purpose of this quantitative research study was to examine the implementation of the career domain of the ASCA national model into the virtual high school setting. Social cognitive…

  20. Consideration Sets and Their Role in Modelling Doctor Recommendations About Contraceptives.

    PubMed

    Fiebig, Denzil G; Viney, Rosalie; Knox, Stephanie; Haas, Marion; Street, Deborah J; Hole, Arne R; Weisberg, Edith; Bateson, Deborah

    2017-01-01

    Decisions about prescribed contraception are typically the result of a consultation between a woman and her doctor. In order to better understand contraceptive choice within this environment, stated preference methods are utilized to ask doctors about what contraceptive options they would discuss with different types of women. The role of doctors is to confine their discussion to a subset of products that best match their patient. This subset of options forms the consideration set from which the ultimate recommendation is made. Given the existence of consideration sets we address the issue of how to model appropriately the ultimate recommendations. The estimated models enable us to characterize doctor recommendations and how they vary with patient attributes and to highlight where recommendations are clear and when they are uncertain. The results also indicate systematic variation in recommendations across different types of doctors, and in particular we observe that some doctors are reluctant to embrace new products and instead recommend those that are more familiar. Such effects are one possible explanation for the relatively low uptake of more cost effective longer acting reversible contraceptives and indicate that further education and training of doctors may be warranted. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  1. Priority setting in health care: trends and models from Scandinavian experiences.

    PubMed

    Hofmann, Bjørn

    2013-08-01

    The Scandinavian welfare states have public health care systems which have universal coverage and traditionally low influence of private insurance and private provision. Due to raises in costs, elaborate public control of health care, and a significant technological development in health care, priority setting came on the public agenda comparatively early in the Scandinavian countries. The development of health care priority setting has been partly homogeneous and appears to follow certain phases. This can be of broader interest as it may shed light on alternative models and strategies in health care priority setting. Some general trends have been identified: from principles to procedures, from closed to open processes, and from experts to participation. Five general approaches have been recognized: The moral principles and values based approach, the moral principles and economic assessment approach, the procedural approach, the expert based practice defining approach, and the participatory practice defining approach. There are pros and cons with all of these approaches. For the time being the fifth approach appears attractive, but its lack of true participation and the lack of clear success criteria may pose significant challenges in the future.

  2. High-Order Model and Dynamic Filtering for Frame Rate Up-Conversion.

    PubMed

    Bao, Wenbo; Zhang, Xiaoyun; Chen, Li; Ding, Lianghui; Gao, Zhiyong

    2018-08-01

    This paper proposes a novel frame rate up-conversion method through high-order model and dynamic filtering (HOMDF) for video pixels. Unlike the constant brightness and linear motion assumptions in traditional methods, the intensity and position of the video pixels are both modeled with high-order polynomials in terms of time. Then, the key problem of our method is to estimate the polynomial coefficients that represent the pixel's intensity variation, velocity, and acceleration. We propose to solve it with two energy objectives: one minimizes the auto-regressive prediction error of intensity variation by its past samples, and the other minimizes video frame's reconstruction error along the motion trajectory. To efficiently address the optimization problem for these coefficients, we propose the dynamic filtering solution inspired by video's temporal coherence. The optimal estimation of these coefficients is reformulated into a dynamic fusion of the prior estimate from pixel's temporal predecessor and the maximum likelihood estimate from current new observation. Finally, frame rate up-conversion is implemented using motion-compensated interpolation by pixel-wise intensity variation and motion trajectory. Benefited from the advanced model and dynamic filtering, the interpolated frame has much better visual quality. Extensive experiments on the natural and synthesized videos demonstrate the superiority of HOMDF over the state-of-the-art methods in both subjective and objective comparisons.

  3. A fast bottom-up algorithm for computing the cut sets of noncoherent fault trees

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Corynen, G.C.

    1987-11-01

    An efficient procedure for finding the cut sets of large fault trees has been developed. Designed to address coherent or noncoherent systems, dependent events, shared or common-cause events, the method - called SHORTCUT - is based on a fast algorithm for transforming a noncoherent tree into a quasi-coherent tree (COHERE), and on a new algorithm for reducing cut sets (SUBSET). To assure sufficient clarity and precision, the procedure is discussed in the language of simple sets, which is also developed in this report. Although the new method has not yet been fully implemented on the computer, we report theoretical worst-casemore » estimates of its computational complexity. 12 refs., 10 figs.« less

  4. Generating extreme weather event sets from very large ensembles of regional climate models

    NASA Astrophysics Data System (ADS)

    Massey, Neil; Guillod, Benoit; Otto, Friederike; Allen, Myles; Jones, Richard; Hall, Jim

    2015-04-01

    Generating extreme weather event sets from very large ensembles of regional climate models Neil Massey, Benoit P. Guillod, Friederike E. L. Otto, Myles R. Allen, Richard Jones, Jim W. Hall Environmental Change Institute, University of Oxford, Oxford, UK Extreme events can have large impacts on societies and are therefore being increasingly studied. In particular, climate change is expected to impact the frequency and intensity of these events. However, a major limitation when investigating extreme weather events is that, by definition, only few events are present in observations. A way to overcome this issue it to use large ensembles of model simulations. Using the volunteer distributed computing (VDC) infrastructure of weather@home [1], we run a very large number (10'000s) of RCM simulations over the European domain at a resolution of 25km, with an improved land-surface scheme, nested within a free-running GCM. Using VDC allows many thousands of climate model runs to be computed. Using observations for the GCM boundary forcings we can run historical "hindcast" simulations over the past 100 to 150 years. This allows us, due to the chaotic variability of the atmosphere, to ascertain how likely an extreme event was, given the boundary forcings, and to derive synthetic event sets. The events in these sets did not actually occur in the observed record but could have occurred given the boundary forcings, with an associated probability. The event sets contain time-series of fields of meteorological variables that allow impact modellers to assess the loss the event would incur. Projections of events into the future are achieved by modelling projections of the sea-surface temperature (SST) and sea-ice boundary forcings, by combining the variability of the SST in the observed record with a range of warming signals derived from the varying responses of SSTs in the CMIP5 ensemble to elevated greenhouse gas (GHG) emissions in three RCP scenarios. Simulating the future with a

  5. Global long-term ozone trends derived from different observed and modelled data sets

    NASA Astrophysics Data System (ADS)

    Coldewey-Egbers, M.; Loyola, D.; Zimmer, W.; van Roozendael, M.; Lerot, C.; Dameris, M.; Garny, H.; Braesicke, P.; Koukouli, M.; Balis, D.

    2012-04-01

    The long-term behaviour of stratospheric ozone amounts during the past three decades is investigated on a global scale using different observed and modelled data sets. Three European satellite sensors GOME/ERS-2, SCIAMACHY/ENVISAT, and GOME-2/METOP are combined and a merged global monthly mean total ozone product has been prepared using an inter-satellite calibration approach. The data set covers the 16-years period from June 1995 to June 2011 and it exhibits an excellent long-term stability, which is required for such trend studies. A multiple linear least-squares regression algorithm using different explanatory variables is applied to the time series and statistically significant positive trends are detected in the northern mid latitudes and subtropics. Global trends are also estimated using a second satellite-based Merged Ozone Data set (MOD) provided by NASA. For few selected geographical regions ozone trends are additionally calculated using well-maintained measurements of individual Dobson/Brewer ground-based instruments. A reasonable agreement in the spatial patterns of the trends is found amongst the European satellite, the NASA satellite, and the ground-based observations. Furthermore, two long-term simulations obtained with the Chemistry-Climate Models E39C-A provided by German Aerospace Center and UMUKCA-UCAM provided by University of Cambridge are analysed.

  6. Set-up and calibration of an indoor nozzle-type rainfall simulator for soil erosion studies

    NASA Astrophysics Data System (ADS)

    Lassu, T.; Seeger, M.

    2012-04-01

    Rainfall simulation is one of the most prevalent methods used in soil erosion studies on agricultural land. In-situ simulators have been used to relate soil surface characteristics and management to runoff generation, infiltration and erosion, eg. the influence of different cultivation systems, and to parameterise erosion models. Laboratory rainfall simulators have been used to determine the impact of the soil surface characteristics such as micro-topography, surface roughness, and soil chemistry on infiltration and erosion rates, and to elucidate the processes involved. The purpose of the following study is to demonstrate the set-up and the calibration of a large indoor, nozzle-type rainfall simulator (RS) for soil erosion, surface runoff and rill development studies. This RS is part of the Kraijenhoff van de Leur Laboratory for Water and Sediment Dynamics in Wageningen University. The rainfall simulator consists from a 6 m long and 2,5 m wide plot, with metal lateral frame and one open side. Infiltration can be collected in different segments. The plot can be inclined up to 15.5° slope. From 3,85 m height above the plot 2 Lechler nozzles 460.788 are sprinkling the water onto the surface with constant intensity. A Zehnder HMP 450 pump provides the constant water supply. An automatic pressure switch on the pump keeps the pressure constant during the experiments. The flow rate is controlled for each nozzle by independent valves. Additionally, solenoid valves are mounted at each nozzle to interrupt water flow. The flow is monitored for each nozzle with flow meters and can be recorded within the computer network. For calibration of the RS we measured the rainfall distribution with 60 gauges equally distributed over the plot during 15 minutes for each nozzle independently and for a combination of 2 identical nozzles. The rainfall energy was recorded on the same grid by measuring drop size distribution and fall velocity with a laser disdrometer. We applied 2 different

  7. Mixture modeling of multi-component data sets with application to ion-probe zircon ages

    NASA Astrophysics Data System (ADS)

    Sambridge, M. S.; Compston, W.

    1994-12-01

    A method is presented for detecting multiple components in a population of analytical observations for zircon and other ages. The procedure uses an approach known as mixture modeling, in order to estimate the most likely ages, proportions and number of distinct components in a given data set. Particular attention is paid to estimating errors in the estimated ages and proportions. At each stage of the procedure several alternative numerical approaches are suggested, each having their own advantages in terms of efficency and accuracy. The methodology is tested on synthetic data sets simulating two or more mixed populations of zircon ages. In this case true ages and proportions of each population are known and compare well with the results of the new procedure. Two examples are presented of its use with sets of SHRIMP U-238 - Pb-206 zircon ages from Palaeozoic rocks. A published data set for altered zircons from bentonite at Meishucun, South China, previously treated as a single-component population after screening for gross alteration effects, can be resolved into two components by the new procedure and their ages, proportions and standard errors estimated. The older component, at 530 +/- 5 Ma (2 sigma), is our best current estimate for the age of the bentonite. Mixture modeling of a data set for unaltered zircons from a tonalite elsewhere defines the magmatic U-238 - Pb-206 age at high precision (2 sigma +/- 1.5 Ma), but one-quarter of the 41 analyses detect hidden and significantly older cores.

  8. Development and Validation of Decision Forest Model for Estrogen Receptor Binding Prediction of Chemicals Using Large Data Sets.

    PubMed

    Ng, Hui Wen; Doughty, Stephen W; Luo, Heng; Ye, Hao; Ge, Weigong; Tong, Weida; Hong, Huixiao

    2015-12-21

    Some chemicals in the environment possess the potential to interact with the endocrine system in the human body. Multiple receptors are involved in the endocrine system; estrogen receptor α (ERα) plays very important roles in endocrine activity and is the most studied receptor. Understanding and predicting estrogenic activity of chemicals facilitates the evaluation of their endocrine activity. Hence, we have developed a decision forest classification model to predict chemical binding to ERα using a large training data set of 3308 chemicals obtained from the U.S. Food and Drug Administration's Estrogenic Activity Database. We tested the model using cross validations and external data sets of 1641 chemicals obtained from the U.S. Environmental Protection Agency's ToxCast project. The model showed good performance in both internal (92% accuracy) and external validations (∼ 70-89% relative balanced accuracies), where the latter involved the validations of the model across different ER pathway-related assays in ToxCast. The important features that contribute to the prediction ability of the model were identified through informative descriptor analysis and were related to current knowledge of ER binding. Prediction confidence analysis revealed that the model had both high prediction confidence and accuracy for most predicted chemicals. The results demonstrated that the model constructed based on the large training data set is more accurate and robust for predicting ER binding of chemicals than the published models that have been developed using much smaller data sets. The model could be useful for the evaluation of ERα-mediated endocrine activity potential of environmental chemicals.

  9. Guidelines on Diabetic Eye Care: The International Council of Ophthalmology Recommendations for Screening, Follow-up, Referral, and Treatment Based on Resource Settings.

    PubMed

    Wong, Tien Y; Sun, Jennifer; Kawasaki, Ryo; Ruamviboonsuk, Paisan; Gupta, Neeru; Lansingh, Van Charles; Maia, Mauricio; Mathenge, Wanjiku; Moreker, Sunil; Muqit, Mahi M K; Resnikoff, Serge; Verdaguer, Juan; Zhao, Peiquan; Ferris, Frederick; Aiello, Lloyd P; Taylor, Hugh R

    2018-05-24

    Diabetes mellitus (DM) is a global epidemic and affects populations in both developing and developed countries, with differing health care and resource levels. Diabetic retinopathy (DR) is a major complication of DM and a leading cause of vision loss in working middle-aged adults. Vision loss from DR can be prevented with broad-level public health strategies, but these need to be tailored to a country's and population's resource setting. Designing DR screening programs, with appropriate and timely referral to facilities with trained eye care professionals, and using cost-effective treatment for vision-threatening levels of DR can prevent vision loss. The International Council of Ophthalmology Guidelines for Diabetic Eye Care 2017 summarize and offer a comprehensive guide for DR screening, referral and follow-up schedules for DR, and appropriate management of vision-threatening DR, including diabetic macular edema (DME) and proliferative DR, for countries with high- and low- or intermediate-resource settings. The guidelines include updated evidence on screening and referral criteria, the minimum requirements for a screening vision and retinal examination, follow-up care, and management of DR and DME, including laser photocoagulation and appropriate use of intravitreal anti-vascular endothelial growth factor inhibitors and, in specific situations, intravitreal corticosteroids. Recommendations for management of DR in patients during pregnancy and with concomitant cataract also are included. The guidelines offer suggestions for monitoring outcomes and indicators of success at a population level. Copyright © 2018 American Academy of Ophthalmology. All rights reserved.

  10. Determining the reliability of a custom built seated stadiometry set-up for measuring spinal height in participants with chronic low back pain.

    PubMed

    Steele, James; Bruce-Low, Stewart; Smith, Dave; Jessop, David; Osborne, Neil

    2016-03-01

    Indirect measurement of disc hydration can be obtained through measures of spinal height using stadiometry. However, specialised stadiometers for this are often custom-built and expensive. Generic wall-mounted stadiometers alternatively are common in clinics and laboratories. This study examined the reliability of a custom set-up utilising a wall-mounted stadiometer for measurement of spinal height using custom built wall mounted postural rods. Twelve participants with non-specific chronic low back pain (CLBP; females n = 5, males n = 7) underwent measurement of spinal height on three separate consecutive days at the same time of day where 10 measurements were taken at 20 s intervals. Comparisons were made using repeated measures analysis of variance for 'trial' and 'gender'. There were no significant effects by trial or interaction effects of trial x gender. Intra-individual absolute standard error of measurement (SEM) was calculated for spinal height using the first of the 10 measures, the average of 10 measures, the total shrinkage, and the rate of shrinkage across the 10 measures examined as the slope of the curve when a linear regression was fitted. SEMs were 3.1 mm, 2.8 mm, 2.6 mm and 0.212, respectively. Absence of significant differences between trials and the reported SEMs suggests this custom set-up for measuring spinal height changes is suitable use as an outcome measure in either research or clinical practice in participants with CLBP. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  11. Source term evaluation model for high-level radioactive waste repository with decay chain build-up.

    PubMed

    Chopra, Manish; Sunny, Faby; Oza, R B

    2016-09-18

    A source term model based on two-component leach flux concept is developed for a high-level radioactive waste repository. The long-lived radionuclides associated with high-level waste may give rise to the build-up of activity because of radioactive decay chains. The ingrowths of progeny are incorporated in the model using Bateman decay chain build-up equations. The model is applied to different radionuclides present in the high-level radioactive waste, which form a part of decay chains (4n to 4n + 3 series), and the activity of the parent and daughter radionuclides leaching out of the waste matrix is estimated. Two cases are considered: one when only parent is present initially in the waste and another where daughters are also initially present in the waste matrix. The incorporation of in situ production of daughter radionuclides in the source is important to carry out realistic estimates. It is shown that the inclusion of decay chain build-up is essential to avoid underestimation of the radiological impact assessment of the repository. The model can be a useful tool for evaluating the source term of the radionuclide transport models used for the radiological impact assessment of high-level radioactive waste repositories.

  12. Review and evaluation of performance measures for survival prediction models in external validation settings.

    PubMed

    Rahman, M Shafiqur; Ambler, Gareth; Choodari-Oskooei, Babak; Omar, Rumana Z

    2017-04-18

    When developing a prediction model for survival data it is essential to validate its performance in external validation settings using appropriate performance measures. Although a number of such measures have been proposed, there is only limited guidance regarding their use in the context of model validation. This paper reviewed and evaluated a wide range of performance measures to provide some guidelines for their use in practice. An extensive simulation study based on two clinical datasets was conducted to investigate the performance of the measures in external validation settings. Measures were selected from categories that assess the overall performance, discrimination and calibration of a survival prediction model. Some of these have been modified to allow their use with validation data, and a case study is provided to describe how these measures can be estimated in practice. The measures were evaluated with respect to their robustness to censoring and ease of interpretation. All measures are implemented, or are straightforward to implement, in statistical software. Most of the performance measures were reasonably robust to moderate levels of censoring. One exception was Harrell's concordance measure which tended to increase as censoring increased. We recommend that Uno's concordance measure is used to quantify concordance when there are moderate levels of censoring. Alternatively, Gönen and Heller's measure could be considered, especially if censoring is very high, but we suggest that the prediction model is re-calibrated first. We also recommend that Royston's D is routinely reported to assess discrimination since it has an appealing interpretation. The calibration slope is useful for both internal and external validation settings and recommended to report routinely. Our recommendation would be to use any of the predictive accuracy measures and provide the corresponding predictive accuracy curves. In addition, we recommend to investigate the characteristics

  13. Parameter Set Cloning Based on Catchment Similarity for Large-scale Hydrologic Modeling

    NASA Astrophysics Data System (ADS)

    Liu, Z.; Kaheil, Y.; McCollum, J.

    2016-12-01

    Parameter calibration is a crucial step to ensure the accuracy of hydrological models. However, streamflow gauges are not available everywhere for calibrating a large-scale hydrologic model globally. Thus, assigning parameters appropriately for regions where the calibration cannot be performed directly has been a challenge for large-scale hydrologic modeling. Here we propose a method to estimate the model parameters in ungauged regions based on the values obtained through calibration in areas where gauge observations are available. This parameter set cloning is performed according to a catchment similarity index, a weighted sum index based on four catchment characteristic attributes. These attributes are IPCC Climate Zone, Soil Texture, Land Cover, and Topographic Index. The catchments with calibrated parameter values are donors, while the uncalibrated catchments are candidates. Catchment characteristic analyses are first conducted for both donors and candidates. For each attribute, we compute a characteristic distance between donors and candidates. Next, for each candidate, weights are assigned to the four attributes such that higher weights are given to properties that are more directly linked to the hydrologic dominant processes. This will ensure that the parameter set cloning emphasizes the dominant hydrologic process in the region where the candidate is located. The catchment similarity index for each donor - candidate couple is then created as the sum of the weighted distance of the four properties. Finally, parameters are assigned to each candidate from the donor that is "most similar" (i.e. with the shortest weighted distance sum). For validation, we applied the proposed method to catchments where gauge observations are available, and compared simulated streamflows using the parameters cloned by other catchments to the results obtained by calibrating the hydrologic model directly using gauge data. The comparison shows good agreement between the two models

  14. Modeling Primary Breakup: A Three-Dimensional Eulerian Level Set/Vortex Sheet Method for Two-Phase Interface Dynamics

    NASA Technical Reports Server (NTRS)

    Herrmann, M.

    2003-01-01

    This paper is divided into four parts. First, the level set/vortex sheet method for three-dimensional two-phase interface dynamics is presented. Second, the LSS model for the primary breakup of turbulent liquid jets and sheets is outlined and all terms requiring subgrid modeling are identified. Then, preliminary three-dimensional results of the level set/vortex sheet method are presented and discussed. Finally, conclusions are drawn and an outlook to future work is given.

  15. Using climate models to estimate the quality of global observational data sets.

    PubMed

    Massonnet, François; Bellprat, Omar; Guemas, Virginie; Doblas-Reyes, Francisco J

    2016-10-28

    Observational estimates of the climate system are essential to monitoring and understanding ongoing climate change and to assessing the quality of climate models used to produce near- and long-term climate information. This study poses the dual and unconventional question: Can climate models be used to assess the quality of observational references? We show that this question not only rests on solid theoretical grounds but also offers insightful applications in practice. By comparing four observational products of sea surface temperature with a large multimodel climate forecast ensemble, we find compelling evidence that models systematically score better against the most recent, advanced, but also most independent product. These results call for generalized procedures of model-observation comparison and provide guidance for a more objective observational data set selection. Copyright © 2016, American Association for the Advancement of Science.

  16. Preference Mining Using Neighborhood Rough Set Model on Two Universes.

    PubMed

    Zeng, Kai

    2016-01-01

    Preference mining plays an important role in e-commerce and video websites for enhancing user satisfaction and loyalty. Some classical methods are not available for the cold-start problem when the user or the item is new. In this paper, we propose a new model, called parametric neighborhood rough set on two universes (NRSTU), to describe the user and item data structures. Furthermore, the neighborhood lower approximation operator is used for defining the preference rules. Then, we provide the means for recommending items to users by using these rules. Finally, we give an experimental example to show the details of NRSTU-based preference mining for cold-start problem. The parameters of the model are also discussed. The experimental results show that the proposed method presents an effective solution for preference mining. In particular, NRSTU improves the recommendation accuracy by about 19% compared to the traditional method.

  17. Dynamic graph of an oxy-fuel combustion system using autocatalytic set model

    NASA Astrophysics Data System (ADS)

    Harish, Noor Ainy; Bakar, Sumarni Abu

    2017-08-01

    Evaporation process is one of the main processes besides combustion process in an oxy-combustion boiler system. An Autocatalytic Set (ASC) Model has successfully applied in developing graphical representation of the chemical reactions that occurs in the evaporation process in the system. Seventeen variables identified in the process are represented as nodes and the catalytic relationships are represented as edges in the graph. In addition, in this paper graph dynamics of ACS is further investigated. By using Dynamic Autocatalytic Set Graph Algorithm (DAGA), the adjacency matrix for each of the graphs and its relations to Perron-Frobenius Theorem is investigated. The dynamic graph obtained is further investigated where the connection of the graph to fuzzy graph Type 1 is established.

  18. Selection of experimental modal data sets for damage detection via model update

    NASA Technical Reports Server (NTRS)

    Doebling, S. W.; Hemez, F. M.; Barlow, M. S.; Peterson, L. D.; Farhat, C.

    1993-01-01

    When using a finite element model update algorithm for detecting damage in structures, it is important that the experimental modal data sets used in the update be selected in a coherent manner. In the case of a structure with extremely localized modal behavior, it is necessary to use both low and high frequency modes, but many of the modes in between may be excluded. In this paper, we examine two different mode selection strategies based on modal strain energy, and compare their success to the choice of an equal number of modes based merely on lowest frequency. Additionally, some parameters are introduced to enable a quantitative assessment of the success of our damage detection algorithm when using the various set selection criteria.

  19. Fluvio geomorphic set-up of Noctis Fossae in Noctis Labyrinthus of Syria-Planum Provenance, Mars

    NASA Astrophysics Data System (ADS)

    Chavan, A. A.; Bhandari, S.

    2017-12-01

    The modern era of planetary exploration has revealed fluvial or fluvial like landforms on the extraterrestrial surfaces of planets and moons of our solar system. This has posed as interesting challenges for advancing our fundamental understanding of fluvial processes and their associated landforms on the planetary surfaces especially on Mars. It has been recognized through earlier studies that the channels and valleys are extensively dissected on Mars. The Valleys are low lying, elongate troughs surrounded by elevated topography. Moreover, valley networks on Mars are the most noticeable features attesting that different geological processes and possibly climatic conditions prevailed in the past and played a vital role in formulating the Martian topography. Channel incisions which are a domino effect both tectonic and surface runoff and groundwater sapping. The components of surface runoff have been deciphered with the help of morphometric exercises. Further, the geomorphological studies of these landforms are critical in understanding the regional tectonics. The present work is an assessment of Fluvio geomorphic set-up of Noctis Fossae in Noctis Labyrinthus of Syria-Planum Provenance, Mars. This study focuses on the fluvio geomorphology of the southern highlands (00 to 400S to 850-1200W) to determine how these features were formed, which process formed these valleys and includes the probable causes resulting into the development of the topography. Keywords: Noctis Fossae; Noctis Labyrinthus; Syria Planum; Mars

  20. Impact of the choice of the precipitation reference data set on climate model selection and the resulting climate change signal

    NASA Astrophysics Data System (ADS)

    Gampe, D.; Ludwig, R.

    2017-12-01

    Regional Climate Models (RCMs) that downscale General Circulation Models (GCMs) are the primary tool to project future climate and serve as input to many impact models to assess the related changes and impacts under such climate conditions. Such RCMs are made available through the Coordinated Regional climate Downscaling Experiment (CORDEX). The ensemble of models provides a range of possible future climate changes around the ensemble mean climate change signal. The model outputs however are prone to biases compared to regional observations. A bias correction of these deviations is a crucial step in the impact modelling chain to allow the reproduction of historic conditions of i.e. river discharge. However, the detection and quantification of model biases are highly dependent on the selected regional reference data set. Additionally, in practice due to computational constraints it is usually not feasible to consider the entire ensembles of climate simulations with all members as input for impact models which provide information to support decision-making. Although more and more studies focus on model selection based on the preservation of the climate model spread, a selection based on validity, i.e. the representation of the historic conditions is still a widely applied approach. In this study, several available reference data sets for precipitation are selected to detect the model bias for the reference period 1989 - 2008 over the alpine catchment of the Adige River located in Northern Italy. The reference data sets originate from various sources, such as station data or reanalysis. These data sets are remapped to the common RCM grid at 0.11° resolution and several indicators, such as dry and wet spells, extreme precipitation and general climatology, are calculate to evaluate the capability of the RCMs to produce the historical conditions. The resulting RCM spread is compared against the spread of the reference data set to determine the related uncertainties and